Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
HamidBee
Impactful Individual
Impactful Individual

What are the most effective Activities you use when transforming data via a pipeline?

I am relativey new to Azure Synapse and I'm trying to build some pipelines. There is a wide variety of activities to choose from. What are the most effective activities you use when building pipelines?. For example there is DataBricks Notebook and Synapse Notebook. I'm not sure which one to choose but I would imagine one of those notebooks will be enough to transform an entire dataset in a single script.

1 ACCEPTED SOLUTION
GeethaT-MSFT
Community Support
Community Support

Hi  @HamidBee Thanks for posting your question in Microsoft Fabric CommunityDataBricks Notebook and Synapse Notebook, both are powerful tools for data transformation. DataBricks Notebook is a fully managed, cloud-based platform for running Apache Spark-based workloads. Synapse Notebook is a serverless, web-based environment for running Apache Spark-based workloads. the basic difference is- if your workload is in Synapse then use synapse notebook and if your workload is in Databricks then use Databricks Notebook, Both notebooks can be used to transform an entire dataset in a single script. The choice between the two depends on your specific requirements. If you need a fully managed platform with advanced features, DataBricks Notebook may be the better choice. If you need a serverless environment with a simpler interface, Synapse Notebook may be the better choice.

Regards

Geetha

 

View solution in original post

2 REPLIES 2
GeethaT-MSFT
Community Support
Community Support

Hi  @HamidBee Thanks for posting your question in Microsoft Fabric CommunityDataBricks Notebook and Synapse Notebook, both are powerful tools for data transformation. DataBricks Notebook is a fully managed, cloud-based platform for running Apache Spark-based workloads. Synapse Notebook is a serverless, web-based environment for running Apache Spark-based workloads. the basic difference is- if your workload is in Synapse then use synapse notebook and if your workload is in Databricks then use Databricks Notebook, Both notebooks can be used to transform an entire dataset in a single script. The choice between the two depends on your specific requirements. If you need a fully managed platform with advanced features, DataBricks Notebook may be the better choice. If you need a serverless environment with a simpler interface, Synapse Notebook may be the better choice.

Regards

Geetha

 

Thank you very much for clarifying this for me.

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

April Fabric Update Carousel

Fabric Monthly Update - April 2024

Check out the April 2024 Fabric update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Kudoed Authors