Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
eason
Frequent Visitor

DataMart data source credential and connetion

datamart only need us  to choose gateway or non-gateway ,but i need to combine gateway data and non-gateway data in datamart,it's so hard for me

 

1 ACCEPTED SOLUTION
v-zhangti
Community Support
Community Support

Hi, @eason 

 

To combine gateway data and non-gateway data in a datamart, you can use Azure Data Factory. First, you need to create a data gateway to move data between on-premises and the cloud. You can create a data gateway by clicking ... More on the toolbar in the Data Factory Editor and then clicking New data gateway. Alternatively, you can right-click Data Gateways in the tree view, and click New data gateway. In the Create page, enter a name for the gateway, and click OK. In the Configure page, click Install directly on this computer. This action downloads the installation package for the gateway, installs, configures, and registers the gateway on the computer.

 

Once you have set up the data gateway, you can use Azure Data Factory to move data between on-premises and the cloud. You can use the Copy Data tool in Azure Data Factory to copy data from a source to a destination. To copy data from a source to a destination, you need to create a pipeline. A pipeline is a logical grouping of activities that together perform a task. You can create a pipeline by clicking the + (plus) button in the Data Factory Editor and then clicking Pipeline.

 

To copy data from both sources to a destination, you need to create a pipeline that includes two Copy Data activities. One Copy Data activity should copy data from the gateway data source to the destination, and the other Copy Data activity should copy data from the non-gateway data source to the destination. You can add a Copy Data activity by dragging it from the Activities toolbox to the pipeline canvas. Once you have added the Copy Data activity, you need to configure it. You can configure the Copy Data activity by clicking on it and then clicking on the Settings tab. In the Settings tab, you need to specify the source and destination data stores, the data integration runtime, and the copy behavior.

 

If you need to transform data as you copy it from a source to a destination, you can use the Data Flow tool in Azure Data Factory to transform data. To use the Data Flow tool, you need to create a data flow. A data flow is a set of transformations that you apply to data as it moves from a source to a destination. You can create a data flow by clicking the + (plus) button in the Data Factory Editor and then clicking Data Flow.

 

To transform data as you copy it from a source to a destination, you need to add a Data Flow activity to the pipeline. You can add a Data Flow activity by dragging it from the Activities toolbox to the pipeline canvas. Once you have added the Data Flow activity, you need to configure it. You can configure the Data Flow activity by clicking on it and then clicking on the Settings tab. In the Settings tab, you need to specify the source and destination data stores, the data integration runtime, and the data flow.

Power Query activity in Azure Data Factory - Azure Data Factory | Microsoft Learn

Understand datamarts (preview) - Power BI | Microsoft Learn

 

Best Regards,

Community Support Team _Charlotte

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

1 REPLY 1
v-zhangti
Community Support
Community Support

Hi, @eason 

 

To combine gateway data and non-gateway data in a datamart, you can use Azure Data Factory. First, you need to create a data gateway to move data between on-premises and the cloud. You can create a data gateway by clicking ... More on the toolbar in the Data Factory Editor and then clicking New data gateway. Alternatively, you can right-click Data Gateways in the tree view, and click New data gateway. In the Create page, enter a name for the gateway, and click OK. In the Configure page, click Install directly on this computer. This action downloads the installation package for the gateway, installs, configures, and registers the gateway on the computer.

 

Once you have set up the data gateway, you can use Azure Data Factory to move data between on-premises and the cloud. You can use the Copy Data tool in Azure Data Factory to copy data from a source to a destination. To copy data from a source to a destination, you need to create a pipeline. A pipeline is a logical grouping of activities that together perform a task. You can create a pipeline by clicking the + (plus) button in the Data Factory Editor and then clicking Pipeline.

 

To copy data from both sources to a destination, you need to create a pipeline that includes two Copy Data activities. One Copy Data activity should copy data from the gateway data source to the destination, and the other Copy Data activity should copy data from the non-gateway data source to the destination. You can add a Copy Data activity by dragging it from the Activities toolbox to the pipeline canvas. Once you have added the Copy Data activity, you need to configure it. You can configure the Copy Data activity by clicking on it and then clicking on the Settings tab. In the Settings tab, you need to specify the source and destination data stores, the data integration runtime, and the copy behavior.

 

If you need to transform data as you copy it from a source to a destination, you can use the Data Flow tool in Azure Data Factory to transform data. To use the Data Flow tool, you need to create a data flow. A data flow is a set of transformations that you apply to data as it moves from a source to a destination. You can create a data flow by clicking the + (plus) button in the Data Factory Editor and then clicking Data Flow.

 

To transform data as you copy it from a source to a destination, you need to add a Data Flow activity to the pipeline. You can add a Data Flow activity by dragging it from the Activities toolbox to the pipeline canvas. Once you have added the Data Flow activity, you need to configure it. You can configure the Data Flow activity by clicking on it and then clicking on the Settings tab. In the Settings tab, you need to specify the source and destination data stores, the data integration runtime, and the data flow.

Power Query activity in Azure Data Factory - Azure Data Factory | Microsoft Learn

Understand datamarts (preview) - Power BI | Microsoft Learn

 

Best Regards,

Community Support Team _Charlotte

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

Top Solution Authors