Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hello everyone,
We are currently working on a project where we need to ingest data from an on-premises server through a data gateway. The ingestion process involves a Gen 2 dataflow with various queries that connect to the data source. However, we are facing an issue due to the fact that we are also pulling metadata from a metadata table created within a lakehouse in the Fabric workspace.
The problem arises because, according to the errors we are encountering, all the queries within the dataflow can only be from one source when using the gateway. This limitation is causing problems with the transformations we are attempting to perform between the metadata table and the ingested data from the on-premises server.
The metadata table contains the batch ID and the last ingested data, which is crucial for the incremental ingestion process. We need this information to collect only the new entries that are not present in the existing data and store them in the lakehouse.
Given this context, we have two main questions:
We understand that the use case is quite complex, but we are eager to leverage Fabric for our problem. Please feel free to ask for clarification if it would help provide better solution answers.
Regards
Solved! Go to Solution.
@Rocco_Praelexis , I do not think we have a limitation with dataflow gen 2 to have data from one source. You can go and choose another data source, you can use edit connection and rename the connection at the time of adding table and provide credentials and gateway again
As of now, you can not pass parameters to dataflow, you can pass across data pipeline activity.
So pass to dataflow, we have it table and use that again. I have done that for incremental
#microsoftfabric- How to use Append data option of Data Flow Gen 2 and Data Pipeline| Incremental- https://youtu.be/E5axX6auBN4
Microsoft Fabric: Load incremental data from local SQL server to Warehouse using on-premise Python- https://youtu.be/gBGiWGJS5Cs
Microsoft Fabric: How to append only Incremental data using Data Pipeline in Lakehouse- https://youtu.be/xzhmaQnB2fM
Microsoft Fabric: Incremental ETL for Warehouse using Data Pipeline, SQL Procedure- https://youtu.be/qsOIfTzjCSQ
Microsoft Fabric: Incremental ETL for Warehouse using Dataflow Gen 2, SQL Procedure, Data Pipeline- https://youtu.be/mpFRnZBXsvQ
@Rocco_Praelexis , I do not think we have a limitation with dataflow gen 2 to have data from one source. You can go and choose another data source, you can use edit connection and rename the connection at the time of adding table and provide credentials and gateway again
As of now, you can not pass parameters to dataflow, you can pass across data pipeline activity.
So pass to dataflow, we have it table and use that again. I have done that for incremental
#microsoftfabric- How to use Append data option of Data Flow Gen 2 and Data Pipeline| Incremental- https://youtu.be/E5axX6auBN4
Microsoft Fabric: Load incremental data from local SQL server to Warehouse using on-premise Python- https://youtu.be/gBGiWGJS5Cs
Microsoft Fabric: How to append only Incremental data using Data Pipeline in Lakehouse- https://youtu.be/xzhmaQnB2fM
Microsoft Fabric: Incremental ETL for Warehouse using Data Pipeline, SQL Procedure- https://youtu.be/qsOIfTzjCSQ
Microsoft Fabric: Incremental ETL for Warehouse using Dataflow Gen 2, SQL Procedure, Data Pipeline- https://youtu.be/mpFRnZBXsvQ
Hi @Rocco_Praelexis
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. Otherwise, will respond back with the more details and we will try to help.
Thanks
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Ask questions in Eventhouse and KQL, Eventstream, and Reflex.