Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Dataflows deployed from a Deployment Pipeline - invalid data source

I have created a dataflow that has a single data source (a SQL database). Here it is in lineage view in my Dev workspace.

 

Dataflow lineage from DEV workspaceDataflow lineage from DEV workspace

 

 

 

I am using a Power BI Deployment Pipeline to deploy that dataflow from Workspace Dev to Workspace Test. There is a rule applied that changes the data source from the Dev database and server specified in Workspace Dev to a different Test database and server in Workspace Test. (In various labels on these screenshots, you'll see "rt" in the naming convention which represents Dev, and "pp" which represents Test).

 

This is the rule I've defined:

 

Deployment rule to map datasource from Dev to TestDeployment rule to map datasource from Dev to Test

 

 

When I look at the deployed dataflow in Workspace Test, it now shows lineage for two data sources, both Dev and Test. 

xhead_2-1631286967454.png

 

OCA Dataflow lineage in Test workspace

The top SQL Server database is the Dev connection (unused and invalid), the bottom one is the Test connection.

 

When I export the json of the dataflow in Workspace Test, it only shows one single data source, and it correctly points to the Test database. 

 

This looks like a bug in the deployment pipeline somehow? 

 

The problem for me is that I need to configure the gateway connection for the SQL database, and, due to company security policies, there is a gateway configured for the Test environment that should only have access to the Test Azure SQL Server instances in a virtual network, and not the Dev Azure SQL Server instances, but the data source configurations for the Test dataflow require me to map *both* those data sources to the same gateway. Since the dev database isn't actually used, I can configure a dummy datasource in the gateway for the dev instance and since it never gets queried, it won't fail. But it looks like I have both Dev and Test data sources in the Test gateway.

 

xhead_3-1631286967525.png

 

OCA dataflow gateway config TEST

I shouldn't have the first data source if the dataflow lineage was correct, but I need to configure this (invalid and unused) datasource in this gateway in order to map both these data sources to the same gateway.

 

What I'm *expecting* is that the dataflow deployed into the Test workspace has only one data source in its lineage, the one that is updated by the deployment rule. The original data source should not be present in the lineage of the dataflow. 

 

Mike

Status: New
Comments
v-robertq-msft
Community Support

Hi,

According to my research, during deployment, the following item properties are copied and overwrite item properties at the target stage:

Data sources (deployment rules are supported)

Parameters (deployment rules are supported)

Report visuals

Report pages

Dashboard tiles

Model metadata

Item relationships

 

I think you can also follow this link to check if you have met the limitations of this process:

https://docs.microsoft.com/en-us/power-bi/create-reports/deployment-pipelines-process

 

Best Regards,

Community Support Team _Robert Qin

xhead
Helper II

I have reported this issue through a Power BI Support ticket, and it has been acknowledged as a bug which involves the Data Flow team, the Lineage team, and the Deployment team. The estimated time the bug will be resolved is Feb 2022.

 

I have been able to work around this issue partially by parameterizing the database and server names in the connection, and using deployment rules on the parameter values in the deployment. In the target workspace, the old data source dependency is shown in the lineage view, but that "phantom" data source does not need to be mapped to a data gateway in the settings of the data flow. 

 

When I didn't parameterize the data source server and database, I needed to map the "phantom" data source to a data gateway (and the same data gateway) as the valid data source.