cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
harold
Regular Visitor

Request: Deployments pipelines should support centralized datasets

In this link it is recommended to use a centralized workspace with datasets.

https://docs.microsoft.com/en-us/power-bi/create-reports/deployment-pipelines-best-practices


However in Deployment Pipelines -> Deployment Settings, when creating a "Dataset Rule", the only option to use another dataset is to add new connection string.  This creates a new dataset in the Workspace.

 

This seems to contradict the suggestion to use Workspace with centralized datasets.

"Deployment pipelines best practices

If you're using centralized datasets that are used across the organization, we recommend that you create two types of workspaces:

  • Modeling and data workspaces - These workspaces will contain all the centralized datasets"

 

From my understanding, it should be possible to link a Dataset Rule to a Dataset in another Workspace.

This would allow to have one dataset per environment (e.g. DEV) for all pipelines.

 

 

 

 

1 ACCEPTED SOLUTION
Nimrod_Shalit
Power BI Team
Power BI Team

Hi @harold ,

Reports can be managed in pipelines even is they are connected to external datasets.

whenever you deploy report update across the stages, it will keep pointing to the same shared dataset.

You can also manage 'central datasets' WS in pipelines.

 

However, managing both datasets and connected reports in separate pipelines is still not working in an optimal way, as there's no built-in way to connect each dataset in each stage with the correlated reports in each stage.

 

We recommend that for now, you connect all reports in all stages to the Prod dataset. in the 'central datasets' pipeline, keep few reports for testing in each stage, so that any change in datasets will be verified on reports before hitting Prod.

 

If you still encounter problems in some reports, you can use the dev/ test stages to fix it in each pipeline, and deploy to Prod.

 

Hope it was clear.

View solution in original post

7 REPLIES 7
Nimrod_Shalit
Power BI Team
Power BI Team

Hi @harold ,

Reports can be managed in pipelines even is they are connected to external datasets.

whenever you deploy report update across the stages, it will keep pointing to the same shared dataset.

You can also manage 'central datasets' WS in pipelines.

 

However, managing both datasets and connected reports in separate pipelines is still not working in an optimal way, as there's no built-in way to connect each dataset in each stage with the correlated reports in each stage.

 

We recommend that for now, you connect all reports in all stages to the Prod dataset. in the 'central datasets' pipeline, keep few reports for testing in each stage, so that any change in datasets will be verified on reports before hitting Prod.

 

If you still encounter problems in some reports, you can use the dev/ test stages to fix it in each pipeline, and deploy to Prod.

 

Hope it was clear.

This is not an acceptable solution, in my honest opinion. Here is an example of where the Deployment Pipeline functionality breaks down.

Let's say I have my "Finance Dataset" in the Production centralized workspace "Certified Datasets", and a "Finance" app with an associated deployment pipeline. Within that pipeline, there is a P&L report. I want to test an update we just made one of our database views, to fix an issue in the existing production P&L report. I obviously don't do this immediately in the production Finance Dataset, I want to do it in a test dataset. So I deploy the test dataset with my dataset updates to our "Test Datasets" workspace. 

What I would like to have, is the ability to set a "Dataset rule" for my P&L front-end .pbix file so that when I deploy it from Test workspace to the Prod workspace, it can switch the connection from the Finance Dataset in the Test Datasets workspace, to the Finance Dataset in the Certified Datasets workspace.


Right now, that isn't possible. I only have the option of retaining the connection that was established in the Test version of the P&L report throughout the deployment pipeline process.

What is even more frustrating is the lack of clarity on Microsoft's part for how to handle this situation (which many customers will run into, if they are following Microsoft's documented guidance on an Enterprise Power BI deployment).

If this functionality is not planned on coming to Deployment Pipelines, what is the recommended solution?

@daxman thanks for the thorough explanation. We understand the difficulties in implementing the desired use case. I can share that we are planning to address this issue and have multiple pipelines be 'aware' of each other when different items are connected across pipelines. 

We will start working on it this year, but i don't have a timeline to share yet.

Hi @Nimrod_Shalit , 

 

we just decided to introduce Power BI Premium at our company and are stumbling upon quite some missing features in Power Bi (compared to other BI products from other vendors). Amongst that is the missing support of your propagated centralized datasets and the deployment pipelines as mentioned by @harold. This is a critical issue for us. 

 

Can you give us an update on the issue, please? Is there an idea available to vote on or can you point us to another solutio0n based on Powershell scripts e.g.? 

 

And I do not understand how this question can be declared as "Solved", since it is not!

 

Thanks

Daniel 

@daamruth@daxman,

We are now working to support this scenario. 

It is part of the release notes for the Oct-Apr 22 timeframe.

See here- Deployment pipelines - multiple pipelines working together | Microsoft Docs

Thank you for keeping everyone in the loop! Your quick response is much appreciated. Cheers

lbendlin
Super User
Super User

This may be a mixing of two concepts - or me misunderstanding the issue.

 

concept 1:  Ability to use shared datasets across multiple workspaces: allows disparate audiences to be served with the same set of master data, but filtered/restricted according to the audience.  Reduces development effort, avoids duplication of data, gets you closer to the "one source of truth"  fallacy. Sorry, ideal. I meant to say ideal.

 

concept 2: Ability to speed up development and testing proess without impacting actual production systems.  DEV and ITG workspaces can be pointed to datasets that have a reduced but still representative subset of the data, and can also emulate flush&fill events that may not be desirable in production

 

Neither concept needs deployment pipelines, but the second concept sure benefits from them.

Helpful resources

Announcements
November 2022 Update

Check it Out!

Click here to read more about the November 2022 updates!

Microsoft 365 Conference â__ December 6-8, 2022

Microsoft 365 Conference - 06-08 December

Join us in Las Vegas to experience community, incredible learning opportunities, and connections that will help grow skills, know-how, and more.

Top Solution Authors