Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
DebbieE
Community Champion
Community Champion

Deployment Pipelines and Azure Devops

I have just been asked a question that I cant answer. I did the Power BI June company update today mentioning that Dataflows are now in Deployment Pipelines.

 

Someone mentioned that you can actually do the deployment in Devops using a Devops Pipeline and because of this what is the point of the deployment Pipelines in power BI Service?

 

They said that the governance around this should be that Deployment Pipelines IN Power BI Service are for people without Azure DevOps. However if you have Azure Devops you should do it in here. And actually its quite confusing having the two options?

 

What does everyone else think about this? Is that the case? If you have Devops should you do the Pipelines in here rather than in Service? 

 

Are they both as up to date as eachother? Are there reasons to use either? I would love some opinions on this

2 ACCEPTED SOLUTIONS
Nimrod_Shalit
Power BI Team
Power BI Team

I'll elaborate on @jeffshieldsdev answer-

Since May, Power BI deployment pipelines have REST API that can be triggered from ADO pipelines and promote content through the stages in a Power BI pipeline. These APIs basically do the same operation that you can do through the UI of Power BI Service, but with all the value that ADO provides- scheduling dpeloyments, cascading multiple deployments, approvals, tests and many more.

 

What your colleagues asked about (i think) is why even use Power BI pipelines, when you can just use ADO pipelines and trigger regular Power BI REST APIs. 

Well, there are quite a few advantages. to name some:

  1. Power BI pipelines allows for a holistic deployment of almost all the content in Power BI together, while REST API mostly support single artifacts that aren't related.
  2. In pipelines we also maintain the original connections between artifacts, as it was in the source stage. no need for 'rebind APIs'.
  3. Rules- PBI pipeline rules allow you to maintain connection and data to the target source, while with regular APIs you will need to set it after the import and refresh again, which causes downtime.
  4. Easier way to publish reports and datasets separately, even if you didn't separate them in the PBIX files.
  5. We are working now for APIs that will enable devs to build PBI pipelines from scratch, so that everything can be automated from scratch.

View solution in original post

If you are using the same data source connection when working in Dev/ Test/ Prod, then there's no problem using 'Import API'. But if you want to switch connection between environments, as many customers do, this is where rules have the advantage vs. using the regular 'Import API'.

View solution in original post

7 REPLIES 7
Nimrod_Shalit
Power BI Team
Power BI Team

I'll elaborate on @jeffshieldsdev answer-

Since May, Power BI deployment pipelines have REST API that can be triggered from ADO pipelines and promote content through the stages in a Power BI pipeline. These APIs basically do the same operation that you can do through the UI of Power BI Service, but with all the value that ADO provides- scheduling dpeloyments, cascading multiple deployments, approvals, tests and many more.

 

What your colleagues asked about (i think) is why even use Power BI pipelines, when you can just use ADO pipelines and trigger regular Power BI REST APIs. 

Well, there are quite a few advantages. to name some:

  1. Power BI pipelines allows for a holistic deployment of almost all the content in Power BI together, while REST API mostly support single artifacts that aren't related.
  2. In pipelines we also maintain the original connections between artifacts, as it was in the source stage. no need for 'rebind APIs'.
  3. Rules- PBI pipeline rules allow you to maintain connection and data to the target source, while with regular APIs you will need to set it after the import and refresh again, which causes downtime.
  4. Easier way to publish reports and datasets separately, even if you didn't separate them in the PBIX files.
  5. We are working now for APIs that will enable devs to build PBI pipelines from scratch, so that everything can be automated from scratch.

Hi @Nimrod_Shalit, I am at the start of the process of setting up deployment pipelines (not sure if I should use PBI Pipelines or Azure Pipelines yet) and I came across this thread.
In point 5, you mention devs wil be able to build PBI Pipelines from scratch. Would you be able to elaborate a bit more on this? Does it mean we will be able to add triggers/schedules to the PBI pipeline service? For example, if I use Azure Repos for Version Control, and then I upload a new version of a report, PBI Pipelines will pick this up and trigger a deployment?

Also, is using Azure Repos the right tool for Power BI dashboards as a version control tool?

 

Thanks.

@shyammayhs,

  1. 'Building pipelines from scratch' meant to create them through APIs (Create new pipeline, assign WSs, grant access), not about triggers for deployment from within the Power BI service.
  2. To trigger a deployment today, you can use the 'Deploy' API. you still need a wasy to call them, i believe Azure pipeline will be a good way to implement it. We are working on ADO extension for deployments, so it will be even easier to integrate it into Azure pipelines.
  3. Power BI dashboards, unlike reports that are managed through PBIX files, don't have a file format. hence, there's no way to manage version control for them through Azure Repos.

Hi. Thank you for the reply. My collegue who is testing this asked about the following pointer  "PBI pipeline rules allow you to maintain connection and data to the target source, while with regular APIs you will need to set it after the import and refresh again, which causes downtime." 

 

She said that after testing in Devops "once something is deployed it seemed to maintain that connection to the data source. But you will need to refresh if you update the data source" So I am wondering if we have misread your advice on this one?

If you are using the same data source connection when working in Dev/ Test/ Prod, then there's no problem using 'Import API'. But if you want to switch connection between environments, as many customers do, this is where rules have the advantage vs. using the regular 'Import API'.

jeffshieldsdev
Solution Sage
Solution Sage

My understanding is you can use Azure DevOps to issue Deployment Pipeline commands via REST API. Your reports, datasets and dataflows still need to reside in development, test and production Workspaces--you can just use DevOps to promote between Workspaces.

I possibly didnt explain this clearly. The issue is that there is deployment pipelines in Power BI service, and You can also do those Deployment Pipelines in Devops. I think there has been some updates so you can actually create pipelines now in Devops.

 

Ive been questioned on the point of having deployment Pipelines in Service when you have DevOps.

I understgand that the objects are still in your workspaces in Service

 

1. If you have DevOps and can run the Deployment Pipelines in there, should you, or is there a case for using Deployment Pipelines in Service?

2. Is the Use case for the Deployment Pipelines in service when you dont have Devops?

3. Are the two very similar or is one better than the other?

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors