Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
dungar
Frequent Visitor

Refreshing one dataflow for multiple workspaces

For the record we're in a Premium Capacity environment-

 

In our environment, to help with the large number of reports we need to make, we've set up Apps/Workspaces for each department we distribute to.  In large part due to how easy this makes handling permissions, since it's on the App level, not the report.

 

So for example we have an Audit workspace/app, and a Security workspace/App.

 

Report 1 is unique to audit, report 2 however, is used by both Audit and Security.

 

What i'm trying to accomplish is setting up one data flow (ideally in a Flow workspace to keep things conslidated) with everything needed for report 2, and then publishing it to the Flow workspace. I'd like to have only ONE refresh per day on this data, and then share it out (and i need that refresh to be incremental).

 

So for next steps, i've created a report, pulled the data from the dataflow, and published it to both Security and Audit workspaces.

 

From what i've seen this does not accomplish what i want, as now both Audit and Security have a dataset that connects to the dataflow.  The dataflow will refresh, but the Audit and Security dataset will not.  Further i'm not sure if i trigger the refresh if it will do it incrementally (just pulling the recent data) or if it'll refresh the ENTIRE thing.

 

Does anyone know if this is possible?  If not how is this sort of distrobution/shared data problem supposed to be approached?  I really don't want to have to clone my data eveywhere and refresh it multiple times.

1 ACCEPTED SOLUTION

Almost!

1. Create Data flow, set up my incremental refresh (archive everything older than 2 weeks, refresh previous 2 weeks). Good!

 

2. Open powerbi desktop, import from the dataflow everything I need, create the report.  Publish it to my "Flow" workspace. Good!

 

3. This will create a report and dataset in the workspace.  We don't care about the report, we care about the dataset. Good!

 

4. Copy the report i just made, this time importing from the dataset in the flow workspace (shoudl be trivial).  Publish this to the Security and Audit workspaces.

 

4. Open you report in flow workspace, and save a copy to the workspaces Security and Audit workspace.
This will also create a lineage from different workspaces to flow workspace.

Tutu_in_YYC_0-1647989456150.png

 

5. Set schedule refresh on your dataset in flow workspace to refresh after your dataflow.

 

Dataset will not incremental refresh, it will do a full load. If you want dataset to incremental refresh, it has to be done in the query of the dataset itself.

 

View solution in original post

3 REPLIES 3
dungar
Frequent Visitor

Ok, to make sure i understand this, could you confirm this workflow is correct-

 

 

1. Create Data flow, set up my incremental refresh (archive everything older than 2 weeks, refresh previous 2 weeks).

 

2. Open powerbi desktop, import from the dataflow everything I need, create the report.  Publish it to my "Flow" workspace.

 

3. This will create a report and dataset in the workspace.  We don't care about the report, we care about the dataset.

 

4. Copy the report i just made, this time importing from the dataset in the flow workspace (shoudl be trivial).  Publish this to the Security and Audit workspaces.

 

Doing this seems to give me the lineage i would expect, however leaves me with one question, which is will the dataSET incrementally refresh like the dataFLOW, or will it pull from the entire dataflow each time?

Almost!

1. Create Data flow, set up my incremental refresh (archive everything older than 2 weeks, refresh previous 2 weeks). Good!

 

2. Open powerbi desktop, import from the dataflow everything I need, create the report.  Publish it to my "Flow" workspace. Good!

 

3. This will create a report and dataset in the workspace.  We don't care about the report, we care about the dataset. Good!

 

4. Copy the report i just made, this time importing from the dataset in the flow workspace (shoudl be trivial).  Publish this to the Security and Audit workspaces.

 

4. Open you report in flow workspace, and save a copy to the workspaces Security and Audit workspace.
This will also create a lineage from different workspaces to flow workspace.

Tutu_in_YYC_0-1647989456150.png

 

5. Set schedule refresh on your dataset in flow workspace to refresh after your dataflow.

 

Dataset will not incremental refresh, it will do a full load. If you want dataset to incremental refresh, it has to be done in the query of the dataset itself.

 

Tutu_in_YYC
Resident Rockstar
Resident Rockstar

Hi Dungar,

Report is driven by dataset, not dataflow. Dataset can pull data from dataflows. So if you use both dataflow and dataset, you need to configure refresh for both.

Refresh has to be scheduled in specific sequence, which means, dataflow should refresh first. Once that is completed, then the dataset can refresh by pulling data from dataflow.

For your case:
1 dataflow > 2 dataset > 2 reports

 

You can simplify by:
1 dataflow > 1 dataset > 2 reports

Check this link out on how to use 1 dataset for different reports in different workspaces
https://docs.microsoft.com/en-us/power-bi/connect-data/service-datasets-across-workspaces

 

 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors