Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Anonymous
Not applicable

Multiple reports from a single dataset in desktop

Hi All, 

 

Background:

My company has a DWH and divisional datamarts.  Recently, they have published a divisional Data Model to the BI team, who are now in the process of building out a suite of operational reports - working with IT to extend the datamart/data model as new busienss requirements are received over time.

 

As a senior BA, however, I need to conduct ad hoc and investigative data analysis to support mulitple projects within the same division.  My requirement is for access an editable dataset, drawn from the same datamart, so I can build out custom columns, measures, tables, and even join new data sources to this dataset, according to the differing project requirements. 

 

It's unlikely that anything more than a handful of these new elements I create would ever need to be build into the divisional data model, as they are transient and project specific in nature.

 

I have been supplied a .pbix file by IT, which contains 4 tables - each connecting to the datamart as a source.

 

Problem Statement

I know I can create multiple reports from a single dataset using the PBI service however, as I need to be able to write custom DAX and power queries, I need to be able to create these reports using PBI desktop.

 

Over time, it's likely that IT will need to include additional tables to those already exposed in my project specific dataset, or pull through additional columns from the datamart for the existing tables in my dataset.

 

I'm wondering if there is a way for me to be able to have multiple reports hanging off the same dataset, but for these reports to be editable in PBI desktop?  If so, when the underlying dataset needs to be updated/extended, I need only do this in a single dataset and it will update all the associated reports.  

 

At present, the only method I can see is to create my reports in PBI desktop, publishing each to a workspace when complete, but effectively duplciating the underlying dataset.  When I need to update/extend this dataset, I'll need to make the changes in each standalone copy for each of the differing reports I've created.

 

In researching this, I've come across the following article, but it doesn't seem to match my specific scenario - https://community.powerbi.com/t5/Community-Blog/Power-BI-Usage-Models-in-Pictures/ba-p/1342820 

 

If anyone can offer any advice, I'd be very grateful!

 

Many thanks,

 

Chris

 

1 ACCEPTED SOLUTION

@Anonymous,

 

That link on golden datasets is the basis for how I structure my environments. We create a golden dataset in Power BI Desktop, and publish it to a golden workspace. The golden dataset contains centralized logic that serves as a single source of truth. Then, we create two types of reports based on the golden dataset: live connection (thin reports, with no additional data sources), and composite models (additional data sources).





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




View solution in original post

7 REPLIES 7
DataInsights
Super User
Super User

@Anonymous,

 

It sounds as if a composite model is a good option for your requirements. In Power BI Desktop, connect to the master published dataset. Then, add other data sources, measures, etc. This will result in a composite model (combination of DirectQuery and Import sources).

 

You'll need to enable the Preview feature below in Power BI Desktop (Options):

 

"DirectQuery for PBI datasets and AS"

 

https://docs.microsoft.com/en-us/power-bi/transform-model/desktop-composite-models 





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Anonymous
Not applicable

Hi @DataInsights 

 

Just wondering if you've had any thoughts based on my response above perhaps?

 

I've had the concept of Template files shared with me elsewhere, but am not 100% convinced this will do what I need.

😞

@Anonymous,

 

Regarding your statement below:

 

"If the SQL queries against the datamart populated a single source, I could surely just connect my queries to this single source so that any changes are automatically pulled through to each of my .pbix files."

 

This sounds like a use case for a golden dataset (the one IT prepared with 4 tables), and multiple pbix files based on the golden dataset (with additional tables and measures, resulting in a composite model). Would you be able to draw up a simple diagram illustrating the structure and why the golden datset/composite model approach won't work?

 

Also, you mentioned not having the full use of DAX. That's a limitation of DirectQuery, but there may be some workarounds we can explore.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Anonymous
Not applicable

Thanks @DataInsights 🙂

 

Here's a view of the current architecture;

CWillson_0-1647510104362.png

The BI reports are connecting to the Ops Mart, but are too constrained for my purposes, e.g. cannot connect to additional sources, write custom queries or DAX, hence why the Analysis reports have been set up.

 

The trouble is, the SQL queries against the mart all sit in these reports, meaning any changes have to be replicated to each individual report.

 

It sounds like the Golden Dataset approach is what might work for us but where is this built; in PBI itself or elsewhere?

 

Just to check my understanding here; as new composite elements are built from the mart, this could be done using SQL queries in the Golden Dataset.  As long as these changes do not affect the existing structure and content of the tables, the new elements would simply appear in the downstream reports - is that right?  I'm a little concerned that any DAX or M queries in the downstream reports might 'break' if the Golden Dataset changes key components in these.  (hopefully that makes sense in my noddy language!  lol)

 

Edit: Just found this resource, which seems to explain it really well. https://exceleratorbi.com.au/new-power-bi-reports-golden-dataset/

 

N.B. I've changed from DirectQuery to Import, which has opened up the full use of DAX and seems to be working fine.

 

Thanks again for your continued assistance with this. It's very much appreciated. 🙂

 

 

@Anonymous,

 

That link on golden datasets is the basis for how I structure my environments. We create a golden dataset in Power BI Desktop, and publish it to a golden workspace. The golden dataset contains centralized logic that serves as a single source of truth. Then, we create two types of reports based on the golden dataset: live connection (thin reports, with no additional data sources), and composite models (additional data sources).





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Anonymous
Not applicable

Spot on!  Thanks for your support on this @DataInsights  🙂

Anonymous
Not applicable

Thanks @DataInsights .

 

Yes, this is what I need in terms of my ability to connect to additional data sources, and this looks to be working OK.

 

Where I'm struggling specifically, is with the following;

 

At present, I have 4 tables in my desktop dataset.  The query for each uses the same syntax, an appears to be extracting data directly from the datamart, e.g. "= Sql.Database("CRR_DWHOUSE", "Ops_Mart", [Query="SELECT ....."

 

Logically, if additional tables are added to the dataset from the datamart using this method, they will each require a new DirectQuery.  Also, if additional fields are added to an existing table (query), the code will need to be updated.

 

I'm aware that we can hang multipple reports off a single dataset within the PBI service/workspace, but this doesn't permit the full use of DAX so files need to be downloaded to PBI Desktop.

 

I will need to create multiple reports using the same dataset, and it looks like the only way to do this in PBI desktop will be to duplicate the dataset multiple times.

 

If/as/when new tables/fields are aded to the dataset, it looks like I'm going to have to update the code in each query for each Desktop report.  Surely there must be a more efficient method than this?!

 

If the SQL queries against the datamart populated a single source, I could surely just connect my queries to this single source so that any changes are automatically pulled through to each of my .pbix files. 

 

Hope this makes sense!  I'm no data engineer!  lol

 

Thanks

Helpful resources

Announcements
PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.