Currently I'm struggling with the problem of repeating calculated tables in multiple reports.
I have used a few datasets from a SQL sever and Excel files to calculate the final dataset, which is used for the visualisations in different reports.
So dataset A + dataset B + dataset C are used to calculate the final dataset D and the final dataset D is necessary to make some visualisations in different reports.
The final dataset D is calculated with DAX / calculated tables.
However, I have to copy those dataset scripts and the calculated table inclusive the DAX measures for every report. This is a very time consuming process and results in a high risk for errors.
Does somebody know an altenative to calculate the final dataset D only once and access that dataset in multiple reports?
Thank you in advance!
Thank you for your responses.
I will give additional explanation of the problem with an example:
Note: all tables are imported in PBI desktop and afterwards the report is published to the PBI webservice.
The purpose of this report is to define which bonus you can expect from a vendor at the end of a year.
Table 1 (based on a SQL script)
Table 2 (Excel file):
|Vendor||Zone 1||Zone 2||Zone 3||Result zone 1||Result zone 2||Result zone 3|
Calculatedtable1 (calculated table, which is created after the import of table 1 and 2 / after the query editor):
|Vendor A||€ 32.500||€ 2.438|
|Vendor B||€ 10.000||€ -|
This table is created based on the following script:
The bonus column is a calculated column based on the following script:
Did these datasets has similar data structure? If this is a case, you can parameterize your connection string and save it as template.
Then you can change the connection string to use template generate the different reports with similar DAX formulas.
Let's make sure the terminology is clear here.
A .pbix file is a collection of queries that point to data sources.
Together with the data model these queries combine into a dataset that you load into the workspace.
What you can do at this moment is share the dataset , both to reports inside the workspace/app, but also to reports in other workspaces. (assuming workspaces are "new", v2)
This gives you control over data acquisition, ETL, and the data model, and gives the subscribers to the shared dataset the flexibility to create their own visualizations, all off the same common dataset.
Next step is to promote and certify the dataset so more developers know about it or are forced to use it.
Next step is to use an actual data modeling tool (like CDM) to ensure cross-company consistency.
Yes, I know, wishful thinking. One can but dream.
This is a must watch for a message from Power BI!
Click here to read more about the December 2020 Updates!
Click here to read the latest blog and learn more about contributing to the Power BI blog!
Mark your calendars and join us for our next Power BI Dev Camp!.