Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi folks,
Power BI novice at work here! I am looking for some feedback on how to design a data model / transform data to meet my reqiurements.
Background
Every year we produce data that is a snapshot of our industry: there are Accounts (companies), the roles they play (Administrator, Custodian, Legal Adviser etc) and the Groups they service - here is a simplified data model:
So, this means that from year-to-year the data can look like:
Year 1
Year 2
....and next year:
Every year we will ingest this full dataset to the model to be able to generate reports such as 'Top Administrators' for that year or 'Largest Group' but I would also like to see the evolution of a particular Account or Group over time - won business through new Groups or lost business.
Solution?
I have followed the 'Star Schema' pattern and consolodated imported tables to reduce the number of Fact tables (not shown):
Questions
1. Year-on-year I will be appending a new dataset to the Fact tables - is that a good approach, any tips here?
2. Assuming yes above there will be duplication of the Account, Service Provider and Groups records. In the Transform should I create a composite key for each record each year e.g. based upon the AccountID, ServiceProviderID and GroupID combined with the Reporting Date and use that in the Fact table relationships? - Does that make sense or is there a better way?
Your feedback is very much appreciated.
Many thanks
Solved! Go to Solution.
Hi @bexbissell ,
For the optimization of the model, I can also provide some suggestions.
If it is Direct Query connection mode: you can optimize your data model using following tips:
For more model optimization guidelines, you can refer to the following document links: Optimization guide for Power BI - Power BI | Microsoft Docs
Looking forward to your feedback.
Best Regards,
Henry
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
@bexbissell , the model looks good.
Numeric Key for Composite keys is good whenever needed. But if you are doing that in power bi it will add a lot of cost at the time of loading.
Thanks for the feedback on the model schema @amitchandak - I've learnt a lot about model schemas, good modelling practice and model relationships, now I'm just trying to put it into practice.
The dataset is relatively small so generating the compostite keys on load/transform does not take long (20 secs).
Hi @bexbissell ,
For the optimization of the model, I can also provide some suggestions.
If it is Direct Query connection mode: you can optimize your data model using following tips:
For more model optimization guidelines, you can refer to the following document links: Optimization guide for Power BI - Power BI | Microsoft Docs
Looking forward to your feedback.
Best Regards,
Henry
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Ask questions in Eventhouse and KQL, Eventstream, and Reflex.
User | Count |
---|---|
87 | |
79 | |
62 | |
61 | |
60 |
User | Count |
---|---|
166 | |
114 | |
99 | |
73 | |
65 |