Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi everyone,
I'm thinking to include in our reporting solutions the dataflows and datamarts.
My idea is to have in dataflows a 1:1 copy of the tables stored in Azure Databricks and then consume them with PowerBI Datamarts creating datamarts dedicated to each project.
There are multiple reasons why we're choosing to have a copy of the DB in dataflows instead of connecting directly to Azure DBK so please let's not focus on this point.
My question is: is there any best practice on how to setup dataflows? How many tables should a dataflow include? For now we have diverging opinions in the team, ranging from having 1 dataflow per 1 table (allowing max flexibility and different refreshes times per table), to 1 dataflow per 1 source system, to 1 dataflow for all (around 40 tables, all <1.5mln rows).
What should we take into consideration to choose and prevent issues in the future, when more tables might be included?
Thanks in advance for the help
Solved! Go to Solution.
Hi @alexpbi88 ,
I have also found asimilar post, please refer to it to see if it helps you.
Solved: All tables to dataflows - Microsoft Power BI Community
Best Regards
Community Support Team _ Polly
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @alexpbi88 ,
I have also found asimilar post, please refer to it to see if it helps you.
Solved: All tables to dataflows - Microsoft Power BI Community
Best Regards
Community Support Team _ Polly
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
User | Count |
---|---|
84 | |
80 | |
72 | |
71 | |
55 |
User | Count |
---|---|
108 | |
106 | |
93 | |
84 | |
66 |