Hello,
i'd like to place a question.
I love Power BI but I have a doubt that nobody clearly explained to me till today:
Everybody knows that we should follow dimensional modelling, that's OK. But If i have to develop operational reports, with 10K or 1M or 10M rows, and with 10 columns, or 50, and not even thinking in Big descriptions, direct query and so, i'd like to ask, why is Power BI and DAX so slow? Or, to be more kind, what should i use inside this ecosystem? Should i use PBI paginated reports, or EXCEL Pivot Tables, will it be much more quick?
Don't get me wrong, i think MS has been doing a great job. But for me this is very strange, and there are lots of projects on migrating from rivals to Power BI, but the experience for this type of reports is not good, also in other situations related to the complexity of DAX. For example, if you have a report with 10 columns, and the last 8 columns use the measure on the 2nd column, you have to do 9 (8+1) times the calculation by row? And so, if I have 1M rows, it's 9M calculations (not completely sure in this, but you get the picture).
Be kind on the answers :), this is a provocative and constructive question on a technology i like, but it seems unfit for operational reports, and all the grey area between operational reports and dashboards...
Regards
Has this post solved your problem? Please mark it as a solution so that others can find it quickly and to let the community know your problem has been solved.
If you found this post helpful, please give Kudos.
I work as a trainer and consultant for Microsoft 365, specialising in Power BI and Power Query.
Hello @AllisonKennedy ,
So you create all the calculated columns on Power Query / M.
at this moment don't know/remember if i have access to Power Query can to create CCs , when creating reports from PBI (Desktop, Desktop for RS or Service) connected to Analysis services Tabular, and also to a SQL DW
I'm not totally sure if i understood this part "save DAX for measures. If you create DAX measures, they don't need to perform the calculation for 9M rows, but only as many times as needed based on the aggregation you have specified in the visual. I wouldn't think Excel Pivot tables would be any faster." - can i assume that measures in visuals and/or repeatedly used on table columns have more performance (something kind of reuse or cache)?
I'll see the radacad link 🙂 the other part i already know. Thanks!
The fact that almost nobody answers, namely from Microsoft, is in itself an answer. Oh Jesus 😞 ...
@Ricardo77 You are right, you don't always have access to Power Query or even DAX calculated columns depending on your data source - if you're using Direct Query you can't, but if you IMPORT then you can use Power Query.
Import on large datasets isn't always a good idea, but you could look at incremental refresh, dataflows, etc.
Has this post solved your problem? Please mark it as a solution so that others can find it quickly and to let the community know your problem has been solved.
If you found this post helpful, please give Kudos.
I work as a trainer and consultant for Microsoft 365, specialising in Power BI and Power Query.
Join us for an in-depth look at the new Power BI features and capabilities at the free Microsoft Business Applications Launch Event.
User | Count |
---|---|
444 | |
164 | |
114 | |
53 | |
52 |
User | Count |
---|---|
458 | |
143 | |
130 | |
78 | |
71 |