Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Ricardo77
Helper II
Helper II

Power BI (Desktop) for Operational Reports - an Architectural doubt

Hello,

i'd like to place a question.

I love Power BI but I have a doubt that nobody clearly explained to me till today:

 

Everybody knows that we should follow dimensional modelling, that's OK.  But If i have to develop operational reports, with 10K or 1M or 10M rows, and with 10 columns, or 50, and not even thinking in Big descriptions, direct query and so, i'd like to ask, why is Power BI and DAX so slow? Or, to be more kind, what should i use inside this ecosystem? Should i use PBI paginated reports, or EXCEL Pivot Tables, will it be much more quick?

Don't get me wrong, i think MS has been doing a great job. But for me this is very strange, and there are lots of projects on migrating from rivals to Power BI, but the experience for this type of reports is not good, also in other situations related to the complexity of DAX. For example, if you have a report with 10 columns, and the last 8 columns use the measure on the 2nd column, you have to do 9 (8+1) times the calculation by row? And so, if I have 1M rows, it's 9M calculations (not completely sure in this, but you get the picture).

 

Be kind on the answers :), this is a provocative and constructive question on a technology i like, but it seems unfit for operational reports, and all the grey area between operational reports and dashboards... 

 

Regards 

3 REPLIES 3
AllisonKennedy
Super User
Super User

Really good question, and I'm posting here mainly to hear what other people respond, and get discussion going. I do not have a solid answer or solution for it.

This is one reason I try not to use DAX for the data modeling side of things - it can be REALLY slow. I use Power Query M instead. Save DAX for measures. If you create DAX measures, they don't need to perform the calculation for 9M rows, but only as many times as needed based on the aggregation you have specified in the visual. I wouldn't think Excel Pivot tables would be any faster.

Do you have a specific report in mind to test/sample? Have you played with the performance analyzer to see which components are taking the most time?
https://docs.microsoft.com/en-us/power-bi/create-reports/desktop-performance-analyzer

https://radacad.com/m-or-dax-that-is-the-question/?ref=818


Please @mention me in your reply if you want a response.

Copying DAX from this post? Click here for a hack to quickly replace it with your own table names

Has this post solved your problem? Please Accept as Solution so that others can find it quickly and to let the community know your problem has been solved.
If you found this post helpful, please give Kudos C

I work as a Microsoft trainer and consultant, specialising in Power BI and Power Query.
www.excelwithallison.com

Hello @AllisonKennedy ,

So you create all the calculated columns on Power Query / M.
at this moment don't know/remember if i have access to Power Query can to create CCs , when creating reports from PBI (Desktop, Desktop for RS or Service) connected to Analysis services Tabular, and also to a SQL DW 

 

I'm not totally sure if i understood this part "save DAX for measures. If you create DAX measures, they don't need to perform the calculation for 9M rows, but only as many times as needed based on the aggregation you have specified in the visual. I wouldn't think Excel Pivot tables would be any faster." - can i assume that measures in visuals and/or repeatedly used on table columns have more performance (something kind of reuse or cache)?

 

I'll see the radacad link 🙂 the other part i already know. Thanks!

The fact that almost nobody answers, namely from Microsoft, is in itself an answer. Oh Jesus 😞 ...

@Ricardo77 You are right, you don't always have access to Power Query or even DAX calculated columns depending on your data source - if you're using Direct Query you can't, but if you IMPORT then you can use Power Query.

Import on large datasets isn't always a good idea, but you could look at incremental refresh, dataflows, etc. 


Please @mention me in your reply if you want a response.

Copying DAX from this post? Click here for a hack to quickly replace it with your own table names

Has this post solved your problem? Please Accept as Solution so that others can find it quickly and to let the community know your problem has been solved.
If you found this post helpful, please give Kudos C

I work as a Microsoft trainer and consultant, specialising in Power BI and Power Query.
www.excelwithallison.com

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.