Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
grggmrtn
Post Patron
Post Patron

HUGE query, M times out, DAX runs out of memory... I'm lost

Sorry, this is a VERY broad question - not even sure it's acceptable, I just don't know where to turn.

 

Our data is in a cube, Analysis Services, and the cube is huge - no way to just import it all and work from there.

So there's a LOT of filtering, custom columns, a couple of index columns that I create and merge so I can get some "Previous" columns for tracking changes... and it's constantly timing out. It used to run but our data grows exponentially.

I tried converting the flat file to a star schema, but running the same things in Dax through measures and custom columns uses every bit of my 32 GB ram and just dies.

I've tried reducing the number of columns but I need a minimum of 8 to get the report we need.

The server the data is on might be part of the problem, but I'm not the server admin here - just the data monkey. My latest timeout happens when I just try to sort 4 of the columns.

 

What I probably need is for someone to look at this with me, but unfortunately all the data is GDPR protected. But maybe someone has SOME sort of strategy, that could reduce server load and/or save my RAM?

4 REPLIES 4
parry2k
Super User
Super User

@grggmrtn performance is very track topic, there are many moving parts, and without looking at the model and measures, it is very hard to say what needs to be done.

 

- star schema is a way to go

- remove unwanted columns

- make sure column type is correct

- avoid merging and grouping in PQ 

- make sure relationships cross filter direction is set to single not both

- and then all comes down to measures which can be further reviewed using performance analyzer and that can get you which measure is taking longer than others.

 

So many factors..

 

I would  Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos whoever helped to solve your problem. It is a token of appreciation!

Visit us at https://perytus.com, your one-stop shop for Power BI related projects/training/consultancy.



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

Yeah I can see I can't avoid the star schema here. I just have a couple algorithms I need to figure out in DAX

 

Thanks for the tips

lbendlin
Super User
Super User

On the M side look at the load distribution between the cube and the Power Query engine (also called Query Folding).  See if any folding happens, and if it is faster or slower with it.

 

On the DAX side there's a simple solution - download and use DAX Studio. Use it often, use it a lot.  Best thing since sliced bread.

Thanks @lbendlin - folding happens WAY too soon, and no matter how I "rearrange" things I was only able to continue folding two more steps.

I'll look into DAX Studio - thanks for the tip!

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.