Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
h4tt3n
Resolver II
Resolver II

Extensive memory consumption causing problems and errors

Hello everyone,

 

I am running Power BI on an 8GB core i7 Thinkpad laptop, and I am experiencing memory related problems, even with smaller tables. One report with about 40k rows suddenly won't refresh and throws the following error for each table in the report, some of which consist of just a single row:

 

Table
Failed to save modifications to the server. Error returned: 'There's not enough memory to complete this operation. Please try again later when there may be more memory available. '.
 
According to task manager, Power BI consumes every bit of available ram when I hit the refresh button, and then throws the error when running out of memory.
 
I have been reading up on tips & tricks, and best practices regarding table setup and data managing, but I can't even apply the changes that might solve the problem, because the query editor never finishes doing the changes, so I am sort of caught in a trap.
 
I'm fairly new to Power BI and could really need some advice and help. Installing more ram is not an option.
 
Cheers, Mike
1 ACCEPTED SOLUTION

@parry2k Yes, you are exactly right! I found the error through simple trial-and-error, simply by deleting memory consuming parts of the report until refresh started working again. It turned out to be one single calculated column whose only function was to subtract a number in a row with the previous number (sorted by timestamp):

 

DeltaIntakeTempCC = Data[IntakeTemp] - LOOKUPVALUE( Data[IntakeTemp], Data[Timestamp], CALCULATE( MIN( Data[Timestamp] ), FILTER( Data, Data[Timestamp] > EARLIER( Data[Timestamp] ) && Data[UnitId] = EARLIER( Data[UnitId] ) ) ), Data[UnitId], Data[UnitId] )
 
This single DAX function consumes 16GB of ram when analyzing just 42K rows of data. Data is the table name, and IntakeTemp, Timestamp, UnitID are integer columns.
 
So, now I'll look for a way to make this less memory intensive.
 
Cheers, Mike

View solution in original post

5 REPLIES 5
v-frfei-msft
Community Support
Community Support

Hi @h4tt3n ,

 

Please refer to thie video.

https://www.sqlbi.com/tv/my-power-bi-report-is-slow-what-should-i-do/

 

We can use DAX sutdio or Performance Analyzer to optimize your model.

 

 

Community Support Team _ Frank
If this post helps, then please consider Accept it as the solution to help the others find it more quickly.
parry2k
Super User
Super User

@h4tt3n so what you are saying it is just a simple table with 40k rows and you are running in memory limit. Do you have any DAX expression/visuals that is done? 

 

Can you try new power bi report, just load the table without any dax etc and see what happens.



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

Yes, that is exactly what is happening. 

 

There are some DAX measures and calculated columns, and there are some visuals with slicers. I'll try copying the entire report and deleting at least the visuals, since everything table wise will still work without them. My problem is that I can't even apply the memory saving changes in the query editor, because there isn't enough memory for the save & upload - it throws an error and aborts. So I'm stuck with the querys I have, for now.

 

Cheers, Mike

@h4tt3n I think it is not the query, it is dax which is causing the issue. Bad written DAX or poor data model can have great impact on the performance and can slow down everything.



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

@parry2k Yes, you are exactly right! I found the error through simple trial-and-error, simply by deleting memory consuming parts of the report until refresh started working again. It turned out to be one single calculated column whose only function was to subtract a number in a row with the previous number (sorted by timestamp):

 

DeltaIntakeTempCC = Data[IntakeTemp] - LOOKUPVALUE( Data[IntakeTemp], Data[Timestamp], CALCULATE( MIN( Data[Timestamp] ), FILTER( Data, Data[Timestamp] > EARLIER( Data[Timestamp] ) && Data[UnitId] = EARLIER( Data[UnitId] ) ) ), Data[UnitId], Data[UnitId] )
 
This single DAX function consumes 16GB of ram when analyzing just 42K rows of data. Data is the table name, and IntakeTemp, Timestamp, UnitID are integer columns.
 
So, now I'll look for a way to make this less memory intensive.
 
Cheers, Mike

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.