Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi!
We've got a customer on Premium EM2 & 5GB limitation. We get dataset refresh errors due to memory limit. Model itself is only 200MB in size. So somehow 200MB becomes >5GB, while processing/refreshing dataset.
Please someone explain:
1) How to find the culprit? (calculated columns & measures)
2) How to estimate memory requirements for PBI models? Clearly the size of the "compressed" model isn't the
only relevant metric.
Thanks,
Kaarel.
Solved! Go to Solution.
The quickest way to monitor memory usage is via temporary Analysis Services instance that is started when opening Power BI Desktop file. 1) Just open task manager and Power BI Desktop file with the problematic report(s). 2) Hit refresh all. 3) Check for ram usage spikes in task manager. That is the correct instance.
Hi @kaarel
Use measure instead of calculated column.Have a look at Vertipaq Analyzer which uses DAX Studio.
Reference:
Power-BI-in-memory-RAM-Overload
Power BI Desktop – Memory Usage
Performance Tip for Power BI; Enable Load Sucks Memory Up
Regards,
OK thanks, but how to specifically track down problematic calculated columns? Not when they are in processed/refreshed/compressed state, but when the model process/refreshing is happening and the memory consumption is growing. Vertipac Analyser doesn't help in that regard, does it?
As I said, model is 200MB, but while processing/refreshing, it grows over 5GB.
The quickest way to monitor memory usage is via temporary Analysis Services instance that is started when opening Power BI Desktop file. 1) Just open task manager and Power BI Desktop file with the problematic report(s). 2) Hit refresh all. 3) Check for ram usage spikes in task manager. That is the correct instance.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.