Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
amir_mm
Helper II
Helper II

Incremental refresh and memory usage/DAX Studio

Hi,

 

I'm currently using incremental refresh for 2 large tables (to refresh the last 12 months of data and archive the past 5 years), hoping to avoid hitting the memory limit of our premium capacity. I understand that the initial refresh in PBI service creates the partitions, with subsequent refreshes being incremental.

Following these refreshes, I checked out the total amount of memory used in last refresh using DAX studio's VertiPaq Analyzer and it was 1.35 GB. 

Then I changed the refresh policy to refresh the last 6 months instead of the last 12, and again the memory usage remained at exactly 1.35 GB after the refresh in DAX studio.

 

Wouldn't reducing the data volume typically result in decreased memory usage?

 

I've verified in the Analysis Services that the partitions are correctly created, and each refresh adheres to the defined policy.

2 ACCEPTED SOLUTIONS
v-xiaoyan-msft
Community Support
Community Support

Hi @amir_mm ,

 

Typically reducing the data volume would result in decreased memory usage. However, in the context of Power BI’s incremental refresh, during a full refresh operation, the service maintains a snapshot of the model in memory until the refresh operation is complete. This can use as much as double the amount of memory required by the model alone.
With incremental refresh, the service dynamically partitions and separates data that needs to be refreshed frequently from data that can be refreshed less frequently. This means that even if you reduce the refresh policy to 6 months, the historical data (past 5 years) is still maintained in the memory.

Therefore, while it might seem intuitive that reducing the refresh policy would decrease memory usage, the actual memory usage depends on a variety of factors including the size of the model, the partitioning of the data, and the data loading process. It’s possible that these factors are contributing to the consistent memory usage you’re observing.

 

For more you may refer to:

Solved: Re: Dataset incremental refresh end memory consump... - Microsoft Fabric Community

Incremental refresh for semantic models and real-time data in Power BI - Power BI | Microsoft Learn

 

Best regards.
Community Support Team_Caitlyn

View solution in original post

Oh that`s cool.
Yes, pls check your model for unexpected impacts.
Pls check filter interaction regarding speed for rendering your visuals.

Be careful to disable MDX in your whole model.
MDX will have an impact in usage in Excel. Hopefully you don`t have that much excel users.
Try by your own which impact disabling MDX has for you in excel.
This article is a very cool one for that topic: https://data-mozart.com/hidden-little-gem-that-can-save-your-power-bi-life/

Another approach to reduce a bit the size is disabling Auto Date/Time table(s) (hidden tables).
They could have an impact on your size.

sergej_og_0-1713334903664.png


Remove unused columns (I think you did it already).

Regards

View solution in original post

6 REPLIES 6
sergej_og
Super User
Super User

1.35 GB is not that much. No worries about hitting memory limit in premium capacity.
When I`m right you have 25GB RAM each sematic model - in P1 (F64) capacity.
But you will need a part of this RAM to make operations like queries and refreshes. In reality you might have less than 25GB.

I have run big sematic models (over 150 mio rows) incremental.
Never hit the memory limit.

Regards

Thank you @sergej_og 

We have a PPU (embedded) license with A2 capacity which is 5GB memory limit.

Previously, during refreshes, DAX Studio showed a memory usage of 1.95GB, consistently resulting in memory limit errors. To resolve this, I transferred some calculated columns to the SQL server and some others to Power Query, and removed many unused columns. Now, with these transformations in place alongside incremental refresh, the memory usage during a refresh has decreased to 1.35GB. While refreshes are generally successful, we still face some failures throughout the day (as we refresh the dataset every 30 min).

I don't know what additional steps I can take to further reduce memory usage.

 

A2, that`s a pity.
So you don`t have the XMLA capabilities.
Via Tabular Editor (using XMLA endpoint) you could shrink your model size by disabling MDX.

don't have the tool (DAX Studio) to hand currently. Not sure about possibility to tweek a bit the size of columns in your model without having XMLA capabilities.

Regards

Thanks again @sergej_og 

Yes we do have the XMLA endpoint capabilities as well.

Disabling MDX seems to be great! I just disabled it for all columns in some large tables through tabular editor, and the memory usage in last refresh dropped from 1.35GB to 1040MB which is great, but I'll have to check the reports and the visuals to make sure it does not have any negative effects on them.

 

And if you know any other methods to further reduce memory usage through XMLA endpoint,I would greatly appreciate it if you could share it with me. Thanks!

Oh that`s cool.
Yes, pls check your model for unexpected impacts.
Pls check filter interaction regarding speed for rendering your visuals.

Be careful to disable MDX in your whole model.
MDX will have an impact in usage in Excel. Hopefully you don`t have that much excel users.
Try by your own which impact disabling MDX has for you in excel.
This article is a very cool one for that topic: https://data-mozart.com/hidden-little-gem-that-can-save-your-power-bi-life/

Another approach to reduce a bit the size is disabling Auto Date/Time table(s) (hidden tables).
They could have an impact on your size.

sergej_og_0-1713334903664.png


Remove unused columns (I think you did it already).

Regards

v-xiaoyan-msft
Community Support
Community Support

Hi @amir_mm ,

 

Typically reducing the data volume would result in decreased memory usage. However, in the context of Power BI’s incremental refresh, during a full refresh operation, the service maintains a snapshot of the model in memory until the refresh operation is complete. This can use as much as double the amount of memory required by the model alone.
With incremental refresh, the service dynamically partitions and separates data that needs to be refreshed frequently from data that can be refreshed less frequently. This means that even if you reduce the refresh policy to 6 months, the historical data (past 5 years) is still maintained in the memory.

Therefore, while it might seem intuitive that reducing the refresh policy would decrease memory usage, the actual memory usage depends on a variety of factors including the size of the model, the partitioning of the data, and the data loading process. It’s possible that these factors are contributing to the consistent memory usage you’re observing.

 

For more you may refer to:

Solved: Re: Dataset incremental refresh end memory consump... - Microsoft Fabric Community

Incremental refresh for semantic models and real-time data in Power BI - Power BI | Microsoft Learn

 

Best regards.
Community Support Team_Caitlyn

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel1

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

Top Solution Authors