Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
sirahcy
Frequent Visitor

PBI Dataset Not Refreshed Due to Reaching Memory Limit

Hi MS Support & Community,

For number of large datasets in our primary PBI workspaces, we have seen errors last couple days related to insufficient memory in premium workspaces.  How can we check memory usage vs limit on the capacity?  The Capacity metrics tool does not appear to monitor memory usage; only CU% are monitored and we're well below the limit on the capacity.

 

sirahcy_0-1696974356886.png

sirahcy_1-1696974356890.png

 

 

2 ACCEPTED SOLUTIONS

Ensure you have Large Dataset turned on for the dataset @sirahcy , and see this article. It specifically mentions a 12GB dataset in a P1 capacity might cause issues, but a 9GB may depending on what else is loaded at the time. Large datasets in Power BI Premium - Power BI | Microsoft Learn

Are you using incremental refresh? The entire dataset does not need to be loaded during refresh if you are using incremental refresh on the large fact table(s). Only those recent partitions that would refresh.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

View solution in original post

Thank you @edhans.  Yes. We're enabled for Large dataset and incrementl refresh on the large fact tables.  Some of the datasets actually approached 12GB or exceeded so we definitely will look into sizing the capacity.  We're using incremental refresh (e.g. ~60% of a fact table), but not sure how much saving in  memory usage, instead of 2x dataset size using full refresh.

View solution in original post

8 REPLIES 8
edhans
Super User
Super User

I suspect it is those approaching or exceeding it that is causing the issue. May be time to move to a P2 capacity, or remove some history from the models.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

We do have adhoc memory failure on P2 capacity as well.  To check against the Max memory per query (6GB), do you look at the IO on the compute/cluster that processes the PBI refresh or something else?

sirahcy_0-1697230835221.png

 

Jannematz
Frequent Visitor

We tend to upscale the fabric instance before a refresh, supplying us with enough memory for multiple dataset refreshs. Afterwards we will downscale again. I don't understand why there are such low memory limitations in the first place though.

Thanks.  Will discuss with admins to enable this setting.

edhans
Super User
Super User

How large is the dataset and how large is the capacity? You typically need 2x the size of the dataset to refresh because temporarily, the full dataset is loaded twice, one for what is the current dataset, and one for the refreshed data that is swapped out.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Thanks. Our dataset size is 9G.  Looking at all the datasets refreshing around the same time is about 21GB (dataset size x 2), which is still below the P1 capacity memory limit 25G.  Or this is cutting too close?

Ensure you have Large Dataset turned on for the dataset @sirahcy , and see this article. It specifically mentions a 12GB dataset in a P1 capacity might cause issues, but a 9GB may depending on what else is loaded at the time. Large datasets in Power BI Premium - Power BI | Microsoft Learn

Are you using incremental refresh? The entire dataset does not need to be loaded during refresh if you are using incremental refresh on the large fact table(s). Only those recent partitions that would refresh.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Thank you @edhans.  Yes. We're enabled for Large dataset and incrementl refresh on the large fact tables.  Some of the datasets actually approached 12GB or exceeded so we definitely will look into sizing the capacity.  We're using incremental refresh (e.g. ~60% of a fact table), but not sure how much saving in  memory usage, instead of 2x dataset size using full refresh.

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel1

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

Top Solution Authors