Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
simoncui
Frequent Visitor

What is the Dataset refresh Limit of Power BI Premium Gen2 P3?

There was an error even it's only refreshing one table via API, the table size is 6G in memory checking with DAX.

It's P3, so there should be 100G for refresh,  even it needs10 times of data size, it's enough, but why it failed with memory limit error time to time?

I noticed there is dataset refresh limit in utilization app, what is it and how to change it?

 

The M evaluation exceeded the memory limit. To address the issue consider optimizing the M expressions, reducing the concurrency of operations that are memory intensive or upgrading to increase the available memory. Container exited unexpectedly with code 0x0000DEAD. PID: 23720.\r\nUsed features: Odbc.DataSource/Microsoft Amazon Redshift ODBC Driver/1.4.45.1000/Redshift/8.0.2.;Container exited unexpectedly with code 0x0000DEAD. PID: 23720.\r\nUsed features: Odbc.DataSource/Microsoft Amazon Redshift ODBC Driver/1.4.45.1000/Redshift/8.0.2.\r\nContainer exited unexpectedly with code 0x0000DEAD. PID: 23720.\r\nUsed features: Odbc.DataSource/Microsoft Amazon Redshift ODBC Driver/1.4.45.1000/Redshift/8.0.2.\r\nContainer exited unexpectedly with code 0x0000DEAD. PID: 23720..

simoncui_0-1648953147304.png

 

Thanks,

Simon

 

1 ACCEPTED SOLUTION
v-kkf-msft
Community Support
Community Support

Hi @simoncui ,

 

Working set is the physical memory (RAM) used by the mashup processes, while commit size is the amount of space reserved in the paging file for the process. In modern operating systems such as Windows, it is possible for processes to circumvent limits on physical memory -- this is achieved through paging, where memory referenced by the process can be temporarily reserved on disk instead of physical memory. The commit size/limit governs the amount of virtual memory in a process that can be backed by the paging file. As such, it is possible for Mashup process to encounter Out of Memory issues (typically denoted by presence of 0x0000DEAD error code) if the commit size limit is exceeded, even though there is significant unused physical memory on the capacity.

 

vkkfmsft_0-1649212733047.png

 

You can use Kusto to check if these refreshes exceed the commit size limit for that time period.

 

If the dataset fails to refresh at any time (avoid other dataset refreshes taking up resources), then please try upgrading the capacity or optimizing the model.

 


If the problem is still not resolved, please provide detailed error information or the expected result you expect. Let me know immediately, looking forward to your reply.
Best Regards,
Winniz
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

8 REPLIES 8
v-kkf-msft
Community Support
Community Support

Hi @simoncui ,

 

Working set is the physical memory (RAM) used by the mashup processes, while commit size is the amount of space reserved in the paging file for the process. In modern operating systems such as Windows, it is possible for processes to circumvent limits on physical memory -- this is achieved through paging, where memory referenced by the process can be temporarily reserved on disk instead of physical memory. The commit size/limit governs the amount of virtual memory in a process that can be backed by the paging file. As such, it is possible for Mashup process to encounter Out of Memory issues (typically denoted by presence of 0x0000DEAD error code) if the commit size limit is exceeded, even though there is significant unused physical memory on the capacity.

 

vkkfmsft_0-1649212733047.png

 

You can use Kusto to check if these refreshes exceed the commit size limit for that time period.

 

If the dataset fails to refresh at any time (avoid other dataset refreshes taking up resources), then please try upgrading the capacity or optimizing the model.

 


If the problem is still not resolved, please provide detailed error information or the expected result you expect. Let me know immediately, looking forward to your reply.
Best Regards,
Winniz
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

@v-kkf-msft That's new to me. Is there official document url for mashup?

 

Thanks,

Simon

Hi @simoncui ,

 

Unfortunately, this is an internal Microsoft document that is not available to the public and users do not have access to view it. Also, since I don't have access to Kusto, I could not help you determine if your dataset exceeds the commit size limit.

 

If you still have not resolved this issue, please submit a ticket and then the team will contact you and provide service.

How to create a support ticket in Power BI 

 

 

Best Regards,
Winniz

Hi @v-kkf-msft , Thank you anyway.

simoncui
Frequent Visitor

Hi @GilbertQ ,

It was refreshing a single table's partitions in parallel. Do you know what is the memory limit for such operation and where to check usage details or change limit?

 

Thanks,

Simon

Hi @simoncui 

 

have a look at these settings which you might need to tweak

 

How to configure workloads in Power BI Premium - Power BI | Microsoft Docs

 

I would suggest trying incremental refresh which could solve this issue?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Workloads setting doesn't help.

Yes, incremental refresh is working, but we need to do full refresh sometimes.

 

Thanks,

Simon

GilbertQ
Super User
Super User

Hi @simoncui 

 

From your error this is not to do with the actual dataset memory but the Power Query refresh limit is being hit when you are loading the data. It looks like you are using Redshift, just make sure to use the Batching process for redshift and not to stream the entire dataset at once. 


If you stream it, it means it will try and load the entire query at once which could be VERY big.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors