Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Bart_Poelert
Frequent Visitor

Fabric F64 gives out of memory while refreshing a live dataset in the service

Hello, 

 

We are transfering from Power BI Premium Per User to Fabric F64.

 

But my main dataset a tabular model deployed from tabular editor is not refreshing. it gives a out of memory error.

I know the model size of ppu is 100gb and that of f64 is 25gb. 

But the out of memory is given at around 9gb. 

I enabeld the large model option in the workspace and on the dataset.

 

why im i not getting the error on 25gb database size but on 9gb database size.

when i deploy the model fresh from tabular editor the inital refresh works fine.

 

fabric capacity size:

Bart_Poelert_2-1713806343850.png

 

 

refresh history (the initial load is working fine)

Bart_Poelert_0-1713806127812.png

 

the error im getting:

Bart_Poelert_3-1713806387149.png

 

Can anyone please advice on why the memory limit is not 25gb and how to adjust this?

 

Regards,

 

Bart Poelert

 

 

1 ACCEPTED SOLUTION
v-jianpeng-msft
Community Support
Community Support

Hi, @Bart_Poelert 

Thank you a lot for sharing the solution. Here's an explanation of why your model exceeds the 25GB memory limit:

25GB=25600MB. Your semantic model has consumed 16 GB (16,571 MB) and your command operations have consumed 9,031 MB before your command starts. Total consumption: 16,571 + 9,031 = 25,602 MB. As a result, an error is reported.

The effective memory limit for a command is calculated based on the amount of memory allowed by the semantic model in terms of capacity (25 GB, 50 GB, 100 GB) and the amount of memory already consumed by the semantic model when the command starts executing. For example, using a semantic model of 12 GB on P1 capacity allows the effective memory limit for new commands to be 13 GB.

You can use the following XMLA property value command to adjust the size of the effective memory:

<PropertyList>  
   ...  
   <DbpropMsmdRequestMemoryLimit>...</DbpropMsmdRequestMemoryLimit>    
   ...  
</PropertyList>  

You can click on the link below to learn about the above:

Troubleshoot XMLA endpoint connectivity in Power BI - Power BI | Microsoft Learn

vjianpengmsft_0-1713838069903.png

DbpropMsmdRequestMemoryLimit Element (XMLA) | Microsoft Learn

vjianpengmsft_1-1713838102594.png

 

 

 

How to Get Your Question Answered Quickly

Best Regards

Jianpeng Li

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

3 REPLIES 3
v-jianpeng-msft
Community Support
Community Support

Hi, @Bart_Poelert 

Thank you a lot for sharing the solution. Here's an explanation of why your model exceeds the 25GB memory limit:

25GB=25600MB. Your semantic model has consumed 16 GB (16,571 MB) and your command operations have consumed 9,031 MB before your command starts. Total consumption: 16,571 + 9,031 = 25,602 MB. As a result, an error is reported.

The effective memory limit for a command is calculated based on the amount of memory allowed by the semantic model in terms of capacity (25 GB, 50 GB, 100 GB) and the amount of memory already consumed by the semantic model when the command starts executing. For example, using a semantic model of 12 GB on P1 capacity allows the effective memory limit for new commands to be 13 GB.

You can use the following XMLA property value command to adjust the size of the effective memory:

<PropertyList>  
   ...  
   <DbpropMsmdRequestMemoryLimit>...</DbpropMsmdRequestMemoryLimit>    
   ...  
</PropertyList>  

You can click on the link below to learn about the above:

Troubleshoot XMLA endpoint connectivity in Power BI - Power BI | Microsoft Learn

vjianpengmsft_0-1713838069903.png

DbpropMsmdRequestMemoryLimit Element (XMLA) | Microsoft Learn

vjianpengmsft_1-1713838102594.png

 

 

 

How to Get Your Question Answered Quickly

Best Regards

Jianpeng Li

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Thank you Jianpeng Li, 

 

that actualy makes sence, for now i will clear the dateset before refresh and i will be going to tune de dataset to reduce it size there is a lot of cluter in the tabels.

 

Regards,

 

Bart

Bart_Poelert
Frequent Visitor

small update on how i resolved this issue for now.. but still would like to know why there is a execption thrown  at memory usage below the 25gb.

 

For now im using a adf pipeline to clear the dataset first with the power bi api (https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/refresh-dataset)

with the body: {"type":"clearValues"}

this clears the dataset, and after that refresh the dataset with the same command and a empty body.

Helpful resources

Announcements
PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors