cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
fhdalsafi
Frequent Visitor

Incremetnal refresh problem

Hello, 

 

I am having problem with the refresh, the error is shown in the picture below.

fhdalsafi_0-1642935781501.png

 

It says that the database size before execution is 14.2GB.

 

When I opened Dax studio it shows that the dataset size is 12.6GB ( there are a couple of excel sheets also connected but they dont make up alot of size).

fhdalsafi_1-1642935929926.png

 

I have Incremental refresh set up to refresh only a small partition of the data.

My question is why is the datasize before execution is the same as the full dataset size, its like the incremental refresh is not taking part of the data to refresh but instead its taking the whole dataset.

 

Appreciate any assistance.

 

Thanks in advance.

 

1 ACCEPTED SOLUTION
v-polly-msft
Community Support
Community Support

Hi @fhdalsafi ,

First, please make sure the increment refresh can work. Incremental refresh is designed for data sources that support query folding, most data source(like flat file, SQL and other  relational data sources etc.) support query folding. Could you please tell me that what is the data source you are using? Please refer to the following document to determine if you have configured it correctly.

Configure incremental refresh and real-time data 

 

Typically, the effective memory limit for a command is calculated on the memory allowed for the dataset by the capacity (25 GB, 50 GB, 100 GB) and how much memory the dataset is already consuming when the command starts executing. For example, a dataset using 12 GB on a P1 capacity allows an effective memory limit for a new command of 13 GB. However, the effective memory limit can be further constrained by the DbPropMsmdRequestMemoryLimit XMLA property when optionally specified by an application. Using the previous example, if 10 GB is specified in the DbPropMsmdRequestMemoryLimit property, then the commands effective limit is further reduced to 10 GB.

 

To potentially reduce exceeding the effective memory limit:

  • Upgrade to a larger Premium capacity (SKU) size for the dataset.
  • Reduce the memory footprint of your dataset by limiting the amount of data loaded with each refresh.
  • For refresh operations through the XMLA endpoint, reduce the number of partitions being processed in parallel. Too many partitions being processed in parallel with a single command can exceed the effective memory limit.

 

More details: Resource governing command memory limit in Premium Gen 2 

 

Best Regards

Community Support Team _ Polly

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

2 REPLIES 2
v-polly-msft
Community Support
Community Support

Hi @fhdalsafi ,

First, please make sure the increment refresh can work. Incremental refresh is designed for data sources that support query folding, most data source(like flat file, SQL and other  relational data sources etc.) support query folding. Could you please tell me that what is the data source you are using? Please refer to the following document to determine if you have configured it correctly.

Configure incremental refresh and real-time data 

 

Typically, the effective memory limit for a command is calculated on the memory allowed for the dataset by the capacity (25 GB, 50 GB, 100 GB) and how much memory the dataset is already consuming when the command starts executing. For example, a dataset using 12 GB on a P1 capacity allows an effective memory limit for a new command of 13 GB. However, the effective memory limit can be further constrained by the DbPropMsmdRequestMemoryLimit XMLA property when optionally specified by an application. Using the previous example, if 10 GB is specified in the DbPropMsmdRequestMemoryLimit property, then the commands effective limit is further reduced to 10 GB.

 

To potentially reduce exceeding the effective memory limit:

  • Upgrade to a larger Premium capacity (SKU) size for the dataset.
  • Reduce the memory footprint of your dataset by limiting the amount of data loaded with each refresh.
  • For refresh operations through the XMLA endpoint, reduce the number of partitions being processed in parallel. Too many partitions being processed in parallel with a single command can exceed the effective memory limit.

 

More details: Resource governing command memory limit in Premium Gen 2 

 

Best Regards

Community Support Team _ Polly

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

aj1973
Community Champion
Community Champion

Hi @fhdalsafi 

The message says it all. Do you have the permission to control the Premium Capacity? otherwise you need to reduce the size of your dataset. To answer your question; did you set up the incremental refresh correctly? did you apply it on the fact table?

Regards
Amine Jerbi

If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook

Helpful resources

Announcements
Microsoft Build 768x460.png

Microsoft Build is May 24-26. Have you registered yet?

Come together to explore latest innovations in code and application development—and gain insights from experts from around the world.

May 23 2022 epsiode 5 without aka link.jpg

The Power BI Community Show

Welcome to the Power BI Community Show! Jeroen ter Heerdt talks about the importance of Data Modeling.

Power BI Dev Camp Session 22 with aka link 768x460.jpg

Check it out!

Mark your calendars and join us on Thursday, May 26 at 11a PDT for a great session with Ted Pattison!

Power BI Release May 2022 768x460.png

Check it out!

Click here to read more about the May 2022 updates!

Top Solution Authors
Top Kudoed Authors