Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
david-hepstar
Regular Visitor

46GB PPU Semantic Model: "Memory error: maximum allowable memory allocation for your tier"

I have one large Semantic Model with data refreshed incrementally from Google BigQuery. Partitions are by month for the last 24 months, and by default the report refreshes the last 14 days (usually successfully). The Semantic Model takes up 46 GB of memory, and is stored in a Premium Per User (PPU) Workspace with no other large Semantic Models. 

 

Reports are in development, and I know there are very few report queries running against these semantic models at any given time. 

davidhepstar_3-1711093155458.png

 

My challenge comes when I try to refresh the historical partitions. When I Process Data  for 1 or 2 months via XMLA endpoint in SSMS, I usually (not always) receive one of these errors:

 

davidhepstar_2-1711092769776.png

 

Error:

 

Failed to save modifications to the server. Error returned: 'Memory error: You have reached the maximum allowable memory allocation for your tier. Consider upgrading to a tier with more available memory.

 

Less common:

 

The operation was throttled by the Power BI Premium because of insufficient memory. Please try again later

 

Also less common:

 

The operation was throttled by the Power BI Premium because the operation was unable to reserve enough memory. Please try again later.

 

 

Question: Where as all this extra memory gone? 46 GB isn't small, but it also isn't close to the 100 GB limit for PPU. Even if refreshing 2 months consumes an extra 6 GB of memory we should still be able to handle a refresh without running into memory limits.

 

Follow up: How can I track my PPU memory usage in real time or close to real time?

 

PPU FAQ: https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-per-user-faq

1 ACCEPTED SOLUTION
david-hepstar
Regular Visitor

I've finally received a different error message, which I assume is closely related to the ones I was receiving earlier. This makes me think that there may be some user settings I could adjust to get around this, though I haven't confirmed this.

Failed to save modifications to the server. Error returned: 'Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 53639 MB, memory limit 49315 MB, database size before command execution 53084 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more.

Technical Details:
RootActivityId: 344ed8dd-99a4-4539-acbb-392119ea557e
Date (UTC): 3/23/2024 11:06:16 AM

 
In the end I've avoided this issue entirely by redesigning my data model to incorporate a second fact table at a different grain. This enabled me to drop an expensive ID column from my original fact table, thereby cutting my memory usage by ~85%.

View solution in original post

6 REPLIES 6
david-hepstar
Regular Visitor

I've finally received a different error message, which I assume is closely related to the ones I was receiving earlier. This makes me think that there may be some user settings I could adjust to get around this, though I haven't confirmed this.

Failed to save modifications to the server. Error returned: 'Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 53639 MB, memory limit 49315 MB, database size before command execution 53084 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more.

Technical Details:
RootActivityId: 344ed8dd-99a4-4539-acbb-392119ea557e
Date (UTC): 3/23/2024 11:06:16 AM

 
In the end I've avoided this issue entirely by redesigning my data model to incorporate a second fact table at a different grain. This enabled me to drop an expensive ID column from my original fact table, thereby cutting my memory usage by ~85%.

Hi, 

Could you please explain a little bit more on how you incorporated a second fact table at a different grain?

I'm facing kind of the same issue and would like to hear how you resolved it.

Thanks!

Hey @amir_mm, rather than explain my situation, let me point you to the article that initially I found confusing but helped in the long run.

It's an article by Alberto Ferrari at SQL BI about Header/Detail models: https://www.sqlbi.com/articles/header-detail-vs-star-schema-models-in-tabular-and-power-bi/. It's a good article, but the first few times I skimmed the article I assumed the best solution for my case was to only use the "detail" table, then use DAX to calculate the measures at the "header" grain. This DAX is simple, but the problem is that it calculated too slowly for the size of my detail table. And the DAX also required to include the header_id column within my Detail table. This high-cardinality column took up a lot of space even after I converted it to an integer. Finally I stumbled on this really helpful comment at the bottom of the article:

 

davidhepstar_0-1713024269024.png

As Alberto points out in this comment, the key thing to avoid in a header/detail model is the relationship between the two fact tables. In my case I found that using both a header and a detail table -- without any relationships -- enabled me to remove all my high-cardinality Fact ID columns, which wound up reducing my memory consumption by 90%, despite adding a second large fact table! The model is also faster, and both tables share the same dimensions. (There was one challenge with a many-many relationship between dim_product and my header table, but I handled that with a junction table.)

Hope this background helps you a bit. Good luck!🙂 

collinq
Super User
Super User

Hi @david-hepstar ,

 

According to this article, Power BI Premium Per User - Power BI | Microsoft Learn, the 100 GB limit for PPU may also include some things reserved for other operations.  Also, since you stated it does not always error, I am wondering if anything else anywhere with your PPU license is running?  It wouldn't take much for another refresh of a different report to be running at the same time to consume all the capacity.  (according to the documentation "A Premium capacity might throttle data refresh operations when too many semantic models are being processed concurrently.")

 

Also, according to this article,  What is Power BI Premium? - Power BI | Microsoft Learn for non-Premium capacities, the amount of memory required for semantic model is doubled.

 

Also, it is possible that within the queries themselves you have created some sort of calculation or measure (or a circular reference) that could be running amok.  Has anything changed in the query side since the last successful run?

 

Another item to check is your gateway.  Is it being overwhelmed?

 

 




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




Thanks for this feedback @collinq. Still struggling to pin this down, but you've brought up some helpful ideas, which I'm in the process of investigating. A couple of comments:

 

Also, since you stated it does not always error, I am wondering if anything else anywhere with your PPU license is running?

 

Very little else happening in this Workspace. I have another non-PPU workspace with some dataflows. I don't think they're big, and they aren't refreshing. I'm not sure how to check their size either. I think as a next step I may try and delete these, just in case. It's also not clear to me if these dataflows contribute to my 100GB memory limit, especially since they're in a non-PPU workspace.

 

 

(according to the documentation "A Premium capacity might throttle data refresh operations when too many semantic models are being processed concurrently.")

 

Thanks for pointing this out, I hadn't thought of this. I do have another small (32 MB) semantic model in the workspace, but it is only refreshed manually and very infrequently. I have another small data model in another workspace that refreshes hourly. It seems like that shouldn't conflict, but I suppose it might. 

 

 

Also, according to this article,  What is Power BI Premium? - Power BI | Microsoft Learn for non-Premium capacities, the amount of memory required for semantic model is doubled.

 

I don't think this applies here since I am working within Premium Per User and only refreshing part of the model. If I've done my sums right, the data model takes up 46 GB. The incremental refresh would take up another ~6 GB. And I'm wondering where the other 48 GB has gone to? I can imagine that some heavy querying might cause issues from time to time, but it seems unlikely given I'm the primary user.

 

Also, it is possible that within the queries themselves you have created some sort of calculation or measure (or a circular reference) that could be running amok.  Has anything changed in the query side since the last successful run?

Another good idea. No calculated columns in the model at all. There are some measures I know are slow, but not significantly slow, and I'm not sure these would have much impact if I'm not generally using them during the model refresh.

 

 

Another item to check is your gateway.  Is it being overwhelmed?

 

Helpful suggestion, thanks! I'm using a cloud connection to Google BigQuery. If I'm understand correctly, this does not involve any gateway, so there's not much for me to troubleshoot or configure here. 

davidhepstar_0-1711114025650.png

 

Hey @david-hepstar ,

 

You are right - you are not using a gateway if that is your only connection.

 

It does sound like you don't have much concurreny either.

 

I think it might be time to open a ticket and have Microsoft help you track down the issue.  let us know the result if you go that direction.




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors