Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
zuheirashraf17
Advocate II
Advocate II

Connecting to Analysis Services - Maximum Allowable Memory Limit

Hi All,

 

I have a report connecting LIVE to Azure Analysis Services (AAS), its quite a huge model. Since our premium capacity are almost to the maximum limit, we tried out this method instead.

 

So I encounter this error:

Couldn't load the data for this visual
You have reached the maximum allowable memory allocation for your tier. Consider upgrading to a tier with more available memory. 
 
Technical Details: 
RootActivityId: de1517a6-4345-4997-94d9-e9f089daeb7d 
Date (UTC): 6/5/2020 2:41:33 AM
Please try again later or contact support. If you contact support, please provide these details.
Activity ID: fdcdc354-a6c6-454a-9616-a22af2a63153
Request ID: 64011976-c456-00bc-9f48-7d3adba3ab3b
Correlation ID: b2970e71-c380-3a53-4714-6581a5be39f9
Time: Tue Jun 09 2020 17:49:14 GMT+0800 (Singapore Standard Time)
Service version: 13.0.13524.194
Client version: 2006.1.01321-train
Cluster URI: https://wabi-europe-north-b-redirect.analysis.windows.net/

 

  1. My understanding was, load while connecting live to AAS should fall under the AAS instead the PowerBI Capacity but I couldn't find a documentation regarding this so I need a confirmation.
  2. In the error message, it says ".. memory allocation for your tier", does this refer to AAS or PowerBI Services?
  3. If I were to upgrade, AAS or Power BI Services? We are currently at P2. but due to huge team using this one capacity it has started to hitting the maximum memory limit during refreshes.
  4. If anyone had the proper documentation or article related to this, please guide me to it.

Thanks!

Zuheir

 

6 REPLIES 6
zuheirashraf17
Advocate II
Advocate II

First of all thanks for the response.

 

I tried to look into the Premium Capacity its not really showing anything when I filtered down to specific dataset that is having the issue. Probably because it was connecting LIVE to AAS.

 

Maybe my description is not clear, this happen only to 1 of the table visuals that showing around 100k+- rows of data. The table visual is showing fine when the number of data are filtered down to 2k +- of data but not able to show more than that.

 

I'm not sure where to check on the AAS log/metrics, this AAS module are being shared by another team so we are not monitoring it. This is a test run before we get our own AAS module, so now we are trying to determine whether AAS or the PowerBI is causing the memory limit issue. 

 

No Issue in the refresh, it is reaching it its limit but we are still managing it, this test is to move away from PowerBI Import mode so we can lessen the current load. 

 

lbendlin
Super User
Super User

Direct Query/Live connection is a good method to keep memory usage down.  You need to go after the Import mode datasets. Remember that these datasets use up TWICE the amount of RAM , so a P2 can accommodate a maximum of ONE 25 GB dataset plus a couple stragglers before it falls over.

 

Use the Performance Metrics  template to identify the culprits and in a friendly-aggresive way help the developers of these datasets to get their memory cost down. This is not a one time exercise either - it needs to be ongoing.

Hi there

It is not always the case that datasets use TWICE the amount of RAM. That would only happen if you were not using Incremental refreshing.

A P2 has a total of 50GB of RAM available, even with a 25GB model there is more than enough capacity for dataset refreshes and other datasets. It all depends on how they are consumed and used.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Yes, should have mentioned the Incremental refresh part - although it would be interesting to hear Microsoft's perspective on that. In my view incremental refresh only manages the dataset's storage partitions. You still need to load the _entire_ dataset into memory (at  least until they implement the promised paging) for rendering.  It will certainly help for refreshing (only current partition) but when it comes to the blitting that's when it gets interesting. I would think they need to load the entire new copy into the SKU and then remove the old copy.

Hi there

I can confirm that when you use Incremental refreshing it ONLY uses the memory required for those partitions.

It will certainly NOT need to load the entire dataset into memory twice to refresh the dataset with Incremental refreshing.

Whilst a lot of people might think that it needs to copy entire datasets, this is not true and there are a lot of smart things that happen behind the scenes which I have seen happen.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

GilbertQ
Super User
Super User

Hi there

That error is possibly from Power BI Premium, where you have exceeded the memory allocation for your tier.

The only way to make sure is it install the Power BI Premium Metrics Template App and view what is consuming your memory.

It would not appear that it is your AAS that is causing this issue. Unless your AAS is also running out of memory, which you can view in the Azure Log Analytics/Metrics to see if that is where the error is.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors