Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi All,
I have a report connecting LIVE to Azure Analysis Services (AAS), its quite a huge model. Since our premium capacity are almost to the maximum limit, we tried out this method instead.
So I encounter this error:
Couldn't load the data for this visual
You have reached the maximum allowable memory allocation for your tier. Consider upgrading to a tier with more available memory.
Technical Details:
RootActivityId: de1517a6-4345-4997-94d9-e9f089daeb7d
Date (UTC): 6/5/2020 2:41:33 AM
Please try again later or contact support. If you contact support, please provide these details.
Activity ID: fdcdc354-a6c6-454a-9616-a22af2a63153
Request ID: 64011976-c456-00bc-9f48-7d3adba3ab3b
Correlation ID: b2970e71-c380-3a53-4714-6581a5be39f9
Time: Tue Jun 09 2020 17:49:14 GMT+0800 (Singapore Standard Time)
Service version: 13.0.13524.194
Client version: 2006.1.01321-train
Cluster URI: https://wabi-europe-north-b-redirect.analysis.windows.net/
Thanks!
Zuheir
First of all thanks for the response.
I tried to look into the Premium Capacity its not really showing anything when I filtered down to specific dataset that is having the issue. Probably because it was connecting LIVE to AAS.
Maybe my description is not clear, this happen only to 1 of the table visuals that showing around 100k+- rows of data. The table visual is showing fine when the number of data are filtered down to 2k +- of data but not able to show more than that.
I'm not sure where to check on the AAS log/metrics, this AAS module are being shared by another team so we are not monitoring it. This is a test run before we get our own AAS module, so now we are trying to determine whether AAS or the PowerBI is causing the memory limit issue.
No Issue in the refresh, it is reaching it its limit but we are still managing it, this test is to move away from PowerBI Import mode so we can lessen the current load.
Direct Query/Live connection is a good method to keep memory usage down. You need to go after the Import mode datasets. Remember that these datasets use up TWICE the amount of RAM , so a P2 can accommodate a maximum of ONE 25 GB dataset plus a couple stragglers before it falls over.
Use the Performance Metrics template to identify the culprits and in a friendly-aggresive way help the developers of these datasets to get their memory cost down. This is not a one time exercise either - it needs to be ongoing.
Yes, should have mentioned the Incremental refresh part - although it would be interesting to hear Microsoft's perspective on that. In my view incremental refresh only manages the dataset's storage partitions. You still need to load the _entire_ dataset into memory (at least until they implement the promised paging) for rendering. It will certainly help for refreshing (only current partition) but when it comes to the blitting that's when it gets interesting. I would think they need to load the entire new copy into the SKU and then remove the old copy.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.