Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
jgarciabu
Advocate I
Advocate I

visual has exceeded available resources

Hi All,

 

I've been using PowerBI for the better part of a year now. For the last month and a half or a little more, we've had annoying failures to refresh for 6 of our 14 tile visualizations. It's also inconsistent. Some days certain tiles of the 6 refresh and others they don't. The dashboard in question contains nothing more than 14 pinned single data point cards. They are simple queries made to our Azure SQL Database. On the database, all the queries run immediately with no delay. I have all my visualizations set to connect live to our Azure database. I'm at a loss for what to do. It seems like it's an issue with resources on Microsoft's servers. Here's the error I get:

 

Resources Exceeded
 
This visual has exceeded the available resources. Try filtering to decrease the amount of data displayed.
Please try again later or contact support. If you contact support, please provide these details.

 

Activity IDc7b563f3-c93d-2aa1-2784-082b8d81f234
Request IDe4c3c75a-060f-6b40-8ab0-7ed194ba10c3
Correlation ID69832fcb-c121-4ed4-ca06-89d5275e48e2
TimeThu Jan 19 2017 11:09:24 GMT-0500 (EST)
Version13.0.1700.1003

 

1 ACCEPTED SOLUTION
v-ljerr-msft
Employee
Employee

Hi @jgarciabu,

 

Yes, the issue occurs when a visual has attempted to query too much data for the server to complete the result with the available resources.

 

As suggested in the error, you may need to try filtering the visual to reduce the amount of data in the result currently.Smiley Happy

 

Regards

View solution in original post

44 REPLIES 44
ballade4
Frequent Visitor

Also having this issue. Does not make sense. Need to understand limits better as it is turning a supposedly enterprise-level tool into a bear for certain critical tasks within my relatively-small dataset.

I am having the same issues. The dataset is not huge but I do have a number of measures becasue I need the model to be dynamic. The model works on the desktop (slow but works), but in the serivce it sometimes works and sometimes crashes. Also, I have noticed that the "spinning wheel" that used to tell me if the system was still calculating, is now gone and so I have no idea when all the calculations are compleded (I assume it vanishes when a model is nearing the resource limit). It's quite frustrating becasue you start to think that the model is stuck and then all of a sudden the data in refreshes. 

 

Is this all becasue we are not signed up to Premium Capacity? Other than major corporations, who can afford Premium Capacity at $4,000/month (or more)??? And then you start looking at Microsoft's pricing model for usage you start to wonder how anyone can really understand what their monthly bill is ever going to be. I call it the "Rube Goldberg Pricing Model". Very frustrating.

Hello,

 

Would someone confirm if this is a Premium / Pro issue? Either I change my DAX or I don't know what to do. It seems that every time I upload my data I am going to see this issue. 

I just hope Microsoft bring us a solution.

 

Cheers

I got this error today. On a relatively small set of data it is only a year but there are some complicated DAX expressions on it.

 

It seems the last straw was adding a PERCENTILEX.INC in a matrix. It won't even show 1 days worth of data which is only about 20 rows in the matrix. the table itself is only 436 rows.

Rajiv
Advocate I
Advocate I

I am having the same issue. I have a function that picks the latest date and time out of 5 Mn rows. And then uses that against other rows to calculate the relative age which then becomes the basis of the period for the chosen metric. It works well on my PBIX file as my desktop has 32 GB RAM. Our On Premise Datagateway runs on a VM with 4 GB RAM. Is there a possibility that if I increase the RAM on the VM, it would work fine?

I'd be interested in this as well. We're running our data gateway on Azure. Does the memory on the data gateway even matter, given that the report is hosted in the PBI service?

jl20
Helper IV
Helper IV

I'm having the same issue in the service, but not the desktop model (which is 20MB by the way). Have been using PBI for over a year and th is is the first time I've experienced this. Please fix.

sonalivt
New Member

Hi, I am too facing this issue. I have created my report using direct query in Azure SQL Database. My table has about 445 rows and a measure with sum of all values. Even though it is showing "Visual has exceeded avaliable resources". 

Im also facing this issue...What are the exact limitations when it comes to Power BI Service? 

Anonymous
Not applicable

Yeah, any updates? Power BI Web is way too slow with a Dataset of 200MB only, it just does not suffice.

Anonymous
Not applicable

I am also having the same issue. It takes too long to load the data into the graphs.  Please help asap.

CahabaData
Memorable Member
Memorable Member

as I understand the reply from Microsoft - it is the size of the record set.  Not the dax or visuals per se.  So i.e.  instead of trying to pull in X years of data - segment that / filter to a smaller data set.

 

but what is not said is how much is too much.... that would be worthy of note.....

www.CahabaData.com

There are definitely some limitations in regards to in-memory analysis as well.  I imagine that Power BI is running Azure Analysis Services behind the scenes and that a certain amount is allocated to each user.

 

My tabular dataset isn't huge.  I've seen this when I'm doing a significant number of date filters in a DAX measurement to create a date-dynamic measurement for historical trends.  It takes a good chunk of my RAM locally when running, but completely borks when it's published.

 

I guess the next step would be to set up my own SSAS for my tabular model, rather than relying on Power BI's.

Dan Malagari
Consultant at Headspring

interesting to hear.  in your comparison - how much RAM locally is it taking a good chunk of??  are you up in the 32G area on a server or more?

www.CahabaData.com

I'm able to run some of the measurements without issues on my 12GB RAM Windows VM, but it doesn't cooperate when I have multiple card visuals with different measurements on the same report page.  On my desktop, I have 32GB of RAM and I'm able to render everything properly.

 

The "big" DAX calculation can take upwards of 18GB of RAM when making the date comparisons.  It really only gets bad on my desktop when I have more than one visual doing a fairly large calculation against a table of ~40K rows.

 

It's at the point where I'm investigating if OLAP Cubes would help improve the performance of my historical data (again, offloading the strenuous calculations to an SSAS instance).

 

 

Dan Malagari
Consultant at Headspring

interesting - thanks for sharing... 

www.CahabaData.com
v-ljerr-msft
Employee
Employee

Hi @jgarciabu,

 

Yes, the issue occurs when a visual has attempted to query too much data for the server to complete the result with the available resources.

 

As suggested in the error, you may need to try filtering the visual to reduce the amount of data in the result currently.Smiley Happy

 

Regards

This happens so intermittently and at random times it would be nice to know if there is something that can be done when it happens.  At times it works fine so there has to be a setup where the stars align, what is that sweet spot and how do we get our visuals to that nirvana so it can be refreshed more reliably?

This answer is not acceptable.

Anonymous
Not applicable

Why is this issue marked as solved? What is the "available resources" and what is the solution?

 

In my instance, some users are seeing this issue in the PowerBI service and some are not. How do I deal with this kind of issue? It would be terrible to believe a visualization is working properly, go out on a sales opportunity to a client and have this fail like this.

This simply does not make any sense without considering the data size, result size, type of visuals etc.

For intsnace, In my POC environment there are 34000 Rows even after all the joins. I have a tabular view which is displaying not more than 40 Rows currently (8 Columns).

As soon as I try and add Another column, the visual fails.

One of the column is a Measure returning Data in relevant currency.

 

I can't understand How inefficient my query could be to make resources so scarce !

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors