I've been using PowerBI for the better part of a year now. For the last month and a half or a little more, we've had annoying failures to refresh for 6 of our 14 tile visualizations. It's also inconsistent. Some days certain tiles of the 6 refresh and others they don't. The dashboard in question contains nothing more than 14 pinned single data point cards. They are simple queries made to our Azure SQL Database. On the database, all the queries run immediately with no delay. I have all my visualizations set to connect live to our Azure database. I'm at a loss for what to do. It seems like it's an issue with resources on Microsoft's servers. Here's the error I get:
This visual has exceeded the available resources. Try filtering to decrease the amount of data displayed. Please try again later or contact support. If you contact support, please provide these details.
as I understand the reply from Microsoft - it is the size of the record set. Not the dax or visuals per se. So i.e. instead of trying to pull in X years of data - segment that / filter to a smaller data set.
but what is not said is how much is too much.... that would be worthy of note.....
There are definitely some limitations in regards to in-memory analysis as well. I imagine that Power BI is running Azure Analysis Services behind the scenes and that a certain amount is allocated to each user.
My tabular dataset isn't huge. I've seen this when I'm doing a significant number of date filters in a DAX measurement to create a date-dynamic measurement for historical trends. It takes a good chunk of my RAM locally when running, but completely borks when it's published.
I guess the next step would be to set up my own SSAS for my tabular model, rather than relying on Power BI's.
I'm able to run some of the measurements without issues on my 12GB RAM Windows VM, but it doesn't cooperate when I have multiple card visuals with different measurements on the same report page. On my desktop, I have 32GB of RAM and I'm able to render everything properly.
The "big" DAX calculation can take upwards of 18GB of RAM when making the date comparisons. It really only gets bad on my desktop when I have more than one visual doing a fairly large calculation against a table of ~40K rows.
It's at the point where I'm investigating if OLAP Cubes would help improve the performance of my historical data (again, offloading the strenuous calculations to an SSAS instance).
I don't think this issue is solved. What are the actual limits, and are they the same for all visuals? I'm having this issue with a simple date filter which is querying a view that has only a few thousand rows. We barely have 1.5 years of data, this is pathetic.
Hi, I am too facing this issue. I have created my report using direct query in Azure SQL Database. My table has about 445 rows and a measure with sum of all values. Even though it is showing "Visual has exceeded avaliable resources".