Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
My dataflow is retrieving data from Azure blobs.
It is returning the error upon refresh:
PipelineException: The evaluation reached the allowed cache entry size limit. Try increasing the allowed cache size.
When copying and pasting the code into Power BI Desktop (Power Query), it all goes fine.
How can I fix this?
Thank You!
Thank you. The dataset is completely validated in Power BI Desktop, and the cache size has also been adjusted in Power BI Desktop.
The error refers to Power BI Service Dataflow (ie, before the data is read by Desktop).
Hi, I'm having the same issue when trying to refresh a Power BI Service Dataflow. Were you able to find a fix for this error? Thanks
Hi @webportal ,
Try to increase the Maximum allowed (MB).
Cache Size for Power BI Desktop
It's good practice that before you schedule the report, you validate the complete dataset. One additional help for this, especially in the cases of larger datasets retrieved from the API, is to increase the cache limits. This increase can improve PowerBI's performance and stability. To configure this:
Best Regards,
Liu Yang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I'm getting this and went up to 100000 MB as a test. It works on older versions of power bi desktop, but i can't get it to work on the Dec 2022 or Feb 2023 versions. I get this same message on all my larger tables.
Ask questions in Eventhouse and KQL, Eventstream, and Reflex.
Check out the May 2024 Power BI update to learn about new features.