Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi,
I read a lot of threads on the forum trying to understand why my semantic model is not refreshing, enhance it and managed to reduce the size by ~40%, from 1.45Go to 824Mo (maybe he wasn't optimized after all). But the refresh keeps failing with the given error:
I'm currently using a PowerBi embedded A2 capacity, meaning (if I'm understanding capacity specifications) taht I have 5Go of RAM. I'm aware that memory usage is generally twice the dataset memory footprint while refreshing and that report usage can have impact on memory usage, still, behaviour is the same in the middle of the night or the day.
I'm trying to understand how a semantic model of 824Mo can consume 5Go of capacity memory ?
Any advice or help would be much appreciated!
Solved! Go to Solution.
Hi,
As per my understanding storage and memory are two different components.And the memory utilisation depends on your transformations.Ex- if you have grouping or merge operations the memory utilisation will be high.
Hi,
As per my understanding storage and memory are two different components.And the memory utilisation depends on your transformations.Ex- if you have grouping or merge operations the memory utilisation will be high.
Hi @SaiTejaTalasila , thank you for your response, it help me found the problem.
A lot of complex transformations are done in my analytic DB using dbt except one where I unpivot column on a table containing ~7M rows. I think this is the one causing the refresh failure.
Thank you again !