Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and a 50 percent discount on exams.
Get startedEarn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hello!
I know that this topic has been mentioned previously but I still have a couple of questions regarding it:
In the last couple of days I have tried refreshing the biggest table in my dataset but all tries took me about an hour and I received the error:
"Failed to save modifications to the server. Error returned: 'There's not enough memory to complete this operation. Please try again later when there may be more memory available.'"
I tried removing relationships, calculated columns and imported columns from the table and I still hardly managed to refresh it.
The whole dataset is only about 3GB. When I do a refresh however it can be easily interupted or i get the no memory error.
My questions are:
1. What operation from the listed above has the greatest calculation costs
(having many relationships, many rows/columns, calculated columns)?
2. Would a computer with more processing power solve this issue?
Thank you in advance!
Solved! Go to Solution.
You don't have enough RAM, this could be because the data transformations are complex, the size of the data, calculated columns etc.
Thanks for the feedback!
You don't have enough RAM, this could be because the data transformations are complex, the size of the data, calculated columns etc.