Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi,
I am getting an error while refresing the dataset in the PowerBI service portal. The error says:
-------------------------------
--------------------------------
When I tried after adding a filter in the data source view (Azure SQL data warehouse) to restrict the number of rows.. and it worked. The max rows I can refresh ~4 million records only. My source view has approx 16 million records which will grow each month. My main fact table in the model has many calcuated columns and I have already kept only those columns which are definetly required in the model.
Please help.
Thanks in advance
-IP
Solved! Go to Solution.
Hi,
I found the actual issue and believe got solution for this.
So the error which I posted in my ticket orginally is correct but misleading as well. My PowerBI data model with 4 million records was just 120 MB whereas when I try loading 5 million records I was getting that error. I dont believe my powerbi model file does go over 1GB max limit.
To investigate the issue further I created a VM on azure with big config of around 28GB RAM, mult-cores prcessor etc. I tried refreshing my model on that machine keeping an eye on the Activity Monitor. I found that all the rows were successfully read into the model without any high use of resources but after reading all the rows then something happened underneth which quickly spiked-up to the full usage of RAM and it took a while before finishing the complete Refresh. I checked the PowerBI model file size... surprisingly it was just under 410 MB only.
So what is the issue????
Then I looked at the my model to investigate what took it to consume such a high amount of RAM after reading all the rows, and there I found the real issue... I created around 15 calculated columns (mostly int i.e. 64bits)... and I beleive these calulated columns were actually needed huge amount of memory to process i.e. (15 columns x 64 bits x 5,000,000 rows = aroung 4 GB ).
I discussed this with Microsoft and they confirmed that because PowerBI is a software as a service there is an upper cap limit on the max resource usage and this cannot be altered for any client requests. The compression engine in Power BI doesn't like wide data as much as it likes long data.
Ironically, I had to convert all of my calculated columns into caluclated measures and that fiixed the issue. The calculated columns are actually required to be calculated and stored as a physical column within the model whereas calulated measures get calulated later within the memory.
Let me know if anyone have any other important thought on this.
Thanks
IP
Hi,
I found the actual issue and believe got solution for this.
So the error which I posted in my ticket orginally is correct but misleading as well. My PowerBI data model with 4 million records was just 120 MB whereas when I try loading 5 million records I was getting that error. I dont believe my powerbi model file does go over 1GB max limit.
To investigate the issue further I created a VM on azure with big config of around 28GB RAM, mult-cores prcessor etc. I tried refreshing my model on that machine keeping an eye on the Activity Monitor. I found that all the rows were successfully read into the model without any high use of resources but after reading all the rows then something happened underneth which quickly spiked-up to the full usage of RAM and it took a while before finishing the complete Refresh. I checked the PowerBI model file size... surprisingly it was just under 410 MB only.
So what is the issue????
Then I looked at the my model to investigate what took it to consume such a high amount of RAM after reading all the rows, and there I found the real issue... I created around 15 calculated columns (mostly int i.e. 64bits)... and I beleive these calulated columns were actually needed huge amount of memory to process i.e. (15 columns x 64 bits x 5,000,000 rows = aroung 4 GB ).
I discussed this with Microsoft and they confirmed that because PowerBI is a software as a service there is an upper cap limit on the max resource usage and this cannot be altered for any client requests. The compression engine in Power BI doesn't like wide data as much as it likes long data.
Ironically, I had to convert all of my calculated columns into caluclated measures and that fiixed the issue. The calculated columns are actually required to be calculated and stored as a physical column within the model whereas calulated measures get calulated later within the memory.
Let me know if anyone have any other important thought on this.
Thanks
IP
Hi @ipsingh79,
Power BI offers two licenses, Power BI (free) and Power BI Pro, with different data capacity limits:
Free users have a maximum 1 GB data capacity.
Pro users of Power BI Pro have 10 GB maximum capacity.
Pro users can create groups, with a maximum 10 GB data capacity each.
You can load your data into Power BI desktop, and save as .pbix file, and see the .pbix file size. You will see a prompt indicating you are over the limit, and it return the error message when you refresh it. If you a free use, you can upgrate to pro use if the file hit the limit. More details, please check this article.
Thanks,
Angelia
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.