Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Lenihan
Helper III
Helper III

Data getting too large - breaching 4 G memory usage. Data Framework wrong maybe?

I'm starting to have a problem with my Power BI for Desktop crashing due to out of memory errors. 

We have a rolling month of data, so that we can watch trending on the graphs. So far, our excel data is about 800 MB (csv files). When I watch memory usage on the power bi for desktop, i sometimes see it spiking close to 4GB. Others are having it crash so I'm assuming they are peaking over the 4 GB. We could increase the memory usage allowance for the application, but our work laptops only have 8 gb of ram. Hardware upgrades aren't an option, but shouldn't be anyway. 

 

The layout is a bit complicated, I'll try to explain as I can't share or provide any of the data. 

We are doing monthly kpi metrics on Configuration items within a CMDB. The metrics we need require comparisons to the full body set of data. For example - percent of duplicates, or percent that are in a specific status (removed, active, etc). We need to calculate this each month, and we can only compare the data to a snapshot view per month. For easy numbers, if we have 100K configuration items we are loading in, we are loading in that 100K of records each month - even if a config item isn't updated in the reporting month. The reason for this, was this was the only way I could figure out how to do calculations 'per month'.  

 

So, if I was calculating percent of configuration items that are a specific status each month, I need to do that comparison only on the config item statuses for that month. This is why we are loading the full set of data each month. When I load in the data, I add a field called "Reporting Month" and put the month and year, so that the formulas know which records to compare for calculations. I add the csv file, and then I append that dataset to the total data set. In looking at my datasets, I have all the months (Jan 2017, Feb 2017) etc.. and the master "All CIs" that each month gets appended to. 

 

Is there any other way to accomplish this? I'm not a programmer, so would need to be somewhat simple. We can't use the power bi service, because it doesn't have enough of the calculation features that we require like power bi for desktop. We are only at month 9 of the year that we have merged the data with. 

 

Rob

6 REPLIES 6
Lenihan
Helper III
Helper III

Any other ideas?  Is it that power bi for desktop just isn't a valid tool for large data sets?

Power BI is definately a memory hog. I use it with a 16Gb machine. 8gb works for most of our users. Restart if you get problems and make sure your memory has been release which can take a while. Close other memory hogs like browsers.

 

Obviously removed unneeded columns.

 

Aggregate in M to a lower grain before you load if you don't need all the detail.

 

Aim for a star schema if possible with id keys joins. Try adding a date dimension with your reporting month and join to your dates.

 

Turn off the Time Intelligence  Auto Date/Time option if you don't need it as create date dimension in the background for every date field from the smallest to largest dates.

 

Also reduce the precision of times if you dont need them to be highly accurate. 

Ideally load date and time seperatly both as integers e.g. mins after midnight.

 

I read here that ordering the data can help the compression if the data is very large. > 1million rows.

https://powerpivotpro.com/2016/02/sort-data-load-improved-compression/

 

here is some details of looking at what is using storage in your model.

https://www.sqlbi.com/articles/data-model-size-with-vertipaq-analyzer/

 

If you still too big then you may need to use either SSAS tabular on a server or try Azure Analysis Services (which can import your powerbi model)

https://azure.microsoft.com/en-gb/services/analysis-services/

 

However 800MB+ csv should be fine with the desktop version.

 

 

How big is the final powerbi doc? I have a couple of docs that are 0.5Gb. 

 

Phil

 

 

the final size of the pbix file is only 127 mb.  Our machines only have 8 GB of ram though. 

 

Thank you for the information. I will look at what I need to do and what I can apply from your comments

Lenihan
Helper III
Helper III

I tried to review if there are any extra columns of data I can remove, but I need all of them.. I'm unsure what can be done to reduce the size.  Is this app just not meant for large data sets?

tjd
Impactful Individual
Impactful Individual

Are you using the 64-bit PowerBI application?

Hi - Yes, I'm using the 64 bit version. 

 

I thought I replied to this yesterday, but my post seems to have disappeared

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.