Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
QQCFA
Frequent Visitor

Complex power bi model with multiple fact tables ,over 200 measures .

Hi Power BI community,
I am looking for some assistance with regards to my PBI model. For context, I currently operate a desktop PC with intel Core i7 CPU and 64GBs of RAM. My PBI model is a relatively complex P&L and balance sheet model that does 60 months projections with over 200 measures (all kinds of DAX used here), multiple Fact tables, 10 Calculated tables(primarily breaks down slowest key measures using Summarizecolumns for all available dimensions to be used in visuals) and 6 dimension tables. I perform a full model refresh every month and have noticed that when I perform my refresh, my PC's RAM memory is exercised heavily and always reaches approximately 45-48 GBs of use.
 
I utilize SQL queries to aggregate data in the import mode therefore my fact tables are relatively small in size, maximum of 20k rows and my file size for context is only 4.5MB.

My concern is that moving forward as business requirements change, I will need to add more dimensions/measures causing the model to exceed the 64GB of available memory upon refresh.
My questions are:
1) is it normal for such complex models (especially those where one measure is used in another and then in another and so forth (derivative measures) to use that much memory at refresh time?
2) If that is not normal which area of the model should i focus on optimizing. (I have wrriten DAX code that are as efficient as i know currently)
3) Would updating to 128GB memory solve any future issues associated with increasing size/complexity of my model or do i have a more fundamental design/dax/model issue i need to address.
 
FYI i have used one-to - many relationships only among all tables and i have some inactive relationships as well which i use when required using DAX.
I am familiar with DAX studio and have used it heavily to increase speed of my slowest measures.
 
Any help on this is highly appreciated!
2 REPLIES 2
colivier
Advocate I
Advocate I

Hi @QQCFA 

 

You may want to check that your report is actually using all the columns that you are importing from your data sources. Getting rid of unused columns can dramatically reduce the size of your dataset as well as the time it takes to refresh the data.

 

I have built a tool, PBI Spy, that will, amongst other things, show you which columns in your dataset are not used by visualizations, filters, relationships, measures, etc.

You can download it from: www.pbispy.com

Let me know if you need any assistance with it.

Anonymous
Not applicable

All I can tell you is that if tables get processed in parallel, and this is what really happens, the amount of memory required can indeed be huge. However, Microsoft say in their documentation (but I don't want to look for it right now) that processing can in SSAS can take 2-3 times as much memory as the loaded model. But this is only an approximation and everything depends on how much data you have and what the data really is. Some columns require more memory to compress, some less, which also depends on the sparsity of data (or its uniqueness). And also, if re-coding is needed during the compression phase, this can also affect the time and memory needed. Basically, all one can say is that there's no telling. But you say you utilize SQL queries to aggregate data... Does the source, SQL database, sit on the same computer? If this is the case, I would attribute the memory consumption to the relational engine. If you don't have the right structures in place in the database, an analytical query can consume a lot of resources. If the above is true, you should focus on the optimization of SQL. In SQL Server a table that you issue analytical queries against should be a columnstore clustered index.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.