Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
power_user_123
Frequent Visitor

suggestion needed to optimize report size

Hello!

In our project, we have 20 different individual reports with nearly 50 tables to serve these reports all together to go in a single .pbix file. The file size is already nearly 2GB. We are doing in import mode(our priority). As the file size has become very big, we are unable to move ahead with the development of the report. The performance of the report is very horrible. If we want to switch to either  direct query or composite mode, what will be the disadvatages? Any suggestions which mode is appropriate:Import/direct/dual? 
FYI: Our reports are mostly table visuals.

Thanks

3 REPLIES 3
KNP
Super User
Super User

Difficult to answer with the information provided. It is very much an, "it depends".

 

That said, things to look at.

  1. Do you actually need all 50 tables?
  2. Is it correctly modelled as a star schema? (this is critical)
  3. Have you removed ALL unnecessary/unused columns?
  4. Have you enabled incremental refresh on the large fact tables? (this, combined with setting the RangeStart and RangeEnd parameters to a smaller period should allow you to continue development and ensure quick refresh times once deployed to the service)
  5. Import mode is always preferred unless data is extremely large.
  6. Have you considered using Dataflows for the initial ETL?
  7. Have you separated the data model from the report files? (I assumed it is based on your comments)
  8. Do you have premium licenses?

 

 

Questions.

  1. What is the datasource(s)? Does it fold?
  2. How many rows in your largest fact tables?
  3. What about the performance is "horrible"?

 

Hopefully this will give you a few things to look at.

 

Have I solved your problem?
Please click Accept as Solution so I don't keep coming back to this post, oh yeah, others may find it useful also ;).
chrome-9xf-Zagzel-B

If you found this post helpful, please give Kudos.
It gives me a sense of instant gratification and, if you give me Kudos enough times, magical unicorns will appear on your screen.
If you find my signature vaguely amusing, please give Kudos.
KIfp67uy-Sr
Proud to be a Super User!PBI-Super-User-Rank-30x30-1x
ashishg
Advocate I
Advocate I

Hi,
You can go with Power Bi Dataflows so import your all table in Dataflows then after that import that dataflow in your pbix file so you get best dataset compression with increase in performance also and your dataset size will be nearly 300mb. I have used in my project it will help me alot

Hope its Helpful !!!

amitchandak
Super User
Super User

@power_user_123 , If you are a premium customer, try to reduce the size using incremental data settings of the development file and load data on the service

Deployment Pipeline, Load More Data on Test/Prod : https://youtu.be/l69cnWkoGX0

 

Other pro options

also, refer

https://www.youtube.com/watch?v=RnDdDlozcdo

https://www.youtube.com/watch?v=qZOEDBedATA

https://apexinsights.net/blog/top-5-tips-to-optimise-data-model

 

 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.