Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
sguenther
Advocate II
Advocate II

Maximum size for .pbix files

Hey everybody,

 

so I created a report by querying data from our PostgreSQL server. It's quite a bit of data, so the file size is above the 250 mb limit and I can't upload it to the online dashboard.

 

Is there any way to remove the data from the pbix file in order to be able to upload it and then let the data refresh only online? Otherwise this would mean that PowerBI could not be used for larger data sets.

 

Best

Seb

1 ACCEPTED SOLUTION
Bjoern
Continued Contributor
Continued Contributor

Have you tried using the direct connect instead of the "import all"? 😉

 

Perhaps make sense. 250mb of a file is not really scalable. You will hit that limit in the long-run anyway.

 

I would recommend:

a) using live connect with on prem sql

b) using live connect with azure sql db

c) using an AS tabular model on top of sql db

View solution in original post

7 REPLIES 7
Sacha
Resolver I
Resolver I

Do you need to load every single column?

You can (should) remove columns that you are not using to reduce the size of the data in the model.

 

This can be done through the Queries part (a.k.a. Power Query) and it can be done from the data model too (a.k.a. Power Pivot).

 

Regards

Sacha

 

Hey,

 

I removed all the columns I don't need. But there are more then 12 Million rows. And I need all of them. No posibility to scale down the data model any more.

 

There must be a different solution.

Have you just loaded it as one big table?  i.e. is it currently a denormalised table containing duplication within columns?

Is there an opportunity for you to convert it to a star schema and thus remove some data duplication?

 

There is potential to drastically reduce the file size that way.

 

Sacha

Alread had it in star schema. No duplicates 😉

If you really need all the data and can't reduce/trim/dedupe/pre-agg then you are likely to need something like SSAS to store the data and use a live connection until/if they increase the limit.

 

SSAS might offer some addtional benefits to your scenario too.

 

[Update:  Have you looked at the data types you are using in the model?  Can you use a more efficient data types anywhere?] 

 

Sacha

 

Bjoern
Continued Contributor
Continued Contributor

Have you tried using the direct connect instead of the "import all"? 😉

 

Perhaps make sense. 250mb of a file is not really scalable. You will hit that limit in the long-run anyway.

 

I would recommend:

a) using live connect with on prem sql

b) using live connect with azure sql db

c) using an AS tabular model on top of sql db

Besides optimizing the pbix file i don't know anyway to make a 250+mb datamodel fit in the 250mb data limit.

Only by putting the data in SSAS, which is most of the time no option. Or run it on-premisis when running on SSRS is possible.

 

I hope they will upgrade this soon, or make it possible to pay for an upgrade.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors