Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
yoshihirok
Post Prodigy
Post Prodigy

Can I upload 1.5 GB file size pbix file ?

I want to upload 1.5 GB file size pbix file in Power BI Service.

 

Now, I cannot upload this file to Power BI Service, by publish from Power BI Desktop.

 

Regards,

Yoshihiro Kawabata

4 ACCEPTED SOLUTIONS
Phil_Seamark
Employee
Employee

Hi there,

 

Sorry the size limit is 1.0GB.

 

You can probably reduce the size of your model by removing unnecessary columns, keeping less data, or aggregating it more before or during the load.

 

Get rid of any ID columns that you don't genuinely need for a visual.


To learn more about DAX visit : aka.ms/practicalDAX

Proud to be a Datanaut!

View solution in original post

GilbertQ
Super User
Super User

Hi @yoshihirok,

 

I agree with @Phil_Seamark, as I am sure that there are columns that are not needed, as well as possibly text columns that could potentially be excluded and bring down the file size.

 

You could use this Power BI Report link from Kasper De Jonge https://www.kasperonbi.com/new-ssas-memory-usage-report-using-power-bi/ 

 

And then use the connection information from DAX Studio to query the information from your Power BI Model. Which in turn will let you know which are the largest columns taking up the most space.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

View solution in original post

Hi @Phil_Seamark

 

Thank you for your reply.

I try to reduce my model,

but I want to use more larger fact data than 50,000,000 rows without any other azure service.

 

I will post a 2GB limit edition as idea.

 

Regards,

Yoshihiro Kawabata

View solution in original post

Hi @GilbertQ thank you for your reply.

 

I will read the Kasper De Jonge report, and checking the size by DAX Studio.

 

Regards,

Yoshihiro Kawabata

View solution in original post

11 REPLIES 11
GilbertQ
Super User
Super User

Hi @yoshihirok,

 

I agree with @Phil_Seamark, as I am sure that there are columns that are not needed, as well as possibly text columns that could potentially be excluded and bring down the file size.

 

You could use this Power BI Report link from Kasper De Jonge https://www.kasperonbi.com/new-ssas-memory-usage-report-using-power-bi/ 

 

And then use the connection information from DAX Studio to query the information from your Power BI Model. Which in turn will let you know which are the largest columns taking up the most space.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Hi @GilbertQ thank you for your reply.

 

I will read the Kasper De Jonge report, and checking the size by DAX Studio.

 

Regards,

Yoshihiro Kawabata

Phil_Seamark
Employee
Employee

Hi there,

 

Sorry the size limit is 1.0GB.

 

You can probably reduce the size of your model by removing unnecessary columns, keeping less data, or aggregating it more before or during the load.

 

Get rid of any ID columns that you don't genuinely need for a visual.


To learn more about DAX visit : aka.ms/practicalDAX

Proud to be a Datanaut!

I post the idea "Extend a file size to 2GB at Power BI Service."

https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/18356608-extend-a-file-size-to-2g...

 

I hope votes.

Yoshihiro Kawabata

Why stop at 2GB? 😉


To learn more about DAX visit : aka.ms/practicalDAX

Proud to be a Datanaut!

Hi @Phil_Seamark

 

The reason "2GB" is starting point of Extend file size.

Now limit 1GB.

Now my data size near 1.5GB.

Hope 2GB.

and also hope 5GB, 10GB, and more.

 

Regards,

Yoshihiro Kawabata

Another option is to use a Tabular cube instead (on-premise, or Azure).  

 

The data modelling engine is the same as Power BI so wouldn't be a big jump and you'd get around your 1.5GB problem


To learn more about DAX visit : aka.ms/practicalDAX

Proud to be a Datanaut!

Hi @Phil_Seamark

 

Yes, 1.5 file size can do with SQL Server, Azure SQL Database, etc.

Now, I want 1.5 file size for non-developers, non-ITs,

They use CSV, Power BI Desktop, a PC, and Power BI Service.

 

Don't change from Query to some service.

Don't build Database.

 

They increase rows from 10million to 50million.

It can load from CSV to Power BI Desktop within 1hour.

 

Regards,

Yoshihiro Kawabata

There would be other advantages to going down the Tabular cube route in terms of Data Refresh times (due to partitioning etc)


To learn more about DAX visit : aka.ms/practicalDAX

Proud to be a Datanaut!

Hi @Phil_Seamark

 

Yes, Tabular have a lot of great advantages.

and, Power BI Service have a lot of advantages, no-developers, no-ITpro, no-DBA, etc.

 

So, I hope to support 1.5 GB, 2 GB for Power BI Service.

 

I will try Azure Data Factory, Azure SQL Database, and Power BI Desktop/Power BI Service,

for from CSV to BI.

 

Regards,
Yoshihiro Kawabata

Hi @Phil_Seamark

 

Thank you for your reply.

I try to reduce my model,

but I want to use more larger fact data than 50,000,000 rows without any other azure service.

 

I will post a 2GB limit edition as idea.

 

Regards,

Yoshihiro Kawabata

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors