Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
RonaldMussche
Frequent Visitor

Best solution for files that are to large

Hi,

 

Currently we have a pbix file that is exactly 1GB. I already analysed the file with the vertpaq analyzer and removed all the colums we don't need. The file is publishing and refreshing now but when it grows it will fail of course. 

 

I have investigated serveral solutions:

- Premium capacity (Way to expensive for only this feature > 4200 euros per month)

- Permium per user (But then not everybody without a premium license) can see the report.

- Azure analisys server and connect the power bi report to the cube.

- Using direct query (to slow for 180M> rows.)

- Aggregated table for the fact table. But I think the granuality is to low to really save data.

 

Currently my gut feeling is to create an SSAS cube in azure but I was wondering if I am right here or if there is an easier solution or advice based on experience.

 

Thanks in advanced,

Ronald

1 ACCEPTED SOLUTION

Yes, Azure analysis services will be a good option and also as @SwayamSinha suggested Pause the Azure analysis services when not in used to save money.


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!

View solution in original post

4 REPLIES 4
SwayamSinha
Employee
Employee

I would agree with the first two points.

Connecting the report live to Azure analysis cube is definitely possible. Note that is some costs & provisioning overhead involved there too , though you can always pause AAS when not to be in use 🙂

I would recommend Direct Query to the data source. I believe DQ are now a lot faster , given that query folding is applied and also because of Query fusion capabilities, read Announcing “Horizontal Fusion,” a query performance optimization in Power BI and Analysis Services |....

Hi, 

 

I tried the direct query. but I run into the limitation of > 1M rows

https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-directquery-about

The underlying data in the report has indeed more then this amount. So I guess this is not an option. Am I correct that SSAS or premium is the only option here?

Yes, Azure analysis services will be a good option and also as @SwayamSinha suggested Pause the Azure analysis services when not in used to save money.


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!
SwayamSinha
Employee
Employee

I would agree with the first two points.

Connecting the report live to Azure analysis cube is definitely possible. Note that is some costs & provisioning overhead involved there too , though you can always pause AAS when not to be in use 🙂

I would recommend Direct Query to the data source. I believe DQ are now a lot faster , given that query folding is applied along with Query fusion capabilities among others. Read Announcing “Horizontal Fusion,” a query performance optimization in Power BI and Analysis Services |....

If your data source is architected well ( with indexes, materialised views etc) your Power BI report will be faster enough.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors