cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
RonaldMussche
Frequent Visitor

Best solution for files that are to large

Hi,

 

Currently we have a pbix file that is exactly 1GB. I already analysed the file with the vertpaq analyzer and removed all the colums we don't need. The file is publishing and refreshing now but when it grows it will fail of course. 

 

I have investigated serveral solutions:

- Premium capacity (Way to expensive for only this feature > 4200 euros per month)

- Permium per user (But then not everybody without a premium license) can see the report.

- Azure analisys server and connect the power bi report to the cube.

- Using direct query (to slow for 180M> rows.)

- Aggregated table for the fact table. But I think the granuality is to low to really save data.

 

Currently my gut feeling is to create an SSAS cube in azure but I was wondering if I am right here or if there is an easier solution or advice based on experience.

 

Thanks in advanced,

Ronald

1 ACCEPTED SOLUTION

Yes, Azure analysis services will be a good option and also as @SwayamSinha suggested Pause the Azure analysis services when not in used to save money.


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!

View solution in original post

4 REPLIES 4
SwayamSinha
Microsoft
Microsoft

I would agree with the first two points.

Connecting the report live to Azure analysis cube is definitely possible. Note that is some costs & provisioning overhead involved there too , though you can always pause AAS when not to be in use 🙂

I would recommend Direct Query to the data source. I believe DQ are now a lot faster , given that query folding is applied and also because of Query fusion capabilities, read Announcing “Horizontal Fusion,” a query performance optimization in Power BI and Analysis Services |....

Hi, 

 

I tried the direct query. but I run into the limitation of > 1M rows

https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-directquery-about

The underlying data in the report has indeed more then this amount. So I guess this is not an option. Am I correct that SSAS or premium is the only option here?

Yes, Azure analysis services will be a good option and also as @SwayamSinha suggested Pause the Azure analysis services when not in used to save money.


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!
SwayamSinha
Microsoft
Microsoft

I would agree with the first two points.

Connecting the report live to Azure analysis cube is definitely possible. Note that is some costs & provisioning overhead involved there too , though you can always pause AAS when not to be in use 🙂

I would recommend Direct Query to the data source. I believe DQ are now a lot faster , given that query folding is applied along with Query fusion capabilities among others. Read Announcing “Horizontal Fusion,” a query performance optimization in Power BI and Analysis Services |....

If your data source is architected well ( with indexes, materialised views etc) your Power BI report will be faster enough.

Helpful resources

Announcements
Carousel_PBI_Wave1

2023 Release Wave 1 Plans

Power BI release plans for 2023 release wave 1 describes all new features releasing from April 2023 through September 2023.

Power BI Summit Carousel 2

Global Power BI Training

Make sure you register today for the Power BI Summit 2023. Don't miss all of the great sessions and speakers!

Thank you 2022 Review

2022 Monthly Feature Releases

We had a great 2022 with a ton of feature releases to help you drive a data culture.

Top Solution Authors
Top Kudoed Authors