Reply
Member
Posts: 55
Registered: ‎02-02-2016

Dealing with Large Datasets

Power BI Community,

 

How do you deal with large datasets? Do you always utilize a live connection to a SSAS cube or import the data?

 

I am at my wits end. I am attempting to figure out the best way to deal with a large dataset in Power BI. I have a dataset that is about 21M+ rows of data and around 40 columns. Several of these columns are "value" columns, like Sales, COGS, etc. The rest are dimension columns that will be required to create one of several slices of the data. Unfortunately, I can't seem to come to a successful outcome. When I import this data, it ballons the file size to over 1GB. While shockingly PBI Service does allow me to upload the .pbix file onto the server (I thought there was a 1GB limit), once it is there, the performance of the report is too slow to be acceptable performance for an end user. I'm not using very complicated dax, just "standard" time intelligence functions or simple sums/subtractions/divisions.

 

My data is sitting in a SQL Server database. I do have access to a SSAS Tabular instance. From the community's experience, are you more likely to be successful utilizing a live connection to a SSAS cube or import the data? If it's import the data, how can I optimize file size and/or report performance?

 

Help, you're my only hope...

Moderator
Posts: 9,878
Registered: ‎03-10-2016

Re: Dealing with Large Datasets

[ Edited ]

@TheOckieMofo,

Firstly, Power BI Premium supports uploads of Power BI Desktop (.pbix) files that are up to 10 GB in size. For more details, please review this article.

Secondly, if you choose import mode, you can follow the guide in this article to optimize your data model.

Thirdly, if you don't have Power BI premium license, it is better for you to use DirectQuery or Connect Live mode to connect to the data source as 1GB limitation doesn't apply to directquery datasets. Also in this case, your report performance  depends largely on the performance of the underlying data source.

Regards,
Lydia

Community Support Team _ Lydia Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Member
Posts: 125
Registered: ‎03-07-2016

Re: Dealing with Large Datasets

Is it not possible to remove the dims from the fact table and create dim tables with ids to the fact.
I have a 80m row 9 column fact and it is about 0.6gb