cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
jitendrabi07 Frequent Visitor
Frequent Visitor

how to handle large data size

Hi,

I have a data size of 50 million records (only required columns considered) approx in sql server. what should be the best way (import or direct query) to build the report on this data size without any performance issue or slowness? Is adding a parameter good idea when connecting thorugh direct query?

Thanks,,,

 

@stretcharm 

@AnujMishra_09 

@v-xicai 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted
stretcharm Senior Member
Senior Member

Re: how to handle large data size

I tend to avoid direct query if performance is important as unless you are filtering down to small data sets on well indexed sources it can be slow.

 

Take a look at aggregations as here you can get the best of both worlds by importing aggregations and direct query for individual records

 

https://docs.microsoft.com/en-us/power-bi/desktop-aggregations
https://www.sqlbi.com/tv/aggregations-in-power-bi/

https://powerbi.microsoft.com/en-us/blog/aggregations-for-petabyte-scale-bi-available-in-the-power-b...

https://radacad.com/power-bi-aggregation-step-1-create-the-aggregated-table

3 REPLIES 3
Highlighted
stretcharm Senior Member
Senior Member

Re: how to handle large data size

I tend to avoid direct query if performance is important as unless you are filtering down to small data sets on well indexed sources it can be slow.

 

Take a look at aggregations as here you can get the best of both worlds by importing aggregations and direct query for individual records

 

https://docs.microsoft.com/en-us/power-bi/desktop-aggregations
https://www.sqlbi.com/tv/aggregations-in-power-bi/

https://powerbi.microsoft.com/en-us/blog/aggregations-for-petabyte-scale-bi-available-in-the-power-b...

https://radacad.com/power-bi-aggregation-step-1-create-the-aggregated-table

kcantor Super Contributor
Super Contributor

Re: how to handle large data size

@jitendrabi07 

It really depends on what type of data you have. PowerQuery is really good and compressing data. In addition if you research the query folding data, you will be able to determine the best place for your query to fold to maintain performance. Here are a couple of good resourses for you.

https://powerpivotpro.com/2015/12/compression-with-power-pivot/

https://blog.crossjoin.co.uk/2015/12/08/working-with-compression-in-power-query-and-power-bi-desktop...

https://powerpivotpro.com/2010/02/surprising-example-of-powerpivot-compression/

https://www.mssqltips.com/sqlservertip/3635/query-folding-in-power-query-to-improve-performance/

https://blog.pragmaticworks.com/power-bi-checking-query-folding-with-view-native-query



Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!




Super User
Super User

Re: how to handle large data size

Hi there. I have a model with about 80 000 000 rows in the fact table and would never even consider the DirectQuery mode if I can use Import. The Import mode is THE BEST OUT THERE you can have. You cannot beat it using the DirectQuery or Dual mode because PQ compresses data with a factor of between 10 and 100x and the data is all stored in memory. YOU JUST CANNOT BEAT IT...

It's as easy as that.

Best
Darek

Helpful resources

Announcements
Virtual Launch Event

Microsoft Business Applications October Virtual Launch Event

Join us for an in-depth look at the new innovations across Dynamics 365 and the Microsoft Power Platform.

Power BI Helps Homeless and Trouble Youth

Power BI Helps Homeless and Trouble Youth

We spoke with Power BI Super User, Greg Deckler, about his charity work

MBAS Gallery

Watch Sessions On Demand!

Continue your learning in our online communities.

Users Online
Currently online: 113 members 1,435 guests
Please welcome our newest community members: