Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Importing 6 billion Data tables from Snowflake

Hi friends,

 

I am trying to import 6 billion data from snowflake to powerbi using my power bi desktop. I have 8 gb machine and I am able to import the data in 40 minutes which is very long waiting time for us. Specially when in Snowflake, same query returns all 6 billion records in 23 seconds.  I am using odbc connector for snowflake connection.

 

Could you please help me by suggesting what needs to be done in my case to improve the report data refresh time.

 

Same thing is happening (40 Minutes data refresh time) in PowerBI service as well when I publish report in workspace.

 

Thanks,

Prabhat Omker  

1 ACCEPTED SOLUTION
JarroVGIT
Resident Rockstar
Resident Rockstar

Hi @Anonymous ,

Despite of the executionspeed of Snowflake itself, the data still has to be transferred to your local machine. A table with 6 billion rows, that is a massive amount of data to transfer from Snowflake onto your laptop or Power BI service. To give an example: if every row holds only 1Kb of data (which is not a lot to be honest), then your data size would mount to 6,000,000,000 / 1,024 (=mb) / 1,024 (=gb) = 5,722 GB. There would be some smart compressing be going on but just to illustrate that the amount of time of getting results from a query of an external source is not just the query execution time; it is also the transfer time. Especially in cases like this. 

If this is too long for you, you might want to look into incremental updates (but that requires Power BI premium);

https://docs.microsoft.com/en-us/power-bi/service-premium-incremental-refresh 

Hope this gives you some clarification as to why this takes longer than you expected 🙂

 

Kind regards

Djerro123

-------------------------------

If this answered your question, please mark it as the Solution. This also helps others to find what they are looking for.

Keep those thumbs up coming! 🙂





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




View solution in original post

1 REPLY 1
JarroVGIT
Resident Rockstar
Resident Rockstar

Hi @Anonymous ,

Despite of the executionspeed of Snowflake itself, the data still has to be transferred to your local machine. A table with 6 billion rows, that is a massive amount of data to transfer from Snowflake onto your laptop or Power BI service. To give an example: if every row holds only 1Kb of data (which is not a lot to be honest), then your data size would mount to 6,000,000,000 / 1,024 (=mb) / 1,024 (=gb) = 5,722 GB. There would be some smart compressing be going on but just to illustrate that the amount of time of getting results from a query of an external source is not just the query execution time; it is also the transfer time. Especially in cases like this. 

If this is too long for you, you might want to look into incremental updates (but that requires Power BI premium);

https://docs.microsoft.com/en-us/power-bi/service-premium-incremental-refresh 

Hope this gives you some clarification as to why this takes longer than you expected 🙂

 

Kind regards

Djerro123

-------------------------------

If this answered your question, please mark it as the Solution. This also helps others to find what they are looking for.

Keep those thumbs up coming! 🙂





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.