Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi friends,
I am trying to import 6 billion data from snowflake to powerbi using my power bi desktop. I have 8 gb machine and I am able to import the data in 40 minutes which is very long waiting time for us. Specially when in Snowflake, same query returns all 6 billion records in 23 seconds. I am using odbc connector for snowflake connection.
Could you please help me by suggesting what needs to be done in my case to improve the report data refresh time.
Same thing is happening (40 Minutes data refresh time) in PowerBI service as well when I publish report in workspace.
Thanks,
Prabhat Omker
Solved! Go to Solution.
Hi @Anonymous ,
Despite of the executionspeed of Snowflake itself, the data still has to be transferred to your local machine. A table with 6 billion rows, that is a massive amount of data to transfer from Snowflake onto your laptop or Power BI service. To give an example: if every row holds only 1Kb of data (which is not a lot to be honest), then your data size would mount to 6,000,000,000 / 1,024 (=mb) / 1,024 (=gb) = 5,722 GB. There would be some smart compressing be going on but just to illustrate that the amount of time of getting results from a query of an external source is not just the query execution time; it is also the transfer time. Especially in cases like this.
If this is too long for you, you might want to look into incremental updates (but that requires Power BI premium);
https://docs.microsoft.com/en-us/power-bi/service-premium-incremental-refresh
Hope this gives you some clarification as to why this takes longer than you expected 🙂
Kind regards
Djerro123
-------------------------------
If this answered your question, please mark it as the Solution. This also helps others to find what they are looking for.
Keep those thumbs up coming! 🙂
Proud to be a Super User!
Hi @Anonymous ,
Despite of the executionspeed of Snowflake itself, the data still has to be transferred to your local machine. A table with 6 billion rows, that is a massive amount of data to transfer from Snowflake onto your laptop or Power BI service. To give an example: if every row holds only 1Kb of data (which is not a lot to be honest), then your data size would mount to 6,000,000,000 / 1,024 (=mb) / 1,024 (=gb) = 5,722 GB. There would be some smart compressing be going on but just to illustrate that the amount of time of getting results from a query of an external source is not just the query execution time; it is also the transfer time. Especially in cases like this.
If this is too long for you, you might want to look into incremental updates (but that requires Power BI premium);
https://docs.microsoft.com/en-us/power-bi/service-premium-incremental-refresh
Hope this gives you some clarification as to why this takes longer than you expected 🙂
Kind regards
Djerro123
-------------------------------
If this answered your question, please mark it as the Solution. This also helps others to find what they are looking for.
Keep those thumbs up coming! 🙂
Proud to be a Super User!
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
110 | |
97 | |
78 | |
64 | |
55 |
User | Count |
---|---|
143 | |
109 | |
89 | |
84 | |
66 |