Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
One of the sources of my PowerBI file uses a mammoth view. This view aggregates data from three huge tables (SQL database), each with more than 10 million rows. The view filters down the output to the transactions from 1st July 2018 on. Even so, there are well in excess 7 million rows.
Due to the size of the PowerBI file (badly in need of optimization), 650Mb, the automatic data refresh fails sometimes; its visuals do not always display the content expected.
The file does not function properly.
My question for you is: do you know of a method to import efficiently data coming from such a view? I understand that incremental refreshing does not work on views, but on tables without any filtering.
Solved! Go to Solution.
Maybe you should try to remove unnecessarily fields and then transform some specific fields with many decimals (with 2 decimals or whole number whenever is possible).
With this way i made a good reduction on my data from 200 mbyte to 100mbyte.
Hi @amirabedhiafi ,
Check the blog below:
https://www.sqlbi.com/articles/data-import-best-practices-in-power-bi/
Best Regards,
Kelly
Did I answer your question? Mark my post as a solution!
Maybe you should try to remove unnecessarily fields and then transform some specific fields with many decimals (with 2 decimals or whole number whenever is possible).
With this way i made a good reduction on my data from 200 mbyte to 100mbyte.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
109 | |
99 | |
83 | |
76 | |
65 |
User | Count |
---|---|
120 | |
111 | |
94 | |
83 | |
77 |