Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi,
I have a power bi file which is currently 1.8 GB big and I'm trying to bring it down to 1-1.5 GB as data is huge It won;t go below 1 Gb unless we split reports which is difficult
I have one entity which has 2 date columns and one of the date column is taking up 600 MB and other one is having 390 MB so between them they almost a GB
columns are full date time and data is being read from azure data lake file
File itself is 2.5 GB big and datatype is datetime where it maintains till miliseconds in file but in power bi it get's trimmed to second level
is there anyway of optimising storage of this datetime columns?
I do not need hierarchy for this 2 columns but I don;t see way of disabling hierarchy for only selective columns
I tried splitting date and time part but that did not help (does it nee to be done before we load on pbix for optimization?)
I do have quite a lot of calculated columns which I'm trying to move to data lake storage in case that helps a little with compression as I read somewhere caclulated columns have very bad compression rate.
Any other suggestions on what can be optimised?
Hi @dilkushpatel,
Based on my research, you could refer to below link about Power BI performence:
https://docs.microsoft.com/en-us/power-bi/power-bi-reports-performance
And you could also refer to GilbertQ's reply in below issue:
Regards,
Daniel He
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
110 | |
94 | |
81 | |
66 | |
58 |
User | Count |
---|---|
151 | |
121 | |
104 | |
87 | |
67 |