10-05-2016 06:02 AM
I have a large datamodel (200mb+) with star schema (dimensions and fact tables) and many many measures (500+).
In the beginning, adding measures was very quick but now it takes forever to make any changes to the datamodel.
Adding a measure takes around 1-2 minutes.
- 20+ sec waiting on the formula bar to be ready,..
- 20+ sec wait for the measure to be added..
- 20+ sec for formatting changes..
The same goes for other changes like, adding a calculated column or renaming columns.
Very frustrating to work with and takes forever to make changes..
Anyone else been experiencing something similar as the datamodel gets big?
PS: I've a i7-processor and 24GB RAM so computer power should not be an issue
02-22-2017 01:09 PM
Haven't delt with this problem on the desktop app, however thinking tabularly here I can advice the following:
1. reduce the size of the data model by pointing its data sources to something with a smaller footprint.
2. once you've added the new measure, publish it to a PBIT template
3. open the PBIT and point it to the full data set, this way you load the enitire data model in the final PBIX
It would be good to have a look at your trace files to get an idea what is it taking that long to load.
03-01-2017 03:49 AM
Thanks for the input.
I have tried to work with just a Power BI template (only metadata) and experience the same issue (slow, long wait) when adding measures / making adjustments to the data model. I think that the number of measures affects the time to add changes, but it doesn't make sense that it should! Same datamodel with less than 100 measures is really easy to work with and to make adjustments to. I get the similar experience developing SSAS tabular models.
Good idea to look at the trace file. I will do that.