I m pretty new to PowerBi and still learning about all possible options. I m now in a situation where a customer, who is using dynamics 365 onpremise, wants to use powerbi for reporting purposes. The amount of data in their database exceeds millions. When configuring my datasource the initial load of data took some hours because of the heavy load. I m now kind of wondering whats the best approach to handle this amount of data for enabling a good customer experience without long load times for refreshing the dataset. My first thoughts were enabling a continuous data refresh or only loading data in a specific time period. So my questions now are:
- For enabling a planned data refresh my only option is the power bi reporting server?
- Is there any other way of incrementally loading data in power bi desktop?
- When configuring a data source, how can I restrict the data to a specific time period?
I m sorry when this is maybe not the correct place to pose this question.
When dealing with large datasets it is a good idea to implement incremental loading of the model. MSFT is working on an incremental load approach for PBI models and it shows as 'planned'. There are some workarounds as well, but the most straightforward way is to use a Tabular model in Analysis Services to accomplish this. Basically you load your data incrementally through a "Process Add" approach. Note that this requires the load happening on SSAS and not in Power BI scheduler - if you are working with such large data and need more control on loads SSAS is something you should consider. The article below has the details on what you can do to implement a true incremental load that Power BI can leverage for reporting.,