Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hey folks,
I have a shared dataset with many tables. I have Incremental Refresh (IR) configured in the tables where I know there will be huge data.
I have a Pro license. So, I know that the initial refresh takes time every time I re-publish the dataset. So, here, the refresh continues for 2 hours and then gives a time-out error. When I refresh it again on service (On demand), the dataset gets refreshed successfully.
So, for that, to avoid this, there is one option - to have premium capacity.
But in premium capacity also, the first refresh would always take time. To avoid initial refresh taking time, I have one solution - the ALM toolkit. This would publish and refresh the differential changes in the service.
Are there any other options by which we can decrease the initial refresh taking time?
If you have Premium, you are able to use Tabular Editor to apply the refresh policy to the empty data (ie, deply the dataset, but don't refresh it). By applying the policy, the partitions are set up, but not actually populated. You can then connect to the dataset via SQL Server Management Studio where you can access the table and it's partitions, and you can manually refresh the partitions one-by-one (or in batches).
IIRC, this requires Premium, because you need write access to the XLMA endpoint.
Microsoft have more information here: https://docs.microsoft.com/en-us/power-bi/connect-data/incremental-refresh-xmla#prevent-timeouts-on-...
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
User | Count |
---|---|
52 | |
35 | |
32 | |
30 | |
25 |