Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi,
I have a few dataflows which have evolved out of desktop models. In the desktop versions, the refresh takes an average of around 15 minutes to refresh all the data - even when published to the Service, they take roughly the same sort of duration to refresh.
But the dataflows take hours to refresh - literally anywhere from 3 hours upwards - and that's with Incremental refresh in the dataflows (and not in the desktop/service model!), without incrememntal refresh, they take anywhere from around 3 hours again to many many hours.
Is there something underlying that causes this?
Is there a configuration setting to improve this?
Should I be worried about the strain this is putting on the Gateway/servers?
Is there a best-practice for how to configure these and ensure that refreshes are optimally managed?
As my use of dataflows grows, I'm concerned that they will all grind to a half as they're doing so much work all at the same time.
Any advice/thoughts very welcome!
What's the memory size on your gateway cluster members? Does your gateway have cluster members that are specced substantially lower than others, or that sit in locations with poor network performance? Are you making sure they have plenty of free disk space?
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.