I have multiple dataflows to siphon data from onprime ERP Database. While I am experiencing occassionally failure in dataflow refresh randomly on any dataflow (20% rate on scheduled refresh, 0% on manual refresh test), I wondered how you are dealing with the issue as most posts regarding this same issue all came into no answer.
If you have a series of dataflow feeding a couple related reports, which including all from staging, transforming, erichment to published dataset, how to you distribute this series? Will you have the series span in multiple workspace or single workspace so you could achieve the least delay time of data presentation (30 mins)?
Refreshes, like queries, require the model be loaded into memory. If there is insufficient memory, the Power BI service will attempt to evict inactive models, and if this isn't possible (as all models are active), the refresh job is queued. Refreshes are typically CPU-intensive, even more so than queries. For this reason, a limit on the number of concurrent refreshes, calculated as the ceiling of 1.5 x the number of backend v-cores, is imposed. If there are too many concurrent refreshes, the scheduled refresh is queued until a refresh slot is available, resulting in the operation taking longer to complete. On-demand refreshes such as those triggered by a user request or an API call will retry three times. If there still aren't enough resources, the refresh will then fail.
Hope this helps you understand some of the temporary refresh failures.