Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi Community,
since 16.02.2022 we suddenly experince troubles updating our datasets connected to dynamics 365 (without any
recent changes of the dataset or model).
The size of the concerned datasets that cause problems is just ca. 8MB and 16MB.
So that's not a large dataset as far as i understand the definition.
I have no idea what could be wrong since the error message is:
The refresh operation failed because it took too long to complete. Consider reducing the size of your dataset or breaking it up into smaller datasets. Please try again later or contact support. If you contact support, please provide these details.
Does anyone experience similar issues recently? Or maybe you have an idea where we shoud look for a possible reason.
Thank you
Alexander
Solved! Go to Solution.
Hi @AlexF_HH
How are you refreshing your dynamics 365 data?
It appears to be a timeout issue more than a sizing issue?
Have you tried using incremental refreshing to solve this and refresh per day instead of the entire dataset?
Hi @AlexF_HH
First, try to use dynamic 365 query function in Power Query to filter the dataset instead of full load.
Second, you can increase the timeout limit for the query.
Another solution is you can migrate your dynamic 365 data to Azure SQL DB by using Data Factory, then Power BI can get data from SQL DB, which you can enable query folding as well as incremental refresh.
Best.
Hi @AlexF_HH
First, try to use dynamic 365 query function in Power Query to filter the dataset instead of full load.
Second, you can increase the timeout limit for the query.
Another solution is you can migrate your dynamic 365 data to Azure SQL DB by using Data Factory, then Power BI can get data from SQL DB, which you can enable query folding as well as incremental refresh.
Best.
Hi @DavisBI , How do i increase the timeout limit for the query? is it by using the command timeout at the source setting of the sql table? If yes, will that affect the scheduled refresh?
Thanks
I have been having the same problem from the same date with our datasets also. below 200mb in size and just time outing all the time. I have not tried incremental refresh since I want our tables to reevaluate historical data, but have tried to reduce size of data by limiting date intelligence without any luck. Has there been any backend changes recently that might cause this?
Hi,
the solutions by @GilbertQ and by @DavisBI are defenitely helpful for some users with a similar problem.
But in my very case the incremental update solution doesn't really solve the problem since every now and then i need to udate the whole dataset.
Increasing timeout limit for queries works only in case you run a premium capacity. At least i didn't find such an oprion as Pro user.
'Migrating dynamics 365 data to Azure SQL DB by using Data Factory - then Power BI can get data from SQL DB' - is a very time consuming option in my case.
My solution in the end is plain and simple to move from Pro to Premium per User (PPU) that can handle large datasets. Eventhough i assume that some structutal improvements of my current pretty small datasets (max. 16 MB) could solve the issues, the time effort to do so seems inappropriate compared to the extra costs my team will have to pay for the upgrade. (Btw. we are 20 users & 1 part-time PowerBI-Amateur)
I already tested the solution with a PPU trial. It works. But be careful when moving back a workspace from PPU to Pro again. The reports won't work until you upload the desktop files again.
Hope that helps somebody.
Best
Alex
Hi @AlexF_HH
How are you refreshing your dynamics 365 data?
It appears to be a timeout issue more than a sizing issue?
Have you tried using incremental refreshing to solve this and refresh per day instead of the entire dataset?
Hi @GilbertQ ,
thank you very much! Of course i didn't try that 🙂
Will do right away. That should solve my problem in the short run.
But still, my main problem will come back as soon as i have to update the whole dataset. E.g. because some
of my data hat to be updated ex-post for any reason.