Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hi All,
I have a question about performance and best practice. Currently, I have a very large dataset from 2 tables with Odata Sources.
The current status is to make all the changes in the 2 separate tables then appending them in a big table. See example screenshot
I added the timeout to the OData but is not working. My file is 8MB.
But for a few days, we keep getting the timeout error from the service.
I want to segment my dataset using date filters to create static tables. So I only have to refresh only one then I had manually the new dates. My questions are:
If there another possibility let me know. At the moment I tried both and the issue is that segmenting the big one takes a long time for each. I don't know how to measure the exact time but is more than 30mins. On the other hand, the append takes almost the same time 25-30 mins.
So if anyone knows how to measure the loading time for a data set I would also like to know.
Thanks in advance,
J.
Solved! Go to Solution.
Hi @Jmenas,
Based on experience, it should be better to segment first the small ones then append them all into the big one using Power Query.
In addition, could you try using the UNION(DAX) function to create a new calculate table to union the two table in Data view instead to see if there is any improvement in your scenario?
Regards
Hi @Jmenas,
Based on experience, it should be better to segment first the small ones then append them all into the big one using Power Query.
In addition, could you try using the UNION(DAX) function to create a new calculate table to union the two table in Data view instead to see if there is any improvement in your scenario?
Regards
Hi @v-ljerr-msft,
I will try it that way. It seems that loading the data before the append is creating this performance issue.
Thanks,
J.