We are using a custom-build OData-Service to fill our Dashboards with life. However, as our databases grow, refreshing can take a long time, in some cases up to an hour. This is due to the fact that refreshing fetches all the data from all the connected OData-Feeds every time, which can be hugely annoying when you are still building the dashboard and have to wait for the refresh.
This blogpost outlined a possible implementation for Reference-Caching which would, at least in our case, alleviate the problem, as a lot of Data doesn't actually change (much) over time. However, it seems that this was never actually included in the OData-Specs. Can somebody clarify the current state of things or in lieu propose a different solution to improving the performance of large, semi-static OData-Feeds in Power BI?
One thought would be to create a version of your OData feed that only posts the changes since, let's say, 24 hours ago. You could use an Append query to append this to the larger dataset coming from the original OData feed.
Lots of technical things to work out, but just an idea.
Did I answer your question? Mark my post as a solution!
I'm sure your suggested method would work, but unfortunately this would create a lot of overhead on the devs (my) part as well as for the analysts, as we currently have over 20 different OData-feed-types that are used by several clients.
The Reference-Caching seems like the obvious and proper solution, except that it doesn't seem to work :-/