Dual storage mode is working fairly well for one of my reports, however once I publish it to the service it fails to refresh. The error code states that the query result is too large. I tried to set up incremental refresh for these tables, but apparently direct query doesn't work with the parameters, because when I filter on the parameters it says 'this transformation is not supported for direct query'. How do I get my large, dual mode, composite model data set to refresh in the service?
If the query result is too large, you can try below some best practices to optimize your model:
For more information on optimizing data sources for DirectQuery, see DirectQuery in SQL Server 2016 Analysis Services.
Thanks, I will give some of that a shot but doubt it will do much. I know the data set is large, and I don't need to refresh the whole data set each time. I would rather set up change detection or incremental refresh. Also - why does one have to set up ranged loads for incremental refresh instead of just doing change detection (ie look at modified DT column).
We are having a similar issue sort of. The Dual Mode datasets do not always refresh. When refreshing via desktop it works fine but when doing a refresh in service not all of the records come across. I did a profiler via SQL Server and it showed the calls were being made. They messed something up bad.
Check out new user group experience and if you are a leader please create your group!
Check out how to claim yours today!
Test your skills now with the Cloud Skills Challenge.