Dual storage mode is working fairly well for one of my reports, however once I publish it to the service it fails to refresh. The error code states that the query result is too large. I tried to set up incremental refresh for these tables, but apparently direct query doesn't work with the parameters, because when I filter on the parameters it says 'this transformation is not supported for direct query'. How do I get my large, dual mode, composite model data set to refresh in the service?
Thanks,
Andy
I get the same error. Refresh works fine in the tool, but fails because the query is too large in the gateway service with a scheduled refresh.
Hi @soldstatic
If the query result is too large, you can try below some best practices to optimize your model:
For more information on optimizing data sources for DirectQuery, see DirectQuery in SQL Server 2016 Analysis Services.
Thanks, I will give some of that a shot but doubt it will do much. I know the data set is large, and I don't need to refresh the whole data set each time. I would rather set up change detection or incremental refresh. Also - why does one have to set up ranged loads for incremental refresh instead of just doing change detection (ie look at modified DT column).
We are having a similar issue sort of. The Dual Mode datasets do not always refresh. When refreshing via desktop it works fine but when doing a refresh in service not all of the records come across. I did a profiler via SQL Server and it showed the calls were being made. They messed something up bad.
Hi,
Did you manage to resolve this? I am facing a similar issue.
Thanks
Mike
Check out new user group experience and if you are a leader please create your group
100+ sessions, 100+ speakers, Product managers, MVPs, and experts. All about Power BI. Attend online or watch the recordings.
User | Count |
---|---|
388 | |
216 | |
84 | |
72 | |
69 |
User | Count |
---|---|
446 | |
246 | |
135 | |
82 | |
78 |