I have a PowerBI report which uses Cosmos DB and few local server xlsx-files. The report works fine in Power BI desktop and when I refresh the report in Desktop, everything works fine and no errors in the data. Also, there are no custom columns or measures which are not working. So everything is fine in the Desktop side.
But, when I publish the report to the BI Service and try to manual refresh / auto refresh the data, the report gives has started to give me errors:
"There was an error when processing the data in the dataset.
Message: The command has been canceled. Table: Contract"
The error message is always the same, BUT, sometimes it changes the Table where it refers to. I cut one 1 to * relation in the model just for a test, and now it refers other table, theContract table.
I have like 10 tables from CosmosDB and some query steps involved. I made some optimization to the queries (deselect unnecessary columns etc), but no luck.
This again is driving me nuts, because when I demo the report in Power BI Desktop, many reports work fine, but when I try to autorefresh the data in the Service using The Great Cloud Data (Cosmos DB), it does not work.
I also tried to solve the problem by naming the report and publishing it, but no luck. Same errors.
Any idea dear Power BI gurus? We have 'a great situation' in this environment, because at this moment we don't have an actual DW, but just a connection to the live ERP CosmosDB data.
The lurking controller who misses SQL on-prem data.
Do you use Azure Cosmos DB connector in Desktop?
How about using only the few local server xlsx-files as a test, does this can be refreshed successfully in Power BI Service?
Do you have configured a gateway to connect to the data source?
See a similar thread.