Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hi,
My Power BI Gen 2 Dataflow has started failing after running successfully for 2 weeks. The error is:
Mashup Exception Data Source Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: The specified path already exists.
The dataflow appends data into a table that i have created in the Fabric Lakehouse so of course the table already exists. The dataflow takes data from from 25 on premise tables and appends them into one table which i then use in a notebook to cleanse before appending into another lakehouse table.
I have included an image of the dataflow.
The only thing that I changed around the time when the failures started is I created a second dataflow pulling similar data but this is going into a seperate table, however it is in the same lakehouse and same workspace. Any help greatly appreciated.
Note it is the same error everytime it fails but it shows against different tables used in the dataflow (e.g. it is not always the same table out of the 25)
Hi @higgy7
Based on the error message you provided, the problem appears to be related to a "Mixed Exception Data Source Error".
The error message indicates that there may be a conflict in the paths used in the data streams.
Make sure that the paths specified for each data stream to store data in the lake are unique. If there is an overlap, this may result in the above error.
Since you mentioned that a second data stream was created at the start of the failure, it is worth checking whether there are any conflicts or resource contention issues with these two data streams.
It is possible that both data streams are running in the same workspace and targeting the same lake.
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi Nono,
So this is what i am trying to understand about Lakehouses. The 2 dataflows will be pulling the same data but i was appending them into 2 different tables in the same Lakehouse. Is the issue that a Lakehouse isn't actually storing the data in tables in the traditional sense? I would assume this is what the issue is but I want to get an understanding of how the storage works etc. See images showing the same data from 2 dataflows appending to 2 different tables but in the same lakehouse