cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Highlighted
Advocate I
Advocate I

DataFlow Refresh Error and Fix for added Fields in Dynamics 365

I work with Power BI Desktop and Service and have began using DataFlows to extract data from Dynamics 365. I noticed that my reports would not refresh if a field was added to the entity in Dynamics 365. The error stated that there were more columns than expected, which caused all my reports to crash. 

 

Here is what I have found works. 

 

1. Go to Power BI service and enter into the dataflow and exit, which validates all the queries. 

2. Go to the report in Power BI Desktop and clear all permissions under data source settings on the desktop report

3. Refresh the data flow

4. Wait ~20 mins

5. Refresh the power BI report on desktop and publish to the service

 

This has worked for me. Please message me if you have any faster ways to do this!

 

 

3 REPLIES 3
Highlighted
Microsoft
Microsoft

Re: DataFlow Refresh Error and Fix for added Fields in Dynamics 365

Hi @achandler,

 

Thanks for your kindly sharing. It might help more users having similar problem.

 

Best regards,

Yuliana Gu

Community Support Team _ Yuliana Gu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Highlighted
Kudo Kingpin
Kudo Kingpin

Re: DataFlow Refresh Error and Fix for added Fields in Dynamics 365

I can confirm that this worked for me.

Highlighted
Advocate I
Advocate I

Re: DataFlow Refresh Error and Fix for added Fields in Dynamics 365

I had the same problem but figured out the root cause....its the model.JSON file that gets updated with the schema of your dataflow. If you have a new column that gets added to your dataflow from the source (whether its another unlinked dataflow or a select * statement from a database), dataflows doesnt seems to auto-support schema drift. 

 

High level process of what's happening

  1. Data source gets changed to now have 11 columns
  2. You refresh dataflow without editing to validation and save which triggers the model.JSON file to update
  3. Query dataflow from PBI Desktop: model.JSON file says to expect 10 columns but the actual CSV data that JSON file points to has 11 columns... thus you get this error.

If you are using disable load to consume data from other dataflows, using premium linked entities would likely fix the problem since metadata is linked instead of duplicated. But in your scenario it sounds like in Dynamics the scema drifted and Dataflows doesn't yet support source schema drift since the model.JSON is only compiled at validation of your dataflow when you save.

 

Workaround IF the source of your dataflow changes (i.e. more/less columns)

  1. edit dataflow (no actual edits required)
  2. click Validate and Save
  3. Refresh your dataflow after saving
  4. Refresh Power BI report and your new column will be available without error

You can validate all of this happens by reviewing the exported JSON file before valdiating and saving your dataflow and then viewing again after saving...you'll noticed that before the JSON did not having the new column and after, it will.

 

 

Helpful resources

Announcements
Super Users of the Quarter - Q2 2020

Super Users of the Quarter - Q2 2020

Who are our Super User Superstars? Who made it to the top of the leaderboards? Get the answers!

June 2020 Community Highlights

June 2020 Community Highlights

Featured community members, changes to the Community, and more! Read up on recent Power BI community news.

Community Summit North America

Community Summit North America

Innovate, Collaborate, Grow. The top training and networking event across the globe for Microsoft Business Applications

Power Platform 2020 release wave 2 plan

Power Platform 2020 release wave 2 plan

Features releasing from October 2020 through March 2021

Top Solution Authors
Top Kudoed Authors