cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
cvillegas
Advocate IV
Advocate IV

Gateway Performance Monitoring PBIT Dataformat error

Just upgraded to the latest version of the Enterprise gateway across our cluster and pulled up the GatewayPerformanceMonitoring.pbit that we've used previously in the past. All of a sudden now when it attempts to read two of the tables QueryExecutionReport and DataError we get an error stating:

 

DataFormat.Error: There were more columns in the result than expected. Details: Count=11
 
I've checked and the amount of columns are valid, just on the last two steps of the ETL it seems that the process just breaks. Has anyone seen this or have any ideas?
1 ACCEPTED SOLUTION
Anonymous
Not applicable

I've experienced the same - I believe this is a regression in the April 2020 version of Power BI Desktop not the Gateway. You'll find it works ok with March 2020. I've reported it here - https://community.powerbi.com/t5/Issues/April-2020-throwing-DataFormat-Error-when-using-Gateway/idi-...

View solution in original post

8 REPLIES 8
Dariusz_Karpiej
New Member

Unfortunately, the same error is showing up now. We updated the gateway to the version from October 2021 and now, after connecting to the generated logs, the following error appears: DataFormat.Error: There were more columns in the result than expected. 

Anonymous
Not applicable

I've experienced the same - I believe this is a regression in the April 2020 version of Power BI Desktop not the Gateway. You'll find it works ok with March 2020. I've reported it here - https://community.powerbi.com/t5/Issues/April-2020-throwing-DataFormat-Error-when-using-Gateway/idi-...

Hey @Anonymous It looks as though the May 2020 release fixed the issue.

I'm having the same issue and I'm running the May 2020 update of Power BI desktop.

 

Looking at the log files produced by the latest version of the gateway (3000.40.15 (May 2020)) I can see that entries with QueryType equal to "Refresh" often have one or two extra values between the "DataReadingAndSerializationDuration(ms)" and "DataProcessingEndTimeUTC" columns.

 

Below is a screenshot of a QueryExecutionReport log file pulled into Excel. I performed a Text to Data operation using a comma as the delimiter. Manual inspection of the log file also shows that these rows have more columns than they should.

Looks like the logging of Refresh queries in this version of the gateway is not performing to spec.

screenshot.png

Hi all, I guess I've found solution.

Let's go to the official documentation: https://docs.microsoft.com/en-us/data-integration/gateway/service-gateway-performance

 

We should set this gateway config file up: Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.dll.config file in the \Program Files\On-premises data gateway

 

Please find the parameter ReportFileCount and set it's value to '11'. Save the file and restart the service of the gateway. I think it should help, currently it fixed my problem

Olá, identifiquei isso também. Alguma forma de contornar este problema?

Getting same error with June 2020 release.  Anyone know how to fix?

Hi all, I guess I've found solution.

Let's go to the official documentation: https://docs.microsoft.com/en-us/data-integration/gateway/service-gateway-performance

 

We should set this gateway config file up: Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.dll.config file in the \Program Files\On-premises data gateway

 

Please find the parameter ReportFileCount and set it's value to '11'. Save the file and restart the service of the gateway. I think it should help, currently it fixed my problem

Helpful resources

Announcements
November 2022 Update

Check it Out!

Click here to read more about the November 2022 updates!

Microsoft 365 Conference â__ December 6-8, 2022

Microsoft 365 Conference - 06-08 December

Join us in Las Vegas to experience community, incredible learning opportunities, and connections that will help grow skills, know-how, and more.

Power BI Dev Camp Session 27

Ted's Dev Camp

This session walks through creating a new Azure AD B2C tenant and configuring it with user flows and custom policies.