Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
idcom
New Member

OLE DB or OCDB [Data Format error] We reached the end of buffer..

 Hi all,

 

I have two excel files, same size, in which I parse a Json file. In Query Editor both tables seem fine but when loading it to the model it gives the error "OLE DB or OCDB [Data Format error] We reached the end of buffer.." but only for the second file, the first one loads fine. 

 

Does anyone know why this happens and how to fix it??

I use this code for both tables:

let
Source = Csv.Document(File.Contents("H:\.shortcut-targets-by-id\16gK5NCGM9X2dE4f_v02OgQOsNtpY4Wrd\Proyectos\1. Done Projects\RETAILIGENT\Modelorama\Data Original\Face Recog\20200930_2352_natsface\20200930_2352_natsface-000.csv"),[Delimiter=",", Columns=22, Encoding=1252, QuoteStyle=QuoteStyle.None]),
#"Promoted Headers" = Table.PromoteHeaders(Source, [PromoteAllScalars=true]),
#"Changed Type" = Table.TransformColumnTypes(#"Promoted Headers",{{"_id", type text}, {"Data.CameraInfo.CameraId", type text}, {"Data.CameraInfo.CameraDescription", type text}, {"Data.CameraInfo.HandlesContent", type logical}, {"Data.CameraInfo.FocalDistance", type number}, {"Data.CameraInfo.SensorHeight", type number}, {"Data.CameraInfo.FrameSize", Int64.Type}, {"Data.TotalFaces", Int64.Type}, {"Data.IdentifiedFaces", Int64.Type}, {"Data.DemographicInfo", type text}, {"FrameTimestamp", type datetimezone}, {"EndpointId", type text}, {"EndpointName", type text}, {"StoreId", type text}, {"CustomerId", type text}, {"CameraName", type text}, {"Location", type logical}, {"Modality", Int64.Type}, {"NotificationType", Int64.Type}, {"DwellTime", type datetime}, {"unixTimestamp", Int64.Type}, {"order", Int64.Type}}),
#"Parsed JSON" = Table.TransformColumns(#"Changed Type",{{"Data.DemographicInfo", Json.Document}}),
#"Expanded Data.DemographicInfo" = Table.ExpandListColumn(#"Parsed JSON", "Data.DemographicInfo"),
#"Expanded Data.DemographicInfo1" = Table.ExpandRecordColumn(#"Expanded Data.DemographicInfo", "Data.DemographicInfo", {"LocalTrackingId", "LocalPersistedId", "BioRecordId", "IdentityName", "IdentityNameFromDB", "IsIdentityConfirmed", "MatchConfidence", "FaceRectangle", "IsFaceEngaged", "Age", "AgeConfidence", "AgeConfidenceLevel", "Gender", "GenderConfidence", "GenderConfidenceLevel", "Ethnicity", "EthnicityConfidence", "EthnicityConfidenceLevel", "Emotion"}, {"Data.DemographicInfo.LocalTrackingId", "Data.DemographicInfo.LocalPersistedId", "Data.DemographicInfo.BioRecordId", "Data.DemographicInfo.IdentityName", "Data.DemographicInfo.IdentityNameFromDB", "Data.DemographicInfo.IsIdentityConfirmed", "Data.DemographicInfo.MatchConfidence", "Data.DemographicInfo.FaceRectangle", "Data.DemographicInfo.IsFaceEngaged", "Data.DemographicInfo.Age", "Data.DemographicInfo.AgeConfidence", "Data.DemographicInfo.AgeConfidenceLevel", "Data.DemographicInfo.Gender", "Data.DemographicInfo.GenderConfidence", "Data.DemographicInfo.GenderConfidenceLevel", "Data.DemographicInfo.Ethnicity", "Data.DemographicInfo.EthnicityConfidence", "Data.DemographicInfo.EthnicityConfidenceLevel", "Data.DemographicInfo.Emotion"}),
#"Expanded Data.DemographicInfo.Emotion" = Table.ExpandRecordColumn(#"Expanded Data.DemographicInfo1", "Data.DemographicInfo.Emotion", {"Anger", "Happiness", "Neutral", "Sadness", "Surprise", "Contempt", "Disgust", "Fear"}, {"Data.DemographicInfo.Emotion.Anger", "Data.DemographicInfo.Emotion.Happiness", "Data.DemographicInfo.Emotion.Neutral", "Data.DemographicInfo.Emotion.Sadness", "Data.DemographicInfo.Emotion.Surprise", "Data.DemographicInfo.Emotion.Contempt", "Data.DemographicInfo.Emotion.Disgust", "Data.DemographicInfo.Emotion.Fear"}),
#"Added Custom" = Table.AddColumn(#"Expanded Data.DemographicInfo.Emotion", "HOUR", each Time.Hour([FrameTimestamp])),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Minute", each Time.Minute([FrameTimestamp])),
#"Changed Type1" = Table.TransformColumnTypes(#"Added Custom1",{{"Data.DemographicInfo.Gender", type text}}),
#"Replaced Value" = Table.ReplaceValue(#"Changed Type1","1","Male",Replacer.ReplaceText,{"Data.DemographicInfo.Gender"}),
#"Replaced Value1" = Table.ReplaceValue(#"Replaced Value","2","Female",Replacer.ReplaceText,{"Data.DemographicInfo.Gender"}),
#"Renamed Columns" = Table.RenameColumns(#"Replaced Value1",{{"Data.DemographicInfo.Emotion.Anger", "Anger"}, {"Data.DemographicInfo.Emotion.Happiness", "Happy"}, {"Data.DemographicInfo.Emotion.Neutral", "Neutral"}, {"Data.DemographicInfo.Emotion.Sadness", "Sad"}, {"Data.DemographicInfo.Emotion.Surprise", "Surprise"}, {"Data.DemographicInfo.Emotion.Contempt", "Contempt"}, {"Data.DemographicInfo.Emotion.Disgust", "Disgust"}, {"Data.DemographicInfo.Emotion.Fear", "Fear"}, {"Data.DemographicInfo.Age", "AGE"}, {"FrameTimestamp", "Fecha"}})
in
#"Renamed Columns"

I notices the error happens after the #"Expanded DataDemographics.Info", before that step, data loads fine. 

 

Thanks all

1 ACCEPTED SOLUTION
V-lianl-msft
Community Support
Community Support

Hi @idcom ,

 

You may try the Table.Buffer function

It could be like this:

#"Changed Type" = Table.Buffer(Table.TransformColumnTypes(#"Promoted Headers",{{"_id", type text}, {"Data.CameraInfo.CameraId", type text}, {"Data.CameraInfo.CameraDescription", type text}, {"Data.CameraInfo.HandlesContent", type logical}, {"Data.CameraInfo.FocalDistance", type number}, {"Data.CameraInfo.SensorHeight", type number}, {"Data.CameraInfo.FrameSize", Int64.Type}, {"Data.TotalFaces", Int64.Type}, {"Data.IdentifiedFaces", Int64.Type}, {"Data.DemographicInfo", type text}, {"FrameTimestamp", type datetimezone}, {"EndpointId", type text}, {"EndpointName", type text}, {"StoreId", type text}, {"CustomerId", type text}, {"CameraName", type text}, {"Location", type logical}, {"Modality", Int64.Type}, {"NotificationType", Int64.Type}, {"DwellTime", type datetime}, {"unixTimestamp", Int64.Type}, {"order", Int64.Type}}))

 

Best Regards,
Liang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

3 REPLIES 3
V-lianl-msft
Community Support
Community Support

Hi @idcom ,

 

You may try the Table.Buffer function

It could be like this:

#"Changed Type" = Table.Buffer(Table.TransformColumnTypes(#"Promoted Headers",{{"_id", type text}, {"Data.CameraInfo.CameraId", type text}, {"Data.CameraInfo.CameraDescription", type text}, {"Data.CameraInfo.HandlesContent", type logical}, {"Data.CameraInfo.FocalDistance", type number}, {"Data.CameraInfo.SensorHeight", type number}, {"Data.CameraInfo.FrameSize", Int64.Type}, {"Data.TotalFaces", Int64.Type}, {"Data.IdentifiedFaces", Int64.Type}, {"Data.DemographicInfo", type text}, {"FrameTimestamp", type datetimezone}, {"EndpointId", type text}, {"EndpointName", type text}, {"StoreId", type text}, {"CustomerId", type text}, {"CameraName", type text}, {"Location", type logical}, {"Modality", Int64.Type}, {"NotificationType", Int64.Type}, {"DwellTime", type datetime}, {"unixTimestamp", Int64.Type}, {"order", Int64.Type}}))

 

Best Regards,
Liang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

is this solution specific to this post? I also got this error but i'm using kusto queries for my data, where do i use the table.buffer function

amitchandak
Super User
Super User

@idcom , if this was working previously and came in the last few days only. and you are using Sep 2020 version. Then check and log issue at -https://community.powerbi.com/t5/Issues/idb-p/Issues

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.