cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
intrasight
Helper IV
Helper IV

We found extra characters at the end of JSON input.

Getting this error when doing a "from the web"

Details: "We found extra characters at the end of JSON input."

 

When I send the JSON to a validator it says that it is fine. How can I get more specific details on why it is complaining? There is no further detail in the trace file.

32 REPLIES 32
scotttaft
Regular Visitor

This might not help everyone but thought I'd share:

 

I wrapped my JSON code in brackets ([...]) and it worked.

I was getting this error and couldn't find a solution. My JSON was formatted like the following:

 

{"name":"value"...},

{"name":"value"...}

 

and PBI kept throwing this error. I just added brackets and it worked like the following:

 

[

{"name":"value"...},

{"name":"value"...}

]

AnalystPower
Frequent Visitor

Are you still having an issue with this? I was able to find a solution. In my example my JSON code just needed to be reformatted from JSON lines to regular JSON format and PBI handled it just fine after.
mericok
Frequent Visitor

Hi, When I want to refresh my PowerBI query I getting following error message. Could you please help me?

Thanks.

 

1.JPG

Continued problem w/ IoT data streamed to Azure Blob store (then get "we found extra...") when attempting to combine JSON files in PowerBI.

 

Here's how I solved it.  Works fine for me now.

 

1) Don't use the Combine (double arrow that gives you the error) feature.  Instead, make a query that trains a pattern on one of your json files.  To do this, just make a new query against your source:

 - Pick the first Json file

- Click the binary

- You should see a file icon, right click and "convert to TXT"

- Now you should see a single column, with JSON data <- but smooshed all in one col.

- Convert that column to JSON (right click col, convert, JSON)

- Now you have one good file export pattern that you can use as a function for all your others. 

- Now, edit this query in the Advanced editor and include the 'let' statement that turns this query into a function (mine is below)

- Replace the example file name with the variable you created.

- Click OK and try it out - put a full path to your JSON file in and see if you can invoke it correctly.

- If it works -> now create another query and source all your JSON files.  Use the "add column" -> "invoke custom function" and fill in the name of your function and click the file path as the function's source.

- It will run your function over and over - for every file and append them.  

 

Here's my FunctionCode:

 

let
    FxFixJson = (jsonfilepathname as text) =>
let
    Source = AzureStorage.Blobs("https://MYBLOBSTOREACCT.blob.core.windows.net/"),
    #"Expanded Data" = Table.ExpandTableColumn(Source, "Data", {"Content", "Name"}, {"Data.Content", "Data.Name"}),
    #"Filtered Rows" = Table.SelectRows(#"Expanded Data", each ([Data.Name] = jsonfilepathname)),
    #"Data Content" = #"Filtered Rows"{0}[Data.Content],
    #"Imported Text" = Table.FromColumns({Lines.FromBinary(#"Data Content",null,null,1252)}),
    #"Parsed JSON" = Table.TransformColumns(#"Imported Text",{},Json.Document),
    #"Expanded Column1" = Table.ExpandRecordColumn(#"Parsed JSON", "Column1", {"deviceId", "messageId", "temperature", "humidity", "EventProcessedUtcTime", "PartitionId", "EventEnqueuedUtcTime", "IoTHub"}, {"deviceId", "messageId", "temperature", "humidity", "EventProcessedUtcTime", "PartitionId", "EventEnqueuedUtcTime", "IoTHub"})
in
    #"Expanded Column1"
in
    FxFixJson

Thank you for your response.

Below example helped me to fix the error but it only worked for one record.

When  i tried to apply FxFixJson function i am confused with what the (jsonfilepathname as text) should be?

 

 

let
    Source = AzureStorage.Blobs("XXXX"),
    wrfm1 = Source{[Name="XXXX"]}[Data],
    #"Expanded Data" = Table.ExpandTableColumn(Source, "Data", {"Content", "Folder Path", "Name"}, {"Data.Content", "Data.Folder Path", "Data.Name"}),
    #"Data Content" = #"Expanded Data"{0}[Data.Content],
    #"Imported Text" = Table.FromColumns({Lines.FromBinary(#"Data Content",null,null,1252)}),
    #"Parsed JSON" = Table.TransformColumns(#"Imported Text",{},Json.Document),
    #"Expanded Column1" = Table.ExpandRecordColumn(#"Parsed JSON", "Column1", {"basicException", "internal", "context"}, {"Column1.basicException", "Column1.internal", "Column1.context"}),
    #"Expanded Column1.internal" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1.internal", {"data"}, {"internal.data"}),
    #"Expanded Column1.context" = Table.ExpandRecordColumn(#"Expanded Column1.internal", "Column1.context", {"application", "data", "device", "user", "session", "operation", "location", "custom"}, {"context.application", "context.data", "context.device", "context.user", "context.session", "context.operation", "context.location", "context.custom"})
in
    #"Expanded Column1.context"

Anonymous
Not applicable

Hi

Just wan tto emphasis for the developers to solv this

I get this issue independent of three sources of JSON files 

Must be something that can be fixed for everyone in the provided parsing of JSON in PBI

HendrixSpirit
Helper I
Helper I

I'm having same issue when loading JSON from folder. Doing the first time works ok, but when updating, it appears "We found extra characters at the end of JSON INPUT.

Please correct this old error from 2016

csiege
New Member

I was able to enter an advanced query to resolve my problem.

I had to replace carriage returns with commas and surround the entire thing with a bracket to form a valid json doc.

 

let
Source = DataLake.Contents("https://SOMETHINGHERE.azuredatalakestore.net"),
TwitterData = Source{[Name="TwitterData"]}[Content],
#"TwitterStream-Output_0_9999 json" = Text.Replace(Text.Combine({"[", Text.FromBinary(TwitterData{[Name="TwitterStream-Output_0_9999.json"]}[Content]), "]" }), "#(cr)", ","),
#"Imported JSON" = Json.Document(#"TwitterStream-Output_0_9999 json",1252),
#"Imported JSON1" = #"Imported JSON"{0}
in
#"Imported JSON1"

yssant
New Member

does this have a solution? i have same error

skaranam
Frequent Visitor

Does anyone have solution to this.

 

 

Anonymous
Not applicable

I'm stuck with the same problem. I'm trying to build a Power BI custom connector, and I'm not even sure that the request by Web.Contents is executed, since I've tried inserting the json as a string in the Power Query and that works, but if I place the same string as a json-file on App Service or as Blob it fails. I'm stranded with with an option on how to trace/debug what really happens.

In my case it's working beautifully in Power BI Desktop, but for some reason I can't get it to run in the Visual Studio for the custom connector project.

 

Any ideas and feedback is most welcome as I expect this to be a simpel error or misconfiguration on my side, more than an error in the SDK.

 

Brgds Brian

I am getting the same error while importing Azure BLOB files (Created by Application Insights and stored in Azure Blob Storage) into powerbi and transforming JSON. Please help!

I dont have exact solution for this but i faced the same issue. Then i emailed to the team where the JSON script is getting generated and they fixed it on their side. 

Thank you for your response.

I have enabled continuous export of Application Insights data into Azure Blob Storage which is completely managed by Microsoft. So I have no control over getting the JSON script fixed. Is there something we could do while importing blobs and combining files and parsing JSON?

yssant
New Member

does this have a solution?

DolEgon22
Helper II
Helper II

This is not unique to Power BI, however—it appears that this is also reproduced with Excel connecting to the blobs created from stream analytics. It appears that while other services can parse the JSON from ASA, PBI and Power Query cannot.

 

My troubleshooting steps:

 

  • Using a 2 line sample from one of the JSON files, I run it through a parser and get parser errors (expecting 'EOF')
  • When I wrap the JSON in square brackets and separate the lines with a comma, it parses fine
  • I downloaded one of the JSON files and made 2 copies of it. I then edited one to have the brackets and comma delimiter and left the other one alone. I was unable to load the unedited JSON into PBI, but had no problem with the one I edited.

This appears to be an issue with how Power Query parses JSON from ASA. 

Just a side note--my steps above don't solve our problem of connecting to Azure Blob Storage using PBI and automatically parsing the JSON stored there... my steps simply reveal the issue in PBI/Power Query. 

sasmpu00
Frequent Visitor

The same issue is here. the same url output (KSON) saved in text and is working bit while directly placing , then not working. 

Basjuh84
Frequent Visitor

Still no definitive solution for this problem? I want to read multiple JSON files from a folder. Individual JSON files work fine, but reading them all in from a folder gives me the same error!

BNajlis
New Member

Found the same issue when trying to load a JSONL file (http://jsonlines.org/). To solve it, I ended up importing it as a text file (which takes one record per line) and then using the Parse function in PowerQuery to parse the JSON in each line.

Helpful resources

Announcements
PBI User Groups

Welcome to the User Group Public Preview

Check out new user group experience and if you are a leader please create your group!

MBAS on Demand

Microsoft Business Applications Summit sessions

On-demand access to all the great content presented by the product teams and community members! #MSBizAppsSummit #CommunityRocks

Get Ready for Power BI Dev Camp

Power BI Dev Camp - June 24th

Mark your calendars and join us for our next Power BI Dev Camp!

Top Solution Authors
Top Kudoed Authors