cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Frequent Visitor

missing columns when trying to expand json files on PowerBI service

Hi guys

 

I created a dataflow and the data source includes a bunch of json files. Each json file has a nested structure of quite a number of levels.

 

When I transform the data on PBI service, I tried to expand all columns. Eventually I counted there are 24 columns.

I have realized some columns missing.

 

Therefore, I saved the data source in my local laptop, tried to get data on PowerBI desktop.

After expanding all the columns, I found there are 52 columns.

 

May I ask is there any limitation on no. of levels of nested documents to expand in PBI service?

I have also googled some people said if the data source has some fields of null values in the first 1000 rows then the columns will disappear on PBI service?

 

Many thanks.

J

 

 

1 ACCEPTED SOLUTION
Frequent Visitor

I have done a test. 

First I placed the data source in my local laptop

A. On power BI service, create dataflow > select folder > get the json files and expand all the columns

B. On power BI desktop, get data from folder > get the json files and expand all the columns

 

Then I compared the M query of A with that of B.

Interestingly I found some columns were missing in A, when a nested document contains more than 2 nested documents the 3rd document were missing and therefore further atrributes in this missing nested document was unable to further expand.

 

As a result, I developed my best practise when dealing with json files

It does not matter where the data source is, but

1. always try to transform the data via PBI desktop, this is much faster and accurate. Very important => always check "use original column name as prefix" (I usually uncheck this box as this makes the field name so long. However I attained errors when paste the M query to PBI service. then I start all over again to check that box whenever expanding columns. in the end I renamed all the attribute names. I found no errors when pasting the M query to PBI service)

2. then copy the M query from advanced editor

3. go to PBI service, create dataflow, select the method you want to connect the data source

4. once connected, go to the advanced editor, paste the M query to the right place

5. save and close 🙂

 

@ibarrau Thank for hints provided here so that I finally work this out.

Thank the community here very responsive and professional.

View solution in original post

3 REPLIES 3
Frequent Visitor

I have done a test. 

First I placed the data source in my local laptop

A. On power BI service, create dataflow > select folder > get the json files and expand all the columns

B. On power BI desktop, get data from folder > get the json files and expand all the columns

 

Then I compared the M query of A with that of B.

Interestingly I found some columns were missing in A, when a nested document contains more than 2 nested documents the 3rd document were missing and therefore further atrributes in this missing nested document was unable to further expand.

 

As a result, I developed my best practise when dealing with json files

It does not matter where the data source is, but

1. always try to transform the data via PBI desktop, this is much faster and accurate. Very important => always check "use original column name as prefix" (I usually uncheck this box as this makes the field name so long. However I attained errors when paste the M query to PBI service. then I start all over again to check that box whenever expanding columns. in the end I renamed all the attribute names. I found no errors when pasting the M query to PBI service)

2. then copy the M query from advanced editor

3. go to PBI service, create dataflow, select the method you want to connect the data source

4. once connected, go to the advanced editor, paste the M query to the right place

5. save and close 🙂

 

@ibarrau Thank for hints provided here so that I finally work this out.

Thank the community here very responsive and professional.

View solution in original post

Super User II
Super User II

Hi. This is strange. They should return the same amount. Please try this: copy the Power Query code from advanced editor in Power Bi Desktop to the DataFlow or just right click on the table in the Transform Data (query editor) to copy it and right click paste in the dataflows query editor.

Let's check if dataflows is returning something different for the same exact code.

Regards,



If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog


good idea, will try this soon.

Helpful resources

Announcements
secondImage

Happy New Year from Power BI

This is a must watch for a message from Power BI!

December Update

Check it Out!

Click here to read more about the December 2020 Updates!

Community Blog

Check it Out!

Click here to read the latest blog and learn more about contributing to the Power BI blog!

Get Ready for Power BI Dev Camp

Get Ready for Power BI Dev Camp

Mark your calendars and join us for our next Power BI Dev Camp!.

Top Solution Authors