Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
jenniferho
Frequent Visitor

missing columns when trying to expand json files on PowerBI service

Hi guys

 

I created a dataflow and the data source includes a bunch of json files. Each json file has a nested structure of quite a number of levels.

 

When I transform the data on PBI service, I tried to expand all columns. Eventually I counted there are 24 columns.

I have realized some columns missing.

 

Therefore, I saved the data source in my local laptop, tried to get data on PowerBI desktop.

After expanding all the columns, I found there are 52 columns.

 

May I ask is there any limitation on no. of levels of nested documents to expand in PBI service?

I have also googled some people said if the data source has some fields of null values in the first 1000 rows then the columns will disappear on PBI service?

 

Many thanks.

J

 

 

1 ACCEPTED SOLUTION
jenniferho
Frequent Visitor

I have done a test. 

First I placed the data source in my local laptop

A. On power BI service, create dataflow > select folder > get the json files and expand all the columns

B. On power BI desktop, get data from folder > get the json files and expand all the columns

 

Then I compared the M query of A with that of B.

Interestingly I found some columns were missing in A, when a nested document contains more than 2 nested documents the 3rd document were missing and therefore further atrributes in this missing nested document was unable to further expand.

 

As a result, I developed my best practise when dealing with json files

It does not matter where the data source is, but

1. always try to transform the data via PBI desktop, this is much faster and accurate. Very important => always check "use original column name as prefix" (I usually uncheck this box as this makes the field name so long. However I attained errors when paste the M query to PBI service. then I start all over again to check that box whenever expanding columns. in the end I renamed all the attribute names. I found no errors when pasting the M query to PBI service)

2. then copy the M query from advanced editor

3. go to PBI service, create dataflow, select the method you want to connect the data source

4. once connected, go to the advanced editor, paste the M query to the right place

5. save and close 🙂

 

@ibarrau Thank for hints provided here so that I finally work this out.

Thank the community here very responsive and professional.

View solution in original post

3 REPLIES 3
jenniferho
Frequent Visitor

I have done a test. 

First I placed the data source in my local laptop

A. On power BI service, create dataflow > select folder > get the json files and expand all the columns

B. On power BI desktop, get data from folder > get the json files and expand all the columns

 

Then I compared the M query of A with that of B.

Interestingly I found some columns were missing in A, when a nested document contains more than 2 nested documents the 3rd document were missing and therefore further atrributes in this missing nested document was unable to further expand.

 

As a result, I developed my best practise when dealing with json files

It does not matter where the data source is, but

1. always try to transform the data via PBI desktop, this is much faster and accurate. Very important => always check "use original column name as prefix" (I usually uncheck this box as this makes the field name so long. However I attained errors when paste the M query to PBI service. then I start all over again to check that box whenever expanding columns. in the end I renamed all the attribute names. I found no errors when pasting the M query to PBI service)

2. then copy the M query from advanced editor

3. go to PBI service, create dataflow, select the method you want to connect the data source

4. once connected, go to the advanced editor, paste the M query to the right place

5. save and close 🙂

 

@ibarrau Thank for hints provided here so that I finally work this out.

Thank the community here very responsive and professional.

ibarrau
Super User
Super User

Hi. This is strange. They should return the same amount. Please try this: copy the Power Query code from advanced editor in Power Bi Desktop to the DataFlow or just right click on the table in the Transform Data (query editor) to copy it and right click paste in the dataflows query editor.

Let's check if dataflows is returning something different for the same exact code.

Regards,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

good idea, will try this soon.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors