Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
Hi! I have a problem on Power Query that seems so easy but I just can't find the answer.
I have a table with some duplicated values:
ID | Region |
A | X |
A | Y |
B | X |
C | Y |
C | Z |
And I need to have a unique ID column, so I want to put the duplicated rows on columns, something like this:
ID | Region 1 | Region 2 |
A | X | Y |
B | X | null |
C | Y | Z |
I very much appreciate your help,
Best regards,
Gabriela
Solved! Go to Solution.
@Anonymous
Hi, lets try with this:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUYpQitWBsCLBLCe4mDNcDMSKgqsDsmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [ID = _t, Region = _t]),
#"Grouped Rows" = Table.Group(Source, {"ID"}, {{"Regions", each _, type table [ID=text, Region=text]}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.ToList( Table.SelectColumns([Regions],"Region"))),
#"Extracted Values" = Table.TransformColumns(#"Added Custom", {"Custom", each Text.Combine(List.Transform(_, Text.From), ","), type text}),
#"Removed Columns" = Table.RemoveColumns(#"Extracted Values",{"Regions"}),
#"Split Column by Delimiter" = Table.SplitColumn(#"Removed Columns", "Custom", Splitter.SplitTextByDelimiter(",", QuoteStyle.Csv), {"Custom.1.1", "Custom.1.2", "Custom.1.3"})
in
#"Split Column by Delimiter"
Regards
Victor
Hi @Anonymous
Can you try:
1. On the Region column, right-click and choose "Duplicate column"
2. "Pivot column" on the new column created.
3. In the dialogue box the appears, choose the Region column as the values column and make sure in the Advanced Options you choose "Don't aggregate"
Below is a code you can put into the query editor.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUYpQitWBsCLBLCe4mDNcDMSKUoqNBQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}}),
#"Duplicated Column" = Table.DuplicateColumn(#"Changed Type", "Column2", "Column2 - Copy"),
#"Pivoted Column" = Table.Pivot(#"Duplicated Column", List.Distinct(#"Duplicated Column"[#"Column2 - Copy"]), "Column2 - Copy", "Column2")
in
#"Pivoted Column"
Hope that helps!
Hi @Anonymous
Can you try:
1. On the Region column, right-click and choose "Duplicate column"
2. "Pivot column" on the new column created.
3. In the dialogue box the appears, choose the Region column as the values column and make sure in the Advanced Options you choose "Don't aggregate"
Below is a code you can put into the query editor.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUYpQitWBsCLBLCe4mDNcDMSKUoqNBQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}}),
#"Duplicated Column" = Table.DuplicateColumn(#"Changed Type", "Column2", "Column2 - Copy"),
#"Pivoted Column" = Table.Pivot(#"Duplicated Column", List.Distinct(#"Duplicated Column"[#"Column2 - Copy"]), "Column2 - Copy", "Column2")
in
#"Pivoted Column"
Hope that helps!
Hi everyone,
Thanks @bheepatel for your solution, I understood your step by step, unfortunately it didn't worked out >< the good thing is I finally found the advanced editor and tried your solution and is correct!
I also tried the other solutions and they all look good, I didn't know you could see the step by step so clearly, I'm very grateful to have learned that.
Thank you @Vvelarde for your clear solution, that step by step is friendly to remember and helped me to understand the logic; to truly understand is the most valuable thing to me when I explore this beautiful world of Power BI,
Best regards, have a nice week!
Gabriela
@Anonymous
Hi, lets try with this:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUYpQitWBsCLBLCe4mDNcDMSKgqsDsmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [ID = _t, Region = _t]),
#"Grouped Rows" = Table.Group(Source, {"ID"}, {{"Regions", each _, type table [ID=text, Region=text]}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.ToList( Table.SelectColumns([Regions],"Region"))),
#"Extracted Values" = Table.TransformColumns(#"Added Custom", {"Custom", each Text.Combine(List.Transform(_, Text.From), ","), type text}),
#"Removed Columns" = Table.RemoveColumns(#"Extracted Values",{"Regions"}),
#"Split Column by Delimiter" = Table.SplitColumn(#"Removed Columns", "Custom", Splitter.SplitTextByDelimiter(",", QuoteStyle.Csv), {"Custom.1.1", "Custom.1.2", "Custom.1.3"})
in
#"Split Column by Delimiter"
Regards
Victor
Hi everyone, thank you for replying so soon, I appreciate your help.
@camargos88 I couldn't download the pbix file, could you please write the code or put the step by step sequence of your solution?
@Anonymous and @Vvelarde could you please explain me a bit more about your code? I don't know much about Power Query and I'd like to understand better the solution I will apply to my model
Thank you for your time,
Best regards,
Gabriela
Hi @Anonymous ,
Just paste it on Advanced Editor:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUYpQitWBsCLBLCe4mDNcDMSKArNcgKxwOMsZzoLoCAayHOEsJzjLGc5ygbNc4Sw3pdhYAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [ID = _t, Region = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"ID", type text}, {"Region", type text}}),
#"Unpivoted Columns" = Table.UnpivotOtherColumns(#"Changed Type", {"ID"}, "Attribute", "Value"),
#"Grouped Rows" = Table.Group(#"Unpivoted Columns", {"ID"}, {{"Count", each _, type table [ID=text, Attribute=text, Value=text]}}),
Indexed = Table.TransformColumns(#"Grouped Rows", {{"Count", each Table.AddIndexColumn(_,"GroupIndex", 1, 1)}}),
#"Removed Columns1" = Table.RemoveColumns(Indexed,{"ID"}),
#"Expanded Count" = Table.ExpandTableColumn(#"Removed Columns1", "Count", {"ID", "Attribute", "Value", "GroupIndex"}, {"ID", "Attribute", "Value", "GroupIndex"}),
#"Added Custom" = Table.AddColumn(#"Expanded Count", "Custom", each [Attribute] & " " & Text.From([GroupIndex])),
#"Removed Columns2" = Table.RemoveColumns(#"Added Custom",{"GroupIndex", "Attribute"}),
#"Pivoted Column" = Table.Pivot(#"Removed Columns2", List.Distinct(#"Removed Columns2"[Custom]), "Custom", "Value")
in
#"Pivoted Column"
Ricardo
Hi @Anonymous ,
I've created this Power Query that handles this situation -> Download PBIX
I should work with ilimited column number, please take a look.
Ricardo
Hi @Anonymous ,
Try this code. Assign your input in the Source step.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUYpQitWBsCLBLCe4mDNcDMSKUoqNBQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [D = _t, Region = _t]),
#"Grouped Rows" = Table.Group(Source, {"D"}, {{"Regions", each Table.Transpose(Table.SelectColumns(_, {"Region"}))}}),
#"Expanded Regions" = Table.ExpandTableColumn(#"Grouped Rows", "Regions", {"Column1", "Column2"}, {"Region 1", "Region 2"})
in
#"Expanded Regions"
This may require a bit of manual adjustment in the "Expanded Regions" step if you have more than 2 regions. If this can vary greatly case-on-case it can be automated - just require a bit more coding - in this case, please let me know I will add it.
Kind regards,
John
This solution is simple and elegant. Any tips on the 'bit more coding' part to make it automated if more than 2 Regions exist?
Hi @Greg_Deckler I have to say It's not the most elegant way out of this, but I'm so relieved it finally worked out! the only problem is what will happen when there are more than two regions, a likely scenario in the near future.
@Anonymous I figured that would be a likely scenario which is why I called in the big guns. I'm more of a DAX wonk than PQ.