Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hello, I would like to standardize some additional informations which I bring out from a crossjoin in POwerQ, as an example take NameMerge in the following table :
LookupKey | KeyMerge | NameMerge | Quantity |
E2345 | E2345F_100 | Shampoo Delicate Skin 500 ml | 1 000 |
E2345 | E2345F_102 | Shampoo Delicate Skin 500ml | 500 |
E2456 | E2456T | .... | .... |
My model is quite complex and I cannot benefit from a lookup Product Table with standardize unique names, actually I came up with a solution by adding a calculated column with the following formula :
calculate(firstnonblank(NameMerge,0 ), filter(All(crossjoin), earlier(crossjoin[LookupKey])= crossjoin[LookupKey])))
and it's working I have now a new uniform name for every position but I am asking myself if it's the recommended way ? May be I should find a way to perform the standardization somehow in powerQuery? I don't think that creating a measure is possible here as I would like to have the names in the rows of the pivot table (I tried and it give me a cartesian products of names).
May be someone come out with another solution?
IMD
Solved! Go to Solution.
Hi @Anonymous
It would be better to do it in PQ, have that table clean already when it comes into the model. You could something simple like this. Place the following M code in a blank query to see the steps of an example:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
#"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge", "NameMerge", "Quantity"}, {"KeyMerge", "NameMerge", "Quantity"})
in
#"Expanded Res"
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Thanks very much for your prompt response, I accept your solution - I didn't precise that I need also to preserve the KeyMerge information so I just add some "loobback" merging
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}), #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}), #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge","NameMerge", "Quantity"}, {"KeyMerge","NameMerge", "Quantity"}), #"Merged Queries" = Table.NestedJoin(#"Expanded Res", {"LookupKey"}, #"Changed Type", {"LookupKey"}, "Expanded Res", JoinKind.LeftOuter), #"Expanded Expanded Res" = Table.ExpandTableColumn(#"Merged Queries", "Expanded Res", {"Quantity"}, {"Expanded Res.Quantity"}), #"Removed Columns" = Table.RemoveColumns(#"Expanded Expanded Res",{"Quantity"}) in #"Removed Columns"
AIB, for sure I will have your contact in mind in case of some difficulties with setting up models.
AIB
Hi @Anonymous
It would be better to do it in PQ, have that table clean already when it comes into the model. You could something simple like this. Place the following M code in a blank query to see the steps of an example:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
#"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge", "NameMerge", "Quantity"}, {"KeyMerge", "NameMerge", "Quantity"})
in
#"Expanded Res"
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
49 | |
26 | |
21 | |
16 | |
12 |
User | Count |
---|---|
57 | |
49 | |
44 | |
19 | |
18 |