Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Standardize (harmonize) field values (names for ex.) to avoid unnecessary row duplications

Hello, I would like to standardize some additional informations which I bring out from a crossjoin in POwerQ,  as an example take  NameMerge in the following table :

LookupKeyKeyMergeNameMergeQuantity
E2345E2345F_100Shampoo Delicate Skin 500 ml1 000
E2345E2345F_102Shampoo Delicate Skin 500ml500
E2456E2456T........

  My model is quite complex and I cannot benefit from a lookup Product Table  with standardize unique names, actually I came up with a solution by adding a calculated column with the following formula :

 

calculate(firstnonblank(NameMerge,0 ), filter(All(crossjoin), earlier(crossjoin[LookupKey])= crossjoin[LookupKey]))) 

 

and it's working I have now a new uniform name for every position but I am asking myself if it's the recommended way ? May be I should find a way to perform the standardization somehow in powerQuery? I don't think that creating a measure is possible here as I would like to have the names in the rows of the pivot table (I tried and it give me a cartesian products of names).

May be someone come out with another solution?

 

IMD

 

1 ACCEPTED SOLUTION
AlB
Super User
Super User

Hi @Anonymous 

It would be better to do it in PQ, have that table clean already when it comes into the model. You could something simple like this. Place the following M code in a blank query to see the steps of an example:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),

    #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
    #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge", "NameMerge", "Quantity"}, {"KeyMerge", "NameMerge", "Quantity"})
in
    #"Expanded Res"

 

Please mark the question solved when done and consider giving a thumbs up if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

 

SU18_powerbi_badge

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Thanks very much for your prompt response, I accept your solution - I didn't precise that I need also to preserve the KeyMerge information so I just add some "loobback" merging

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),

    #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
    #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge","NameMerge", "Quantity"}, {"KeyMerge","NameMerge", "Quantity"}),
    #"Merged Queries" = Table.NestedJoin(#"Expanded Res", {"LookupKey"}, #"Changed Type", {"LookupKey"}, "Expanded Res", JoinKind.LeftOuter),
    #"Expanded Expanded Res" = Table.ExpandTableColumn(#"Merged Queries", "Expanded Res", {"Quantity"}, {"Expanded Res.Quantity"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Expanded Res",{"Quantity"})
in
    #"Removed Columns"

AIB, for sure I will have your contact in mind in case of some difficulties with setting up models.

 

AIB

AlB
Super User
Super User

Hi @Anonymous 

It would be better to do it in PQ, have that table clean already when it comes into the model. You could something simple like this. Place the following M code in a blank query to see the steps of an example:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),

    #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
    #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge", "NameMerge", "Quantity"}, {"KeyMerge", "NameMerge", "Quantity"})
in
    #"Expanded Res"

 

Please mark the question solved when done and consider giving a thumbs up if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

 

SU18_powerbi_badge

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors