Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi All
Is it possible to create a measure in PBI that remove duplicate values per group, and keeps the first occurrence of the value? Perhaps not a measure but maybe a new table, I'm unsure of how to work around this. For context, the database I'm working with has a funky treatment of some data that has a function back-end but doesn't make sense when trying to visualise user behaviour.
Currently, the data looks like:
USER | ENTRY ID | ENTRY | STATUS | TIME SUBMITTED (HH:mm:ss) |
userA | 1 | unicorn | fail | 10:23:10 |
userA | 2 | forest | pass | 10:30:49 |
userA | 1 | unicorn | fail | 10:30:49 |
userB | 1 | unicorn | fail | 13:40:22 |
userB | 1 | fairy | pass | 13:43:59 |
I want to clean it so it looks like:
USER | ENTRY ID | ENTRY | STATUS | TIME SUBMITTED (HH:mm:ss) |
userA | 1 | unicorn | fail | 10:23:10 |
userA | 2 | forest | pass | 10:30:49 |
userB | 1 | unicorn | fail | 13:40:22 |
userB | 1 | fairy | pass | 13:43:59 |
... | ... | ... | ... | ... |
Note the row I want to remove has
Also, the ENTRY ID cannot be used.
Any pointers would be greatly appreciated 🙂
Solved! Go to Solution.
Here's the M code that does what you want:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WKi1OLXJU0lEyBOLSvMzk/KI8ICstMTMHJGhgZWRsZWigFKuDUGkEks8vSi0uATIKEouLIQqNDaxMLFEU4jISVaUTTpXGViZA+40wVALliyqR7AaqM7YyBZoYCwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, EntryID = _t, Entry = _t, Status = _t, Time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"User", type text}, {"EntryID", Int64.Type}, {"Entry", type text}, {"Status", type text}, {"Time", type time}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "FirstTimeEntry",
each List.Min(
Table.SelectRows(
#"Changed Type",
(r) => r[User] = [User] and r[Entry] = [Entry]
)[Time]
) = [Time]),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([FirstTimeEntry] = true)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"FirstTimeEntry"})
in
#"Removed Columns"
Best
D
Here's the M code that does what you want:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WKi1OLXJU0lEyBOLSvMzk/KI8ICstMTMHJGhgZWRsZWigFKuDUGkEks8vSi0uATIKEouLIQqNDaxMLFEU4jISVaUTTpXGViZA+40wVALliyqR7AaqM7YyBZoYCwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, EntryID = _t, Entry = _t, Status = _t, Time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"User", type text}, {"EntryID", Int64.Type}, {"Entry", type text}, {"Status", type text}, {"Time", type time}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "FirstTimeEntry",
each List.Min(
Table.SelectRows(
#"Changed Type",
(r) => r[User] = [User] and r[Entry] = [Entry]
)[Time]
) = [Time]),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([FirstTimeEntry] = true)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"FirstTimeEntry"})
in
#"Removed Columns"
Best
D
Thanks @Anonymous, had to have a little play around and add in r[entry id] = [entry id] but it's working well now 👍 cheers again
Thanks @Pragati11 , the buffer got me halfway there! Now it's just removing the wrong duplicate, hopefully the other reply will resolve this 🙂
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
47 | |
23 | |
20 | |
15 | |
13 |
User | Count |
---|---|
51 | |
42 | |
39 | |
19 | |
19 |