cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Olemat
Regular Visitor

Creating M function which would conditionally and selectively delete a row

Hello everyone!

I have an interesting issue: in the server Log activity I have plenty of repetitve parameters such as LOGOUT and then again LOGOUT which is quite senseless as once you did the first LOGOUT you cannot repeat it.

 

So I've created an additional Index column which I think to use as a reference table for a conditional statement. For instance: each LOGOUT may have only odd number as an Index.

 

so IF the first LOGOUNT belongs to the odd Index value THEN it has to be deleteted  so that way all 'repetitivenesses' would be deleted on a transformation stage.

 

Not sure about how that logic would work. Should I use Table.SelectRows ?

 

 

Olemat_1-1627384004544.png

 

Thank you so much for sharing your ideas and/or experience

Regards,

Alexei

 

 

2 ACCEPTED SOLUTIONS
Jakinta
Super User
Super User

If that happens always at the end of list per User ID, you can Group by User Id column, show All rows, then remove last row with Table.RemoveLastN. 

If Logouts might repeat somewhere in the middle of the list, not necessarily and only at the end, different operations must be applied.

View solution in original post

v-jingzhang
Community Support
Community Support

Hi @Olemat 

 

My idea is to always keep the first row when duplicated values appear (no matter LOGIN or LOGOUT duplicated). 

 

The operations are:

1. Add an Index column which starts with 0.

2. Add a custom column to determine whether the Attempt Result of the current row is the same as its previous row. If so, return 1 else return 0.

3. Filter the custom column to keep rows with value 0.

4. At last, remove the index column and custom column.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMtM3NNA3MlQwNLMyNVbSUfLxd/f0A9JGSrE6SLLmcFn/0BBMaQu8mi3xajYywC9tiM9sIyP8mo1xaDYEyeK0GCxriEvWiAhZXI6GyOJyFUTWBNPoWAA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Attempt Date" = _t, #"Attempt Result" = _t, UserID = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Attempt Date", type datetime}, {"Attempt Result", type text}, {"UserID", Int64.Type}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "Duplicated", each let previousResult = #"Added Index"{[Index]-1}[Attempt Result] in try if [Attempt Result] = previousResult then 1 else 0 otherwise 0),
    #"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([Duplicated] = 0))
in
    #"Filtered Rows"

072902.jpg

 

Attach the pbix for your reference. Let me know if you have any questions.

 

Regards,
Community Support Team _ Jing
If this post helps, please Accept it as the solution to help other members find it.

View solution in original post

5 REPLIES 5
v-jingzhang
Community Support
Community Support

Hi @Olemat 

 

My idea is to always keep the first row when duplicated values appear (no matter LOGIN or LOGOUT duplicated). 

 

The operations are:

1. Add an Index column which starts with 0.

2. Add a custom column to determine whether the Attempt Result of the current row is the same as its previous row. If so, return 1 else return 0.

3. Filter the custom column to keep rows with value 0.

4. At last, remove the index column and custom column.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMtM3NNA3MlQwNLMyNVbSUfLxd/f0A9JGSrE6SLLmcFn/0BBMaQu8mi3xajYywC9tiM9sIyP8mo1xaDYEyeK0GCxriEvWiAhZXI6GyOJyFUTWBNPoWAA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Attempt Date" = _t, #"Attempt Result" = _t, UserID = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Attempt Date", type datetime}, {"Attempt Result", type text}, {"UserID", Int64.Type}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "Duplicated", each let previousResult = #"Added Index"{[Index]-1}[Attempt Result] in try if [Attempt Result] = previousResult then 1 else 0 otherwise 0),
    #"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([Duplicated] = 0))
in
    #"Filtered Rows"

072902.jpg

 

Attach the pbix for your reference. Let me know if you have any questions.

 

Regards,
Community Support Team _ Jing
If this post helps, please Accept it as the solution to help other members find it.

View solution in original post

Hi Jing Zhang!

Thanks a lot for that particular piece of code: each let previousResult = #"Added Index"{[Index]-1}[Attempt Result] in try if [Attempt Result] = previousResult then 1 else 0 otherwise 0

It helped a lot. M language's just great thing

Accepted as a solution!

All the best to you, Colleague

@Olemat Glad that you liked my solution, but seems you forgot to accept it as a solution...😉

Jakinta
Super User
Super User

If that happens always at the end of list per User ID, you can Group by User Id column, show All rows, then remove last row with Table.RemoveLastN. 

If Logouts might repeat somewhere in the middle of the list, not necessarily and only at the end, different operations must be applied.

View solution in original post

Olemat
Regular Visitor

Thanks for sharing the idea, that was my first guess too.. but nope..because Logouts and Logins are distributed very randomly.. the Group By didn't work for me. Please see the Accepted Solution

Helpful resources

Announcements
UG GA Amplification 768x460.png

Launching new user group features

Learn how to create your own user groups today!

Power BI October Update 2021.jpg

Power BI Release

Click here to read more about the October 2021 Release!

Community Connections 768x460.jpg

Community & How To Videos

Check out the new Power Platform Community Connections gallery!

Teds Dev Camp Oct. 2021 768x460.jpg

Power BI Dev Camp - October 28th, 2021

Mark your calendars and join us for our next Power BI Dev Camp!

Top Solution Authors
Top Kudoed Authors