cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
eric98
Helper II
Helper II

Delete row that have the same ID and keep the last one

Hello guys,

Hope you're well. I'm facing an issue and you may help me with that !

Let's support I have a table like this : 

 

client_IDclient_VTAclient_NAMEclient_SCORE
875486999883366Coca Cola19
563735098333253Nike19

 

Sometime, the client_SCORE value changes. When it happens, I will have a new row with the same other columns values and the new client_SCORE.

Now I'm looking on how to delete or replace the other row with the new one.

 

client_IDclient_VTAclient_NAMEclient_SCORE
875486999883366Coca Cola19
563735098333253Nike19
563735098333253Nike20

 

Can you help me please ?

1 ACCEPTED SOLUTION

Hi @eric98 ,

 

After creating the index, copy the table and group the client_ID and get max index for each client_ID.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WsjA3NbEwU9JRsrS0tLAwNjYDsZ3zkxMVnPNzEoFsQ0ulWJ1oJVMzY3NjUyDfwBKoytjI1BjI9svMTiVOiZGBUmwsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [client_ID = _t, client_VTA = _t, client_NAME = _t, client_SCORE = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"client_ID", Int64.Type}, {"client_VTA", Int64.Type}, {"client_NAME", type text}, {"client_SCORE", Int64.Type}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1, Int64.Type),
    #"Grouped Rows" = Table.Group(#"Added Index", {"client_ID"}, {{"max", each List.Max([Index]), type number}})
in
    #"Grouped Rows"

vjaywmsft_0-1649178786167.png

Then inner jion the two tables and delete the extra columns.

vjaywmsft_1-1649178880619.png

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WsjA3NbEwU9JRsrS0tLAwNjYDsZ3zkxMVnPNzEoFsQ0ulWJ1oJVMzY3NjUyDfwBKoytjI1BjI9svMTiVOiZGBUmwsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [client_ID = _t, client_VTA = _t, client_NAME = _t, client_SCORE = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"client_ID", Int64.Type}, {"client_VTA", Int64.Type}, {"client_NAME", type text}, {"client_SCORE", Int64.Type}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1, Int64.Type),
    #"Merged Queries" = Table.NestedJoin(#"Added Index", {"Index"}, #"Table (2)", {"max"}, "Table (2)", JoinKind.Inner),
    #"Removed Columns" = Table.RemoveColumns(#"Merged Queries",{"Table (2)"})
in
    #"Removed Columns"

vjaywmsft_2-1649178937553.png

 

Best Regards,

Jay

 

 

Community Support Team _ Jay
If this post helps, then please consider Accept it as the solution
to help the other members find it.

View solution in original post

7 REPLIES 7
v-jayw-msft
Community Support
Community Support

Hi @eric98 ,

 

Are the new records always under the original records?

If so you could create an index in Power Query Editor.

vjaywmsft_0-1648821261081.png

Then you will be able to get the max index for each cilent_ID and filter the records with index equals to max index.

 

Best Regards,

Jay

Community Support Team _ Jay
If this post helps, then please consider Accept it as the solution
to help the other members find it.

Hello @v-jayw-msft 

Your solution could work because the new values always come after the existing values.

But I don't really know how to make a filter like the one you describe.

Can you help me please ?

Hi @eric98 ,

 

After creating the index, copy the table and group the client_ID and get max index for each client_ID.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WsjA3NbEwU9JRsrS0tLAwNjYDsZ3zkxMVnPNzEoFsQ0ulWJ1oJVMzY3NjUyDfwBKoytjI1BjI9svMTiVOiZGBUmwsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [client_ID = _t, client_VTA = _t, client_NAME = _t, client_SCORE = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"client_ID", Int64.Type}, {"client_VTA", Int64.Type}, {"client_NAME", type text}, {"client_SCORE", Int64.Type}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1, Int64.Type),
    #"Grouped Rows" = Table.Group(#"Added Index", {"client_ID"}, {{"max", each List.Max([Index]), type number}})
in
    #"Grouped Rows"

vjaywmsft_0-1649178786167.png

Then inner jion the two tables and delete the extra columns.

vjaywmsft_1-1649178880619.png

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WsjA3NbEwU9JRsrS0tLAwNjYDsZ3zkxMVnPNzEoFsQ0ulWJ1oJVMzY3NjUyDfwBKoytjI1BjI9svMTiVOiZGBUmwsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [client_ID = _t, client_VTA = _t, client_NAME = _t, client_SCORE = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"client_ID", Int64.Type}, {"client_VTA", Int64.Type}, {"client_NAME", type text}, {"client_SCORE", Int64.Type}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1, Int64.Type),
    #"Merged Queries" = Table.NestedJoin(#"Added Index", {"Index"}, #"Table (2)", {"max"}, "Table (2)", JoinKind.Inner),
    #"Removed Columns" = Table.RemoveColumns(#"Merged Queries",{"Table (2)"})
in
    #"Removed Columns"

vjaywmsft_2-1649178937553.png

 

Best Regards,

Jay

 

 

Community Support Team _ Jay
If this post helps, then please consider Accept it as the solution
to help the other members find it.

That's work perfectly !

Thank you.

eric98
Helper II
Helper II

No there's no other data, I don't even have a date column.

And the new score can be larger or smaller than the precedent.

I was wondering if, during the update of the table, insert a column that will store the date of the refresh.

lkalawski
Memorable Member
Memorable Member

@eric98 

Unfortunately, this is not possible at the Power BI level. However, any other ETL process will handle this.
Before you update the data in the target table, you follow these steps:
1. Compare source and target.
2. From the target, remove the values from source (based on 3 columns).
3. Add records from source to target.

lkalawski
Memorable Member
Memorable Member

Hi @eric98 ,

Do you have any other data that would help identify the new record?
How is the score calculated? Is it a constant value or a measure?

 

If you only get these 4 columns from your source, I'm afraid you can't do it.
Unless the new score is always larger than the previous one. Is there any logic to this?

Helpful resources

Announcements
Power Platform Conf 2022 768x460.jpg

Join us for Microsoft Power Platform Conference

The first Microsoft-sponsored Power Platform Conference is coming in September. 100+ speakers, 150+ sessions, and what's new and next for Power Platform.

Power BI Dev Camp Session 23 768x460.jpg

Check it Out!

Mark your calendars and join us on Thursday, June 30 at 11a PDT for a great session with Ted Pattison!

Top Solution Authors
Top Kudoed Authors