Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Anonymous
Not applicable

Equivalent to a new column usign OVER functionality

Two columns : one is an ID column, with non-unique values.  The second column SIZES contains integers.  I want to create a third column with the maximum value of SIZES for each value of ID, without dropping any rows.

So I want to go from :

ID           NUMS

1             0

1             4

1             7

2             0

2             0

And I want to produce:

ID           NUMS     MAXNUMS

1             0             7

1             4             7

1             7             7

2             0             0

2             0             0

2 ACCEPTED SOLUTIONS

Hi @Anonymous ,

 

Using below M code:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlQ4tACOlHSUIEwDpVgdXHImeOTMwXJGeMzEKRcLAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [ID = _t, NUMS = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"ID", Int64.Type}, {"NUMS", Int64.Type}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1),
    #"Added Custom" = Table.AddColumn(#"Added Index", "Custom", each let i = [ID]
in
Table.Max(Table.SelectRows(#"Changed Type",each [ID] = i),"NUMS")[NUMS])
in
    #"Added Custom"

 

And you will see:

Annotation 2020-04-17 182907.png

 

For the related .pbix file,pls click here.

 

Best Regards,
Kelly
Did I answer your question? Mark my post as a solution!

View solution in original post

Anonymous
Not applicable

I've come up with a performant solution - the ones so far did not work well on 120k rows.  Basically, I created a (temporary) table of the grouped data, then merged this back into the main table.  Something like this:

let

    Source = whatever

    Grouped = Table.Group(#"Source", col1, max(number))

    Merged = Table.Merge(#"Source", #"Grouped", using col1)

Because this uses table processing rather than row processing, it runs pretty quickly.

View solution in original post

5 REPLIES 5
v-kelly-msft
Community Support
Community Support

Hi  @Anonymous ,

 

As line 4 and line 5 have the same values in each column,so when the data is put in a table visual ,you need to create an index column first,otherwise the 2 lines will be aggregated:

Go to query editor>add column>index column;

Then create a measure as below:

 

Measure = MAXX(FILTER(ALL('Table'),'Table'[ID]=SELECTEDVALUE('Table'[ID])),'Table'[  NUMS])

 

And you will see: 

Annotation 2020-04-17 135806.png

 

Or you can create a calculated column as below:

 

Column = CALCULATE(MAX('Table'[  NUMS]),ALLEXCEPT('Table','Table'[ID]))

 

And you will see:

Annotation 2020-04-17 140331.png

For the related .pbix file,pls click here.

 

Best Regards,
Kelly
Did I answer your question? Mark my post as a solution!
Anonymous
Not applicable

Is it possible to do the same in M, so that I can push it into a Dataflow ?

Hi @Anonymous ,

 

Using below M code:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlQ4tACOlHSUIEwDpVgdXHImeOTMwXJGeMzEKRcLAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [ID = _t, NUMS = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"ID", Int64.Type}, {"NUMS", Int64.Type}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1),
    #"Added Custom" = Table.AddColumn(#"Added Index", "Custom", each let i = [ID]
in
Table.Max(Table.SelectRows(#"Changed Type",each [ID] = i),"NUMS")[NUMS])
in
    #"Added Custom"

 

And you will see:

Annotation 2020-04-17 182907.png

 

For the related .pbix file,pls click here.

 

Best Regards,
Kelly
Did I answer your question? Mark my post as a solution!
Anonymous
Not applicable

I've come up with a performant solution - the ones so far did not work well on 120k rows.  Basically, I created a (temporary) table of the grouped data, then merged this back into the main table.  Something like this:

let

    Source = whatever

    Grouped = Table.Group(#"Source", col1, max(number))

    Merged = Table.Merge(#"Source", #"Grouped", using col1)

Because this uses table processing rather than row processing, it runs pretty quickly.

parry2k
Super User
Super User

@Anonymous add following measure

 

Max Nums = CALCULATE ( MAX ( Table[Nums] ), ALLEXCEPT ( Table, Table[Id] ) )


Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.