Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
I am looking for a way to keep only the first two (earliest) dates for each grouping in a table like the following.
group date quantity
A | 1/5/2018 | 10 |
A | 3/10/2018 | 8 |
A | 5/12/2018 | 15 |
B | 2/20/2018 | 5 |
C | 2/25/2018 | 20 |
C | 7/19/2018 | 28 |
C | 10/5/2018 | 15 |
So the query would find the first (earliest) two dates for each grouping, and then only keep those rows. the resulting table would look like this:
group date quantity
A | 1/5/2018 | 10 |
A | 3/10/2018 | 8 |
B | 2/20/2018 | 5 |
C | 2/25/2018 | 20 |
C | 7/19/2018 | 28 |
Any thoughts on how to do this with DAX or even M?
Solved! Go to Solution.
I find Power Query to be better to handle this type, of what I call, heavy lifting. I attached the PBIX below so you can see all the steps, but basically just grouped the data by your Group Column, and then added an index column and used that to create a simple measure.
Here's the final table from Power Query:
Then all you need is a simple calculate measure:
Quantity Earliest 2 = CALCULATE( SUM ( Table1[Quantity] ), Table1[Index] <= 2)
which gives you this:
For me (and this is just my opinion) that way is much simpler, but to each there own!
PBIX File:
I find Power Query to be better to handle this type, of what I call, heavy lifting. I attached the PBIX below so you can see all the steps, but basically just grouped the data by your Group Column, and then added an index column and used that to create a simple measure.
Here's the final table from Power Query:
Then all you need is a simple calculate measure:
Quantity Earliest 2 = CALCULATE( SUM ( Table1[Quantity] ), Table1[Index] <= 2)
which gives you this:
For me (and this is just my opinion) that way is much simpler, but to each there own!
PBIX File:
Hi @Anonymous
in dax you can do something like:
GENERATE( VALUES( Data[Iteam] ), CALCULATETABLE( TOPN( 2, SELECTCOLUMNS( Data, "Item", Data[Iteam], "Date", Data[Date], "Value", Data[Value] ), [Date], ASC ) ) )
Did I answer your question correctly? Mark my answer as a solution!
Proud to be a Datanaut!
User | Count |
---|---|
140 | |
113 | |
104 | |
77 | |
65 |
User | Count |
---|---|
136 | |
118 | |
101 | |
71 | |
61 |