Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Capstone
Resolver I
Resolver I

latest transaction rows

 

 

Capture.PNG

 

So I have the following dataset which has  daily transactions for each location per customer. Each day will have multiple locations and they wont be grouped but for ease of understanding I have only shown one location and grouped it in descending order of StartTime column. So basically I want to be able to choose the section highlighted in yellow for each day which is equal to the latest transaction records per location whenever there is a change in CustomerID. I cannot group it by CustomerID because on the 18/10/2018 there are two transactions from Customer 1000 which are between the transactions by customer 1001. Hence after two transactions by Customer 1000 there are three transactions by Customer 1001 which is what I want to isolate. Hope it is clear.

The measure Im looking for is the average duration for the latest transactions which is average of (EndTime - StartTime). The measure is the easy bit but isolating the records is proving difficult. Hope someone has a solution !

1 ACCEPTED SOLUTION

@Capstone

 

You are going to need to modify your table a little bit and add an additional 'flag' column in order to make the DAX really easy.

 

The modification is to be done in Power Query as per below. You just need to modify the Source step to link it to your real source:

 

let


    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldRbisMwDAXQrQz5LlRXDz+0ldL9b6MxM32MBKb6MijixI6vcrsdoCvGlQnjuBxGhHPBc3k++mF2olQyHPfL1wSt/v8luPUCgenKSdW2I8KuMVwkqTIqRHfWpPKsHKQ5LKtUIcypZbVyI5D1+ZPKFYJdR1alQsBlZlV3RAgByCVe0qlagZjOcV/kvE1WEIYjmdimO7R3J4kkvYPZP4Dfw8Z2c2isaC8A6mSJpAJw/hJaIlEAsDYcSS4AtGY5kvIN0P/eJ2uUo6k7YQaB1yQn1CoE1iRHFa1CkL8C9lbpjMP9AQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Date = _t, LocationID = _t, CustomerID = _t, StartTime = _t, EndTime = _t]),
    
    
    ChangedType = Table.TransformColumnTypes(Source,{{"Date", type date},{"LocationID", Int64.Type}, {"CustomerID", Int64.Type}, {"StartTime", type datetime}, {"EndTime", type datetime}}),
    
    
   SortTable = Table.Sort( 
                    ChangedType, 
                    { 
                        {"LocationID", Order.Ascending}, 
                        {"StartTime", Order.Ascending} 
                    } ),
   
   GroupTable = Table.Group( 
                    SortTable, 
                    {"LocationID", "CustomerID"}, 
                    { 
                        //{ "MaxStartTime", each List.Max( [StartTime] ), type datetime },
                        {"FullTable", each _, type table }
                    },
                    GroupKind.Local
   ),
   
   ReGroupTable = Table.Group( GroupTable, {"LocationID"}, {"MaxDate", each List.Max( Table.Combine( [FullTable] )[StartTime] ), type table }, GroupKind.Global ),
   
   MergeTables = Table.NestedJoin( GroupTable, {"LocationID"}, ReGroupTable, {"LocationID"}, "MaxDateTable" ),
   
   AddFlag = Table.AddColumn( MergeTables, "Flag", each if List.Max( [FullTable][StartTime] ) = List.Max( [MaxDateTable][MaxDate] ) then true else false, type logical ),
   
   SelectCols = Table.SelectColumns(AddFlag,{"FullTable", "Flag"}),
   
   ExpandedFullTable = Table.ExpandTableColumn(SelectCols, "FullTable", {"Date", "LocationID", "CustomerID", "StartTime", "EndTime"}, {"Date", "LocationID", "CustomerID", "StartTime", "EndTime"}),
   
   Final = Table.TransformColumnTypes(ExpandedFullTable, {{"Date", type date}, {"LocationID", Int64.Type}, {"CustomerID", Int64.Type}, {"StartTime", type datetime}, {"EndTime", type datetime}})
   
   
in
    Final

 

 

 

Capture.PNG

 

 

 

 

 

Afterwards, I have normalized the LocationID and CustomerID and built the below model:

 

Capture.PNG

 

 

Then the dax becomes really easy:

 

 

Average Duration ( Minutes ) =
AVERAGEX (
    CALCULATETABLE ( Transactions, Transactions[Flag] = TRUE () ),
    ( Transactions[EndTime] - Transactions[StartTime] )
        * 1440
)

 

Capture.PNG

 

Let me know if you need me to share the .pbix 

 


 


Did I answer your question correctly? Mark my answer as a solution!


Proud to be a Datanaut!  

View solution in original post

4 REPLIES 4
LivioLanzo
Solution Sage
Solution Sage

Hi @Capstone

 

could you share a version I could easily copy / paste?  thanks!@

 


 


Did I answer your question correctly? Mark my answer as a solution!


Proud to be a Datanaut!  

@Capstone

 

You are going to need to modify your table a little bit and add an additional 'flag' column in order to make the DAX really easy.

 

The modification is to be done in Power Query as per below. You just need to modify the Source step to link it to your real source:

 

let


    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldRbisMwDAXQrQz5LlRXDz+0ldL9b6MxM32MBKb6MijixI6vcrsdoCvGlQnjuBxGhHPBc3k++mF2olQyHPfL1wSt/v8luPUCgenKSdW2I8KuMVwkqTIqRHfWpPKsHKQ5LKtUIcypZbVyI5D1+ZPKFYJdR1alQsBlZlV3RAgByCVe0qlagZjOcV/kvE1WEIYjmdimO7R3J4kkvYPZP4Dfw8Z2c2isaC8A6mSJpAJw/hJaIlEAsDYcSS4AtGY5kvIN0P/eJ2uUo6k7YQaB1yQn1CoE1iRHFa1CkL8C9lbpjMP9AQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Date = _t, LocationID = _t, CustomerID = _t, StartTime = _t, EndTime = _t]),
    
    
    ChangedType = Table.TransformColumnTypes(Source,{{"Date", type date},{"LocationID", Int64.Type}, {"CustomerID", Int64.Type}, {"StartTime", type datetime}, {"EndTime", type datetime}}),
    
    
   SortTable = Table.Sort( 
                    ChangedType, 
                    { 
                        {"LocationID", Order.Ascending}, 
                        {"StartTime", Order.Ascending} 
                    } ),
   
   GroupTable = Table.Group( 
                    SortTable, 
                    {"LocationID", "CustomerID"}, 
                    { 
                        //{ "MaxStartTime", each List.Max( [StartTime] ), type datetime },
                        {"FullTable", each _, type table }
                    },
                    GroupKind.Local
   ),
   
   ReGroupTable = Table.Group( GroupTable, {"LocationID"}, {"MaxDate", each List.Max( Table.Combine( [FullTable] )[StartTime] ), type table }, GroupKind.Global ),
   
   MergeTables = Table.NestedJoin( GroupTable, {"LocationID"}, ReGroupTable, {"LocationID"}, "MaxDateTable" ),
   
   AddFlag = Table.AddColumn( MergeTables, "Flag", each if List.Max( [FullTable][StartTime] ) = List.Max( [MaxDateTable][MaxDate] ) then true else false, type logical ),
   
   SelectCols = Table.SelectColumns(AddFlag,{"FullTable", "Flag"}),
   
   ExpandedFullTable = Table.ExpandTableColumn(SelectCols, "FullTable", {"Date", "LocationID", "CustomerID", "StartTime", "EndTime"}, {"Date", "LocationID", "CustomerID", "StartTime", "EndTime"}),
   
   Final = Table.TransformColumnTypes(ExpandedFullTable, {{"Date", type date}, {"LocationID", Int64.Type}, {"CustomerID", Int64.Type}, {"StartTime", type datetime}, {"EndTime", type datetime}})
   
   
in
    Final

 

 

 

Capture.PNG

 

 

 

 

 

Afterwards, I have normalized the LocationID and CustomerID and built the below model:

 

Capture.PNG

 

 

Then the dax becomes really easy:

 

 

Average Duration ( Minutes ) =
AVERAGEX (
    CALCULATETABLE ( Transactions, Transactions[Flag] = TRUE () ),
    ( Transactions[EndTime] - Transactions[StartTime] )
        * 1440
)

 

Capture.PNG

 

Let me know if you need me to share the .pbix 

 


 


Did I answer your question correctly? Mark my answer as a solution!


Proud to be a Datanaut!  

Thank you, that was brilliant ! I applied the solution to my real data and it worked like a charm Man Happy. The only addition I had to make was in the Merge Tables step where I had to add the "Date" along with the "LocationID"

 

MergeTables = Table.NestedJoin( GroupTable, {"LocationID","Date"}, ReGroupTable, {"LocationID","Date"}, "MaxDateTable" )

I definitely learnt a few tricks along the way 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.