Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi,
I've been searching to find the answer with no luck. What I am trying to do is reference a previous row using a DAX calculation. I have a list of received call times and end call times by operator. What I am trying to determind is which calls were recieved before the previous call was ended.
For example from below operator 277BRANDON on May 7 recieved a call at 10:18:43 AM ended 11:02:49 AM the next call for 277BRANDON recieved at 10:50:31AM the same day. I need to be able to flag the instances which this occurs
Call Date | Receive Time | End Time | Operator |
Friday, May 04, 2018 | 4:23:11 PM | 4:47:38 PM | 277BRANDON |
Friday, May 04, 2018 | 4:47:19 PM | 5:38:12 PM | 277BRANDON |
Friday, May 04, 2018 | 7:34:28 PM | 8:53:04 PM | 277BRANDON |
Friday, May 04, 2018 | 9:03:45 PM | 9:33:09 PM | 277BRANDON |
Monday, May 07, 2018 | 6:55:40 AM | 8:17:26 AM | 277BRANDON |
Monday, May 07, 2018 | 8:11:48 AM | 9:04:02 AM | 277BRANDON |
Monday, May 07, 2018 | 9:08:33 AM | 10:05:44 AM | 277BRANDON |
Monday, May 07, 2018 | 10:18:43 AM | 11:02:49 AM | 277BRANDON |
Monday, May 07, 2018 | 10:50:31 AM | 11:46:37 AM | 277BRANDON |
Monday, May 07, 2018 | 11:39:39 AM | 12:17:21 PM | 277BRANDON |
Monday, May 07, 2018 | 12:19:16 PM | 12:28:07 PM | 277BRANDON |
Monday, May 07, 2018 | 12:46:18 PM | 1:30:32 PM | 277BRANDON |
Monday, May 07, 2018 | 2:18:51 PM | 2:27:32 PM | 277BRANDON |
Monday, May 07, 2018 | 2:24:59 PM | 2:38:50 PM | 277BRANDON |
Monday, May 07, 2018 | 4:14:49 PM | 4:42:58 PM | 277BRANDON |
Thanks,
Solved! Go to Solution.
You need to add an index column then use the following
Column = VAR _EndTimeLookup = LOOKUPVALUE ( Table1[End Time], Table1[Index], Table1[Index] - 1 ) RETURN IF ( NOT ( ISBLANK ( _EndTimeLookup ) ), IF ( Table1[Receive Time] < _EndTimeLookup, 1, 0 ) )
This will make a column that returns 1 if the row is to be flagged and a 0 if not.
If you prefer you can also do it in Power Query, which is how I prefer to do items like this if I dont need it to be dynamic
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldMxT8QwDAXgvxJ17mA7dpO87RBi64FYTzecxMICEtv9e9xLeywF4dHD9/yaJqfT8PT1/na5jmm+XBPpmIS4DuOgkAzm9DLfBi3ItQ9SysPr4fj4fBzO4x/eCbdOzDFYIt73eYV1ZYVlkEZ8A2WoddKQ3bddP39+/Phy9xPMoJQOfT8XyNSH/3knDK2deBkFScQ7qd66EyaQt9FIgBuu0C2BfT+0BROMkPmeoBNyCSUwsp/9upXldowc+Q2LaeCpG5+kgkowwXvzepO8kH/S/k38JUCWc7StNaSEvShsu3zLSzCKeAXr8ue2lyiwnZd4/gY=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [#"Call Date" = _t, #"Receive Time" = _t, #"End Time" = _t, Operator = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"Call Date", type date}, {"Receive Time", type time}, {"End Time", type time}, {"Operator", type text}}), #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1), #"Added Custom" = Table.AddColumn(#"Added Index", "Last Row End", each try #"Added Index"{[Index]-1}[End Time] otherwise null,Time.Type), #"Added Custom1" = Table.AddColumn(#"Added Custom", "Filter", each if [Last Row End] >= [Receive Time] and [Last Row End] <> null then 1 else 0,Int64.Type), #"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Last Row End"}) in #"Removed Columns"
You need to add an index column then use the following
Column = VAR _EndTimeLookup = LOOKUPVALUE ( Table1[End Time], Table1[Index], Table1[Index] - 1 ) RETURN IF ( NOT ( ISBLANK ( _EndTimeLookup ) ), IF ( Table1[Receive Time] < _EndTimeLookup, 1, 0 ) )
This will make a column that returns 1 if the row is to be flagged and a 0 if not.
If you prefer you can also do it in Power Query, which is how I prefer to do items like this if I dont need it to be dynamic
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldMxT8QwDAXgvxJ17mA7dpO87RBi64FYTzecxMICEtv9e9xLeywF4dHD9/yaJqfT8PT1/na5jmm+XBPpmIS4DuOgkAzm9DLfBi3ItQ9SysPr4fj4fBzO4x/eCbdOzDFYIt73eYV1ZYVlkEZ8A2WoddKQ3bddP39+/Phy9xPMoJQOfT8XyNSH/3knDK2deBkFScQ7qd66EyaQt9FIgBuu0C2BfT+0BROMkPmeoBNyCSUwsp/9upXldowc+Q2LaeCpG5+kgkowwXvzepO8kH/S/k38JUCWc7StNaSEvShsu3zLSzCKeAXr8ue2lyiwnZd4/gY=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [#"Call Date" = _t, #"Receive Time" = _t, #"End Time" = _t, Operator = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"Call Date", type date}, {"Receive Time", type time}, {"End Time", type time}, {"Operator", type text}}), #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1), #"Added Custom" = Table.AddColumn(#"Added Index", "Last Row End", each try #"Added Index"{[Index]-1}[End Time] otherwise null,Time.Type), #"Added Custom1" = Table.AddColumn(#"Added Custom", "Filter", each if [Last Row End] >= [Receive Time] and [Last Row End] <> null then 1 else 0,Int64.Type), #"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Last Row End"}) in #"Removed Columns"
Thank you so much, this worked!!
is this recommended for large tables?
I have the same issue, it's taking absoutely ages to load
@ovetteabejuela, @doubleclick This is definitely not the best way for large tables. Using a self-merge instead of a lookup should be better, though you might still run into memory issues to the table is too large.
For example,
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldMxT8QwDAXgvxJ17mA7dpO87RBi64FYTzecxMICEtv9e9xLeywF4dHD9/yaJqfT8PT1/na5jmm+XBPpmIS4DuOgkAzm9DLfBi3ItQ9SysPr4fj4fBzO4x/eCbdOzDFYIt73eYV1ZYVlkEZ8A2WoddKQ3bddP39+/Phy9xPMoJQOfT8XyNSH/3knDK2deBkFScQ7qd66EyaQt9FIgBuu0C2BfT+0BROMkPmeoBNyCSUwsp/9upXldowc+Q2LaeCpG5+kgkowwXvzepO8kH/S/k38JUCWc7StNaSEvShsu3zLSzCKeAXr8ue2lyiwnZd4/gY=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [#"Call Date" = _t, #"Receive Time" = _t, #"End Time" = _t, Operator = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Call Date", type date}, {"Receive Time", type time}, {"End Time", type time}, {"Operator", type text}}),
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index0", 0, 1),
#"Added Index1" = Table.Buffer(Table.AddIndexColumn(#"Added Index", "Index1", 1, 1, Int64.Type)),
#"Merged Queries" = Table.NestedJoin(#"Added Index1", {"Index0"}, #"Added Index1", {"Index1"}, "Merge", JoinKind.LeftOuter),
#"Expanded Added Index1" = Table.ExpandTableColumn(#"Merged Queries", "Merge", {"End Time"}, {"Last End Time"}),
#"Added Custom" = Table.AddColumn(#"Expanded Added Index1", "Filter", each [Last End Time] <> null and [Last End Time] >= [Receive Time], type logical)
in
#"Added Custom"
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
115 | |
99 | |
88 | |
70 | |
62 |
User | Count |
---|---|
151 | |
120 | |
103 | |
87 | |
68 |