Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi,
I am new to power BI. I am not sure if I need to use DAX or M for what I need.
I would like create cycle count in new column based on the following:
*for every device
*change cycle when State reaches "1"
*every device gets its own cycle counts
*cycles increment by 1
*cycles must be in chronological order per "time", for every device
I tried the following without sucess:
Time | device | m1 | m2 | m3 | State | Cycles |
9/1/2020 0:01 | 4 | -0.103 | 0 | 100 | 1 | 1 |
9/1/2020 12:56 | 4 | -0.103 | 0 | 100 | 1 | 1 |
9/1/2020 13:26 | 4 | -0.103 | 0 | 100 | 1 | 1 |
9/1/2020 13:26 | 4 | -0.594 | 0 | 100 | 4 | 1 |
9/1/2020 13:31 | 4 | -0.594 | -0.1 | 100 | 4 | 1 |
9/1/2020 13:36 | 4 | -0.654 | -15 | 99.9 | 4 | 1 |
9/1/2020 20:31 | 4 | 0.02 | -24.7 | 77.3 | 4 | 1 |
9/1/2020 20:33 | 4 | 16.927 | 0 | 77.8 | 3 | 1 |
9/1/2020 20:38 | 4 | 17.407 | 1.5 | 79.1 | 3 | 1 |
9/1/2020 20:58 | 4 | 17.331 | 7.3 | 84.4 | 3 | 1 |
9/1/2020 21:23 | 4 | 17.282 | 14.5 | 91.1 | 3 | 1 |
9/1/2020 21:27 | 4 | 17.284 | 15.9 | 92.4 | 3 | 1 |
9/1/2020 21:28 | 4 | 15.926 | 16.2 | 92.6 | 2 | 1 |
9/1/2020 23:23 | 4 | 2.269 | 25.6 | 101.3 | 2 | 1 |
9/1/2020 23:43 | 4 | 1.862 | 26.4 | 101.9 | 2 | 1 |
9/1/2020 23:44 | 4 | -0.1 | 0 | 100 | 1 | 2 |
9/2/2020 0:14 | 4 | -0.1 | 0 | 100 | 1 | 2 |
9/2/2020 7:19 | 4 | -0.102 | 0 | 100 | 1 | 2 |
9/2/2020 7:24 | 4 | -0.102 | 0 | 100 | 1 | 2 |
9/2/2020 7:26 | 4 | -0.175 | 0 | 100 | 4 | 2 |
9/2/2020 7:31 | 4 | -1.253 | -0.1 | 99.9 | 4 | 2 |
9/2/2020 20:01 | 4 | -0.025 | -5 | 79.7 | 4 | 2 |
9/2/2020 20:06 | 4 | -0.025 | -15 | 79.7 | 4 | 2 |
9/2/2020 20:11 | 4 | -0.025 | -22.2 | 79.7 | 4 | 2 |
9/2/2020 20:15 | 4 | 7.011 | 0 | 80.1 | 3 | 2 |
9/2/2020 20:20 | 4 | 17.448 | 1.5 | 81.5 | 3 | 2 |
9/2/2020 20:35 | 4 | 17.376 | 5.9 | 85.5 | 3 | 2 |
9/2/2020 20:55 | 4 | 17.344 | 11.6 | 90.8 | 3 | 2 |
9/2/2020 20:58 | 4 | 16.897 | 12.5 | 91.6 | 2 | 2 |
9/2/2020 21:18 | 4 | 9.272 | 15.6 | 94.4 | 2 | 2 |
9/2/2020 23:08 | 4 | 2.274 | 22.6 | 100.8 | 2 | 2 |
9/2/2020 23:11 | 4 | 1.002 | 0 | 100 | 1 | 3 |
9/3/2020 9:35 | 4 | -0.102 | 0 | 100 | 1 | 3 |
9/3/2020 9:39 | 4 | -1.319 | 0 | 100 | 4 | 3 |
9/3/2020 9:49 | 4 | -0.661 | -0.1 | 99.9 | 4 | 3 |
9/3/2020 14:09 | 4 | -3.817 | -15.4 | 85.9 | 4 | 3 |
9/3/2020 20:34 | 4 | -0.294 | -33.2 | 69.6 | 4 | 3 |
9/3/2020 20:36 | 4 | 17.005 | 0 | 70.1 | 3 | 3 |
9/3/2020 21:21 | 4 | 17.305 | 13.1 | 82 | 3 | 3 |
9/3/2020 21:56 | 4 | 17.398 | 23.2 | 91.3 | 3 | 3 |
9/3/2020 21:57 | 4 | 15.347 | 23.4 | 91.5 | 2 | 3 |
9/3/2020 22:57 | 4 | 4.322 | 30.8 | 98.3 | 2 | 3 |
9/4/2020 0:32 | 4 | 2.248 | 34.9 | 102.1 | 2 | 3 |
9/4/2020 0:34 | 4 | 0.622 | 0 | 100 | 1 | 4 |
9/4/2020 7:14 | 4 | -0.103 | 0 | 100 | 1 | 4 |
9/4/2020 7:18 | 4 | -0.716 | 0 | 100 | 4 | 4 |
9/4/2020 7:23 | 4 | -1.818 | -0.2 | 99.9 | 4 | 4 |
9/4/2020 12:23 | 4 | -7.451 | -7 | 93.6 | 4 | 4 |
9/4/2020 13:58 | 4 | -3.936 | -14.9 | 86.3 | 4 | 4 |
9/4/2020 18:48 | 4 | -0.704 | -26.9 | 75.3 | 4 | 4 |
9/4/2020 18:51 | 4 | 11.033 | 0 | 75.8 | 3 | 4 |
9/4/2020 19:16 | 4 | 17.268 | 7.3 | 82.5 | 3 | 4 |
9/4/2020 19:46 | 4 | 17.223 | 15.9 | 90.4 | 3 | 4 |
9/4/2020 19:51 | 4 | 16.994 | 17.2 | 91.6 | 2 | 4 |
9/4/2020 22:11 | 4 | 2.249 | 27.9 | 101.4 | 2 | 4 |
9/4/2020 22:12 | 4 | -0.102 | 0 | 100 | 1 | 5 |
9/4/2020 23:52 | 4 | -0.104 | 0 | 100 | 1 | 5 |
9/1/2020 0:01 | 7 | -0.103 | 0 | 100 | 1 | 1 |
9/1/2020 12:56 | 7 | -0.103 | 0 | 100 | 1 | 1 |
9/1/2020 13:26 | 7 | -0.103 | 0 | 100 | 1 | 1 |
9/1/2020 13:26 | 7 | -0.594 | 0 | 100 | 4 | 1 |
9/1/2020 13:31 | 7 | -0.594 | -0.1 | 100 | 4 | 1 |
9/1/2020 13:36 | 7 | -0.654 | -15 | 99.9 | 4 | 1 |
9/1/2020 20:31 | 7 | 0.02 | -24.7 | 77.3 | 4 | 1 |
9/1/2020 20:33 | 7 | 16.927 | 0 | 77.8 | 3 | 1 |
9/1/2020 20:38 | 7 | 17.407 | 1.5 | 79.1 | 3 | 1 |
9/1/2020 20:58 | 7 | 17.331 | 7.3 | 84.4 | 3 | 1 |
9/1/2020 21:23 | 7 | 17.282 | 14.5 | 91.1 | 3 | 1 |
9/1/2020 21:27 | 7 | 17.284 | 15.9 | 92.4 | 3 | 1 |
9/1/2020 21:28 | 7 | 15.926 | 16.2 | 92.6 | 2 | 1 |
9/1/2020 23:23 | 7 | 2.269 | 25.6 | 101.3 | 2 | 1 |
9/1/2020 23:43 | 7 | 1.862 | 26.4 | 101.9 | 2 | 1 |
9/1/2020 23:44 | 7 | -0.1 | 0 | 100 | 1 | 2 |
Solved! Go to Solution.
@JoeJoe666 - Ah, you want Cthulhu - https://community.powerbi.com/t5/Quick-Measures-Gallery/Cthulhu/m-p/509739#M211
Actually, no, not Cthulhu but you can do it with 2 columns, I am still working on the single column version. PBIX is attached below sig, it is Table (13).
Cycle Change =
VAR __Table = FILTER('Table (13)',[device]=EARLIER([device]) && [time]<=EARLIER([Time]))
VAR __Previous = MAXX(FILTER(__Table,[time]<EARLIER([Time])),[Time])
VAR __PreviousState = MAXX(FILTER(__Table,[Time]=__Previous),[State])
RETURN
IF(([State]=1 && __PreviousState<>1),1,0)
Cycles = SUMX(FILTER('Table (13)',[device]=EARLIER([device]) && [Time]<=EARLIER([Time])),[Cycle Change])
OK, got this into a single column:
Single Column non working =
VAR __BaseTable = 'Table (13)'
VAR __Table =
ADDCOLUMNS(
ADDCOLUMNS(
ADDCOLUMNS(
FILTER(__BaseTable,[device]=EARLIER([device]) && [time]<=EARLIER([Time])),
"__Previous",MAXX(FILTER(__BaseTable,[time]<EARLIER([Time])),[Time])
),
"__PreviousState",MAXX(FILTER(__BaseTable,[Time]=[__Previous]),[State])
),
"__CycleChange",IF(([State]=1 && [__PreviousState]<>1),1,0)
)
RETURN
SUMX(__Table,[__CycleChange])
Updated the PBIX.
@JoeJoe666 - Just in case you missed it, I updated my original response with a DAX solution. Two column and single column solutions.
Here's the M code to do it:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("pZdbcqQwDEW3MsV34kjym62ksv9tjCw/iW0SOh80dMFBtnx1sT4/D4gfgB8EBP8ATsDj7TB8vINC0HwBfCDIrxxfbxcG6bTuKaNP+iNjo7kwZsNo/M6kgD9jQyhn5QIt/8ao4g4j6NFAASWIjPJ89l7pO6zddCqSLxNjKPBJ76BQIa8MJAhVGqKPMr8dZgdMy3jz2IJRZovhSbpjFNLc0Ei4iDfhmPMjJxdWUhjpPl4bJj8uC8+5ocylf7TkdB8nKXIpDll5HgFlmjvOtPmp4NJD5GR8iYt3nBl0PKmYGkGtwvAh4U+MAyG6+pkh8wIz1qS3U32tmF5fqMjqPq2hUiaOrj4DlGK9F/H6O8pNFP4CwzkYkajpJ9CWm14B1sUK0BS/Yqj5CtelCa0uQz7tMG07pn2aZ66TYG8xO2IiRkQRfITmHkuu1ZdTIYp7UK3nWl8zhydWLirylOtTwmX7WGP6hIpx2r2cqZRlHuaOayuHChYq1g3RRZGxJ3Kj/CXTKoxdQsrtqvwVY4aqdA6Xyp84NCc0TquAPmtYkhfsHkwKGUqa8qdMa1Gxi8rdgfUmawSg1rXvMp4hNmDskBYItQBi/jvKDqF0lJXNQ4zZfrdc+0xYlrHPnMmcLfqYOeqcUZrkoaynGJrbd8w0D9bU5Sglqo2knvWi8I6rS8BLTrO0zIT4q90vNjpLJnTGo5vkuGLah48lHOQFSSZXOU4cb946x3ZlRcYpp1E3Sc2U7vbBIo6ir3fMKQyubXVmMJxmnBrIBX9pE+jtLWibHtkKdM0jQ9XlZiieOMiRXHqybHaomeoKMyMmCaqbFmiblhXXx8hTivUFx8VUJ45V3FwuyVF2G77IEZurLrkq443V2Ynhlbsw5ob51hT4vYa3TcEjpmzw/8A8agouzO+bgoI9bQpkU/+8KfDHC02BP15qCgr2tCko2OOmoHEPmwJ/vNYU+OO1piAn8XlTUHU8qZj3OV//AQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Time = _t, device = _t, m1 = _t, m2 = _t, m3 = _t, State = _t, Cycles = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Time", type datetime}, {"device", Int64.Type}, {"m1", type number}, {"m2", type number}, {"m3", type number}, {"State", Int64.Type}, {"Cycles", Int64.Type}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Cycles"}),
#"Sorted Rows" = Table.Sort(#"Removed Columns",{{"device", Order.Ascending}, {"Time", Order.Ascending}}),
#"Added Index" = Table.AddIndexColumn(#"Sorted Rows", "Index", 0, 1, Int64.Type),
#"Reordered Columns" = Table.ReorderColumns(#"Added Index",{"Index", "Time", "device", "m1", "m2", "m3", "State"}),
#"Added Custom" = Table.AddColumn(#"Reordered Columns", "PrevState",
each
let
PrevRowState = List.SingleOrDefault(
Table.SelectRows(#"Reordered Columns",
(r) => r[Index] = [Index] - 1 and r[device] = [device])[State],
null
)
in
PrevRowState
),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "ShouldAddOne",
each
if [PrevState] is null or ([PrevState] <> 1 and [State] = 1)
then 1 else 0
),
#"Added Custom2" = Table.AddColumn(#"Added Custom1", "Cycle",
each
// this is just a cumulative sum of ShouldAddOne
List.Sum(
Table.SelectRows(#"Added Custom1",
(r) => r[Index] <= [Index] and r[device] = [device]
)[ShouldAddOne]
)
),
#"Removed Columns1" = Table.RemoveColumns(#"Added Custom2",{"PrevState", "ShouldAddOne"})
in
#"Removed Columns1"
@daxer-almighty Thank you. However, I am not sure how to even copy paste this into M. I will try to learn and try it. The DAX solution works but didn't work for me due to 400,000+ line items. Would this give me the same memory error?
@JoeJoe666 - Here is one improvement to the model that will hopefully help, I missed a filter. Let me see what other tricks I can do to get a more optimized calculation but try this in the mean time. There is a lot going on with this thing, it isn't pretty what you are trying to accomplish. For the Power Query solution, you open up Advanced Editor and paste in the code but it can be a little hairy if you have never done it before. Also, 2 column approach should be a lot less processing.
Cycles Count Column =
VAR __BaseTable = 'Table'
VAR __Time = [Time]
VAR __device = [device]
VAR __Table =
ADDCOLUMNS(
ADDCOLUMNS(
ADDCOLUMNS(
FILTER(__BaseTable,[device]=__device && [time]<=__Time),
"__Previous",MAXX(FILTER(__BaseTable,[device]=__device && [time]<EARLIER([Time])),[Time])
),
"__PreviousState",MAXX(FILTER(__BaseTable,[device]=__device && [Time]=[__Previous]),[State])
),
"__CycleChange",IF(([State]=1 && [__PreviousState]<>1),1,0)
)
RETURN
SUMX(__Table,[__CycleChange])
@JoeJoe666 - OK here is another improvement I think that should run much faster
Cycles Count Column =
VAR __Time = [Time]
VAR __device = [device]
VAR __WorkingTable = FILTER('Table',[device]=__device && [time]<=__Time)
VAR __Table =
ADDCOLUMNS(
ADDCOLUMNS(
ADDCOLUMNS(
__WorkingTable,
"__Previous",MAXX(FILTER(__WorkingTable,[time]<EARLIER([Time])),[Time])
),
"__PreviousState",MAXX(FILTER(__WorkingTable,[Time]=[__Previous]),[State])
),
"__CycleChange",IF(([State]=1 && [__PreviousState]<>1),1,0)
)
RETURN
SUMX(__Table,[__CycleChange])
tried the 2 column approach and 1st column not even coming back. same memory issue. 1 thing, my data always comes in chronological order for time column. Would it be less processing if [time]<EARLIER[time] is not required? my table is pre-sorted by time and device
@JoeJoe666 - Well, you could try adding an Index to the table in Power Query. Then you could use this:
1 column:
Cycles Count Column 1 =
VAR __Index = [Index]
VAR __device = [device]
VAR __WorkingTable = FILTER('Table',[device]=__device && [Index]<=__Index)
VAR __Table =
ADDCOLUMNS(
ADDCOLUMNS(
ADDCOLUMNS(
__WorkingTable,
"__Previous",MAXX(FILTER(__WorkingTable,[time]<EARLIER([Time])),[Time])
),
"__PreviousState",MAXX(FILTER(__WorkingTable,[Time]=[__Previous]),[State])
),
"__CycleChange",IF(([State]=1 && [__PreviousState]<>1),1,0)
)
RETURN
SUMX(__Table,[__CycleChange])
2 column:
Cycle Change Step 1 =
VAR __Table = FILTER('Table',[device]=EARLIER([device]) && [Index]<=EARLIER([Index]))
VAR __Previous = MAXX(FILTER(__Table,[Index]<EARLIER([Index])),[Index])
VAR __PreviousState = MAXX(FILTER(__Table,[Index]=__Previous),[State])
RETURN
IF(([State]=1 && __PreviousState<>1),1,0)
Cycles 2 = SUMX(FILTER('Table',[device]=EARLIER([device]) && [Index]<=EARLIER([Index])),[Cycle Change Step 1])
I'm working through another method.
@JoeJoe666 - OK, here is another method for the 2 step process that may perform better (should)
Cycle Change Step 1a =
VAR __device = [device]
VAR __Index = [Index]
VAR __Previous = CALCULATE(MAX([Index]),FILTER('Table',[device]=__device && [Index]<__Index))
VAR __PreviousState = CALCULATE(MAX([State]),FILTER('Table',[Index]=__Previous))
RETURN
IF(([State]=1 && __PreviousState<>1),1,0)
Cycles 2a =
VAR __device = [device]
VAR __Index = [Index]
RETURN
CALCULATE(SUM([Cycle Change Step 1a]), FILTER('Table',[device]=__device && [Index]<=__Index))
@JoeJoe666 - Just following up, did you ever get this operational?
@JoeJoe666 - Ah, you want Cthulhu - https://community.powerbi.com/t5/Quick-Measures-Gallery/Cthulhu/m-p/509739#M211
Actually, no, not Cthulhu but you can do it with 2 columns, I am still working on the single column version. PBIX is attached below sig, it is Table (13).
Cycle Change =
VAR __Table = FILTER('Table (13)',[device]=EARLIER([device]) && [time]<=EARLIER([Time]))
VAR __Previous = MAXX(FILTER(__Table,[time]<EARLIER([Time])),[Time])
VAR __PreviousState = MAXX(FILTER(__Table,[Time]=__Previous),[State])
RETURN
IF(([State]=1 && __PreviousState<>1),1,0)
Cycles = SUMX(FILTER('Table (13)',[device]=EARLIER([device]) && [Time]<=EARLIER([Time])),[Cycle Change])
OK, got this into a single column:
Single Column non working =
VAR __BaseTable = 'Table (13)'
VAR __Table =
ADDCOLUMNS(
ADDCOLUMNS(
ADDCOLUMNS(
FILTER(__BaseTable,[device]=EARLIER([device]) && [time]<=EARLIER([Time])),
"__Previous",MAXX(FILTER(__BaseTable,[time]<EARLIER([Time])),[Time])
),
"__PreviousState",MAXX(FILTER(__BaseTable,[Time]=[__Previous]),[State])
),
"__CycleChange",IF(([State]=1 && [__PreviousState]<>1),1,0)
)
RETURN
SUMX(__Table,[__CycleChange])
Updated the PBIX.
@Greg_Deckler . Thank you so much. I would not have firgured this out at all.
My data has over 400,000 rows. I tried this and Power BI keeps saying "working on it" then 10-15 mins later says "There's not enough memory to complete this operation. Please try this later...." I have 32gb of memory.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
102 | |
101 | |
78 | |
69 | |
62 |
User | Count |
---|---|
141 | |
106 | |
101 | |
85 | |
72 |