Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
I'm quite new to PowerBi and coding in general and am completely stumped. I'm hoping I can get some help to put me on the correct path.
I'm working with a SQL db that contains alarm information. One of the requested KPIs is "alarm flooding"
An alarm flood has been defined by ANSI/ISA 18.2 as being 10 or more annunciated alarms in any 10-minute period.
I am trying to report the time an alarm flood begins, the end time and the duration if the flood lasts longer than 10 minutes. As well as all the alarm names and timestamps involved in each flood period.
The data contains two columns used to calculate alarm flooding. Alarm name ie PT-0001 and a timestamp when the alarm occured 8/2/2019 6:26:32 AM
My initial thought was to create a date/time table and divide the time into 10 minute bins. However, this didn't work since the time bins are not always the same 10 minute spans.
I was thinking this might be possible using DAX but I have been unsucessful in implementing it.
Something like organize the data by time, add 10 minutes to the first entries time stamp and count the number of occurences until the created time stamp matches the current row's timestamp. If more than 10 occurences are within the 10min window that would be a flood. If the next 10 minutes after the created timestamp also contains more than 10 alarms that would be the same flood period. If it doesn't then the flood ended at the last alarms time.
I've spent an embarassing amount of time on this, I appreciate any help given!
sample data attached along with desired output
Timestamp Tag 7/1/2019 1:32:33 AM PT-0001 7/1/2019 1:32:33 AM PT-0002 7/1/2019 1:32:33 AM PT-1023 7/1/2019 1:32:33 AM TT-1111 7/1/2019 1:32:33 AM TT-1112 7/1/2019 1:32:34 AM PT-2222 7/1/2019 1:32:34 AM xx-xxxx 7/1/2019 1:32:34 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 1:32:35 AM xx-xxxx 7/1/2019 2:32:35 AM xx-xxxx 7/1/2019 2:33:35 AM xx-xxxx 7/1/2019 2:37:45 AM xx-xxxx 7/1/2019 6:42:12 AM xx-xxxx 7/1/2019 6:42:12 AM xx-xxxx 7/1/2019 6:43:35 AM xx-xxxx 7/1/2019 6:44:04 AM xx-xxxx 7/1/2019 6:44:04 AM xx-xxxx 7/1/2019 6:44:35 AM xx-xxxx 7/1/2019 6:44:40 AM xx-xxxx 7/1/2019 6:44:40 AM xx-xxxx 7/1/2019 6:44:40 AM xx-xxxx 7/1/2019 6:44:48 AM xx-xxxx 7/1/2019 6:44:48 AM xx-xxxx 7/1/2019 6:44:48 AM xx-xxxx 7/1/2019 6:44:54 AM xx-xxxx 7/1/2019 6:44:54 AM xx-xxxx 7/1/2019 6:45:00 AM xx-xxxx 7/1/2019 6:45:05 AM xx-xxxx 7/1/2019 6:45:19 AM xx-xxxx 7/1/2019 6:45:22 AM xx-xxxx 7/1/2019 6:45:22 AM xx-xxxx 7/1/2019 6:45:22 AM xx-xxxx 7/1/2019 6:45:22 AM xx-xxxx 7/1/2019 6:45:22 AM xx-xxxx 7/1/2019 6:45:23 AM xx-xxxx 7/1/2019 6:45:26 AM xx-xxxx 7/1/2019 6:45:26 AM xx-xxxx 7/1/2019 6:45:29 AM xx-xxxx 7/1/2019 6:45:29 AM xx-xxxx 7/1/2019 6:45:29 AM xx-xxxx 7/1/2019 6:45:29 AM xx-xxxx 7/1/2019 6:45:31 AM xx-xxxx 7/1/2019 6:45:33 AM xx-xxxx 7/1/2019 6:46:00 AM xx-xxxx 7/1/2019 6:46:00 AM xx-xxxx 7/1/2019 6:46:00 AM xx-xxxx 7/1/2019 6:46:00 AM xx-xxxx 7/1/2019 6:46:09 AM xx-xxxx 7/1/2019 6:46:09 AM xx-xxxx 7/1/2019 6:46:09 AM xx-xxxx 7/1/2019 6:46:09 AM xx-xxxx 7/1/2019 6:46:13 AM xx-xxxx 7/1/2019 6:46:13 AM xx-xxxx 7/1/2019 6:46:13 AM xx-xxxx 7/1/2019 6:46:13 AM xx-xxxx 7/1/2019 6:46:13 AM xx-xxxx 7/1/2019 6:46:13 AM xx-xxxx 7/1/2019 6:46:13 AM xx-xxxx 7/1/2019 6:46:27 AM xx-xxxx 7/1/2019 6:46:28 AM xx-xxxx 7/1/2019 6:46:28 AM xx-xxxx 7/1/2019 6:46:28 AM xx-xxxx 7/1/2019 6:46:28 AM xx-xxxx 7/1/2019 6:46:28 AM xx-xxxx 7/1/2019 6:46:29 AM xx-xxxx 7/1/2019 6:46:30 AM xx-xxxx 7/1/2019 6:46:30 AM xx-xxxx 7/1/2019 6:48:29 AM xx-xxxx 7/1/2019 6:48:29 AM xx-xxxx 7/1/2019 6:48:29 AM xx-xxxx 7/1/2019 6:48:29 AM xx-xxxx 7/1/2019 6:48:29 AM xx-xxxx 7/1/2019 6:48:29 AM xx-xxxx 7/1/2019 6:49:09 AM xx-xxxx 7/1/2019 6:49:49 AM xx-xxxx 7/1/2019 6:49:49 AM xx-xxxx 7/1/2019 6:50:05 AM xx-xxxx 7/1/2019 6:50:05 AM xx-xxxx 7/1/2019 6:50:05 AM xx-xxxx 7/1/2019 6:50:05 AM xx-xxxx 7/1/2019 6:50:06 AM xx-xxxx 7/1/2019 6:50:07 AM xx-xxxx 7/1/2019 6:50:07 AM xx-xxxx 7/1/2019 6:50:07 AM xx-xxxx 7/1/2019 6:50:09 AM xx-xxxx 7/1/2019 6:50:09 AM xx-xxxx 7/1/2019 6:50:09 AM xx-xxxx 7/1/2019 6:50:30 AM xx-xxxx 7/1/2019 6:50:31 AM xx-xxxx 7/1/2019 6:50:32 AM xx-xxxx 7/1/2019 6:51:19 AM xx-xxxx 7/1/2019 6:51:58 AM xx-xxxx 7/1/2019 6:51:58 AM xx-xxxx 7/1/2019 6:51:58 AM xx-xxxx 7/1/2019 6:51:58 AM xx-xxxx 7/1/2019 6:51:59 AM xx-xxxx 7/1/2019 6:51:59 AM xx-xxxx 7/1/2019 6:59:19 AM xx-xxxx 7/1/2019 6:59:20 AM xx-xxxx
output:
Begin End Alarm Count Duration 7/1/2019 1:32:32 AM 7/1/2019 1:32:34 AM 19 0:00:02 7/1/2019 6:42:12 AM 7/1/2019 6:59:20 AM 87 0:17:08
Solved! Go to Solution.
Hi there. I think I've nailed it down. Please download this file
https://1drv.ms/u/s!ApyQEauTSLtO6nD1kk5LMOFywpOi
and have a careful look. The code could be streamlined. Some steps are not totally necessary. You can clean it up if you want. Just try to test the M code on a dataset that's big enough to simulate real conditions.
I've used the data from Sample2. The end result is correct but you have to try this code out on other sets.
Best
Darek
In Power Query:
1. Group entries by datetime and for each datetime count the number of alarms. So, now you'll have a table
with unique datetimes and the count of alarms at this datetime (2 columns).
2. Then create another column that will store the end of the 10 min period (EndDatetime = StartDatetime + 10 mins).
3. Cross-join the table with itself (you have to create a copy if it in PQ) and for each row from the
first table leave only those rows from the other where the StartDatetime is within the bounds. So,
if T1 and T2 are copies of the summarized table, then for each row in T1 you have to grab all the
rows from T2 where T1[StartDatetime] <= T2[StartDatetime] <= T1[EndDatetime] and sum up all the counts
that come from T2. This way for each distinct StartDatetime you'll have the counts of alarms that
fall into the interval [StartDatetime, StartDatetime + 10 mins]. Name the column with the counts [Alarm Counts].
4. Once you have this, filter out the rows where [Alarm Counts] < 10.
5. So, now you have all the intervals that are potentially "alarm flooding."
6. You now have to get rid of the intervals/rows that are overlapping with others, so that only disjoint intervals remain.
The last point can, of course, be also carried out but you have to think about how to do it.
I'll think about it as well. I just need to find a good and reliable way to do it in M. But it's feasible 100%.
Best
Darek
Very smart. Thank you so much. I think this will work. I'll update once I implement it
Apologies and thank you again for your help, let me try to explain a bit better.
The ultimate goal is to find the total flood event duration this duration can be greater or less than 10 minutes.
For example if the first alarm starts at 11:00am and between 11:00am-11:10am 11 alarms are set with the last alarm at 11:10am the time span of 11:00am-11:10am would be considered in flood.
If between 11:01:42am-11:11:42am 9 alarms are set then this span would not be in flood.
However, if between 11:10am-11:20am there are 10 alarms with the last alarm at 11:14am the timespan 11:10am - 11:14 am is in flood.
This can occur many times, if between any ten minute period more than 10 alarms exist that period is in flood and if it overlaps with other flood periods the total time span is the total flood event duration
In the attached example
Since 11:00am - 11:10am , 11:10am - 11:14am , and 11:11:42am - 11:21:40am are all in flood and overlap the entire time span of 11:00am to 11:21:40am would be the total flood event duration.
The next flood period can start anytime after the last and a flood event duration can be less than 10min. So if between 11:27am-11:37am 10=< alarms set with the last alarm setting at 11:32am and no other alarms setting until 11:42:01am the total flood event duration would be from 11:27am - 11:32am.
I hope that made more sense and once again I really appreciate your time.
Hi there. I think I've nailed it down. Please download this file
https://1drv.ms/u/s!ApyQEauTSLtO6nD1kk5LMOFywpOi
and have a careful look. The code could be streamlined. Some steps are not totally necessary. You can clean it up if you want. Just try to test the M code on a dataset that's big enough to simulate real conditions.
I've used the data from Sample2. The end result is correct but you have to try this code out on other sets.
Best
Darek
Works perfectly, thank you Darek.
While searching the forum I found others that were doing similar work with alarm systems so I'll be sure to upload the final pbi file.
Thanks again!
Hi @Anonymous
What is the business rule to group the times?
From the sample data in excel I notice the rows 3-21 are grouped as one
Rows 22 to 24 as one
ROws 25 to 111 as one
How to identify start and end of a group ?
Cheers
CheenuSing