Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Impossible in PowerBi? Counting number of occurrences within a dynamic time range

I'm quite new to PowerBi and coding in general and am completely stumped. I'm hoping I can get some help to put me on the correct path.

 

I'm working with a SQL db that contains alarm information. One of the requested KPIs is "alarm flooding" 

 

An alarm flood has been defined by ANSI/ISA 18.2 as being 10 or more annunciated alarms in any 10-minute period.

 

I am trying to report the time an alarm flood begins, the end time and the duration if the flood lasts longer than 10 minutes. As well as all the alarm names and timestamps involved in each flood period.

 

The data contains two columns used to calculate alarm flooding. Alarm name ie PT-0001 and a timestamp when the alarm occured 8/2/2019 6:26:32 AM

 

My initial thought was to create a date/time table and divide the time into 10 minute bins. However, this didn't work since the time bins are not always the same 10 minute spans. 

 

I was thinking this might be possible using DAX but I have been unsucessful in implementing it.

Something like organize the data by time, add 10 minutes to the first entries time stamp and count the number of occurences until the created time stamp matches the current row's timestamp. If more than 10 occurences are within the 10min window that would be a flood. If the next 10 minutes after the created timestamp also contains more than 10 alarms that would be the same flood period.  If it doesn't then the flood ended at the last alarms time. 

 

I've spent an embarassing amount of time on this, I appreciate any help given!

 

sample data attached along with desired output

Sample Data & Output

 

	
Timestamp	Tag
7/1/2019 1:32:33 AM	PT-0001
7/1/2019 1:32:33 AM	PT-0002
7/1/2019 1:32:33 AM	PT-1023
7/1/2019 1:32:33 AM	TT-1111
7/1/2019 1:32:33 AM	TT-1112
7/1/2019 1:32:34 AM	PT-2222
7/1/2019 1:32:34 AM	xx-xxxx
7/1/2019 1:32:34 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 1:32:35 AM	xx-xxxx
7/1/2019 2:32:35 AM	xx-xxxx
7/1/2019 2:33:35 AM	xx-xxxx
7/1/2019 2:37:45 AM	xx-xxxx
7/1/2019 6:42:12 AM	xx-xxxx
7/1/2019 6:42:12 AM	xx-xxxx
7/1/2019 6:43:35 AM	xx-xxxx
7/1/2019 6:44:04 AM	xx-xxxx
7/1/2019 6:44:04 AM	xx-xxxx
7/1/2019 6:44:35 AM	xx-xxxx
7/1/2019 6:44:40 AM	xx-xxxx
7/1/2019 6:44:40 AM	xx-xxxx
7/1/2019 6:44:40 AM	xx-xxxx
7/1/2019 6:44:48 AM	xx-xxxx
7/1/2019 6:44:48 AM	xx-xxxx
7/1/2019 6:44:48 AM	xx-xxxx
7/1/2019 6:44:54 AM	xx-xxxx
7/1/2019 6:44:54 AM	xx-xxxx
7/1/2019 6:45:00 AM	xx-xxxx
7/1/2019 6:45:05 AM	xx-xxxx
7/1/2019 6:45:19 AM	xx-xxxx
7/1/2019 6:45:22 AM	xx-xxxx
7/1/2019 6:45:22 AM	xx-xxxx
7/1/2019 6:45:22 AM	xx-xxxx
7/1/2019 6:45:22 AM	xx-xxxx
7/1/2019 6:45:22 AM	xx-xxxx
7/1/2019 6:45:23 AM	xx-xxxx
7/1/2019 6:45:26 AM	xx-xxxx
7/1/2019 6:45:26 AM	xx-xxxx
7/1/2019 6:45:29 AM	xx-xxxx
7/1/2019 6:45:29 AM	xx-xxxx
7/1/2019 6:45:29 AM	xx-xxxx
7/1/2019 6:45:29 AM	xx-xxxx
7/1/2019 6:45:31 AM	xx-xxxx
7/1/2019 6:45:33 AM	xx-xxxx
7/1/2019 6:46:00 AM	xx-xxxx
7/1/2019 6:46:00 AM	xx-xxxx
7/1/2019 6:46:00 AM	xx-xxxx
7/1/2019 6:46:00 AM	xx-xxxx
7/1/2019 6:46:09 AM	xx-xxxx
7/1/2019 6:46:09 AM	xx-xxxx
7/1/2019 6:46:09 AM	xx-xxxx
7/1/2019 6:46:09 AM	xx-xxxx
7/1/2019 6:46:13 AM	xx-xxxx
7/1/2019 6:46:13 AM	xx-xxxx
7/1/2019 6:46:13 AM	xx-xxxx
7/1/2019 6:46:13 AM	xx-xxxx
7/1/2019 6:46:13 AM	xx-xxxx
7/1/2019 6:46:13 AM	xx-xxxx
7/1/2019 6:46:13 AM	xx-xxxx
7/1/2019 6:46:27 AM	xx-xxxx
7/1/2019 6:46:28 AM	xx-xxxx
7/1/2019 6:46:28 AM	xx-xxxx
7/1/2019 6:46:28 AM	xx-xxxx
7/1/2019 6:46:28 AM	xx-xxxx
7/1/2019 6:46:28 AM	xx-xxxx
7/1/2019 6:46:29 AM	xx-xxxx
7/1/2019 6:46:30 AM	xx-xxxx
7/1/2019 6:46:30 AM	xx-xxxx
7/1/2019 6:48:29 AM	xx-xxxx
7/1/2019 6:48:29 AM	xx-xxxx
7/1/2019 6:48:29 AM	xx-xxxx
7/1/2019 6:48:29 AM	xx-xxxx
7/1/2019 6:48:29 AM	xx-xxxx
7/1/2019 6:48:29 AM	xx-xxxx
7/1/2019 6:49:09 AM	xx-xxxx
7/1/2019 6:49:49 AM	xx-xxxx
7/1/2019 6:49:49 AM	xx-xxxx
7/1/2019 6:50:05 AM	xx-xxxx
7/1/2019 6:50:05 AM	xx-xxxx
7/1/2019 6:50:05 AM	xx-xxxx
7/1/2019 6:50:05 AM	xx-xxxx
7/1/2019 6:50:06 AM	xx-xxxx
7/1/2019 6:50:07 AM	xx-xxxx
7/1/2019 6:50:07 AM	xx-xxxx
7/1/2019 6:50:07 AM	xx-xxxx
7/1/2019 6:50:09 AM	xx-xxxx
7/1/2019 6:50:09 AM	xx-xxxx
7/1/2019 6:50:09 AM	xx-xxxx
7/1/2019 6:50:30 AM	xx-xxxx
7/1/2019 6:50:31 AM	xx-xxxx
7/1/2019 6:50:32 AM	xx-xxxx
7/1/2019 6:51:19 AM	xx-xxxx
7/1/2019 6:51:58 AM	xx-xxxx
7/1/2019 6:51:58 AM	xx-xxxx
7/1/2019 6:51:58 AM	xx-xxxx
7/1/2019 6:51:58 AM	xx-xxxx
7/1/2019 6:51:59 AM	xx-xxxx
7/1/2019 6:51:59 AM	xx-xxxx
7/1/2019 6:59:19 AM	xx-xxxx
7/1/2019 6:59:20 AM	xx-xxxx

output:

Begin	                     End	              Alarm Count         Duration
7/1/2019 1:32:32 AM	7/1/2019 1:32:34 AM     	19	           0:00:02
7/1/2019 6:42:12 AM	7/1/2019 6:59:20 AM	        87	           0:17:08
1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi there. I think I've nailed it down. Please download this file

 

https://1drv.ms/u/s!ApyQEauTSLtO6nD1kk5LMOFywpOi

 

and have a careful look. The code could be streamlined. Some steps are not totally necessary. You can clean it up if you want. Just try to test the M code on a dataset that's big enough to simulate real conditions.

 

I've used the data from Sample2. The end result is correct but you have to try this code out on other sets.

 

Best
Darek

View solution in original post

7 REPLIES 7
Anonymous
Not applicable

In Power Query:


1. Group entries by datetime and for each datetime count the number of alarms. So, now you'll have a table
with unique datetimes and the count of alarms at this datetime (2 columns).
2. Then create another column that will store the end of the 10 min period (EndDatetime = StartDatetime + 10 mins).
3. Cross-join the table with itself (you have to create a copy if it in PQ) and for each row from the
first table leave only those rows from the other where the StartDatetime is within the bounds. So,
if T1 and T2 are copies of the summarized table, then for each row in T1 you have to grab all the
rows from T2 where T1[StartDatetime] <= T2[StartDatetime] <= T1[EndDatetime] and sum up all the counts
that come from T2. This way for each distinct StartDatetime you'll have the counts of alarms that
fall into the interval [StartDatetime, StartDatetime + 10 mins]. Name the column with the counts [Alarm Counts].
4. Once you have this, filter out the rows where [Alarm Counts] < 10.
5. So, now you have all the intervals that are potentially "alarm flooding."
6. You now have to get rid of the intervals/rows that are overlapping with others, so that only disjoint intervals remain.

The last point can, of course, be also carried out but you have to think about how to do it.
I'll think about it as well. I just need to find a good and reliable way to do it in M. But it's feasible 100%.

 

Best

Darek

Anonymous
Not applicable

Very smart.  Thank you so much. I think this will work. I'll update once I implement it

Anonymous
Not applicable

I've almost done it but unfortunately, your explanation is not entirely clear. You should give a simple example (not the one above since it explains almost nothing about the algorithm) and show manually (in slow motion, so to speak) what you want to achieve, step by step.

For instance, it's not clear what you want to do when there are 2 consecutive datetimes which together have 10 entries and are 1 minute apart. If there is another datetime that's 1 minute after the latest of the two entries then does it potentially start another alarm flooding or it's only after 10 minutes from the first datetime that we can have another flooding? This is really not clear....

Best
Darek
Anonymous
Not applicable

Apologies and thank you again for your help, let me try to explain a bit better.

 

The ultimate goal is to find the total flood event duration this duration can be greater or less than 10 minutes.

 

For example if the first alarm starts at 11:00am and between 11:00am-11:10am 11 alarms are set with the last alarm at 11:10am the time span of 11:00am-11:10am would be considered in flood. 

 

If between 11:01:42am-11:11:42am 9 alarms are set then this span would not be in flood.

 

However, if between 11:10am-11:20am there are 10 alarms with the last alarm at 11:14am the timespan 11:10am - 11:14 am is in flood.

 

This can occur many times, if between any ten minute period more than 10 alarms exist that period is in flood and if it overlaps with other flood periods the total time span is the total flood event duration 

 

In the attached example

Since  11:00am - 11:10am , 11:10am - 11:14am , and 11:11:42am - 11:21:40am are all in flood and overlap the entire time span of 11:00am to 11:21:40am would be the total flood event duration.

 

 

The next flood period can start anytime after the last and a flood event duration can be less than 10min.  So if between 11:27am-11:37am 10=< alarms set with the last alarm setting at 11:32am and no other alarms setting until 11:42:01am the total flood event duration would be from 11:27am - 11:32am.

 

I hope that made more sense and once again I really appreciate your time.

 

Sample2

Anonymous
Not applicable

Hi there. I think I've nailed it down. Please download this file

 

https://1drv.ms/u/s!ApyQEauTSLtO6nD1kk5LMOFywpOi

 

and have a careful look. The code could be streamlined. Some steps are not totally necessary. You can clean it up if you want. Just try to test the M code on a dataset that's big enough to simulate real conditions.

 

I've used the data from Sample2. The end result is correct but you have to try this code out on other sets.

 

Best
Darek

Anonymous
Not applicable

Works perfectly, thank you Darek. 

 

While searching the forum I found others that were doing similar work with alarm systems so I'll be sure to upload the final pbi file.

 

Thanks again!

CheenuSing
Community Champion
Community Champion

Hi @Anonymous 

 

 

What is the business rule to group the times?

 

From the sample data in excel I notice the rows 3-21 are grouped as one

Rows 22 to 24 as one

ROws 25 to 111 as one

 

How to identify start and end of a group ?

 

 

Cheers

 

CheenuSing

 

 

Did I answer your question? Mark my post as a solution and also give KUDOS !

Proud to be a Datanaut!

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.