Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
AgoZ_KH
Resolver I
Resolver I

Manage time slots over more days

Hello to all!

My problem today is to calculate an average measure over 4 time slots for a six mounth period of collected data.

 

Time slots are: 0 - 6 am; 7-12 am; 13-18 pm and 19-24 pm

The goal is to filter my data on a six mounth timestamp using and comparing average between this time slots.

As anyone experienced some kind of problem? Thanks in advance for any suggestion

Ago

 

1 ACCEPTED SOLUTION

Hi @AgoZ_KH,

You can extract hour from datetime field and create a calculated column based on if statement conditions, then you can create table visual with new category column and aggregated(average) raw table value fields to achieve your requirement:

Time Range =
IF (
    HOUR ( [operation_timestamp] ) >= 19,
    4,
    IF (
        HOUR ( [operation_timestamp] ) >= 13,
        3,
        IF ( HOUR ( [operation_timestamp] ) >= 7, 2, 1 )
    )
)

Regards,

Xiaoxin Sheng

Community Support Team _ Xiaoxin
If this post helps, please consider accept as solution to help other members find it more quickly.

View solution in original post

6 REPLIES 6
AgoZ_KH
Resolver I
Resolver I

Hello to all!! Thanks to @v-shex-msft ; @lbendlin  & @Greg_Deckler  for your suggestions!

 

I had used some of suggestion for resolve my issue in that way:

 

First I have extracted for each row the Time in a new calculated column: Time = FORMAT(myDB[operation_timestamp],"hh:mm:ss")

Then I have calculated Hour for each row in a second calculated colum: DayTime = HOUR(myDB[Time]) and recorded the data in text format

Third passage a new calculated column usin the DAX command SWITCH: DayQuarter = SWITCH([DayTime],"0","12am - 6am","1","12am - 6am","2","12am - 6am","3","12am - 6am","4","12am - 6am","5","12am - 6am","6","6 am - 12pm","7","6 am - 12pm","8","6 am - 12pm","9","6 am - 12pm","10","6 am - 12pm","11","6 am - 12pm","12","12pm - 6 pm","13","12pm - 6 pm","14","12pm - 6 pm","15","12pm - 6 pm","16","12pm - 6 pm","17","12pm - 6 pm","6pm - 12 am")

 

Obviously I don't think it's the most elegant and simple way to solve my problem; but at the moment it works even if all the suggested approaches should equally achieve the intended goal. 

 

Of course I don't think it's the most elegant and simple way to solve my problem; but at the moment it works even if all the suggested approaches should equally achieve the intended goal.

I obviously reserve the right to go back to the solution to see if I can find an even easier way ...

 

Ago

Greg_Deckler
Super User
Super User

@AgoZ_KH Can you share sample data and expected output? Would make it more clear in my head.


@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
The Definitive Guide to Power Query (M)

DAX is easy, CALCULATE makes DAX hard...

Suppose my table is as shown:

 

Field1Field2Field3operation_timestamp
759,3254103,67433,4527/02/2020 08:00
756,9019104,94814,6927/02/2020 08:00
324,9738102,673421,9827/02/2020 08:01
664,9268105,085914,2827/02/2020 08:01
608,652103,844211,5227/02/2020 08:02
710,9264103,94255,5627/02/2020 08:02
236,5321104,153619,2827/02/2020 08:02
272,6447102,082624,6927/02/2020 08:03
305,73103,346422,427/02/2020 08:04
399,346398,9246630,1327/02/2020 08:04
301,5051103,640620,827/02/2020 08:05
233,414762,07999192,5927/02/2020 08:05
750,2224104,17675,1628/02/2020 14:00
790,4665103,9655,3128/02/2020 14:00
712,0485105,537313,9328/02/2020 14:01
370,0733103,512524,3328/02/2020 14:01
388,4965104,084810,7128/02/2020 14:02
259,4267103,763617,5928/02/2020 14:02
722,3729104,44625,0728/02/2020 14:03
269,6838100,055426,8628/02/2020 14:03
340,3696103,854124,728/02/2020 14:04
844,4793104,33284,6529/02/2020 21:00
772,1326104,31255,2129/02/2020 21:00
677,9255104,61513,1229/02/2020 21:01
323,4487103,200524,0529/02/2020 21:01
353,381103,891417829/02/2020 21:02
218,8718104,146615,0529/02/2020 21:02
706,2962104,21355,4329/02/2020 21:03
250,2747100,405725,9929/02/2020 21:03
320,1574103,625322,6829/02/2020 21:04

 

I need to classify each row with a Time_slot number, i.e. 1 for hours between 0-6; 2  between 7-12; 3 between 13-18; 4 between 19-24, so I can in the report show the average for each slot (and field):

TimeslotAverage F1
2464,5733
3511,4622
4496,3297

 

Like suggested the user Ibedlin I need probably a calculated column for each row with the Timeslot number based on the operation_timestamp column.

Hope this is more clear ...

 

Ago

Hi @AgoZ_KH,

You can extract hour from datetime field and create a calculated column based on if statement conditions, then you can create table visual with new category column and aggregated(average) raw table value fields to achieve your requirement:

Time Range =
IF (
    HOUR ( [operation_timestamp] ) >= 19,
    4,
    IF (
        HOUR ( [operation_timestamp] ) >= 13,
        3,
        IF ( HOUR ( [operation_timestamp] ) >= 7, 2, 1 )
    )
)

Regards,

Xiaoxin Sheng

Community Support Team _ Xiaoxin
If this post helps, please consider accept as solution to help other members find it more quickly.

Here's how your table "Buckets" would look like after the transform in Power Query

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("dZRNbiwxCISv8jRrpPCP/a4S5f7XCI1xL9Lu1Vgjfw1FFf7+/oRNEDb9wIdQwEMljwJq+cPxhfzFyPgPx3/Ezw9chMNEmkUoTB2URwWfb4RwXgsZRXDWkKscE8zxRKgQ90TYF2KAw6qeAr8jOMCNW8hQrSNB/fWH4KWE8CqytU/lS7SB+RvB4mDC1NrJ5LpK89hWI5GKVaPF4+D6+nlesuaVikO6K9FqkBn0eV/X/TmvaxcxRypSv0oIAskrgjkYNNq2K1ZXCAcd1tIzFFQ6PGXEnOXIZLCDEOuoIDCz7nGFRw2Yqti4CdI7XBMh27dtiS9HhF4BymZ0WAfFZA0uUXkiKygSmP3LHrBR2Z6OyDsyBujstjRN1JVMiENj7XsulrJHVwlfUYke1wmJNFmC92apOpd6jCexksI+wUdvVqqyWmR2GIcJd7gUQXz63hNTWuoPRVZWcpVAY0q3JVJhzwDX2OZNMN2mZOZJ2DdBvVi1OEfCIzK4tgfsZMtF4ifRlnDmUceeLyO2i3hoqxHLdRo782NSBTPG835bSANG0NjxXXtFdizRFqIDT+dGmGQpr1f1D9EWXjsS/T5k+NHKCIParyMiuadkcb/ZbLJeCD9ISQ9/fgE=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Field1 = _t, Field2 = _t, Field3 = _t, operation_timestamp = _t]),
    #"Split Column by Delimiter" = Table.SplitColumn(Source, "operation_timestamp", Splitter.SplitTextByDelimiter("/", QuoteStyle.Csv), {"operation_timestamp.1", "operation_timestamp.2", "operation_timestamp.3"}),
    #"Added Custom" = Table.AddColumn(#"Split Column by Delimiter", "Custom", each [operation_timestamp.2] & "/" & [operation_timestamp.1] & "/" & [operation_timestamp.3]),
    #"Changed Type" = Table.TransformColumnTypes(#"Added Custom",{{"Custom", type datetime}}),
    #"Removed Other Columns" = Table.SelectColumns(#"Changed Type",{"Field1", "Field2", "Field3", "Custom"}),
    #"Replaced Value" = Table.ReplaceValue(#"Removed Other Columns",",","",Replacer.ReplaceText,{"Field1", "Field2", "Field3"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Replaced Value",{{"Field1", Int64.Type}, {"Field2", Int64.Type}, {"Field3", Int64.Type}}),
    #"Added Custom1" = Table.AddColumn(#"Changed Type1", "Bucket", each  Number.RoundDown(4* Number.From(Time.From([Custom]))))
in
    #"Added Custom1"

 

And then the calculated column would be

 

DayQuarter = SWITCH(Buckets[Bucket],"0","12am - 6am","1","6 am - 12pm","2","12pm - 6 pm","6pm - 12 am")
 
(or you can add that to the Power Query transforms)

 

lbendlin
Super User
Super User

Assuming you have a datetime field for your data you want to do two things

 

1. Create a calculated column that only uses the date value of your field.  Then connect that date field to your calendar table

2. Create another calculated column that creates the quarter day buckets. For example

 

DayQuarter =MOD([datetimefield]*4,4)

 

This will give you buckets of 0,1,2 and 3 which you can then map to your time ranges

 

DayQuarter = SWITCH(MOD([datetimefield]*4,4),0,"12am - 6am",1,"6 am - 12pm",2,"12pm - 6 pm","6pm - 12 am"

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.