cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
pagaczpagacz
Frequent Visitor

Aggregation by hour:minute for hour:minute:second stamp

Hello there,

 

I have question about aggregation by time stamp.

My time stamp is day:month:year hour:minute:second and I need to show my data after aggregation to day:month:year hour:minute.

Have you any ideas how should I do that?

Sometimes there is only one "value" per minute and next following 59 seconds are 'null'.

 

Thanks in advance 🙂

 

This is dataset for one minute:

 

11.10.2019 08:44:0062,06255358,9375
11.10.2019 08:44:0162,06255358,9375
11.10.2019 08:44:0262,06255358,9375
11.10.2019 08:44:0362,06255358,9375
11.10.2019 08:44:0462,06255358,9375
11.10.2019 08:44:0562,06255358,9375
11.10.2019 08:44:0662,062552,9376220758,9375
11.10.2019 08:44:0762,062552,937558,9375
11.10.2019 08:44:0862,062552,937558,9375
11.10.2019 08:44:0962,062552,937558,9375
11.10.2019 08:44:1062,062552,937558,9375
11.10.2019 08:44:1162,062552,937558,9375
11.10.2019 08:44:1262,062552,937558,9375
11.10.2019 08:44:1362,062552,937558,9375
11.10.2019 08:44:1462,062552,937558,9375
11.10.2019 08:44:1562,062552,937558,9375
11.10.2019 08:44:1662,062552,937558,9375
11.10.2019 08:44:1762,062552,937558,9375
11.10.2019 08:44:1862,062552,937558,9375
11.10.2019 08:44:1962,062552,937558,9375
11.10.2019 08:44:2062,062552,882812558,9375
11.10.2019 08:44:2162,062552,8750076358,9375
11.10.2019 08:44:2262,062552,87558,9375
11.10.2019 08:44:2362,062552,87558,9375
11.10.2019 08:44:2462,062552,87558,9375
11.10.2019 08:44:2562,062552,87558,9375
11.10.2019 08:44:2662,062552,87558,9375
11.10.2019 08:44:2762,062552,87558,9375
11.10.2019 08:44:2862,062552,87558,9375
11.10.2019 08:44:2962,062552,87558,9375
11.10.2019 08:44:3062,062552,87558,99993896
11.10.2019 08:44:3162,062552,87559
11.10.2019 08:44:3262,062552,87559
11.10.2019 08:44:3362,062552,87559
11.10.2019 08:44:3462,062552,87559
11.10.2019 08:44:3562,062552,8126220759
11.10.2019 08:44:3662,062552,812559
11.10.2019 08:44:3762,062552,812559
11.10.2019 08:44:3862,062552,812559
11.10.2019 08:44:3962,062552,812559
11.10.2019 08:44:4062,062552,812559
11.10.2019 08:44:4162,062552,812559
11.10.2019 08:44:4262,062552,812559
11.10.2019 08:44:4362,062552,812559
11.10.2019 08:44:4462,062552,812559
11.10.2019 08:44:4562,062552,812559
11.10.2019 08:44:4662,062552,812559
11.10.2019 08:44:4762,062552,812559
11.10.2019 08:44:4862,062552,812559
11.10.2019 08:44:4962,062552,812559
11.10.2019 08:44:5062,062552,812559
11.10.2019 08:44:5562,062552,812559
11.10.2019 08:44:5662,062552,812559,06054688
11.10.2019 08:44:5762,062552,812559,0625
11.10.2019 08:44:5862,062552,812559,0625
11.10.2019 08:44:5962,062552,812559,00006104

 

1 ACCEPTED SOLUTION
pagaczpagacz
Frequent Visitor

Ok, I think I have my answer.

 

I just split my date stamp by 3 signs from right and I got column like day:month:year hour:minute:00 and aggregation made automatically at visualisation level (only thing is to change from sum to avg).

 

So simple 🙂 

View solution in original post

1 REPLY 1
pagaczpagacz
Frequent Visitor

Ok, I think I have my answer.

 

I just split my date stamp by 3 signs from right and I got column like day:month:year hour:minute:00 and aggregation made automatically at visualisation level (only thing is to change from sum to avg).

 

So simple 🙂 

View solution in original post

Helpful resources

Announcements
PBI User Groups

Welcome to the User Group Public Preview

Check out new user group experience and if you are a leader please create your group!

MBAS on Demand

Microsoft Business Applications Summit sessions

On-demand access to all the great content presented by the product teams and community members! #MSBizAppsSummit #CommunityRocks

MBAS Attendee Badge

Claim Your Badge & Digital Swag!

Check out how to claim yours today!

secondImage

Are You Ready?

Test your skills now with the Cloud Skills Challenge.

Top Solution Authors