Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Removing Null values before a dataset w/o removing entire rows

I'm working with data where a driver makes a run through a particular section of road each day. The raw data has a date/time, gps coordinates, and a speed but are all aggregated into a single .csv file.

 

DateTime               Lat     Long     Speed

12/25/18 9:00:01    xxx      xxx          25

12/25/18 9:00:02    xxx      xxx          26

...

12/26/18 9:00:01    xxx      xxx          24

12/26/18 9:00:02    xxx      xxx          23

...

 

I've managed to create a new column for each date with the speed data filling the rows but there are many "null" cells I cannot remove.

 

Elapsed Time        12/25/18     12/26/18          ...          ...

0                                25               null

1                                26               null

...

72                             null               24

73                             null               23

...

 

My goal is to collapse the data by shifting my data set up into a shared "elapsed time" like so:

Elapsed Time        12/25/18     12/26/18          ...          ...

0                                25                24

1                                26                23

...

 

I expect each run to take a different amount of time +/- 10 seconds and don't mine having null values after the data set but I need each column to run the data I have wrt the same elapsed time counter. Any suggestions?

1 ACCEPTED SOLUTION

Sure, everything is possible with PowerQuery (it's just a question of if it finishes on time 😉 )

 

Please check enclosed file. It creates a new column if there is a gap for more than one second. There is a parameter "DurationToStartNew" that let's you edit that time interval for the next column to start:

 

image.png

 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

View solution in original post

8 REPLIES 8
Greg_Deckler
Super User
Super User

I took your original table and added these two columns:

 

Date = DATE(YEAR([DateTime]),MONTH([DateTime]),DAY([DateTime]))

Elapsed Time = 
VAR __min = MINX(FILTER(ALL('Table6'),DATE(YEAR([DateTime]),MONTH([DateTime]),DAY([DateTime])) = DATE(YEAR(EARLIER([DateTime])),MONTH(EARLIER([DateTime])),DAY(EARLIER([DateTime])))),[DateTime])
RETURN DATEDIFF(__min,[DateTime],SECOND)

Then I could create a matrix that looked like what you wanted using Elapsed Time for my rows, Date for my Columns and Speed for my Value.

 

I attached the PBIX, you want Page 2, Table6


@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
The Definitive Guide to Power Query (M)

DAX is easy, CALCULATE makes DAX hard...
Anonymous
Not applicable

Does this work within power query as well?

To do this in the query editor, you would first add a new column that extracts that seconds from your datetime-columns:

 

image.png

 

Then transform the DateTime-column to date and pivot on it (with Speed as value and don't summarize in the advanced options).

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Anonymous
Not applicable

Thanks for the prompt responses! Your solution works great in instances when my data starts at the same second but in instances where one data log starts at 5 seconds and another at 15 this solution produces unusable data. I'm not familiar with the M code used within power query but taking the (current time) - (the initial time from each date) would produce an "elapsed time" number.

 

Another solution could be to test that the interval between each datetime entry is one second and having a column count upwards from 0, resetting to 0 (ie, a new column once we create a matrix) when a jump more than say... 5 seconds is detected.

Sorry, but I don't understand your request. Please provide sample data that makes your request more clear.

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Anonymous
Not applicable

DateTime                       Speed (mph)Second

12/16/2018 11:17:17 AM32.3717
12/16/2018 11:17:18 AM33.0818
12/16/2018 11:17:19 AM34.7219
12/16/2018 11:17:20 AM36.6420
12/16/2018 11:17:21 AM38.1421
12/16/2018 11:17:22 AM39.5722
12/16/2018 11:17:23 AM40.8923
12/16/2018 11:17:24 AM42.4624
12/16/2018 11:17:25 AM43.4425
12/16/2018 11:17:26 AM43.0226
12/16/2018 11:17:27 AM42.8627
12/16/2018 11:17:28 AM42.8828
12/16/2018 11:17:29 AM43.7129
12/16/2018 11:17:30 AM44.430
12/16/2018 11:17:31 AM45.2131
12/16/2018 11:17:32 AM45.1432
12/16/2018 11:17:33 AM45.2133
12/16/2018 11:17:34 AM45.934
12/16/2018 11:17:35 AM46.6235
12/16/2018 11:17:36 AM46.1936
12/16/2018 11:17:37 AM47.4537
12/16/2018 11:17:38 AM48.2338
12/16/2018 11:17:39 AM47.2439
12/16/2018 11:17:40 AM48.3640
12/16/2018 11:17:41 AM47.8541
12/16/2018 11:17:42 AM48.7442
12/16/2018 11:17:43 AM48.4743
12/16/2018 11:17:44 AM49.1544
12/16/2018 11:17:45 AM49.4445
12/16/2018 11:17:46 AM49.5946
12/16/2018 11:17:47 AM49.1247
12/16/2018 11:17:48 AM48.7748
12/16/2018 11:17:49 AM48.9449
12/16/2018 11:17:50 AM48.3250
12/16/2018 11:17:51 AM48.6351
12/16/2018 11:17:52 AM47.7652
12/16/2018 11:17:53 AM47.7453
12/16/2018 11:17:54 AM47.2954
12/16/2018 11:17:55 AM47.1855
12/16/2018 11:17:56 AM47.4956
12/16/2018 11:17:57 AM47.2757
12/16/2018 11:17:58 AM47.0758
12/16/2018 11:17:59 AM47.1159
12/16/2018 11:18:00 AM47.510
12/16/2018 11:18:01 AM49.011
12/16/2018 11:18:02 AM48.032
12/16/2018 11:18:03 AM47.833
12/16/2018 11:18:04 AM48.634
12/16/2018 11:18:05 AM47.785
12/16/2018 11:18:06 AM48.566
12/16/2018 11:18:07 AM48.657
12/16/2018 11:18:08 AM48.258
12/16/2018 11:18:09 AM48.549
12/16/2018 11:18:10 AM47.6910
12/16/2018 11:18:11 AM48.0711
12/16/2018 11:18:12 AM48.7212
12/16/2018 11:18:13 AM48.513
12/16/2018 11:18:14 AM48.9914
12/16/2018 11:18:15 AM48.2715
12/16/2018 11:18:16 AM48.0516
12/16/2018 11:18:17 AM48.5617
12/16/2018 11:18:18 AM47.7118
12/16/2018 11:18:19 AM47.4919
12/16/2018 11:18:20 AM47.7420
12/16/2018 11:18:21 AM48.1821
12/16/2018 11:18:22 AM48.8522
12/16/2018 11:18:23 AM49.5323
12/16/2018 11:18:24 AM48.7724
12/16/2018 11:18:25 AM48.9225
12/16/2018 11:18:26 AM49.326
12/16/2018 11:18:27 AM48.1627
12/16/2018 11:18:28 AM49.1928
12/16/2018 11:18:29 AM48.3429
12/16/2018 11:18:30 AM48.8130
12/16/2018 11:18:31 AM49.1231
12/16/2018 11:18:32 AM47.8332
12/16/2018 11:18:33 AM47.9833
12/16/2018 11:18:34 AM47.6934
12/16/2018 11:18:35 AM47.6935
12/16/2018 11:18:36 AM46.8936
12/16/2018 11:18:37 AM46.5737
12/16/2018 11:18:38 AM45.9538
12/16/2018 11:18:39 AM45.3739
12/16/2018 11:18:40 AM46.6440
12/16/2018 11:18:41 AM46.5741
12/16/2018 11:18:42 AM45.8842
12/16/2018 11:18:43 AM46.8243
12/16/2018 11:18:44 AM47.2744
12/16/2018 11:18:45 AM46.7745
12/16/2018 11:18:46 AM47.3346
12/16/2018 11:18:47 AM48.7247
12/16/2018 11:18:48 AM47.7848
12/16/2018 11:18:49 AM48.349
12/16/2018 11:18:50 AM47.8950
12/16/2018 11:18:51 AM47.9851
12/16/2018 11:18:52 AM48.2352
12/16/2018 11:18:53 AM48.953
12/16/2018 11:18:54 AM49.8454
12/16/2018 11:18:55 AM51.1655
12/16/2018 11:18:56 AM50.4956
12/16/2018 11:18:57 AM51.5857
12/16/2018 11:18:58 AM51.0558
12/16/2018 11:18:59 AM52.9759
12/16/2018 11:19:00 AM52.790
12/16/2018 11:19:01 AM52.551
12/16/2018 11:19:02 AM52.992
12/16/2018 11:19:03 AM51.723
12/16/2018 11:19:04 AM53.044
12/16/2018 11:19:05 AM51.765
12/16/2018 11:19:06 AM53.246
12/16/2018 11:19:07 AM53.087
12/16/2018 11:19:08 AM53.228
12/16/2018 11:19:09 AM52.729
12/16/2018 11:19:10 AM53.0210
12/16/2018 11:19:11 AM52.5911
12/16/2018 11:19:12 AM51.8512
12/16/2018 11:19:13 AM51.7613
12/16/2018 11:19:14 AM52.0314
12/16/2018 11:19:15 AM51.6515
12/16/2018 11:19:16 AM51.7616
12/16/2018 11:19:17 AM51.4717
12/16/2018 11:19:18 AM51.4718
12/16/2018 11:19:19 AM51.7919
12/16/2018 11:19:20 AM51.9420
12/16/2018 11:19:21 AM52.2321
12/16/2018 11:19:22 AM52.0522
12/16/2018 11:19:23 AM52.0123
12/16/2018 11:19:24 AM51.4724
12/16/2018 11:19:25 AM52.1225
12/16/2018 11:19:26 AM51.8526
12/16/2018 11:19:27 AM50.8727
12/16/2018 11:19:28 AM51.3628
12/16/2018 11:19:29 AM51.3829
12/16/2018 11:19:30 AM49.7730
12/16/2018 11:19:31 AM52.4631
12/16/2018 11:19:32 AM49.9132
12/16/2018 11:19:33 AM49.5533
12/16/2018 11:19:34 AM50.6734
12/16/2018 11:19:35 AM50.2935
12/16/2018 11:19:36 AM50.3336
12/16/2018 11:19:37 AM50.0237
12/16/2018 11:19:38 AM49.1738
12/16/2018 11:19:39 AM48.5439
12/16/2018 11:19:40 AM46.9840
12/16/2018 11:19:41 AM49.2141
12/16/2018 11:19:42 AM49.9142
12/16/2018 11:19:43 AM49.1243
12/16/2018 11:19:44 AM49.9344
12/16/2018 11:19:45 AM49.0345
12/16/2018 11:19:46 AM49.146
12/16/2018 11:19:47 AM50.8947
12/16/2018 11:19:48 AM51.7948
12/16/2018 11:19:49 AM51.4349
12/16/2018 11:19:50 AM51.3650
12/16/2018 11:19:51 AM52.2851
12/16/2018 11:19:52 AM52.6152
12/16/2018 11:19:53 AM52.353
12/16/2018 11:19:54 AM51.2754
12/16/2018 11:19:55 AM51.255
12/16/2018 11:19:56 AM51.8356
12/16/2018 11:19:57 AM51.6557
12/16/2018 11:19:58 AM51.2758
12/16/2018 11:19:59 AM50.6759
12/16/2018 11:20:00 AM49.950
12/16/2018 11:20:01 AM50.781
12/16/2018 11:20:02 AM49.082
12/16/2018 11:20:03 AM503
12/16/2018 11:20:04 AM50.784
12/16/2018 11:20:05 AM49.885
12/16/2018 11:20:06 AM50.096
12/16/2018 11:20:07 AM50.137
12/16/2018 11:20:08 AM50.558
12/16/2018 11:20:09 AM50.449
12/16/2018 11:20:10 AM50.0910
12/16/2018 11:20:11 AM51.0511
12/16/2018 11:20:12 AM50.0912
12/16/2018 11:20:13 AM49.9713
12/16/2018 11:20:14 AM50.4214
12/16/2018 11:20:15 AM5115
12/16/2018 11:20:16 AM50.2416
12/16/2018 11:20:17 AM49.3917
12/16/2018 11:20:18 AM48.9218
12/16/2018 11:20:19 AM47.9819
12/16/2018 11:20:20 AM47.9620
12/16/2018 11:20:21 AM47.4221

Second would only work if the dataset is less than 60 seconds long. Is there a way to start counting seconds at 0 and continue until there is a large gap in datetime? (a few hours, or the start of the next calendar day?)

Sure, everything is possible with PowerQuery (it's just a question of if it finishes on time 😉 )

 

Please check enclosed file. It creates a new column if there is a gap for more than one second. There is a parameter "DurationToStartNew" that let's you edit that time interval for the next column to start:

 

image.png

 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Anonymous
Not applicable

That works beautifully! Thank you so much for helping.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.