Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Linnil
Helper II
Helper II

Group By Based on Sets of Dates / MIN Date

Hi Everyone

 

I have timesheets and Employees who take Leave, and the Leave might stop and start in the same pay period.
In my example, the EMP has taken the Leave in 2 "blocks".

 

What I have What I would like
EMP NOLeave TypeHoursDate EMP NOLeave TypeHoursDateStart DateEnd Date
105586815/01/2024 10558631 15/01/202418/01/2024
105586716/01/2024       
105586817/01/2024       
105586818/01/2024       
1055  19/01/2024       
1055  20/01/2024       
105586821/01/2024 10558640 21/01/202425/01/2024
105586822/01/2024       
105586823/01/2024       
105586824/01/2024       
105586825/01/2024       
1055  26/01/2024       

 

I am running a Group By Based on Emp and Leave Type and Min Date.
However my Min Date only gives one Min Dates being 15Jan24.
1) So if I can get 2 Min Dates that would make the Group By work better.


Because the Group By only finds one Min Date, I get 71 Hrs and it can count that there were 9 days (I just gave each date line value 1 and summed) BUT it adds 9 days to Min Date so I get Max Date as 23Jan24.

Hopefully you get the idea.
I can't remove dates as they are being used by other columns in the table.

I coud perhaps "short circuit" the block of dates with a dummy Leave Type Code, so if the code changes, the Min Date resets and looks for the "next" Leave Type code again.

I can get my PQ code to work perfectly if the dates are continuous, but of course it's never that easy ...

Any help appreciated - TIA

PS "What I would like" the empty rows are not necessary, only for explaining my problem

1 ACCEPTED SOLUTION
dufoq3
Super User
Super User

Hi @Linnil, try grouping with 4th argument GroupKind.Local.

 

Comment: I'm not sure if you want to group also by Leave Type or not.

 

Result

dufoq3_0-1711093438256.png

v1 GroupKind.Local

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjQwNVXSUbIwAxFAbGiqb2Cob2RgZKIUq4MqbQ6SNsMpDdZtjl/aApu0AhQbWuKTNTLAa7SRIX5pI/zSxvilTfBLYw0zuMORgiwWAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"EMP NO" = _t, #"Leave Type" = _t, Hours = _t, Date = _t]),
    ChangedType = Table.TransformColumnTypes(Source,{{"Date", type date}, {"Hours", Int64.Type}}),
    GroupedRowsLocal = Table.Group(ChangedType, {"EMP NO", "Leave Type"}, {{"Hours", each List.Sum([Hours]), type nullable number}, {"Start Date", each List.Min([Date]), type nullable date}, {"End Date", each List.Max([Date]), type nullable date}}, GroupKind.Local),
    FilteredRows = Table.SelectRows(GroupedRowsLocal, each ([Hours] <> null))
in
    FilteredRows

 

v2 GroupKind.Global (if you don't want to use v1 with GroupKind.Local for some reason, you can check this one with default grouping method)

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjQwNVXSUbIwAxFAbGiqb2Cob2RgZKIUq4MqbQ6SNsMpDdZtjl/aApu0AhQbWuKTNTLAa7SRIX5pI/zSxvilTfBLYw0zuMORgiwWAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"EMP NO" = _t, #"Leave Type" = _t, Hours = _t, Date = _t]),
    ChangedType = Table.TransformColumnTypes(Source,{{"Date", type date}, {"Hours", Int64.Type}}),
    AddedIndex = Table.AddIndexColumn(ChangedType, "Index", 0, 1, Int64.Type),
    Ad_GroupHelper = Table.AddColumn(AddedIndex, "GroupHelper", each if [Hours] = null then [Index] else null, Int64.Type),
    FilledUp = Table.FillUp(Ad_GroupHelper,{"GroupHelper"}),
    FilteredRows = Table.SelectRows(FilledUp, each ([Hours] <> null)),
    GroupedRows = Table.Group(FilteredRows, {"EMP NO", "GroupHelper"}, {{"Hours", each List.Sum([Hours]), type nullable number}, {"Start Date", each List.Min([Date]), type nullable date}, {"End Date", each List.Max([Date]), type nullable date}})
in
    GroupedRows

 


Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

View solution in original post

4 REPLIES 4
dufoq3
Super User
Super User

Hi @Linnil, try grouping with 4th argument GroupKind.Local.

 

Comment: I'm not sure if you want to group also by Leave Type or not.

 

Result

dufoq3_0-1711093438256.png

v1 GroupKind.Local

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjQwNVXSUbIwAxFAbGiqb2Cob2RgZKIUq4MqbQ6SNsMpDdZtjl/aApu0AhQbWuKTNTLAa7SRIX5pI/zSxvilTfBLYw0zuMORgiwWAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"EMP NO" = _t, #"Leave Type" = _t, Hours = _t, Date = _t]),
    ChangedType = Table.TransformColumnTypes(Source,{{"Date", type date}, {"Hours", Int64.Type}}),
    GroupedRowsLocal = Table.Group(ChangedType, {"EMP NO", "Leave Type"}, {{"Hours", each List.Sum([Hours]), type nullable number}, {"Start Date", each List.Min([Date]), type nullable date}, {"End Date", each List.Max([Date]), type nullable date}}, GroupKind.Local),
    FilteredRows = Table.SelectRows(GroupedRowsLocal, each ([Hours] <> null))
in
    FilteredRows

 

v2 GroupKind.Global (if you don't want to use v1 with GroupKind.Local for some reason, you can check this one with default grouping method)

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjQwNVXSUbIwAxFAbGiqb2Cob2RgZKIUq4MqbQ6SNsMpDdZtjl/aApu0AhQbWuKTNTLAa7SRIX5pI/zSxvilTfBLYw0zuMORgiwWAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"EMP NO" = _t, #"Leave Type" = _t, Hours = _t, Date = _t]),
    ChangedType = Table.TransformColumnTypes(Source,{{"Date", type date}, {"Hours", Int64.Type}}),
    AddedIndex = Table.AddIndexColumn(ChangedType, "Index", 0, 1, Int64.Type),
    Ad_GroupHelper = Table.AddColumn(AddedIndex, "GroupHelper", each if [Hours] = null then [Index] else null, Int64.Type),
    FilledUp = Table.FillUp(Ad_GroupHelper,{"GroupHelper"}),
    FilteredRows = Table.SelectRows(FilledUp, each ([Hours] <> null)),
    GroupedRows = Table.Group(FilteredRows, {"EMP NO", "GroupHelper"}, {{"Hours", each List.Sum([Hours]), type nullable number}, {"Start Date", each List.Min([Date]), type nullable date}, {"End Date", each List.Max([Date]), type nullable date}})
in
    GroupedRows

 


Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

Hi @dufoq3 - thanks for taking time to look at this.
v1 worked fine - I had actually got rid of the Null values for my hours earlier (though I showed it in my example) so I reinstated null hour values and it worked very well.

FYI Actually I find I have to group on 4 fields or more, so to save any pain, I just merge the fields I need to group by with a delimited, do my group by, and then split them by delimited when I have my grouping solutions. Makes group by a lot easier when there's just one field (and I'm always nervous grouping by more than one field, that's just me!)

Thanks again and best wishes 🙂

wdx223_Daniel
Super User
Super User

NewStep=Table.Group(WhatYouHave,{"EMP NO","Leave Type"},{{"Hours",each List.Sum([Hours])},{"Start Date",each List.Min(List.RemoveNulls([Date]))},{"End Date",each List.Max(List.RemoveNulls([Date]))}},0,(x,y)=>Byte.From(x[EMP NO]<>y[EMP NO] or y[Leave Type]=null))

Hi @wdx223_Daniel  Thanks for your input. I tried that solution - it gave me a total of 71 hours and Start Date 15Jan24 End Date 26Jan24. That was the point I got stuck on myself when I was coding.
The solutions above captured the break in the days / 2 lots of sub totals.
Thanks again

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors