cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Anonymous
Not applicable

Sum based on date range

Hello, I am currently having an issue with calculating a sum based on the adjacent case in PowerQuery M.

 

I want to group the Column "Value" based on the the rows where Start_CW date and END_CW date is in between the of the range of columns Valid_from date and Valid_to date. 

 

Thanks in advance, 

best Regards

 

IndexCWStart_CWEND_CWValid_fromValid_toValue
15012/6/202112/12/202112/14/202112/25/202115
15112/13/202112/19/202112/14/202112/25/202120
15212/20/202112/26/202112/14/202112/25/202125
1 ACCEPTED SOLUTION
edhans
Super User
Super User

None of your start/end dates are between the valid from/to dates, so nothing will be returned.

I added some rows where it was valid, but it returns that for everything.

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTI1ABKGRvpm+kYGRoYQNhAhcUyQOEamcI6pUqwO1AiYSmNkbZaEzTAyQJhhBJU0QFZpRoQZSO4whqpE0WZOhF8g7gA5wdQUmxkWhM0AOiMWAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Index = _t, CW = _t, Start_CW = _t, END_CW = _t, Valid_from = _t, Valid_to = _t, Value = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Valid_from", type date}, {"Valid_to", type date}, {"Value", Int64.Type}, {"Start_CW", type date}, {"END_CW", type date}}),
    GroupedSum =
        Table.AddColumn(
            #"Changed Type",
            "Grouped Sum",
            each
            List.Sum(
                Table.SelectRows(
                    #"Changed Type",
                    each [Start_CW] >= [Valid_from] and [END_CW] <= [Valid_to]
                )[Value]
            )
        )
in
    GroupedSum

 

But I am sure that is not what you want.

edhans_0-1638472908930.png

 

Can you provide some valid data, as well as an example of the expected output?

By the way, this will NOT perform well on large datasets. Power Query isn't set up to scan tables like this. A few hundred rows will be no problem A few thousand will be noticable. Over 10K-15K records and it will be pain. Over 100K and it won't finish. This is best done in DAX.

How to get good help fast. Help us help you.

How To Ask A Technical Question If you Really Want An Answer

How to Get Your Question Answered Quickly - Give us a good and concise explanation
How to provide sample data in the Power BI Forum - Provide data in a table format per the link, or share an Excel/CSV file via OneDrive, Dropbox, etc.. Provide expected output using a screenshot of Excel or other image. Do not provide a screenshot of the source data. I cannot paste an image into Power BI tables.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

View solution in original post

1 REPLY 1
edhans
Super User
Super User

None of your start/end dates are between the valid from/to dates, so nothing will be returned.

I added some rows where it was valid, but it returns that for everything.

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTI1ABKGRvpm+kYGRoYQNhAhcUyQOEamcI6pUqwO1AiYSmNkbZaEzTAyQJhhBJU0QFZpRoQZSO4whqpE0WZOhF8g7gA5wdQUmxkWhM0AOiMWAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Index = _t, CW = _t, Start_CW = _t, END_CW = _t, Valid_from = _t, Valid_to = _t, Value = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Valid_from", type date}, {"Valid_to", type date}, {"Value", Int64.Type}, {"Start_CW", type date}, {"END_CW", type date}}),
    GroupedSum =
        Table.AddColumn(
            #"Changed Type",
            "Grouped Sum",
            each
            List.Sum(
                Table.SelectRows(
                    #"Changed Type",
                    each [Start_CW] >= [Valid_from] and [END_CW] <= [Valid_to]
                )[Value]
            )
        )
in
    GroupedSum

 

But I am sure that is not what you want.

edhans_0-1638472908930.png

 

Can you provide some valid data, as well as an example of the expected output?

By the way, this will NOT perform well on large datasets. Power Query isn't set up to scan tables like this. A few hundred rows will be no problem A few thousand will be noticable. Over 10K-15K records and it will be pain. Over 100K and it won't finish. This is best done in DAX.

How to get good help fast. Help us help you.

How To Ask A Technical Question If you Really Want An Answer

How to Get Your Question Answered Quickly - Give us a good and concise explanation
How to provide sample data in the Power BI Forum - Provide data in a table format per the link, or share an Excel/CSV file via OneDrive, Dropbox, etc.. Provide expected output using a screenshot of Excel or other image. Do not provide a screenshot of the source data. I cannot paste an image into Power BI tables.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Helpful resources

Announcements
Microsoft Build 768x460.png

Microsoft Build is May 24-26. Have you registered yet?

Come together to explore latest innovations in code and application development—and gain insights from experts from around the world.

May 23 2022 epsiode 5 without aka link.jpg

The Power BI Community Show

Welcome to the Power BI Community Show! Jeroen ter Heerdt talks about the importance of Data Modeling.

Power BI Dev Camp Session 22 with aka link 768x460.jpg

Check it out!

Mark your calendars and join us on Thursday, May 26 at 11a PDT for a great session with Ted Pattison!

charticulator_carousel_with_text (1).png

Charticulator Design Challenge

Put your data visualization and design skills to the test! This exciting challenge is happening now through May 31st!