Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
AnnaSA
Helper I
Helper I

RUNNING SUM BY DATE BY GROUP BY LANE

Hi everyone,

I hope you are doing well.

I am struggling a lot with the following problem and will appreciate any help SO MUCH!!

I want to calculate the running sum of PICK.LOAD_DIFF by hour by date by store by lane.

The confirmation_time.hour is the date and hour group (ex 26/10/2019 18:00:00)

The calculated column I have:

 
 
CummTotal = CALCULATE(SUM(CROSS_TABLE_2[PICK.LOAD_DIFF]),
FILTER(CROSS_Table,
CROSS_TABLE_2[RANK_LANE] = EARLIER(CROSS_TABLE_2[RANK_LANE])
&& CROSS_TABLE_2[RANK_DSTGROUP] = EARLIER(CROSS_TABLE_2[RANK_DSTGROUP])
&& CROSS_TABLE_2[CONFIRMATION_TIME.HOUR] <= EARLIER(CROSS_TABLE_2[CONFIRMATION_TIME.HOUR])
))


This is the dax that I tried, but it is just returning the PICK.LOAD_DIFF that I'm trying to calculate the running sum on. 

I will really appreciate any help so much.

3 ACCEPTED SOLUTIONS

@AnnaSA look in the attached solution, a table called Table (ignore other tables in pbix) that contains your sample data and table visual showing running total, I did it two ways, so you can pick and choose.

 

I would  Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos whoever helped to solve your problem. It is a token of appreciation!

 

 



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

View solution in original post

Hi @AnnaSA 

 

Kindly check below results, pbix attached.

Measure = CALCULATE(SUM('Table'[PICK.LOAD_DIFF]),FILTER(ALL('Table'),[CONFIRMATION_TIME.HOUR]<=MAX('Table'[CONFIRMATION_TIME.HOUR])&&[Hour]<=MAX([Hour])),VALUES('Table'[LANE_PICKING]),VALUES('Table'[DSTGRP   ]))

6.PNG

 

Community Support Team _ Dina Ye
If this post helps, then please consider Accept it as the solution to help the other members find it more
quickly.

View solution in original post

Hi @AnnaSA 

 

Please use this one:

Column = CALCULATE(SUM('Table'[PICK.LOAD_DIFF]),FILTER(ALLEXCEPT('Table','Table'[LANE_PICKING],'Table'[DSTGRP   ]),[CONFIRMATION_TIME.HOUR]<=EARLIER('Table'[CONFIRMATION_TIME.HOUR])&&[Hour]<=EARLIER([Hour])))

8.PNG

Pbix attached.

Community Support Team _ Dina Ye
If this post helps, then please consider Accept it as the solution to help the other members find it more
quickly.

View solution in original post

9 REPLIES 9
parry2k
Super User
Super User

@AnnaSA it will be helpful if you put sample data in excel file with expected output and share it here and that can be use to put together a solution.



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

Hi @parry2k 

I apologize, I am new to PowerBI and this community.

Will this help?

So, I have multiple stores. One store can be assigned to 2 lanes and there can be more than one store in a lane. There are units going in and out throughout the day. And I have to calculate how many units was in a specific hour group on a day in a specific lane for a specific store.

I will truly appreciate any help. 

Please let me know if you need more information?

DATEHOURSTORELANEINOUTDIFFCUMTOTAL
02/02/202008:00:00STORE 1LANE A3211
02/02/202009:00:00STORE 1LANE A5412
02/02/202010:00:00STORE 1LANE A6157
02/02/202011:00:00STORE 1LANE A17-61
02/02/202012:00:00STORE 1LANE A101910
02/02/202013:00:00STORE 1LANE A61515
02/02/202014:00:00STORE 1LANE A00015
02/02/202015:00:00STORE 1LANE B09-90
02/02/202016:00:00STORE 1LANE B7255
03/02/202017:00:00STORE 1LANE B5327
03/02/202018:00:00STORE 1LANE B30310
03/02/202019:00:00STORE 2LANE C7070
03/02/202020:00:00STORE 2LANE C6244
03/02/202021:00:00STORE 2LANE C10826
03/02/202022:00:00STORE 2LANE C2117
03/02/202023:00:00STORE 2LANE C81714
03/02/202000:00:00STORE 2LANE C01-113
03/02/202001:00:00STORE 2LANE C81720
03/02/202002:00:00STORE 2LANE C11020

DATEHOURSTORELANEINOUTDIFFCUMTOTAL
02/02/202008:00:00STORE 1LANE A3211
02/02/202009:00:00STORE 1LANE A5412
02/02/202010:00:00STORE 1LANE A6157
02/02/202011:00:00STORE 1LANE A17-61
02/02/202012:00:00STORE 1LANE A101910
02/02/202013:00:00STORE 1LANE A61515
02/02/202014:00:00STORE 1LANE A00015
02/02/202015:00:00STORE 1LANE B10911
02/02/202016:00:00STORE 1LANE B7256
03/02/202017:00:00STORE 1LANE B5328
03/02/202018:00:00STORE 1LANE B30311
03/02/202019:00:00STORE 2LANE C7077
03/02/202020:00:00STORE 2LANE C62411
03/02/202021:00:00STORE 2LANE C108213
03/02/202022:00:00STORE 2LANE C21114
03/02/202023:00:00STORE 2LANE C81721
03/02/202000:00:00STORE 2LANE C01-120
03/02/202001:00:00STORE 2LANE C81727
03/02/202002:00:00STORE 2LANE C11027


I see there is a slight error in the previous table. This one is correct

 

@AnnaSA look in the attached solution, a table called Table (ignore other tables in pbix) that contains your sample data and table visual showing running total, I did it two ways, so you can pick and choose.

 

I would  Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos whoever helped to solve your problem. It is a token of appreciation!

 

 



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

Thank you VERY much for the help @parry2k . I really appreciate it so much!!

When I try that measure the table visual crashes, and the store(DSTGRP) is also not included.

Here is a snap of the actual table (filtered on a few days on a few stores).
PICK.LOAD_DIFF = Picking_HUs - Loading_HUs - OUT_HUs

The RT will have to by by hour by day by lane by dstgrp, which makes it difficult for me to find a solution 😞

Do you maybe know how I can create a calculated column in this table for the running total for PICK.LOAD_DIFF?

CONFIRMATION_TIME.HOURDSTGRP   LANE_PICKINGPicking_HUsLoading_HUsOUT_HUsPICK.LOAD_DIFF
2019/10/01 00:00GC02BB0371001
2019/10/01 00:00GC04JJ0121001
2019/10/01 00:00GC04AA0163003
2019/10/01 00:00GC02MM0650000
2019/10/01 01:00GC04JJ0121100
2019/10/01 01:00GC04AA0162100-8
2019/10/01 01:00GC02BB0371001
2019/10/01 01:00GC02MM0652002
2019/10/01 02:00GC04AA0161100
2019/10/01 02:00GC02BB0370000
2019/10/01 02:00GC04JJ0122002
2019/10/01 02:00GC02MM0650000
2019/10/01 03:00GC04AA0160000
2019/10/01 03:00GC02BB0370000
2019/10/01 03:00GC04JJ0121001
2019/10/01 03:00GC02MM0650000
2019/10/01 04:00GC04AA0164004
2019/10/01 04:00GC02BB0371001
2019/10/01 04:00GC04JJ0122002
2019/10/01 04:00GC02MM0653003
2019/10/01 05:00GC04AA0163003
2019/10/01 05:00GC02BB0372002
2019/10/01 05:00GC04JJ0121001
2019/10/01 05:00GC02MM0651001
2019/10/01 06:00GC04AA0161001
2019/10/01 06:00GC02BB0371001
2019/10/01 06:00GC04JJ0124013
2019/10/01 06:00GC02MM0650000
2019/10/01 07:00GC02BB0371010
2019/10/01 07:00GC04JJ0120000
2019/10/01 07:00GC04AA0161010
2019/10/01 07:00GC02MM0651001
2019/10/01 08:00GC04AA0163012
2019/10/01 08:00GC02BB0373003
2019/10/01 08:00GC04JJ0121001
2019/10/01 08:00GC02MM0650000
2019/10/01 09:00GC04JJ0120000
2019/10/01 09:00GC04AA0160000
2019/10/01 09:00GC02BB0370000
2019/10/01 09:00GC02MM0650000
2019/10/01 10:00GC04AA0162002
2019/10/01 10:00GC02BB0371001
2019/10/01 10:00GC04JJ0121001
2019/10/01 10:00GC02MM0650000
2019/10/01 11:00GC04AA0162002
2019/10/01 11:00GC02BB0370000
2019/10/01 11:00GC04JJ0120000
2019/10/01 11:00GC02MM0651001
2019/10/01 12:00GC04AA0163003
2019/10/01 12:00GC02BB0371010
2019/10/01 12:00GC04JJ0123003
2019/10/01 12:00GC02MM0650000
2019/10/01 13:00GC04JJ0121001
2019/10/01 13:00GC04AA016010-1
2019/10/01 13:00GC02BB0371001
2019/10/01 13:00GC02MM065001-1
2019/10/01 14:00GC04AA0163003
2019/10/01 14:00GC02BB0370000
2019/10/01 14:00GC04JJ0121001
2019/10/01 17:00GC04AA0161001
2019/10/01 17:00GC02BB0370000
2019/10/01 17:00GC04JJ0121001
2019/10/01 17:00GC02MM0650000
2019/10/01 18:00GC04JJ0121001
2019/10/01 18:00GC04AA0160000
2019/10/01 19:00GC04AA0162002
2019/10/01 19:00GC02MM0650000
2019/10/01 20:00GC04AA0160000
2019/10/01 20:00GC02BB0370000
2019/10/01 20:00GC04JJ0121001
2019/10/01 20:00GC02MM065030-3
2019/10/01 21:00GC04JJ0120000
2019/10/01 21:00GC04AA0160000
2019/10/01 21:00GC02BB037040-4
2019/10/01 21:00GC02MM0650000
2019/10/01 22:00GC04AA0161001
2019/10/01 22:00GC02BB0370000
2019/10/01 22:00GC04JJ0121001
2019/10/01 22:00GC02MM0650000
2019/10/01 23:00GC04AA016120-1
2019/10/01 23:00GC02BB0370000
2019/10/01 23:00GC04JJ0121100
2019/10/01 23:00GC02MM0650000
2019/10/02 00:00GC04AA0163003
2019/10/02 00:00GC02BB0372002
2019/10/02 00:00GC04JJ0122002
2019/10/02 00:00GC02MM0651001
2019/10/02 01:00GC04AA0163003
2019/10/02 01:00GC02BB0370000
2019/10/02 01:00GC04JJ0122002
2019/10/02 01:00GC02MM0651001
2019/10/02 02:00GC04AA0161001
2019/10/02 02:00GC02BB0370000
2019/10/02 02:00GC04JJ0121001
2019/10/02 04:00GC04AA0165005
2019/10/02 04:00GC02BB0370000
2019/10/02 04:00GC04JJ0124004
2019/10/02 04:00GC02MM0650000
2019/10/02 05:00GC04AA0161001
2019/10/02 05:00GC02BB0370000
2019/10/02 05:00GC04JJ0121001
2019/10/02 05:00GC02MM0650000
2019/10/02 06:00GC04JJ0120000
2019/10/02 06:00GC04AA0160000
2019/10/02 06:00GC02BB0370000
2019/10/02 06:00GC02MM0650000
2019/10/02 07:00GC02BB0370000
2019/10/02 07:00GC04JJ0120000
2019/10/02 07:00GC04AA0162002
2019/10/02 07:00GC02MM0650000
2019/10/02 08:00GC04AA0161001
2019/10/02 08:00GC02BB0370000
2019/10/02 08:00GC04JJ0121001
2019/10/02 08:00GC02MM0650000
2019/10/02 09:00GC04JJ0120000
2019/10/02 09:00GC04AA0160000
2019/10/02 09:00GC02BB0370000
2019/10/02 09:00GC02MM0650000
2019/10/02 10:00GC04AA0163003
2019/10/02 10:00GC02BB0370000
2019/10/02 10:00GC04JJ0121001
2019/10/02 10:00GC02MM0650000
2019/10/02 11:00GC04AA0161001
2019/10/02 11:00GC02BB0370000
2019/10/02 11:00GC04JJ0121001
2019/10/02 11:00GC02MM0650000
2019/10/02 12:00GC04JJ0120000
2019/10/02 12:00GC04AA0161001
2019/10/02 12:00GC02BB0370000
2019/10/02 12:00GC02MM0650000
2019/10/02 13:00GC04JJ0120000
2019/10/02 13:00GC04AA0162011
2019/10/02 13:00GC02BB0370000
2019/10/02 13:00GC02MM0650000
2019/10/02 14:00GC04AA0160000
2019/10/02 14:00GC02BB0371001
2019/10/02 14:00GC04JJ0121001
2019/10/02 14:00GC02MM0650000
2019/10/02 15:00GC04JJ0120000
2019/10/02 15:00GC04AA0160000
2019/10/02 15:00GC02BB0370000
2019/10/02 15:00GC02MM0650000
2019/10/02 16:00GC04AA0162002
2019/10/02 16:00GC02BB0371001
2019/10/02 17:00GC02MM0651001
2019/10/02 18:00GC04JJ0121001
2019/10/02 18:00GC04AA0160000
2019/10/02 18:00GC02BB0371001
2019/10/02 18:00GC02MM0650000
2019/10/02 19:00GC02BB0370000
2019/10/02 19:00GC04JJ012130-2
2019/10/02 19:00GC04AA0160170-17
2019/10/02 20:00GC04JJ0121001
2019/10/02 20:00GC02MM0650000

Hi @AnnaSA 

 

Kindly check below results, pbix attached.

Measure = CALCULATE(SUM('Table'[PICK.LOAD_DIFF]),FILTER(ALL('Table'),[CONFIRMATION_TIME.HOUR]<=MAX('Table'[CONFIRMATION_TIME.HOUR])&&[Hour]<=MAX([Hour])),VALUES('Table'[LANE_PICKING]),VALUES('Table'[DSTGRP   ]))

6.PNG

 

Community Support Team _ Dina Ye
If this post helps, then please consider Accept it as the solution to help the other members find it more
quickly.

Oh my goodness @v-diye-msft , THANK YOU so much. It does work.

Unfortunately, for the purpose of my analysis, I will need that as a calculated column since I will have to max it / find the 90th percentile etc...

Do you maybe know what alterations I can make to have that as a calculated column?

Hi @AnnaSA 

 

Please use this one:

Column = CALCULATE(SUM('Table'[PICK.LOAD_DIFF]),FILTER(ALLEXCEPT('Table','Table'[LANE_PICKING],'Table'[DSTGRP   ]),[CONFIRMATION_TIME.HOUR]<=EARLIER('Table'[CONFIRMATION_TIME.HOUR])&&[Hour]<=EARLIER([Hour])))

8.PNG

Pbix attached.

Community Support Team _ Dina Ye
If this post helps, then please consider Accept it as the solution to help the other members find it more
quickly.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.