Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
Sssasa
Frequent Visitor

Buffer level - recursive calculation not possible?

Dear all,

I've been hitting my head on this problem for a while now, and I've read around a lot and tried many solution but nothing seems to fit. My hunch says it's got to do with recursive calculations and the fact that Power Bi cannot perform them.

 

I'm trying to create a dashboard on Power Bi from a very easy excel file but I cannot seem do be able to do it.

The main source data are a buffer level, and a valve: the buffer gets (partially) emptied when the valve is open (valve value = 1), and I'd like to calculate what the level decrease is everytime that the valve is open (and then closed, valve value = 0). 

In excel, I simply start summing up the level decrease until the next value of the valve is 0 (valve closed). To do that I reference to the previous row of the same column (in principle it is a cumulative measure based on the previous row). The cumulation needs to stop when the valve is closed, and start over when the valve is open again). 
I've recreated the calculations on Power Bi but I get stuck with this last calculation, because the cumulation does not stop and start again as I'd wish. Next time the valve is open, whatever was calculated with the previous unloading gets added to that!

 

This is the formula I'm using now, and below some screenshots (unfortunately due to the restrictions on my work computer I cannot upload any files):

Cumulative level difference =

VAR
calc =
CALCULATE (
    SUM (  'Table' [Level difference during unloading]  ),
    TOPN(
        100,
            FILTER (
            'Table' ,
            'Table'[Index] <= EARLIER (  'Table'[Index] )
    ),
    'Table' [Index],
        DESC
 
))
 
var _if = calc * 'Table' [Valve open/close]
 
RETURN _if

 

Example.PNG

Example 2.PNG

 

Any help will be extremely appreciated! 

Thanks

Sara

1 ACCEPTED SOLUTION

Ah, I have misunderstood the ask. You want it the other way round.

 

let
    Query2 = let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("VZTZkQMhDERT2fI3rpKEDnAqLuefhjl0eL+YYXigVjfzfj9AXtBfAI/2QBtN1rif//bYhMZ5/rS7UH3hgEa+EJt1u4tFzzsOSmAW0B2ABkMuwHt8QmOFIJCKGElIn/8JAE1CglilJDFseFGn0NapgFEAOrC2VPYzaO/yXN9MAiGMDs3Ge7wrVPEXoSV9JMKB4GnqQaxN9rr6Hp+j8cjCyBxRPd06iKxNrySe806ozUA6BAJHy0G0ofJFTqfXxBYXSLgto0Eg1MD0FmZyJ1QwES2EHFnnue885u1gl57ELILTFLR+CXaCLMVzGC8zbcS1pQsxt3VwAlLSCwAmt+QeJZjCeRRQ0YJO3l2OMGZ8BYsI3TspeD3kk8qVAxBLJGwXbTOQ3ig8hHkndGRdErZLr6T01tkRkTth0/O43QgE65SVszl/Qr/zGB5qXXKBCteKsLmJAo6QJaKFYMkn8MLIXVSaiYTvbMcUD1fHe0qfHi7DHkjeeCkfqU2/WqzjvPMsQAqI39Uui+3XlVVW3CytK88jrd/iUd0V7xesOj+fLw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, #"Buffer level" = _t, #"Valve open/close" = _t, #"Level difference" = _t, #"rolling average level difference" = _t, #"corrected level difference" = _t]),
    #"Grouped Rows" = Table.Group(Source, {"Valve open/close"}, {{"rows", each process(_)}},GroupKind.Local),
    process = (t)=>
       let 
    #"Changed Type" = Table.TransformColumnTypes(t,{{"Valve open/close", Int64.Type},{"corrected level difference", type number}},"nl"),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "Unloading", each [#"Valve open/close"]*[corrected level difference]),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "Cumul difference", (k)=> List.Sum(Table.SelectRows(#"Added Custom",each [Index]<=k[Index])[Unloading]))
in
    #"Added Custom1"
in 
   #"Grouped Rows",
    #"Expanded rows" = Table.ExpandTableColumn(Query2, "rows", {"Timestamp", "Buffer level", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"}, {"Timestamp", "Buffer level", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"}),
    #"Reordered Columns" = Table.ReorderColumns(#"Expanded rows",{"Timestamp", "Buffer level", "Valve open/close", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"})
in
    #"Reordered Columns"

View solution in original post

9 REPLIES 9
Sssasa
Frequent Visitor

Hi, thank you, I thought I had replied but I do not see it anymore.

Because of the restrictions in my work computer I cannot upload any file in external services such as WeTransfer. I copied the tables with the expected result and what I get instead.

Unfortunately Power Query doesn't seem to be an option as one of my columns is a rolling average and that takes too much time/power in Power Query (unless I'm doing something wrong). Do you have a solution for that?

And if yes, which arguments should List.Accumulate have? Thank you!

What I expect:

     = D - E= IF (C =0; 0; F )= IF (G = 0; 0; H [n-1] +G  ) 
TimestampBuffer levelValve open/closeLevel differencerolling average level differencecorrected level differenceLevel difference during unloadingCumulative level differenceFilter trucks
05:03:00178,50 0,528 0  
05:06:00180,201,7370,5561,182000
05:09:00180,300,0850,545-0,460000
05:12:00180,800,5390,545-0,006000
05:15:00181,700,8780,5520,326000
05:18:00181,10-0,6490,527-1,175000
05:21:00179,41-1,6610,527-2,188-2,187712627-2,1877126270
05:24:00171,51-7,9480,538-8,486-8,486121171-10,67383380
05:27:00166,31-5,1800,499-5,679-5,679383612-16,353217410
05:30:00160,11-6,1640,485-6,649-6,648694671-23,001912080
05:33:00158,01-2,0760,575-2,651-2,651090943-25,653003020
05:36:00158,210,1370,489-0,353-0,352762115-26,005765141
05:39:00158,400,1730,449-0,276000
05:42:00159,701,3540,4700,884000
05:45:00160,701,0420,5230,519000
05:48:00160,800,0320,494-0,462000
05:51:00160,21-0,6180,439-1,057-1,057102974-1,0571029740
05:54:00156,91-3,2800,409-3,689-3,688966736-4,746069710
05:57:00153,51-3,3400,455-3,795-3,795230347-8,5413000570
06:00:00151,91-1,5990,552-2,151-2,150819885-10,692119940
06:03:00150,31-1,6770,450-2,127-2,126959898-12,819079840
06:06:00150,11-0,2000,429-0,629-0,62893065-13,448010490
06:09:00147,71-2,3170,395-2,713-2,712622644-16,160633131
06:12:00150,702,9610,4682,493000
06:15:00150,50-0,2470,439-0,686000
06:18:00148,80-1,6160,402-2,017000



Sssasa
Frequent Visitor

What I get:

TimestampBuffer levelValve open/closeLevel differencerolling average level differencecorrected level differenceLevel difference during unloadingCumulative level differenceFilter trucks
05:03:00178,50 0,528 0  
05:06:00180,201,7370,5561,182000
05:09:00180,300,0850,545-0,460000
05:12:00180,800,5390,545-0,006000
05:15:00181,700,8780,5520,326000
05:18:00181,10-0,6490,527-1,175000
05:21:00179,41-1,6610,527-2,188-2,187712627-2,1877126270
05:24:00171,51-7,9480,538-8,486-8,486121171-10,67383380
05:27:00166,31-5,1800,499-5,679-5,679383612-16,353217410
05:30:00160,11-6,1640,485-6,649-6,648694671-23,001912080
05:33:00158,01-2,0760,575-2,651-2,651090943-25,653003020
05:36:00158,210,1370,489-0,353-0,352762115-26,005765141
05:39:00158,400,1730,449-0,276000
05:42:00159,701,3540,4700,884000
05:45:00160,701,0420,5230,519000
05:48:00160,800,0320,494-0,462000
05:51:00160,21-0,6180,439-1,057-1,057102974-27,062868110
05:54:00156,91-3,2800,409-3,689-3,688966736-30,751834850
05:57:00153,51-3,3400,455-3,795-3,795230347-34,54706520
06:00:00151,91-1,5990,552-2,151-2,150819885-36,697885080
06:03:00150,31-1,6770,450-2,127-2,126959898-38,824844980
06:06:00150,11-0,2000,429-0,629-0,62893065-39,453775630
06:09:00147,71-2,3170,395-2,713-2,712622644-42,166398271
06:12:00150,702,9610,4682,493000
06:15:00150,50-0,2470,439-0,686000
06:18:00148,80-1,6160,402-2,017000

what's the rolling average window?

your level difference column accuracy seems to be higher than the buffer level accuracy too.

 

lbendlin_0-1715700731327.png

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("VZTZkQMhDERT2fI3rpKEDnAqLuefhjl0eL+YYXigVjfzfj9AXtBfAI/2QBtN1rif//bYhMZ5/rS7UH3hgEa+EJt1u4tFzzsOSmAW0B2ABkMuwHt8QmOFIJCKGElIn/8JAE1CglilJDFseFGn0NapgFEAOrC2VPYzaO/yXN9MAiGMDs3Ge7wrVPEXoSV9JMKB4GnqQaxN9rr6Hp+j8cjCyBxRPd06iKxNrySe806ozUA6BAJHy0G0ofJFTqfXxBYXSLgto0Eg1MD0FmZyJ1QwES2EHFnnue885u1gl57ELILTFLR+CXaCLMVzGC8zbcS1pQsxt3VwAlLSCwAmt+QeJZjCeRRQ0YJO3l2OMGZ8BYsI3TspeD3kk8qVAxBLJGwXbTOQ3ig8hHkndGRdErZLr6T01tkRkTth0/O43QgE65SVszl/Qr/zGB5qXXKBCteKsLmJAo6QJaKFYMkn8MLIXVSaiYTvbMcUD1fHe0qfHi7DHkjeeCkfqU2/WqzjvPMsQAqI39Uui+3XlVVW3CytK88jrd/iUd0V7xesOj+fLw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, #"Buffer level" = _t, #"Valve open/close" = _t, #"Level difference" = _t, #"rolling average level difference" = _t, #"corrected level difference" = _t]),
    #"Added Index" = Table.AddIndexColumn(Source, "Index", 0, 1, Int64.Type),
    #"Changed Type" = Table.TransformColumnTypes(#"Added Index",{{"Timestamp", type time}, {"Buffer level", type number}, {"Level difference", type number}, {"rolling average level difference", type number}, {"corrected level difference", type number}},"nl"),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "Unloading", each if [#"Valve open/close"]="0" then 0 else[corrected level difference],type number),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "Cumul difference", (k)=> if k[Unloading]=0 then 0 else List.Sum(Table.SelectRows(#"Added Custom",each [Index]<=k[Index])[Unloading]))
in
    #"Added Custom1"

How to use this code: Create a new Blank Query. Click on "Advanced Editor". Replace the code in the window with the code provided here. Click "Done". Once you examined the code, replace the Source step with your own source.

Hello,

Thank you for your reply!

Unfortunately this does not solve my problem (I had already got to this result). The cumulative difference does not reset to 0 after the valve is closed again, and that's exactly what I need to calculate how much has been unloaded.

The -27 that I have highlighted in the screenshot below needs to be -1,05, which is basically the sum of the previous cell (0) plus the difference in level now that the valve is open again.

It's a very straightforward operation in Excel but I can't seem to make it work here, every single option I tried leads exactly to this result 😞

Knipsel3.PNG

Ah, I have misunderstood the ask. You want it the other way round.

 

let
    Query2 = let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("VZTZkQMhDERT2fI3rpKEDnAqLuefhjl0eL+YYXigVjfzfj9AXtBfAI/2QBtN1rif//bYhMZ5/rS7UH3hgEa+EJt1u4tFzzsOSmAW0B2ABkMuwHt8QmOFIJCKGElIn/8JAE1CglilJDFseFGn0NapgFEAOrC2VPYzaO/yXN9MAiGMDs3Ge7wrVPEXoSV9JMKB4GnqQaxN9rr6Hp+j8cjCyBxRPd06iKxNrySe806ozUA6BAJHy0G0ofJFTqfXxBYXSLgto0Eg1MD0FmZyJ1QwES2EHFnnue885u1gl57ELILTFLR+CXaCLMVzGC8zbcS1pQsxt3VwAlLSCwAmt+QeJZjCeRRQ0YJO3l2OMGZ8BYsI3TspeD3kk8qVAxBLJGwXbTOQ3ig8hHkndGRdErZLr6T01tkRkTth0/O43QgE65SVszl/Qr/zGB5qXXKBCteKsLmJAo6QJaKFYMkn8MLIXVSaiYTvbMcUD1fHe0qfHi7DHkjeeCkfqU2/WqzjvPMsQAqI39Uui+3XlVVW3CytK88jrd/iUd0V7xesOj+fLw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, #"Buffer level" = _t, #"Valve open/close" = _t, #"Level difference" = _t, #"rolling average level difference" = _t, #"corrected level difference" = _t]),
    #"Grouped Rows" = Table.Group(Source, {"Valve open/close"}, {{"rows", each process(_)}},GroupKind.Local),
    process = (t)=>
       let 
    #"Changed Type" = Table.TransformColumnTypes(t,{{"Valve open/close", Int64.Type},{"corrected level difference", type number}},"nl"),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "Unloading", each [#"Valve open/close"]*[corrected level difference]),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "Cumul difference", (k)=> List.Sum(Table.SelectRows(#"Added Custom",each [Index]<=k[Index])[Unloading]))
in
    #"Added Custom1"
in 
   #"Grouped Rows",
    #"Expanded rows" = Table.ExpandTableColumn(Query2, "rows", {"Timestamp", "Buffer level", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"}, {"Timestamp", "Buffer level", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"}),
    #"Reordered Columns" = Table.ReorderColumns(#"Expanded rows",{"Timestamp", "Buffer level", "Valve open/close", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"})
in
    #"Reordered Columns"

Thank you!! This works indeed.

However I now have the problem of a rolling average that, when performed in Power Query takes waaay two long (and this cumulative calculation depends on the rolling average column).

Is there a way to make it efficient?
This is what I use now:

= let v=#"Added Index"[Value] in Table.AddColumn(#"Added Index", "MA5_PQ", each let pos1 = List.Max({0, [Index]-4}) in List.Average(List.Range(v, pos1, [Index]-pos1+1)))

Please provide sample data that fully covers your issue.

lbendlin
Super User
Super User

You cannot do anything recursive in Power BI. Change your process to make it non-recursive or use Power Query  (List.Accumulate).

 

Please provide sample data that covers your issue or question completely, in a usable format (not as a screenshot).

Do not include sensitive information or anything not related to the issue or question.

If you are unsure how to upload data please refer to https://community.fabric.microsoft.com/t5/Community-Blog/How-to-provide-sample-data-in-the-Power-BI-...

Please show the expected outcome based on the sample data you provided.

Want faster answers? https://community.fabric.microsoft.com/t5/Desktop/How-to-Get-Your-Question-Answered-Quickly/m-p/1447...

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel1

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.