Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Conditional Cumulative Calculated Column

Hi, I would like to create a cummulative counting column that adds 1 when [rsup] becomes equal to "0". The process order should follow [date] (like in the example below, descending). Once the column becomes "0", the [Goal Column] should equal "0". Once it changes to anything but a "0", it should add 1 to the existing [Goal Column] value. 

 

As you can see, there are multiple [Well] categories, so it should be filtered on this. 

 

Thank you and let me know if you need an additional description of the problem! I really appreciate your help, I've been stuck with this for a while. 

 

 

 

WelldatersupGoal Column 
justin6/1/2021 5:0071
justin6/1/2021 5:1071
justin6/1/2021 5:2071
justin6/1/2021 5:3011
justin6/1/2021 5:4071
justin6/1/2021 5:5000
justin6/1/2021 6:0072
justin6/1/2021 6:1072
justin6/1/2021 6:2000
justin6/1/2021 6:3073
justin6/1/2021 6:4073
brett6/4/2021 2:1071
brett6/4/2021 2:2071
brett6/4/2021 2:3000
brett6/4/2021 2:4072
brett6/4/2021 2:5012
brett6/4/2021 3:0000
brett6/4/2021 3:1073
3 REPLIES 3
Jihwan_Kim
Super User
Super User

Hi, @Anonymous 

Please check the sample pbix file's link down below.

 

https://www.dropbox.com/s/tq8jyocbvb32hj6/jklibv2.pbix?dl=0 

 

step one CC =
VAR currentwell = 'Table'[Well]
VAR currentdatetime = 'Table'[date]
VAR previousdatetime =
CALCULATE (
MAX ( 'Table'[date] ),
FILTER (
'Table',
'Table'[Well] = currentwell
&& 'Table'[date] < currentdatetime
)
)
RETURN
SWITCH (
TRUE (),
'Table'[rsup] = 0, 0,
CALCULATE (
SUM ( 'Table'[rsup] ),
FILTER (
'Table',
'Table'[Well] = currentwell
&& 'Table'[date] = previousdatetime
)
) <> 0, 0,
1
)
 
Goal Column CC =
VAR currentwell = 'Table'[Well]
VAR currentdatetime = 'Table'[date]
VAR previousdatetime =
CALCULATE (
MAX ( 'Table'[date] ),
FILTER (
'Table',
'Table'[Well] = currentwell
&& 'Table'[date] < currentdatetime
)
)
RETURN
IF (
'Table'[rsup] = 0,
0,
SUMX (
FILTER (
'Table',
'Table'[Well] = currentwell
&& 'Table'[date] <= currentdatetime
),
'Table'[step one CC]
)
)
 
 

Hi, My name is Jihwan Kim.


If this post helps, then please consider accept it as the solution to help other members find it faster, and give a big thumbs up.


Linkedin: linkedin.com/in/jihwankim1975/

Twitter: twitter.com/Jihwan_JHKIM

 

 

If this post helps, then please consider accepting it as the solution to help other members find it faster, and give a big thumbs up.


Go to My LinkedIn Page


Anonymous
Not applicable

Thank you for the quick response and support - however, after running the first formula, I'm receiving the following error: 

"There's not enough memory to complete this operatino. Please try again later whne there may be more memory avilable". 

 

My dataset currently has ~25 million rows. Is this something I should do in the query editor? Perhaps with Python or M? 

 

Thanks!

Hi @Anonymous 

 

You could consider reducing the data model size by removing some unnecessary rows/columns in Power Query Editor. There are some tips about how to reduce data model size for your reference:

3 ways to REDUCE DATA In Power BI Desktop (No Premium needed!) - YouTube

Top 5 tips to reduce your Power BI data model size — Apex Insights: Power BI tips & tricks

Data reduction techniques for Import modeling - Power BI | Microsoft Docs

 

Or you can use a subset of data to test the formulas when you create a report.

 

Regards,
Community Support Team _ Jing

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.