Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi,
I'm trying to create a report that forecasts when some units will likely need to be replaced. For this I have some data about the unit with its age (in days) and an estimate of how old it will become. I have also created a date table with future dates.
The forecast is provided by a cumulative normal distribution.
Currently I have managed to make a cumulutive sum over all units over the years and have tried to use DATEADD(datetable,-1,DAY) as a filter in a CALCULATE statement to get the value for the previous period. This is likely not optimal as it iterates over all dates. Something that looks in the current scope to the previous period should be better.
The current form of the measure works like [Norm.dist(period.N) - Norm.dist(period.N-1)]. It should ideally be [ Norm.dist(period.N) - Norm.dist(period.N-1) ] / [ 1-Nomr.dist(today() ) ] on a unit basis before it is aggregated, as shown in the lower part in linked Excel file. This change is important for proper forecast for unit age close to expected age.
Link to files:
Solved! Go to Solution.
Hi,
Believe I solved it myself.
Figured I could use MIN( Date[Date] ) -1 to always find the last date just outside the current scope. I also don't any longer think I need a nested SUMX over dates as that is in a way included in the SUMX over 'data' and the filter context it is produced in.
I now have something like this:
Delta units out =
SUMX (
'data',
DIVIDE (
NORM.DIST (
INT ( MAX ( 'Date'[Date] ) - data[Start] ),
[Expected EOL meas.],
200,
TRUE
)
- IF (
MIN ( 'Date'[Date] ) - 1
< TODAY (),
NORM.DIST ( INT ( TODAY () - data[Start] ), [Expected EOL meas.], 200, TRUE ),
NORM.DIST (
INT ( ( MIN ( 'Date'[Date] ) - 1 ) - data[Start] ),
[Expected EOL meas.],
200,
TRUE
)
),
(
1
- NORM.DIST ( INT ( TODAY () - data[Start] ), [Expected EOL meas.], 200, TRUE )
)
)
)
In addition I should probably add a filter when Max(date) = today() so we omit that 0-point when timedelta=0.
Hi,
Believe I solved it myself.
Figured I could use MIN( Date[Date] ) -1 to always find the last date just outside the current scope. I also don't any longer think I need a nested SUMX over dates as that is in a way included in the SUMX over 'data' and the filter context it is produced in.
I now have something like this:
Delta units out =
SUMX (
'data',
DIVIDE (
NORM.DIST (
INT ( MAX ( 'Date'[Date] ) - data[Start] ),
[Expected EOL meas.],
200,
TRUE
)
- IF (
MIN ( 'Date'[Date] ) - 1
< TODAY (),
NORM.DIST ( INT ( TODAY () - data[Start] ), [Expected EOL meas.], 200, TRUE ),
NORM.DIST (
INT ( ( MIN ( 'Date'[Date] ) - 1 ) - data[Start] ),
[Expected EOL meas.],
200,
TRUE
)
),
(
1
- NORM.DIST ( INT ( TODAY () - data[Start] ), [Expected EOL meas.], 200, TRUE )
)
)
)
In addition I should probably add a filter when Max(date) = today() so we omit that 0-point when timedelta=0.
Hi @Hei_aQing
It's is about getting the full formula working together properly. Just producing an answer to what you quouted should be doable, but I'm having trouble getting it to iterate correctly through the data.
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Ask questions in Eventhouse and KQL, Eventstream, and Reflex.
User | Count |
---|---|
87 | |
78 | |
62 | |
61 | |
59 |
User | Count |
---|---|
160 | |
114 | |
100 | |
74 | |
65 |