Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
I have received a request for a calculation that I am in need of some help to create.
One of our leadership team has posed the question "On average, how many days does a new rep need to reach a cumulative sales total of $1million?"
Any suggstions on how to build and filter the necessary calculation based upon Total Sales?
@Sean, @KHorseman or @MattAllington could probably do this in their sleep.
Proud to be a Super User!
Solved! Go to Solution.
@kcantorI dunno about in my sleep, but I think I've found something that works. It ain't pretty.
Given a simple table of daily sales (three columns: SalesRep, Amount, Date) and a standard date table, I believe this measure will do it:
DaysToMillion = AVERAGEX( SUMMARIZE( Sales, Sales[SalesRep], "Days", VAR first = CALCULATE(MIN(Sales[Date])) VAR million = CALCULATE( MIN(Sales[Date]), FILTER( ADDCOLUMNS( Sales, "Cumulative", CALCULATE( SUM(Sales[Amount]), FILTER( ALLEXCEPT(Sales, Sales[SalesRep]), Sales[Date] < EARLIER(Sales[Date]) ) ) ), [Cumulative] >= 1000000 ) ) RETURN CALCULATE( COUNTROWS( DATESBETWEEN(DateTable[Date], first, million) ) ) ), [Days] )
I only tested this on a sample table of 1000 rows with 6 sales reps. I have no idea how well this scales with larger datasets. If you have a lot of sales reps and a lot of sales, you may find better performance if you trim this down and use parts of it to create an actual summary table and then get an average out of that. If your sales table has a lot of columns you should use a SUMMARIZE inside the ADDCOLUMNS so you just grab the three columns you actually need.
It basically ignores date filters in aggregate (i.e. this is lifetime days to $1m, not annual days to $1m). If you want an annual restart you'd have to change the part where it uses ALLEXCEPT. Something something DATESYTD mumble mumble...? I don't know, I haven't figured that out yet.
Proud to be a Super User!
Here, have some example files.
Here's the sample sales data.
Here's the pbix. It has two methods: the measure above and an alternative summary table and average measure.
Proud to be a Super User!
@kcantorI dunno about in my sleep, but I think I've found something that works. It ain't pretty.
Given a simple table of daily sales (three columns: SalesRep, Amount, Date) and a standard date table, I believe this measure will do it:
DaysToMillion = AVERAGEX( SUMMARIZE( Sales, Sales[SalesRep], "Days", VAR first = CALCULATE(MIN(Sales[Date])) VAR million = CALCULATE( MIN(Sales[Date]), FILTER( ADDCOLUMNS( Sales, "Cumulative", CALCULATE( SUM(Sales[Amount]), FILTER( ALLEXCEPT(Sales, Sales[SalesRep]), Sales[Date] < EARLIER(Sales[Date]) ) ) ), [Cumulative] >= 1000000 ) ) RETURN CALCULATE( COUNTROWS( DATESBETWEEN(DateTable[Date], first, million) ) ) ), [Days] )
I only tested this on a sample table of 1000 rows with 6 sales reps. I have no idea how well this scales with larger datasets. If you have a lot of sales reps and a lot of sales, you may find better performance if you trim this down and use parts of it to create an actual summary table and then get an average out of that. If your sales table has a lot of columns you should use a SUMMARIZE inside the ADDCOLUMNS so you just grab the three columns you actually need.
It basically ignores date filters in aggregate (i.e. this is lifetime days to $1m, not annual days to $1m). If you want an annual restart you'd have to change the part where it uses ALLEXCEPT. Something something DATESYTD mumble mumble...? I don't know, I haven't figured that out yet.
Proud to be a Super User!
KHorseman wrote: trim this down and use parts of it to create an actual summary table and then get an average out of that.
I added an extra row to the data set provided - @KHorseman with sales 1 short of a million
The results...
@KHorsemanGreat Job!
Here, have some example files.
Here's the sample sales data.
Here's the pbix. It has two methods: the measure above and an alternative summary table and average measure.
Proud to be a Super User!
@KHorsemanYou solved that beautifully . . . twice!
I was stuck in the middle where you added ALLEXCEPT which solved my problem. I did add in a seperate filter to only calculate working days as our teams do not work on weekends. Other than that it is perfect out of the box.
I used both the measure and the calculated table. Both perform well with 2.5 million rows and 18 reps.
Do you plan to attend the Data Insight in June? Perhaps just being in the room with you and folks like @Sean will allow my DAX to improve through osmosis.
Proud to be a Super User!
That's great. I was worried because this formula looks like it has at least three different performance killers combined, so I'm glad to hear it works with that much data.
I want to go to the summit. I was there last year. I need to pester my boss about approving it again. I'd love to get to meet and chat with you and a bunch of the other regulars on here.
Proud to be a Super User!
Do those sales earn monthly interest?
If not why can't you just divide 1,000,000 by the Average Daily Sales per Sales Rep
I don't know if you are looking for the equivalent of the Excel NPER function but if there's no interest there's no need?
User | Count |
---|---|
140 | |
113 | |
104 | |
77 | |
65 |
User | Count |
---|---|
136 | |
117 | |
101 | |
71 | |
61 |