Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Cypher294
Frequent Visitor

Average processingtime per ticket state / Measure required

Hello everybody,

 

I've got an problem creating a report for showing the average processing time for several tickets based on a processing table which includes each state for each ticket and the needed time of processing in each state ( each incident mentioned several times // one time for each state // Ticket ID is unique ).

 

The sample data is linked to this post.

 

In this example there are 9 Unique Incidents within August 2018. I've already created a measure to calculate the average processing time for each team within this month based on the number of unique tickets within this month ( 9 unique tickets within 14 rows ). In this case 56,26 hours ( total hours 506 divided through 9 not 14 )

 

pic_1.JPG

 

 

 

The issue I have is that now I want to calculate the average processing time by each state for each UNIQUE incident.

The expectation is that e.g. the 66 hours for state " waiting" will be divided by 9 unique incidents for August ( 7,33 hours ).

Of course the full dataset includes several months and because of this the reports needs a logic which divides by the number of unique incidents for each month. The result would be that the sum of 56 hours will be spread through the states with a total of 56 hours. ( Currently the sum is >150 because of the single states will be summarized ) .

 

pic_2.JPG

The PBIX could be find here:

 

Sample Data

 

Thanks for your help.

8 REPLIES 8
v-frfei-msft
Community Support
Community Support

Hi @Cypher294,

 

Please use this measure.

 

Measure = CALCULATE(SUM(Sample_Data[Time in Hours]))/CALCULATE(DISTINCTCOUNT(Sample_Data[Incident ID]),ALLSELECTED(Sample_Data))

Capture.PNG

For more details, please check the pbix as attached.

 

Regards,

Frank

Community Support Team _ Frank
If this post helps, then please consider Accept it as the solution to help the others find it more quickly.

Hi @Cypher294,

 

To use the measure.

 

Measure = CALCULATE(SUM(Sample_Data[Time in Hours]))/CALCULATE(DISTINCTCOUNT('Incidents 2'[Incident ID]),ALLEXCEPT('Date','Date'[Date].[MonthNo]))

3.PNG

Regards,

Frank

Community Support Team _ Frank
If this post helps, then please consider Accept it as the solution to help the others find it more quickly.

HI @v-frfei-msft,

 

it looks well so long... But I have one more issue which cause currently some problems.

 

Normally not every ticket ID which is recorded in " Incidents 2 " is recorded in "Sample_Data" which cause some problems.

It divides by all the unique IDs recorded to Incident 2 table not just by the ones registered in BOTH.

 

If I adjust the measure ( page 2 --> Measure 2 ) It counts right for "Group" seperation but not for "State".

 

I've uploaded an adjusted sample data. Hopefully this is the last problem to be solved ...

 

Sample Data 4

Hi @Cypher294,

 

 To create another measure to work on State.

 

STATE = CALCULATE(SUM(Sample_Data[Time in Hours]))/CALCULATE(DISTINCTCOUNT('Sample_Data'[Incident ID]),ALLEXCEPT(Sample_Data,Sample_Data[Ticket State]))

4.PNG

 

Regards,

Frank

Community Support Team _ Frank
If this post helps, then please consider Accept it as the solution to help the others find it more quickly.

Hi @v-frfei-msft,

 

sorry it doesn't work until now ... For example the total hours for October is 1198,96. Divided by 17 Uniq Incidents for Oktober the Average should be ~ 70. These 70hours should be seperated by team and state within the two graphs but in both cases the Sum is 70. If you would select for example team 1 ( which has less than 70 for example the sum for Team 1 should also be seperated by the states but the sum should be the same ). Hopefully you understand what I mean.

 

currently in the last sample data the average in both graphs is not ~ 70 and additional different on both perspectives (SUM Group 135 and state 110 hours )

 

Maybe there is a way to fix it.

 

Thanks

 

Dennis

 

HI @v-frfei-msft,

 

Thanks for your fast support. I've tried it out but there is one more issue. The calculation just work for a single months if there is more data for additional months the average seems to be calculated for all IDs not for the ones within the dedicated month.

 

I've attached an additional sample data with more data to show the issue. if you select an explicit month it counts correctly but not if you not select one.

 

Sample Data 2

 

Thanks Dennis

Hi @Cypher294,

 

To update the measure as below.

 

Measure = CALCULATE(SUM(Sample_Data[Time in Hours]))/CALCULATE(DISTINCTCOUNT(Sample_Data[Incident ID]),ALLEXCEPT(Sample_Data,Sample_Data[Date].[MonthNo]))

2.PNG

 

Regards,

Frank

Community Support Team _ Frank
If this post helps, then please consider Accept it as the solution to help the others find it more quickly.

Hi @v-frfei-msft,

 

I Thought I could adapt the sample data to my real database table environment with all relations etc. but if I use the statement within this environment the calculation seems to be faulty.

 

I've attached a new pbix and seperated and relate the tables like they are in real.

 

The calculation with the seperation by team seems to be correct on a monthly basis but if I use the same measure to separate by state it seems not. The expextation is that the 77+58 ( sum by team = 135  ) will be distributed by the states but currently the sum is bigger than 135.

 

Maybe you could help again.

 

Thanks for your great support it helps me a lot.

 

Sample 3

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.