Mais conteúdo relacionado Semelhante a The 80/20 Rule for Desktop KPI's: Less is More! (20) The 80/20 Rule for Desktop KPI's: Less is More!1. The 80/20 Rule for Desktop KPI’s:
Less is More!
Jeff Rumburg • MetricNet
2. More than 1,100 Desktop Support Benchmarks
Global Database
30+ Key Performance Indicators
Nearly 60 Industry Best Practices
IT Support Benchmarking Database
7. But Is There and Even More Fundamental Building Block?
METRICS
PEOPLE
PROCESS
TECH
7© 2014 MetricNet, LLC, www.metricnet.com
8. Six-Part Model for Desktop Support Best Practices
Proactively Managing
Stakeholder Expectations
Stakeholder
Communication
A Holistic Approach to
Performance
Measurement
Performance
Measurement
Leveraging People and
Processes
Technology
Expeditious Delivery of
Customer Service
Process
Proactive, Life-cycle
Management of
Personnel
Human
Resources
Defining Your Charter
and Mission
Strategy
Definition
Model
Component
Customer
Enthusiasm
Strategy
Human
Resources
Process
Technology
Performance
Measurement
Stakeholder
Communication
8© 2014 MetricNet, LLC, www.metricnet.com
9. Best Practices in Performance Measurement
1
2
3
4
5
6
11
12
Desktop Support tracks the Mean Time to Resolve (MTTR), and the Percentage of tickets resolved
within 24 hours.
14
7
8
Technician Satisfaction is measured, recorded, and tracked.
Technician Utilization is measured, recorded, and tracked on an ongoing basis.
Desktop Support understands key correlations and cause/effect relationships between the various KPI's.
This enables Desktop Support to achieve desired performance goals by leveraging and driving the
underlying "causal" metrics.
10
9
13
Desktop Support conducts benchmarking at least once per year.
Desktop Support KPI's are used to establish "stretch" goals.
Cost per Ticket is measured, recorded, and tracked on an ongoing basis.
Customer Satisfaction is measured, recorded, and tracked on an ongoing basis.
Desktop Support measures are used holistically, and diagnostically to identify performance gaps in
Desktop Support performance, and to prescribe actions that will improve performance.
Desktop Support maintains a balanced scorecard that provides a single, all-inclusive measure of
Desktop Support performance.
First Contact Resolution Rate for Incidents is measured, recorded, and tracked on an ongoing basis.
First Level Resolution is measured, recorded, and tracked on an ongoing basis.
Desktop Support tracks the number of tickets that are that could have been resolved by the Desktop
Support organization at Level one.
Desktop Support conducts event driven customer surveys whereby the results of customer satisfaction
surveys can be linked back to a specific ticket, and to a specific technician handling the contact at
9© 2014 MetricNet, LLC, www.metricnet.com
10. Ranking Explanation
1 No Knowledge of the Best Practice.
2 Aware of the Best Practice, but not applying it.
3 Aware of the Best Practice, and applying at a rudimentary level.
4 Best Practice is being effectively applied.
5 Best Practice is being applied in a world-class fashion.
Best Practices Evaluation Criteria
10© 2014 MetricNet, LLC, www.metricnet.com
12. 10 Mega Trends and the Holistic Use of KPI’s
Holistic use of KPI’s and Benchmarking
Process Optimization (ITIL, ITSM)
User Self-Help
Desktop Virtualization
Marketing the Service Desk
Understanding of TCO and First Level Resolution
Process Optimization (ITIL, ITSM)
Knowledge Management and Remote Diagnosis
The Widespread Adoption of ROI Analysis
Service and Support as a Business
12© 2014 MetricNet, LLC, www.metricnet.com
13. The Premise Behind Desktop Support KPI’s
We’ve all heard the expression…
“If you’re not measuring it,
you’re not managing it!”
But there’s more to the story…Lots more!
© 2014 MetricNet, LLC, www.metricnet.com 13
14. Two Paradigms for Desktop Support KPI’s
The Historical Approach The Holistic Approach
Measurement
(75%)
Analysis
(15%)
Prescription
(7.5%)
Action
(2.5%)
Measurement
(5%)
Analysis
(20%)
Prescription
(30%)
Action
(45%)
IncreasingValue!
© 2014 MetricNet, LLC, www.metricnet.com 14
15. 15
4
3
2
1
Customer
Enthusiasm
An Industry MegaTrend: The Holistic Use of KPI’s
Measure
Diagnose
Prescribe
Implement
Model
Component Description
1. Measure
Measure help
desk
performance on
an ongoing basis
2. Diagnose
Benchmark
performance and
conduct a gap
analysis
3. Prescribe
Define actions to
close the gap
4. Implement
Implement your
action plan and
improve
performance
© 2014 MetricNet, LLC, www.metricnet.com
16. The Most Common Desktop Support KPI’s
Cost per Ticket
Cost per Incident
Cost per Service Request
Cost Productivity
Service Level
Quality
Ticket Handling
Technician Average Incident Response
Time (min)
% of Incidents Resolved in
24 Hours
Mean Time to Resolve
Incidents (hours)
Mean Time to Complete
Service Requests (days)
Technician Utilization
Tickets per Technician-
Month
Incidents per Technician-
Month
Service Requests per
Technician-Month
Ratio of Technicians to
Total Headcount
Customer Satisfaction
First Contact Resolution Rate
(Incidents
% Resolved Level 1 Capable
% of Tickets Re-opened
Technician Satisfaction
New Technician Training
Hours
Annual Technician Training
Hours
Annual Technician Turnover
Technician Absenteeism
Technician Tenure (months)
Technician Schedule
Adherence
Average Incident Work
Time (min)
Average Service Request
Work Time (min)
Average Travel Time per
Ticket (min)
And there are hundreds more!!
Workload
Tickets per Seat per Month
Incidents per Seat per
Month
Service Requests per Seat
per Month
Incidents as a % of Total
Ticket Volume
© 2014 MetricNet, LLC, www.metricnet.com 16
17. Controllable vs. Non-Controllable KPI’s
Causal
Factors
Drivers
Cost and Quality are the Macro Measures
The Macro Measures tell the story of your
performance
They are good for communicating the
performance of Desktop Support
But you cannot control them directly
Workload metrics are driven by Causal Factors
The Causal Factors define the volume and mix
of work performed by Desktop Support
Desktop Support has very little control over the
Causal Factors
They are a function of your IT environment
Productivity, Service Level, Technician, and Ticket
Handling are the underlying drivers of performance
These are the metrics that drive your performance
You can control these metrics directly
It is through these metrics that you can influence the
Macro Measures, and improve your performance
© 2014 MetricNet, LLC, www.metricnet.com 17
18. Causal Factors: The Workload Drivers
Causal
Factors
Drivers
Causal Factors include:
Device count and mix
Mix of desktop vs. laptop
computers
Number of mobile devices
Average age of devices
Standardization of desktop
environment
User population density
High rise vs. campus vs. field
User work location
Office vs. home vs. field
And numerous other factors…
© 2014 MetricNet, LLC, www.metricnet.com 18
19. There Has Got to Be a Better Way!
Cost per Ticket
Cost per Incident
Cost per Service Request
Cost Productivity
Service Level
Quality
Ticket Handling
Technician Average Incident Response
Time (min)
% of Incidents Resolved in
24 Hours
Mean Time to Resolve
Incidents (hours)
Mean Time to Complete
Service Requests (days)
Technician Utilization
Tickets per Technician-
Month
Incidents per Technician-
Month
Service Requests per
Technician-Month
Ratio of Technicians to
Total Headcount
Customer Satisfaction
First Contact Resolution Rate
(Incidents
% Resolved Level 1 Capable
% of Tickets Re-opened
Technician Satisfaction
New Technician Training
Hours
Annual Technician Training
Hours
Annual Technician Turnover
Technician Absenteeism
Technician Tenure (months)
Technician Schedule
Adherence
Average Incident Work
Time (min)
Average Service Request
Work Time (min)
Average Travel Time per
Ticket (min)
And there are hundreds more!!
Workload
Tickets per Seat per Month
Incidents per Seat per
Month
Service Requests per Seat
per Month
Incidents as a % of Total
Ticket Volume
© 2014 MetricNet, LLC, www.metricnet.com 19
20. The 80/20 Rule for Desktop Support KPI’s
Cost per TicketCost
Productivity
Quality
Call Handling
Technician Utilization
Customer satisfaction
First contact resolution rate (incidents)
Technician Technician Satisfaction
Aggregate Balanced scorecard
TCO % Resolved Level 1 Capable
Service Level Mean Time to Resolve
© 2014 MetricNet, LLC, www.metricnet.com 20
21. Desktop KPI’s: Which Ones Really Matter?
Cost per TicketCost
Productivity
Quality
Call Handling
Technician Utilization
Customer satisfaction
First contact resolution rate (incidents)
Technician Technician Satisfaction
Aggregate Balanced scorecard
TCO % Resolved Level 1 Capable
Service Level Mean Time to Resolve
© 2014 MetricNet, LLC, www.metricnet.com 21
22. Aggregate Metrics: The Balanced Scorecard
Step 1
Eight critical
performance
metrics have been
selected for the
scorecard
Step 2
Each metric has been
weighted according to its
relative importance
Step 3
For each performance metric,
the highest and lowest
performance levels in the
benchmark are recorded
Step 4
Your actual
performance for
each metric is
recorded in this
column
Step 5
Your score for each
metric is then
calculated: (worst case
– actual performance) /
(worst case – best
case) X 100
Step 6
Your balanced score for each
metric is calculated: metric
score X weighting
22
Worst Case Best Case
Cost per Incident 15.0% $312.00 $19.00 $48.00 90.1% 13.5%
Cost per Service Request 15.0% $556.00 $41.00 $113.00 86.0% 12.9%
Customer Satisfaction 25.0% 67.0% 94.0% 83.0% 59.3% 14.8%
Technician Utilization 15.0% 36.0% 84.0% 59.0% 47.9% 7.2%
First Contact Resolution Rate (incidents) 15.0% 38.0% 84.0% 61.0% 50.0% 7.5%
% of Incidents Resolved in 24 Hours 5.0% 19.0% 71.0% 58.0% 75.0% 3.8%
Mean Time to Complete Service Requests (days) 5.0% 18.4 1.8 5.8 75.9% 3.8%
Technician Satisfaction 5.0% 59.0% 93.0% 84.0% 73.5% 3.7%
Total 100.0% N/A N/A N/A N/A 67.1%
Balanced
Score
Your Actual
Performance
Metric
Score
Performance RangeMetric
WeightingPerformance Metric
© 2014 MetricNet, LLC, www.metricnet.com
23. Desktop Support Scorecard Trend
40%
45%
50%
55%
60%
65%
70%
75%
80%
85%
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
DesktopSupportBalancedScore
12 Month Average Monthly Score
© 2014 MetricNet, LLC, www.metricnet.com 23
24. Desktop Support Balanced Scorecard Benchmark
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
DesktopSupportBalancedScores
High 86.2%
Average ----- 51.9%
Median 50.1%
Low 16.0%
Your Score 67.1%
Desktop Support Balanced Scores
Key Statistics
© 2014 MetricNet, LLC, www.metricnet.com 24
27. 27
Characteristics of World-Class Desktop Support
Desktop Support consistently exceeds customer expectations
Result is high levels of customer satisfaction (> 93%)
MTTR is below average for Incidents and Service Requests
< 0.7 days for Incidents
< 3.8 days for Service Requests
Costs are managed at or below industry average levels
Cost per Ticket, per Incident, and per Service Request is below average
Minimizes Total Cost of Ownership (TCO)
Desktop Support follows industry best practices
Industry best practices are defined and documented
Desktop Support follows industry best practices
Every transaction adds value
A positive customer experience
Drives a positive view of IT overall
© 2014 MetricNet, LLC, www.metricnet.com
28. Tickets, Incidents, and Service Requests
Incident Volume + Service Request Volume = Ticket Volume
Tickets
Incidents Service Requests
Unplanned work that
requires a physical
touch to a device
Hardware break/fix
Device failure
Connectivity failure
Planned work that
requires a physical touch
to one or more devices
Move/Add/Change
Hardware or software
upgrade
Device refresh
Device set-up
© 2014 MetricNet, LLC, www.metricnet.com 28
29. Benchmarking Data Summary
Average Min Max
Cost per Ticket $81 $68 $55 $117
Cost per Incident $57 $45 $34 $85
Cost per Service Request $144 $128 $107 $196
Customer Satisfaction 71% 81% 71% 89%
First Contact Resolution Rate (Incidents) 49% 55% 44% 82%
% Resolved Level 1 Capable 9% 17% 4% 23%
% of Tickets Re-opened 2.0% 6.0% 1.0% 8.0%
Technician Utilization 46% 54% 36% 84%
Tickets per Technician per Month 65 81 58 93
Incidents per Technician per Month 81 100 72 131
Service Requests per Technician per Month 42 53 35 71
Ratio of Technicians to Total Headcount 73% 84% 69% 88%
Average Incident Response Time (hours) 9.1 6.3 4.2 12.1
% of Incidents Resolved in 24 Hours 33% 51% 33% 62%
Mean Time to Resolve Incidents (days) 2.6 1.9 0.7 4.3
Mean Time to Complete Service Requests (days) 12.7 7.4 4.6 12.7
Technician Satisfaction 71% 84% 66% 87%
New Technician Training Hours 10 46 10 80
Annual Technician Training Hours 0 21 0 28
Annual Technician Turnover 53% 24% 8% 53%
Technician Absenteeism 19% 14% 9% 22%
Technician Tenure (months) 11 47 11 64
Technician Schedule Adherence 64% 87% 64% 91%
Average Incident Work Time (min) 65 49 38 77
Average Service Request Work Time (min) 119 94 80 110
Average Travel Time per Ticket (min) 18 23 12 49
Tickets per Seat per Month 0.79 0.90 0.70 1.68
Incidents per Seat per Month 0.57 0.65 0.5 1.2
Service Requests per Seat per Month 0.22 0.25 0.2 0.48
Incidents as a % of Total Ticket Volume 72% 72% 71% 71%
Deskside Support KPI's Peer GroupYour
Performance
Productivity
Workload
Cost
Quality
Metric Type
Service Level
Technician
Ticket Handling
© 2014 MetricNet, LLC, www.metricnet.com 29
30. Key Metrics Magnified
© 2014 MetricNet, LLC, www.metricnet.com 30
KPI Company Peer Group Average
Cost per Ticket $81 $68
Customer Satisfaction 71% 81%
31. The Two Foundation Metrics: Cost and Customer Satisfaction
Cost per Ticket Customer Satisfaction
© 2014 MetricNet, LLC, www.metricnet.com 31
32. Cost and Quality: Nothing Else Matters!
Lower Cost
Cost per Ticket (Efficiency)
CustomerSatisfaction
(Effectiveness)
Top Quartile
Efficient and Effective
Lower Quartile
Middle Quartiles
Effective but not Efficient
Middle Quartiles
Efficient but not Effective
World-Class
Desktop
Support
Peer Group
Higher Cost
Lower
Quality
Higher
Quality
© 2014 MetricNet, LLC, www.metricnet.com 32
33. Technician Utilization: The Key Driver of Cost
Cost per Ticket Customer Satisfaction
Technician
Utilization
© 2014 MetricNet, LLC, www.metricnet.com 33
35. Desktop Support Technician Utilization Defined
Technician
Utilization
((Average number of Incidents handled by a technician in a month) X (Average Incident Work Time) +
(Average number of Service Requests handled by a technician in a month) X (Average Service Request Work Time) +
(Average number Tickets handled by a technician in a month) X (Average Travel Time per Ticket))
(Average number of days worked in a month) X (Number of work hours in a day) X (60 minutes/hr)
Technician Utilization is a measure of technician work and travel time, divided by
total time at work during the month
It takes into account both incidents and service requests handled by the
technicians
But it does not make adjustments for sick days, holidays, training time, project
time, or idle time
=
© 2014 MetricNet, LLC, www.metricnet.com 35
36. Example: Desktop Support Technician Utilization
Incidents per Technician per Month = 60
Service Requests per Technician per Month = 24
Average Tickets per Technician per Month = 84
Average Incident Work Time = 32 minutes
Average Service Request Work Time = 59 minutes
Average Travel Time per Ticket = 41 minutes
Technician
Utilization
((Average number of Incidents handled by a technician in a month) X (Average Incident Work Time) +
(Average number of Service Requests handled by a technician in a month) X (Average Service Request Work Time) +
(Average number Tickets handled by a technician in a month) X (Average Travel Time per Ticket))
(Average number of days worked in a month) X (Number of work hours in a day) X (60 minutes/hr)
=
Technician
Utilization
((60 Incidents per Month) X (32 minutes) + (24 Service Requests per Month) X (59 minutes) +
(84 Tickets per Month) X (41 minutes))
(21.5 working days per month) X (7.5 work hours per day) X (60 minutes/hr)
= =
70%
Technician
Utilization
© 2014 MetricNet, LLC, www.metricnet.com 36
37. The Drivers of Customer Satisfaction
Cost per Ticket Customer Satisfaction
Technician
Utilization
FCR
(Incidents)
Technician
Satisfaction
SL’s
MTTR
© 2014 MetricNet, LLC, www.metricnet.com 37
40. Incident MTTR vs. Customer Satisfaction
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
0.0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0
Incident MTTR (days)
CustomerSatisfaction
© 2014 MetricNet, LLC, www.metricnet.com 40
41. The Drivers of Technician Satisfaction
Cost per Ticket Customer Satisfaction
Technician
Utilization
FCR
(Incidents)
Technician
Satisfaction
SL’s
MTTR
Service
Levels: MTTR
Coaching Career Path Training Hours
© 2014 MetricNet, LLC, www.metricnet.com 41
42. New Technician Training Hours vs. Technician Satisfaction
20%
30%
40%
50%
60%
70%
80%
90%
100%
0.0 50.0 100.0 150.0 200.0 250.0
New Technician Training Hours
TechnicianSatisfaction
© 2014 MetricNet, LLC, www.metricnet.com 42
43. Annual Training Hours vs. Technician Job Satisfaction
40%
50%
60%
70%
80%
90%
100%
0 20 40 60 80 100 120 140
Annual Technician Training Hours
TechnicianSatisfaction
© 2014 MetricNet, LLC, www.metricnet.com 43
44. Technician Experience vs. Incident FCR
20%
30%
40%
50%
60%
70%
80%
90%
0.0 10.0 20.0 30.0 40.0 50.0 60.0 70.0
Technician Time on Job (months)
IncidentFCR
© 2014 MetricNet, LLC, www.metricnet.com 44
45. Measuring Defects: % Resolved Level 1 Capable
Cost per Ticket Customer Satisfaction
Technician
Utilization
FCR
(Incidents)
Technician
Satisfaction
SL’s
MTTR
Service
Levels: MTTR
Coaching Career Path Training Hours
% Resolved
Level 1
Capable
© 2014 MetricNet, LLC, www.metricnet.com 45
46. 46© 2014 MetricNet, LLC, www.metricnet.com
Two Metrics You Should Know
% Resolved Level 1 Capable (PRLC)
The percentage of tickets resolved by desktop support
that could have been resolved at level 1 support.
First Level Resolution Rate (FLR)
The number of tickets resolved at level 1 divided by all
tickets that can potentially be resolved at level 1.
47. Cost of Resolution: North American Averages
Support Level Cost per Ticket
Vendor
Level 2: Desktop Support
Field Support
Level 3 IT
(apps, networking, NOC, etc.)
Level 1: Service Desk
$471
$196
$85
$62
$22
© 2014 MetricNet, LLC, www.metricnet.com 47
48. A SPOC Service Desk is Highly Leveraged
User Community
Level 1:
Service Desk
Level 2:
Desktop
Support
Field
Support
Level 3:
IT Support
Vendor
Support
48© 2014 MetricNet, LLC, www.metricnet.com
49. Key SPOC Principles
Key SPOC Principles
Enterprise takes an end-to-end view of user
support
User/Customer has a single point of contact
for all IT-related incidents, questions,
problems, and work requests
The Level 1 Service Desk is the SPOC
Level 1 is responsible for:
Ticket triage
Resolution at Level 1 if possible
Effective handoffs to n level support
Resolution coordination and facilitation
Ticket closure
Desktop “Drive-bys”, “Fly-bys”, and “Snags”
are strongly discouraged
© 2014 MetricNet, LLC, www.metricnet.com 49
50. SPOC Support Reduces Total Cost of Ownership
0%
5%
10%
15%
20%
25%
30%
35%
40%
%ResolvedLevel1Capable
% Resolved Level 1 Capable with SPOC % Resolved Level 1 Capable without SPOC
Average = 22.8%
Average = 15.3%
© 2014 MetricNet, LLC, www.metricnet.com 50
51. A Summary of the Major KPI Correlations
Cost per Ticket Customer Satisfaction
Technician
Utilization
FCR
(Incidents)
Technician
Satisfaction
Coaching Career Path Training Hours
SL’s
MTTR
Work/
Travel Time
Techs/
Total FTE’s
Absenteeism/
Turnover
% Resolved
Level 1
Capable
Scheduling
Efficiency
Service
Levels: MTTR
© 2014 MetricNet, LLC, www.metricnet.com 51
52. Benchmarking Case Study: The Diagnosis
Average Min Max
Cost per Ticket $81 $68 $55 $117
Cost per Incident $57 $45 $34 $85
Cost per Service Request $144 $128 $107 $196
Customer Satisfaction 71% 81% 71% 89%
First Contact Resolution Rate (Incidents) 49% 55% 44% 82%
% Resolved Level 1 Capable 9% 17% 4% 23%
% of Tickets Re-opened 2.0% 6.0% 1.0% 8.0%
Technician Utilization 46% 54% 36% 84%
Tickets per Technician per Month 65 81 58 93
Incidents per Technician per Month 81 100 72 131
Service Requests per Technician per Month 42 53 35 71
Ratio of Technicians to Total Headcount 73% 84% 69% 88%
Average Incident Response Time (hours) 9.1 6.3 4.2 12.1
% of Incidents Resolved in 24 Hours 33% 51% 33% 62%
Mean Time to Resolve Incidents (days) 2.6 1.9 0.7 4.3
Mean Time to Complete Service Requests (days) 12.7 7.4 4.6 12.7
Technician Satisfaction 71% 84% 66% 87%
New Technician Training Hours 10 46 10 80
Annual Technician Training Hours 0 21 0 28
Annual Technician Turnover 53% 24% 8% 53%
Technician Absenteeism 19% 14% 9% 22%
Technician Tenure (months) 11 47 11 64
Technician Schedule Adherence 64% 87% 64% 91%
Average Incident Work Time (min) 65 49 38 77
Average Service Request Work Time (min) 119 94 80 110
Average Travel Time per Ticket (min) 18 23 12 49
Tickets per Seat per Month 0.79 0.90 0.70 1.68
Incidents per Seat per Month 0.57 0.65 0.5 1.2
Service Requests per Seat per Month 0.22 0.25 0.2 0.48
Incidents as a % of Total Ticket Volume 72% 72% 71% 71%
Deskside Support KPI's Peer GroupYour
Performance
Productivity
Workload
Cost
Quality
Metric Type
Service Level
Technician
Ticket Handling
© 2014 MetricNet, LLC, www.metricnet.com 52
53. Low Customer Satisfaction Diagnosed
© 2014 MetricNet, LLC, www.metricnet.com 53
KPI Company Peer Group Average
Customer Satisfaction 71% 81%
Incident FCR 49% 55%
New Tech Training Hrs 10 46
Annual Tech Training Hrs 0 21
54. High Costs Diagnosed
© 2014 MetricNet, LLC, www.metricnet.com 54
KPI Company Peer Group Average
Cost per Ticket $81 $68
Technician Utilization 46% 54%
% Resolved in 24 Hrs 33% 51%
Annual Turnover 53% 24%
55. A Summary of the Major KPI Correlations
Cost per Ticket Customer Satisfaction
Technician
Utilization
FCR
(Incidents)
Technician
Satisfaction
Coaching Career Path Training Hours
SL’s
MTTR
Work/
Travel Time
Techs/
Total FTE’s
Absenteeism/
Turnover
% Resolved
Level 1
Capable
Scheduling
Efficiency
Service
Levels: MTTR
© 2014 MetricNet, LLC, www.metricnet.com 55
57. 57
The Paradox of IT Support
Less than 5% of all IT spending is
allocated to end-user support
Service desk, desktop support,
field support
This leads many to erroneously
assume that there is little upside
opportunity in IT support
The result is that most support
organizations are managed with the
goal of minimizing costs
But the most effective support
strategies focus on maximizing
value
© 2014 MetricNet, LLC, www.metricnet.com
Corporate IT Spending Breakdown
4%
96%: Non support functions
End-User Support
Application
Development
Application
Maintenance
Network
Operations
Mainframe and
midrange Computing
Desktop Computing
Contract Services
(e.g., disaster
recovery)
59. 84%
47%
31% 29%
22%
19%
8%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
Service Desk Desktop
Support
Network
Outages
VPN Training Enterprise
Applications
Desktop
Software
Factors Contributing to IT Customer Satisfaction
%SayingVeryImportant
n = 1,044
Global large cap companies
Survey type: multiple choice
3 responses allowed per survey
84% cited the service desk as a very important factor in their overall satisfaction with corporate IT
47% cited desktop support as a very important factor in their overall satisfaction with corporate IT
59© 2014 MetricNet, LLC, www.metricnet.com
Because Desktop Support is a Major Driver of Customer Satisfaction
60. 60
Support Level Cost per Ticket
Vendor
Level 2: Desktop Support
Field Support
Level 3 IT
(apps, networking, NOC, etc.)
Level 1: Service Desk
$471
$196
$85
$62
$22
© 2014 MetricNet, LLC, www.metricnet.com
Because Support Has an Opportunity to Minimize TCO
61. 61
0
10
20
30
40
50
60
70
0 1 2 3 4 5
ProductiveHoursLostperEmployeeperYear
Because Quality of Support Drives End-User Productivity
1 (top) 2 3 4 (bottom)
Customer Satisfaction 93.5% 84.5% 76.1% 69.3%
First Contact Resolution Rate 90.1% 83.0% 72.7% 66.4%
Mean Time to Resolve (hours) 0.8 1.2 3.6 5.0
Customer Satisfaction 94.4% 89.2% 79.0% 71.7%
First Contact Resolution Rate 89.3% 85.6% 80.9% 74.5%
Mean Time to Resolve (hours) 2.9 4.8 9.4 12.3
Service Desk
Desktop Support
Performance Quartile
Support Function Key Performance Indicator
37.4 46.9Average Productive Hours Lost per Employee per Year 17.1 25.9
Performance Quartile n = 60
© 2014 MetricNet, LLC, www.metricnet.com
62. The 80/20 Rule for Desktop Support KPI’s
Cost per TicketCost
Productivity
Quality
Call Handling
Technician Utilization
Customer satisfaction
First contact resolution rate (incidents)
Technician Technician Satisfaction
Aggregate Balanced scorecard
TCO % Resolved Level 1 Capable
Service Level Mean Time to Resolve
© 2014 MetricNet, LLC, www.metricnet.com 62
63. Some Final Thoughts on Desktop Support KPI’s
When it comes to Desktop Support KPI’s, the 80/20 Rule applies
Less really is more!
Seven KPI’s, plus the Balanced Scorecard are all you need to holistically
measure and manage your Desktop Support organization
Understand the cause-and-effect relationship between KPI’s
This gives you the power to achieve desired outcomes in Desktop Support!
Not all Desktop Support KPI’s are controllable
Focus on the controllable KPI’s
Effective application of Desktop Support KPI’s can help to
Drive high levels of Customer Satisfaction for all of IT
Reduce and minimize Total Cost of Ownership for End-User Support,
Return productive hours to end users
© 2014 MetricNet, LLC, www.metricnet.com 63
65. Metrics: The Foundation of Success in Desktop Support!
METRICS
PEOPLE
PROCESS
TECH
65© 2014 MetricNet, LLC, www.metricnet.com
67. Please make sure to complete an
evaluation on your smartphone by going to
www.HDIConference.com/Eval, or by going
to the HDI 2014 mobile conference app.
Thank you for attending this session
70. 70© 2014 MetricNet, LLC, www.metricnet.com
Benchmarking is MetricNet’s Core Business
Call Centers
Telecom
Information
Technology
Satisfaction
Technical Support
Customer Service
Telemarketing/Telesales
Collections
Service Desk
Desktop Support
Field Support
Price Benchmarking
Customer Satisfaction
Employee Satisfaction
72. More than 1,100 Desktop Support Benchmarks
Global Database
30+ Key Performance Indicators
Nearly 60 Industry Best Practices
25 Years of Desktop Support Data
73. 73
Your Presenter: Jeff Rumburg
© 2014 MetricNet, LLC, www.metricnet.com
Jeff Rumburg is a co-founder and Managing Partner at MetricNet,
LLC. Jeff is responsible for global strategy, product development,
and financial operations for the company. As a leading expert in
benchmarking and re-engineering, Mr. Rumburg authored a best
selling book on benchmarking, and has been retained as a
benchmarking expert by such well-known companies as American
Express, Hewlett-Packard, and GM. Prior to co-founding
MetricNet, Mr. Rumburg was president and founder of The Verity
Group, an international management consulting firm specializing in
IT benchmarking. While at Verity, Mr. Rumburg launched a number
of syndicated benchmarking services that provided low cost
benchmarks to more than 1,000 corporations worldwide.
Mr. Rumburg has also held a number of executive positions at META Group, and Gartner, Inc. As
a vice president at Gartner, Mr. Rumburg led a project team that reengineered Gartner's global
benchmarking product suite. And as vice president at META Group, Mr. Rumburg's career was
focused on business and product development for IT benchmarking. Mr. Rumburg's education
includes an M.B.A. from the Harvard Business School, an M.S. magna cum laude in Operations
Research from Stanford University, and a B.S. magna cum laude in Mechanical Engineering. He
is author of A Hands-On Guide to Competitive Benchmarking: The Path to Continuous Quality
and Productivity Improvement, and has taught graduate-level engineering and business courses.
Mr. Rumburg serves on the Strategic Advisory Board for HDI.
74. Meet a Sampling of Our Clients
MetricNet Conducts benchmarking for IT Service and Support
organizations worldwide, and across virtually every industry sector.
© 2014 MetricNet, LLC, www.metricnet.com 74
75. You Can Reach MetricNet…
By Phone…
703-992-7559
On Our Website…
www.metricnet.com
Or E-mail us…
info@metricnet.com
© 2014 MetricNet, LLC, www.metricnet.com 75
76. The 80/20 Rule for Desktop KPI’s:
Less is More!
Jeff Rumburg • MetricNet