IT benchmarking (the comparison IT costs and IT prices) with market leaders or other companies of your industry or maybe a foreign sector can be a valuable management tool. Its effectiveness, though, depends on the quality of the reference data and a professionally conducted benchmark (reference model).
LEXTA is a leader in IT benchmarking and offers a deeper look into its practices and experiences in the field of IT benchmarkings.
2. For technically-oriented benchmarking, LEXTA has access to current data from
approx. 1,500 individual contracts containing approx. 50,000 price and cost items
Company-Profile-LEXTA.pptx
Slide 1Source: LEXTA
IT BENCHMARKING METHODOLOGY
IT
Benchmarking
Benchmarking strategies Procedure Results
• Analysis of IT service scope, with
demarcation of standard and non-
standard service components
• Determination of IT costs, with
projection onto a standardised service
scope for all IT services under
consideration
• Analysis of IT costs
• Determination of business-related
KPIs, e.g.:
– IT costs vs. sales or as a
share of overall costs
– IT costs per user / employee
– IT costs per subscriber, etc.
• Comprehensive statement on the gap
between the IT service providers with
excellent cost / performance positions
(lessons learned)
• Reliable identification of levers and
approaches for the optimisation of costs
with an estimation of potential
• Business- and cost-related IT KPIs for
industry-specific trend comparisons
• Indicator of customary market practice
for user requirements
• Limited consideration of quality
standards, functional scope and
company-specific characteristics
Business-
oriented
benchmarking
Technically-
oriented
benchmarking
IT
Benchmarking
3. SupportOperateBuildPlan
Standardisation is the key in IT management – LEXTA can provide benchmarks
for all standard IT services
Company-Profile-LEXTA.pptx
Slide 2Source: LEXTA
IT VALUE CREATION PYRAMID
Standard
applications
(e.g. SAP/ERP)
Data centre facilities
(Building, air conditioning, etc.)
Hardware
(Processors, storage, network)
Operating systems
(e.g. UNIX, Windows)
Databases
(e.g. Oracle)
= Standard IT benchmarking = LEXTA-specific cost estimate (ProBIT) = LEXTA-specific application benchmarking
Individual
applications
4. Slide 3
All IT benchmarking is based on clearly-defined service categories
TYPICAL CATEGORIES FOR IT BENCHMARKING
Source: LEXTA
Networks and Telephony
LAN/wireless LAN VPN TelephonyWAN
Internet access
(incl. firewall)
Daily rates
Platforms and Databases
Web services /
portals
MiddlewareDatabases
Data Centre
Applications
Standard applications
(SAP, incl. IS modules)
Application
development
Custom
applications
End-User Computing
Desktop
Printer/peripherals
Messaging
(e.g. email)
Smartphone
Virtual client /
terminal services
Collaboration
platforms
Standard software /
software distribution
File / print services
Notebook
Thin client
Basic services
(e.g. ADS)
Server Data centre LAN
Storage ArchiveBackup and restore
Co-location
Output management
Tablet
Overheads and general expenses
Service desk / user help desk
Field service (Break & Fix, IMACD)
Company-Profile-LEXTA.pptx
5. BITKOM’s extensive benchmarking standards…
Company-Profile-LEXTA.pptx
Slide 4Source: BITKOM
CONFORMITY WITH BITKOM BENCHMARKING STANDARDS (1 OF 2)
Methodological competence
• Standardised approach towards data collection and
analysis
• Identification and quantification of cost drivers
• Verification of data and procedures to identify possible
inconsistencies and gaps
Experience
• Methodological competence in data collection, measurement
and comparison
• Seniority, experience and continuous availability of
the person responsible for benchmarking
• Substantial references from benchmarking projects
Principles and code of conduct
• Open and fair conduct
• Observation of legal requirements and confidentiality in
data handling
• Comprehensive and truthful data collection
• Conformity with schedules and milestones
• Continuity of project team
• Target-oriented preparation and hosting of project meetings
Provisioning of an up-to-date, comprehensive pool of data
• Coverage of all service components
• Coverage of service range and quality
• Up-to-date data (< 15 months)
• Database granularity as regards service depth
• In-house database
• First-hand data collection
• Data quality assurance
• Uniform and standardised services
• Frequent updates
• Geographical coverage
Europa and North America
(global within GBN)
6. …are covered almost perfectly by LEXTA
Company-Profile-LEXTA.pptx
Slide 5Source: BITKOM
CONFORMITY WITH BITKOM BENCHMARKING STANDARDS (2 OF 2)
Data collection and analysis
• Compilation of technical performance indicators
• Collection of service quality data, with particular focus on
cost drivers
• Recording of the individual company setting
• Recording of governance and business processes
• Creation of peer groups
• Development of categories based on SLAs and / or services
• Assessment of special financial items (e.g. asset or human
resource transfers, ‘cosmetic’ pricing adjustments, risk
spreading)
• Data standardisation using price deductions and premiums
• Transparent standardisation
Full compilation of all factors influencing price
• Service specifications
• Quality
• Complexity
• Volumes / amounts
• Contract duration
• Special arrangements
–
Project organisation
• Application of a three-tiered organisational model (steering
committee, project management and team)
• Organisation of data collection workshops and
coaching sessions
• Realistic project planning
Presentation of results
• Market conformity shown for the results
• Best in class results shown
• Extensive presentation of the results
• Comparison within the same industry sector
• Comparison with other industries
• Peer group sample size (>= 6)
• Indicator analysis
• Derivation of levers
• Clear and tangible results
• Proof of experience via existing references
7. All IT benchmarks are based on continuously updated data, hand-picked and quality-
assured by LEXTA consultants
Company-Profile-LEXTA.pptx
Slide 6Source: LEXTA
LEXTA BENCHMARKING DATABASE
Criteria Core content
Data quality
• Collected at first hand by LEXTA consultants during benchmarking, sourcing and cost optimisation
projects, all data is quality-assured
• Data is gathered in close cooperation with the customer during contract and cost analyses, and is
verified explicitly in joint workshops
• No third-party data, no research data
Data currency
• Benchmarks use recent data with a maximum age of 12 months
• For trend extrapolation, time series data is available from the last 10 years
Data scope
• Current data pool:
Approx. 250 companies
Approx. 1,500 individual contracts
Approx. 50,000 records; 2,500 records for application operation benchmarks
Levers and
recommendations
• All benchmarking categories can draw on an extensive collection of
levers,
price drivers, and
recommendations
8. A typical IT benchmarking project consists of six phases
Company-Profile-LEXTA.pptx
Slide 7Source: LEXTA
IT BENCHMARKING PROCEDURE
1. Preparation 2. Collection of
initial data
3. Normalisation 4. Selection of
peer group
5. Execution of
benchmarking
6. Identification
of levers
Phases:
• Kick-off
• Select contact
persons for
benchmarking
• Classify service
modules
• Scheduling
Content: • Collect information
on service
portfolio, quantity
structures, SLAs,
security and costs
• Analyse overheads
(if necessary)
• Research
company’s specific
regulations
• Ensure
comparability of IT
services
• Consider SLA
criteria
• Scale client data to
parameters of
comparable
systems
• Define selection
criteria for peer
groups
• Select inner and
outer peer groups
as benchmark
references
• Establish clarity as
regards
comparability and
potential limitations
• Calculate costs per
IT service
• Determine ranking
in comparison to
competitors
• Show costs in
comparison to peer
groups
• Identify causes of
cost differences
versus peer group
• Identify potential
for improvement
• Plan
implementation
• Project frameworkResults: • Report on current
situation
• Report on
correction
factors
• Peer group
selected
• Benchmarking
results
• Improvement
potential
• Overview of levers
9. Data recording in templates is based on a detailed interrogation of the service
descriptions for the services in scope
TEMPLATE STRUCTURE
Worksheet basic data
• Responsible persons for data delivery and quality
• Data recording version
Worksheet service scope
• Definition of service scope
Worksheet quantity structure
• Recording of relevant quantities
Work sheet service levels
• Definition of service levels
Worksheet technology
• Specification of technologies used for service delivery
Worksheet costs
• Recording of costs (labour, hardware, software, third-
party and material costs) for the service
Worksheet special data
• Identification of additional company-specific information
and cost drivers
Template content and structure
IT Benchmarking
Service Service desk
Version recorded 0.1
Date of data recording
Customer contact
LEXTA consultant
An extra form (cover
sheet) is available for
cross-subject data
(security, contract
conditions, etc.)
Source: LEXTA Slide 8
Company-Profile-LEXTA.pptx
10. Within the normalisation process, comparability between benchmarking participant
data is ensured by normalising it against a common reference model
Company-Profile-LEXTA.pptx
Slide 9Source: LEXTA
APPROACH TO NORMALISATION
Company 1
Company 2
Company n
Client company
Reference
model
Normalisation
Normalised and non-
normalised data from the
companies is available in the
benchmarking database
Normalisation
11. Company-Profile-LEXTA.pptx
Slide 10
Normalization is not a ‘black box’ but a transparent computation
In EUR per ticket
Service Normalisation
requirement
Approach Agreed by
contract
Normalisation of Markup (+) /
markdown (-)
Service desk No customer identification
prior to call pick-up by agent
• Manual identification in approx. 50 % of the calls
(LEXTA estimate)
• Identification cost of 20 seconds per call (LEXTA
estimate)
• Customary hourly market rate for qualified support
of €65.50 (benchmark mean value)
No Peer group + 0.22
Fault analysis only partially
included
• Estimate of 4 hours per month (LEXTA estimate) for
the additional fault analysis
• Customary hourly market rate for qualified support
of €65.50 (benchmark mean value)
Yes Peer group + 0.10
Extended reporting partly
included
• Estimate of 4 hours per month (LEXTA estimate) for
the preparation of further reports
• Customary hourly market rate for reporting
specialist of €87.50 (benchmark mean value)
Yes Peer group – 0.14
Normalisation total Peer group + 0.18
Number of tickets per month 2,500
NORMALISATION ‒ SERVICE DESK PROJECT EXAMPLE
Source: LEXTA
12. The peer groups are selected on the basis of quantitative and qualitative criteria
Company-Profile-LEXTA.pptx
Slide 11Source: LEXTA
SELECTION OF PEER GROUP
* = comparable without normalisation = = minor normalisation necessary = needs to be discussed
No. Industry Region Number of users Comparability IT provided
internally,
integrated or
externally
Service level* Security
requirements*
Peer company EMEA, FTAA, AFTA 150,000 External
1 Transport and logistics EMEA 60,000 Integrated
2 Utility company EMEA 95,000 Integrated
3 Automotive EMEA, FTAA, AFTA 100,000 External
4 IT service provider EMEA 100,000 External
5 Chemical industry EMEA, FTAA, AFTA 110,000 Integrated
6 Transport and logistics EMEA, FTAA, AFTA 120,000 Integrated
7 Financial services EMEA, FTAA, AFTA 150,000 Internal
8 Industry EMEA, FTAA, AFTA 180,000 External
9 Telecommunications EMEA, FTAA, AFTA 240,000 Integrated
10 Automotive EMEA, FTAA, AFTA 260,000 External
PROJECT EXAMPLE
13. Presented in an anonymised chart, each benchmark serves to indicate the evaluation
of prices from service providers or other market participants
Company-Profile-LEXTA.pptx
Slide 12Source: LEXTA
BENCHMARKING RESULT SERVICE DESK
16.45
Rank 10
24.22
Rank 9
21.86
Rank 8
20.17
Rank 7
19.14
Rank 6
16.82
Rank 5
15.57
Rank 4
15.16
Rank 3
13.25
Rank 2
12.88
Rank 1
12.39
Arithmetic
mean
17.15
1st
quartile
13.73
Price
16.45
In EUR per ticket
Example client Benchmark
PROJECT EXAMPLE
14. Slide 13
RECOMMENDED ACTIONS – SERVICE DESK
No. Observation Recommended action
1 The number of tickets per user is significantly above the
market level
• Check ticket volume after successful completion of Windows
7 migration
• Investigate staff training needs
2 The proportion of forwarded tickets is 50 percent and thus
higher than the market (average at 30 percent)
• Check the troubleshooting database and procedure for
knowledge transfer within the various departments
• Check process steps used for problem management
3 Staff with very good language qualifications; service desk is
offered in four languages
• Perform a requirements analysis with the goal of reducing
the number of languages offered
4 The average Service Desk response time is 350 seconds and
is thus much higher than the market (20 seconds)
• Increase staffing levels at peak times
• Reduce maximum problem resolution time before forwarding
Customer Market
0.85
0.65
The reasons for the high ticket volume should be investigated; staff training
requirements should also be reviewed
Company-Profile-LEXTA.pptx
Source: LEXTA
PROJECT EXAMPLE
15. During application benchmarking, LEXTA distinguishes between standard and
customised applications
Slide 14
APPLICATION BENCHMARKING METHODS
Source: LEXTA
Topic Standard application benchmark Customised application benchmark
Approach
• Standard applications (e.g. SAP) share
specific ‘standards’ and therefore are
compared only with other standard
applications
• Standard applications compared have the
same functionality (e.g. SAP BW)
• Customized applications are unique – they
are compared to other applications on the
basis of operational cost drivers
• Applications compared may be functionally
distinct
Services included
• Application operations
• Basic operations
• Infrastructure
• Application operations
• Infrastructure (optional)
Comparison group
• Standard applications from other companies
comparable on size / industry
• Comparable quantity structure (user,
incidents, changes, etc.)
• Application with comparable operational cost
drivers (incidents, changes, service time,
interfaces (APIs), etc.)
Company-Profile-LEXTA.pptx
16. Reliable end-to-end benchmarks are available for the operation of SAP and standard
applications
Company-Profile-LEXTA.pptx
BENCHMARKING APPROACH TO SAP AND OTHER STANDARD APPLICATIONS
Applications Benchmark database
ERP
IS-U
BI
CRM
HCM
Mobile
Other
Dynamics NAV
Portal
Dynamics AX
Salesforce CRM
EDI
Oracle E-Business Suite
Other (e.g. CAD, PLM)
SAP
Microsoft EUR per account
EUR per transaction
EUR per master record
…
End-to-end benchmarks
include:
• Licenses
• Cost of maintenance
• Basic application operation
• Infrastructure
Databases
Servers
Storage
Backup and restore
DC LAN
Co-location
Not included:
• Larger implementation /
application integration
projects
• Service desk / user help
desk
End-to-end benchmarks EUR per SAP landscape
EUR per SAP product
EUR per SAPS*
EUR per user
EUR per invoice
EUR per infocube
Source: LEXTA Slide 15
* SAPS: SAP Application Performance Standard
17. +
Benchmarks for the operation of individual applications are based on the parameters
that constitute the primary cost drivers in application operations
Company-Profile-LEXTA.pptx
BENCHMARKING APPROACH TO INDIVIDUAL APPLICATIONS
Primary cost drivers Benchmark database
Best-practice costs for
application management,
based on over 1,000
benchmarked application
instances and regression
analyses
Application operations Infrastructure
Databases
Server
Storage
Backup and restore
DC LAN
Co-location
Service desk / user help
desk
…
Source: LEXTA Slide 16
Correlation of cost drivers with operational effort
Volume of problems
(easy / medium / complex)
Platform
Availability
Service request volume
Parametrisation volume
Volume of incidents
(easy / medium / complex)
Volume of changes
(easy / medium / complex)
Time of service, incl. call-out
service
Volume of logical partitions
Volume of APIs
Volume of users
Time of service
Incidents
Changes
Time of Service incl.
Call-out Service
Logical Partitions
APIs
User
Problems
Time of Service
Platform
Availability
Service Requests
Parametrization
18. The customer gains a competitive advantage by using SAP BI, but it comes at a price
Company-Profile-LEXTA.pptx
KPI OVERVIEW FOR SAP BI SALES
Improvement of business agility Reduction of application costs
1
2
3
4
5
Age / lifecycle Differentiation of
functionality
Customer
interface
Number of
stakeholders
Maturity of
documentation
Number of major
release changes
Build / run
ratio
Number of
weighted
changes
Implementation
time for changes
Projects in time /
in quality /
in budget
Maturity of
programming
language
Number of
interfaces
Innovation
strategy
Disaster
recovery
1
2
3
4
5
Cost per
infocube Average cost
per change
Incident /
problem ratio
Distribution
on- / near- /
offshore
Number of
3rd-level
tickets
External
maintenance
costs
Number of
incidents p. a.
Ratio
hotfixes / changes
Post-release
incidents / changes
Nucleus
Availability,
SaaS, cloud
Standardised
platform,
programming
environment
Tier levels
Virtualisation
PROJECT EXAMPLE
Source: LEXTA Slide 17
19. Three methods are used for benchmarking application development
Company-Profile-LEXTA.pptx
Slide 18
METHODS FOR BENCHMARKING APPLICATION DEVELOPMENT
Source: LEXTA
Application development effort
(Volume in person days, costs in EUR)
Benchmarking of development costs Hourly rate benchmarking
• Use of Function Point Analysis to
evaluate product complexity
• Method: market price or cost
benchmark, i.e. compare
development costs with similarly
complex projects from other
companies
• Demonstrate differences compared
to the very best on the market and
identify proper instruments
Analysis of productivity
(‘Productivity Index’)
• Evaluation of product complexity and
size in comparison with the market
(‘XL, L, M, S, XS’)
• Evaluation of the productivity index
as a percentage
• Use of a standardised record sheet
for data collection
• Method: interviews with 6 - 8 key
stakeholders, e.g. project manager,
developer, tester, division, service
provider, operations
• Identification of proper instruments
to improve productivity
• Evaluation of appropriateness of
hourly development rate in
comparison with the market
• Method: market price or cost
benchmark using the SFIA* model
According to skills (e.g. IT
Engineer, Consultant, Project
Manager)
or
According to bundles (e.g. R1 –
Creation of product specification
and solution concept)
• Identification of proper instruments
* SFIA: Skill Framework for the Information Age
20. Application development productivity is evaluated using an adapted standardised
record sheet
Company-Profile-LEXTA.pptx
Slide 19
ANALYSIS OF PRODUCTIVITY: ACTUAL SUBMISSION AND RESULTS FORMAT
Source: LEXTA
Project organisation
Staff
ProductProcesses
Environment
10 %
20 %
30 %
40 %
50 %
60 %
70 %
80 %
100 %
90 %
EXAMPLE
Criteria Specification Value
Project organisation
Project sponsor motivation Low 2
Project sponsor influence Low 2
Goal uniqueness Low 2
Goal conformity High 4
Risk management effectivity and efficiency Medium 3
Development site volume High 1
Proportion of near- / -offshoring Medium 2
Staff
Team experience High 4
Team cohesion High 4
Project manager experience Medium 3
Availability of key personnel Low 2
Quality of communication / information exchange within
the team
Medium 3
Employment of specialists Medium 3
Employee turnover Low 3
Product
Stability of requirements Medium 3
Quality of reused products High 4
Quality of development environment used High 4
…
21. Development project size, complexity and productivity can be correlated to produce
a comprehensive evaluation
Company-Profile-LEXTA.pptx
APPLICATION DEVELOPMENT BENCHMARK – ILLUSTRATION OF RESULTS
XS M XLLXL
Application size / complexity
0 %
100 %
50 %
75 %
25 %
Productivityindex
Basic project, badly managed
(costs too high)
‘Disastrous project’
(costs much too high)
Basic project, well managed Difficult project, well managed
Project 2
Project 3
Project 4
Project 1
Productivity index in percent, project cost* in EUR thousand
* Size of bubble: 100 50 25 10
Source: LEXTA Slide 20
22. Business processes can be benchmarked in their entirety or in terms of specific
sub-processes
Company-Profile-LEXTA.pptx
Slide 21
BENCHMARKING PROCESS FOR HUMAN RESOURCES
• Number of FTEs
per 1,000 staff
• Ø personnel costs
per FTE
• Number of FTEs
per 1,000 staff
• Ø personnel costs
per FTE
• Number of FTEs
per 1,000 staff
• Ø personnel costs
per FTE
• Processing time
per application
• Hiring rate
• Application rate
• Number of FTEs
per 1,000 staff
• Ø personnel costs
per FTE
• Costs per payslip
• Payroll processing
time
• Number of
payslips with
errors per 1,000
staff
• Number of FTEs
per 1,000 staff
• Ø personnel costs
per FTE
• Number of FTEs
per 1,000 staff
• Ø personnel costs
per FTE
• Training and CPD
costs/total
personnel costs
• Ø personnel costs
per trainee
• Ø number of CPD
days per employee
HR PlanningHR Marketing
HR
Procurement
Payroll
HR
Management
HR
Development
HR department (overall)
ILLUSTRATIVE
• Number of FTEs per 1,000 staff
• Average personnel costs per HR dept. employee
• Average HR dept. costs per employee
• Staff turnover
Source: LEXTA
23. Please contact us for further information
Company-Profile-LEXTA.pptx
Slide 22Source: LEXTA
CONTACT DETAILS
Matthias Seidl
Managing Director
Dorotheenstraße 37
10117 Berlin
Germany
http://www.lexta.com
Phone +49 30 887124-122
Fax +49 30 887124-20
Mobile +49 179 6617730
Email seidl@lexta.com
24. LEXTA offers a performance portfolio based on IT benchmarking that covers all of
the key IT issues in management
Company-Profile-LEXTA.pptx
Slide 23Source: LEXTA
LEXTA PERFORMANCE PORTFOLIO
IT Benchmarking
IT GovernanceIT Security
IT Service
Catalogue
IT Key Performance
Indicators
IT SourcingIT Cost Optimisation
IT Strategy /
IT Business ValueProject Management