The document describes Compuware's Performance JourneySM assessment process. The assessment identifies an organization's current application performance management (APM) capabilities, maps business and IT goals, and analyzes any gaps. It involves workshops with stakeholders to review current practices, score capabilities, and identify improvement options. The assessment results typically show gaps in APM capability adoption, differences between aspirations and strategies, process deficiencies, organizational blockers, skills deficiencies, and gaps in available tooling. The goal is to develop a roadmap to achieve APM excellence based on the organization's capabilities and goals.
2. What Does it Take to be Great?
• Dribbling
• Passing
• Tackling
• Shooting
• Heading
• Teamwork
3. What Does it Take to be Great at APM?
Performance Reporting
Performance Measurement
The ability to measure
application
performance from the
end-user’s perspective
across the entire
application delivery
chain
Problem Resolution
The ability to identify,
isolate fault domain,
determine root cause
and resolve
application
performance
problems
Performance Improvement
The ability to
continuously identify,
prioritize, implement
and measure the
results of application
improvement
opportunitiesProduction Readiness
The ability to ensure
user experience can
scale with load prior
to launching new
applications or
deploying
infrastructure changes
The ability to provide
role-specific insight
using common
metrics, enabling
superior business-
oriented IT decision-
making
4. A Model of APM Maturity
Performance
Measurement
Problem
Resolution
Performance
Improvement
Production
Readiness
Performance
Reporting
No awareness of
user experience
Reactive
problem
resolution
Ad hoc “gut feel”
approach to
improvements
“Test in
Production”
approach to new
technologies
Few SLAs /
reporting on
technology
Basic awareness
and ownership
Reactive
resolution but
can validate
Some
improvements
using baseline
Best effort
focused on code
/ infrastructure
SLAs / reporting
have basic end-
user metrics
Own user
experience
across the chain
Increasingly
proactive
awareness
Can pinpoint
specific causes of
issues
Load testing
focused on the
application
SLAs / reporting
on end-to-end
performance
Deeper level of
understanding
tied to business
Automation of
issue analysis
and diagnostics
Improvement
based on Six
Sigma or ITIL
Load testing
focused on user
experience
SLAs / reporting
tied to business
metrics
Real-time
visibility drives
service delivery
Automation of
resolution before
business impact
Improvement
processes & auto
implementation
Designed w/
performance in
mind
SLAs / reporting
is competitive
advantage
Level 1
REACTIVE
Level 2
AWARE
Level 3
EFFECTIVE
Level 4
OPTIMIZED
Level 5
PERVASIVE
Best Practice
5. Improving APM Maturity
Performance Reporting
Performance Measurement
Measure end-user
experience based on key
repeatable “control”
transactions
Measure the end-user
experience based on real
end-user response time
Measure transaction-level
performance across all
tiers of the application
delivery chain
Problem Resolution
Performance Improvement
Production Readiness
Apply synthetic or robotic
monitoring using scripted
performance and
availability measurements
to identify problems
specific to geographies, to
alert on availability
problems, and to provide
controlled repeatable
end-user experience
measurements ideal for
response-time SLAs.
Core APM Aspects Key Capabilities
for Each Aspect
Best Practices for
Each Capability
9. “If you want to build a ship,
don't drum up people together to collect wood
and don't assign them tasks and work,
but rather teach them to long for the endless
immensity of the sea”
Antoine De Saint-Exupery
An approach to achieving application
performance excellence and developing a
roadmap based on organizational
capabilities and goals
Performance Journey℠ Vision
10. Performance Journey℠ Assessment
• Interactive series of workshops with key stakeholders
• Identifies current capability baseline, maps business and IT
goals, and analyzes gaps
GAP
Baseline
Goal
GAP
GAP
GAP
GAP
12. Level 2
AWARE
Performance
Reporting
Few SLAs /
reporting on
technology
SLAs / reporting
have basic end-
user metrics
SLAs / reporting
on end-to-end
performance
SLAs / reporting
tied to business
metrics
SLAs / reporting
is competitive
advantage
Few SLAs /
reporting on
technology
SLAs / reporting
on end-to-end
performanceGAP
Production
Readiness
“Test in
Production”
approach to new
technologies
Best effort
focused on code
/ infrastructure
Load testing
focused on the
application
Load testing
focused on user
experience
Designed w/
performance in
mind
“Test in
Production”
approach to new
technologies
Load testing
focused on user
experience
GAP
Performance
Improvement
Ad hoc “gut feel”
approach to
improvements
Some
improvements
using baseline
Can pinpoint
specific causes of
issues
Improvement
based on Six
Sigma or ITIL
Improvement
processes & auto
implementation
Some
improvements
using baseline
Improvement
based on Six
Sigma or ITIL
GAP
Problem
Resolution
Reactive
problem
resolution
Reactive
resolution but
can validate
Increasingly
proactive
awareness
Automation of
issue analysis
and diagnostics
Automation of
resolution before
business impact
Reactive
problem
resolution
Automation of
issue analysis
and diagnostics
GAP
Performance
Measurement
No awareness of
user experience
Basic awareness
and ownership
Own user
experience
across the chain
Deeper level of
understanding
tied to business
Real-time
visibility drives
service delivery
Basic awareness
and ownership
Deeper level of
understanding
tied to business
GAP
Performance Journey℠ Assessment Results
Level 1
REACTIVE
Level 3
EFFECTIVE
Level 4
OPTIMIZED
Level 5
PERVASIVE
Typical Challenges Identified
• Gaps in APM capability adoption
• Aspiration/strategy differences
• Process deficiencies
• Organizational blockers
• Skills deficiencies
• Gaps in available APM toolset