Sharon Niemi, Practice Director of SQA, talks about how the right combination of predictive and reactive metrics can help you build a measurement portfolio that improves product quality and release consistency. You’ll learn how to build a measurement system that incorporates leading and lagging indicators to improve your team’s consistency in delivering quality products on time and within budget.
Scaling API-first – The story of a global engineering organization
Predictive metrics that drive successful product releases
1. Software Quality Associates
Use the Windshield, Not the Mirror:
Predictive Metrics that Drive Successful
Product Releases
Presented by:
Sharon Niemi
Practice Director, Lifecycle
Optimization
2. Agenda
2
My “measurement aha! moment”
Measures Today; The Missing Links
Why Measure; The ROI
What to Measure; The Portfolio of Measures
How to Measure; The Measurement System
Pulling it Together; A Case Study
Seapine Tools Demonstration
3. Measures Today
3
< 50% of projects are delivered successfully
40% of a Project Teams effort is wasted on unproductive rework
70% of defects uncovered in production are requirements related
90% of our clients most often measure and use:
Schedule
Budget
Defects
(found in Systems Test)
Our Benchmark Studies and other sources of data have revealed….Our Benchmark Studies and other sources of data have revealed….
So what’s missing?
Portfolio of
Measures
Measureme
nt System
4. To Effectively &
Proactively Manage It!
Use Measures…
The Basics – Why Measure
4
Measure
Metric
Measurement
Technique
Baseline
Action
Measures are a Means ~
Not an End!
Benchmark
“You can’t manage what you don’t measure”“You can’t manage what you don’t measure”
X
5. The Portfolio of Measures
Outcome Measures
5
The Rear View Mirror
Reactive
How Well Did You Execute?
(Performance)
Time Cost Quality
Customer
Satisfaction
6. The Portfolio of Measures
Predictive Measures – The Missing Set
6
Resources
Resources
Internal Process
Activities*
Internal Process
Activities*
The Windshield
* Methodology Agnostic
Capability &Capacity
Tools
Tools
Efficiency
Effectivenes
s
-Process Compliance
-Requirements Stability
-Change Request Backlog
-Velocity
-Trends
7. Creating the Line of Sight
7
Resources
Resources
Internal Process
Activities*
-Process Compliance
-Requirements Stability
-Change Request Backlog
-Velocity
-Trends
Internal Process
Activities*
-Process Compliance
-Requirements Stability
-Change Request Backlog
-Velocity
-Trends
Predictive
(Proactive)
For the Portfolio of Measures to provide value, it needs to be
holistic
Outcomes
(Reactive)
ToolsTools
Learning & FeedbackLearning & Feedback
How much do
you want to
Improve?
OutcomesOutcomes
Cause
8. Th
The Measurement System
How to Turn the Portfolio of Measures
into Action!
8
Applied
to
Measurement
Based
Technique
Measurement
Based
Technique
Processes,
Tools, and
Capabilities
Processes,
Tools, and
Capabilities
To Supply
To Improve
Fact Based
Data for
Decisions
Fact Based
Data for
Decisions
Techniques
Balanced Scorecard
Goal –Question – Metric
Six Sigma’s DMAIC
ISO / CMMi / ITIL / GxP
Techniques
Balanced Scorecard
Goal –Question – Metric
Six Sigma’s DMAIC
ISO / CMMi / ITIL / GxP
Owned
Defined
Used
Improved
Portfolio
of
Measures
Visible
9. The Four Main Points
Remember…
Develop a Portfolio of Measures ~ balanced and integrated.
Tie measures to Key Business Drivers and Goals ~ make them
meaningful and relevant.
Implement a Measurement System that Drives Action ~ openly
communicate progress, gaps, and action plans.
Continue to update the Portfolio of Measures as Goals are
attained and new goals identified!
9
11. Step One – Identify Goals and Perceived
Issues
11
Three organizational goals were established;
Avoid Client Impact
Consistency in Delivery
Strive to be Best in Class
Perceived issues;
Testing was perceived to be the roadblock to on time delivery.
Software Development Process defined – Iterative in nature.
Recently implemented Test Management tool and questioned if fully utilized.
12. Step Two– Develop the Questions
Where do we begin to look?
12
Predictor Questions
Tools: Are the tools being utilized appropriately? How integrated are they?
Resources: How well trained are the testers? Workloads?
Process: We hear that the SDLC is defined, but is it followed and effective?
Process: How stable are the requirements? When do they baseline or “freeze”
them?
Trends: Are there any trending data that we can use?
Outcome Questions
Time: How much time is allocated for testing and how much effort does it actually
take?
Quality: What’s the state of the builds being deployed to Test and what defects are
being uncovered throughout the development process?
Cost: What is the cost of migrating defects?
Customer Satisfaction: Were and if so where are the customers finding the defects?
13. Step Three – Build the Portfolio of Measures
What Measures are available?
13
Category Description Data Source Data Elements
Defects/Errors reported by Customer Help Desk Excel Spreadsheets
Planned Testing Effort vs. Actual Testing Effort Test Analysts LOE Actual vs. Planned by project/ release
Tests Planned vs. Actual Test Analysts # of Planned Test Cases Planned vs. # of Actual Test Cases Run
Defects Identifited (full lifecycle) QA Analysts # of Defects Type by Priority
Cost Cost of Rework QA Director LOE/Fully Loaded Cost/# of Defects Found
Time
Quality
Satisfaction
Category Description Data Source Data Elements
SDLC PMO Excel Spreadsheets
Change Request Backlog PMO Excel Spreadsheets
Requirements and Testing Practices BA and Test Managers Excel Spreadsheets
Skills Matrix by Job Description HR Performance Management Excel Spreadsheets
Training Planned and Completed Test Manager Training Plans and Performance Reviews
Tools Utilization and Guidelines Test Manager Review of data and Adherence to Guidelines
Trending Metrics Test Cost per Defect QA Director LOE/Fully Loaded Cost/# of Defects Found
Process Adherence
Skill Levels
Training
Outcome Measures
Predictive Measures
14. Step Four – Gather and Analyze the Measures
Predictive Measures
14
Tools
Available
Suitable for Intended Use
Integrated
Widespread Adoption (Test Organization)
Applicable
Resources Trained Appropriately
Used Consistently
Data Integrity
Capability; Resource / Skills
15. Step Four – Gather and Analyze the Measures
Predictive Measures, continued
15
65
67 68 6870
81 82
93
0
10
20
30
40
50
60
70
80
90
100
#ofRequirements
Requirements 1.22.13 Release
Planned
Actual
Desig
n
Cod
e
Test – Week
2
Test – Week
1
Requirements Stability
Requirements
Requirements were never frozen
Requirements continued to be unstable / changing;
36% increase to plan during week 2 of Systems
Testing
Requirements reviews do not include representative
from Test
Added effort required to address rework due to impact
of changes
Increase = 36%
Process
Defined and Documented
Appropriate to the Culture
Used Consistently
Associated Metrics / Quantifiable
Roadblocks
Duplicate Work
Gaps
Outcomes Measured (time / cost / quality)
16. Step Four – Gather and Analyze the Measures
Outcome Measures
16
Effort
Requirements unstable/changing, sixteen new builds passed to the Test
Environment in four weeks
Test Analysts averaged 8 or more hours overtime 11/22, 11/29, 12/13
3 Test Analysts were out sick week ending 12/6
Due to # of defects being found, additional test cases were selected for
execution
Test Analysts do not possess the right level of skills to complete the job
17. Step Four – Gather and Analyze the Measures
Outcome Measures
17
0
10
20
30
40
50
60
70
80
90
Options Trading Positions Accounts Quote PDF Balances Quicken
#ofEscalatedCalls
Call Category
Escalated Calls by Category 12/13/2012
12/6/2012
11/29/2012
11/22/2012
Customer Satisfaction
Over a four week period, 53%
of the calls were due to
Options, Quote, and Trading
issues
Only 10 % of Test Cases run
are against Options, Quote,
and Trading
Options, Quote, and Trading
are not fully covered in the
Regression Suites
Options Tradin
g
Positions Account
s
Quote PDF Balance
s
Quicken
24.7% 13.2% 7.6% 12.2% 15.1% 7.9% 7.6% 11.8%
18. Step Four – Gather and Analyze the Measures
Outcome Measures
18
Cost
Total Production Defects =
251
Phase Unit
Cost
(Industry
Trend)
Calculated
Cost
# of
Defects
Reported
Min $ Max $
Requirements 1 $39 $3,120
Design 3 to 5 $117 / $195 $9,360 $15,600
Code 10 $390 $31,200
Systems Test 15 to 40 $585 / $1560 $46,800 $124,800
User
Acceptance
30 to 70 $1170 / $2730 $293,670 $685,230
Production 40 to 1000 $1560 / $39,000 80 $124,800 $3,120,000 Phase; Software Development Lifecycle phase
Unit Cost per Defect; Based on Industry Trend
Calculated Cost per Defect; Based on $585
per Defect if Found in Systems Test Phase (cost
of Testers / average # of defects found = average
cost per defect in that particular phase)
# of Defect Reported; Based on 80 defects
(Requirements Related) reported in Production
Environment (reported in a three month period)
Min / Max $$; Cost to Fix if Found in Phase
19. Step Five – Take Action!
Alignment of Measures Key to Success
19
DRIVE
CONSISTENCY
STRIVE TO BE
BEST IN CLASS
World Class
AVOID
CLIENT
IMPACT
Time/$/Quality
Probability
TargetN
Satisfaction, Cost,
&Quality; Defect
Propagation & Customer
Found Defects
Time; Effort Planned vs.
Actual, Test Cases Planned
vs. Actual due to changing
landscape
Improve Regression
Testing
Need Full Lifecycle Defect
Mgt
Time/$/Quality
Probability
TargetN
Process; Abandoned
When Schedule
Pressures Arose
Tools; Expansion
Required
Resources; Weren’t
Skilled and Completely
Capable
Need Application Training
Need Process Adherence
Need Stable
Requirements
Control
Protect
Production
Time/$/Quality
Probability
TargetN
Trends; Data Available
(CpD)
Measurement; Need
integrated Portfolio of
Measures, Baselines, and
Benchmarks
Need Expansion to BA /
Development Activities
Protect
Production
Time/$/Quality
Probability
TargetN
Need Measurement
System fully established
Need Governance and
Oversight; Improvement
through Measurement as
a way of life
Reactive (Rear View
Mirror)
Proactive (Windshield)
Protect
Production
Control
Assure
Manag
e