CMMI High Maturity Best Practices HMBP 2010: Demystifying High Maturity Implementation Using Statistical Tools & Techniques by Sreenivasa M. Gangadhara,Ajay Simha and Archana V. Kumar
Demystifying High Maturity Implementation Using Statistical Tools & Techniques
-Sreenivasa M. Gangadhara
Ajay Simha
Archana V. Kumar
(Honewell Technology Solutions Lab)
.
presented at
1st International Colloquium on CMMI High Maturity Best Practices held on May 21, 2010, organized by QAI
Semelhante a CMMI High Maturity Best Practices HMBP 2010: Demystifying High Maturity Implementation Using Statistical Tools & Techniques by Sreenivasa M. Gangadhara,Ajay Simha and Archana V. Kumar
Ravit Danino HP - Roles and Collaboration in AgileAgileSparks
Semelhante a CMMI High Maturity Best Practices HMBP 2010: Demystifying High Maturity Implementation Using Statistical Tools & Techniques by Sreenivasa M. Gangadhara,Ajay Simha and Archana V. Kumar (20)
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
CMMI High Maturity Best Practices HMBP 2010: Demystifying High Maturity Implementation Using Statistical Tools & Techniques by Sreenivasa M. Gangadhara,Ajay Simha and Archana V. Kumar
1. Demystifying High Maturity Implementation Using
Statistical Tools & Techniques
-Sreenivasa M. Gangadhara
Ajay Simha
Archana V. Kumar
(Honewell Technology Solutions Lab)
.
1 File Number
3. Introduction
• Interpretation & implementation of High Maturity practices in projects is a
challenge
• This paper attempts to “Demystify” the High Maturity Implementation by
using simple Statistical & Simulation Tools & Techniques
• The analytical approach presented in this paper is one of the many best
practices used in the organization
• Project’s specific dynamics needs to be factored when applied to projects
3 File Number
4. Key Takeaways…
At the end of this presentation, we will see one of the ways of…
• Assessing the confidence of project in meeting the project’s multiple goals
• Identifying the Critical Sub-Process with Quantitative justification
• Setting Quantitative project improvement goal
• Defining Sub-Process level Model and arriving at Critical & Controllable factors
• Arriving at “Probabilistic” Model from a “Deterministic” Model
• Doing “What-if” analysis for a proposed process improvement
• Demonstrating whether the proposed solution will meet the project’s objective
(end process result), before deploying the solution
• Demonstrating the usage of models at different stages of the project lifecycle
• Demonstrating that the improved process is statistically significant
4 File Number
5. Multi Goal Simulation Model
(Getting the confidence at the beginning of the project)
5 File Number
6. Problem Statement
• We have a new product release, in a similar product line
• Estimated Size of project is 195 Requirements
• Estimated Effort of project is 140 Person Months
• Goal is to complete the project
- Within 5% effort variance even in the worst scenario
- With a Quality goal of NOT more than 0.1 defects / requirement after
release
What is the confidence that the team has in
meeting this project Goal…???
6 File Number
7. Prediction Model
Note: Model is designed by using Crystal Ball Simulation Tool
Input factor distributions are arrived from the performance baseline
7 File Number
8. Certainty Levels
Prediction:
• 94.45% certain project will complete in 140 person months
• 98.71% certain project will complete with 5% more effort
• 82.83% certain project will complete with 5% less effort
• Project can deliver the product with a Quality Goal of 0.1
Defects / Req with a certainty of 78.51%
8 File Number
9. Model Representation
Effort Component Defect Component
Req Analysis & Dev Defect Injection Rate
-
+
Req Review Defect Removal Efficiency Defect Detection Rate
(DRE)
+ =
Req Rework Defect Fix Rate X Defect Leakage Rate
+ +
Design Defect Injection Rate
-
+
Design Review Defect Removal Efficiency Defect Detection Rate
(DRE)
+ =
Design Rework
Defect Leakage Rate
+ +
Input Assumptions
Historical Performance Baseline Measures:
• Effort / Req for each of the Development, Review, Test execution phases
Calculations
• Defect Injection Rate for each of development phases
• Defect Removal Efficiency (DRE) Rate for each of Review & Test phases
Detected Detected
• Defect Fix Rate of defects for each of the phases DRE = -------------------- = -----------------------
Total Present (Injected + Leaked)
9 File Number
10. Control Factors…
• Control Injection Rate (Reduce Injection Rate)
- Adopt the best Development Process from the existing Process
Composition which takes less effort and injects less defects
• Control Detection Rate (Increase Detection Rate)
- Adopt the best Review Process from the existing Process Composition
which takes less effort and uncover more defects
Next step is to find the control factors at sub-process level
10 File Number
16. Investigating Defect Removal Activities
Control Chart: Defect Detection Density
I Chart of DD by Phase
Req Design Code DIT SIT Post Release
3.0
2.5
2.0
Individual Value
1.5
1
1.0
0.5
1
_
UCL=0.177
0.0 X=0.064
LCL=-0.049
1 19 37 55 73 91 109 127 145 163
Observation
Is SIT a Critical Sub-Process…!!!???
16 File Number
17. Investigating Defect Removal Activities
Trend Chart: Defect Detection Density
3.000
2.500
2.000
1.500
1.000
0.500
0.000
Req DD Design DD Code DD DIT DD SIT DD Post Release DD
17 File Number
18. Investigating Defect Removal Activities
Trend Chart: Defect Detection Density
3.000
2.500
2.000
1.500
1.000
0.500
0.000
Req DD Design DD Code DD DIT DD SIT DD Post Release DD
Min Max Average
Min, Max and Mean values representation
18 File Number
22. Investigating Defect Removal Activities
Trend Chart: Defect Detection Density
3.000
2.500
2.000
1.500
1.000
0.500
0.000
Req DD Design DD Code DD DIT DD SIT DD Post Release DD
Min Max Average
22 File Number
23. Comparing Detection with Injection
Trend Chart: Comparing Defect Density of Detection with Injection
3.000
2.500
2.000
1.500
1.000
0.500
0.000
Req DD Design DD Code DD DIT DD SIT DD Post Release DD
Min Max Average Min Max Mean
Improvement Opportunity
23 File Number
24. Sub-Process Identification
Comparing Detection with Injection Defect Density:
Requirement
Design Phase Coding Phase
Phase
Min 0.154 0.053 0.783
Defect Detection
Max 1.250 0.667 1.154
Density
Mean 0.559 0.280 0.925
Min 1.111 0.577 1.000
Defect Injection
Max 2.833 1.464 1.923
Density
Mean 1.806 0.966 1.533
Mean Difference 1.247 0.686 0.608
Requirement phase Defect Density “Mean” is
relatively more compared to that of other phases
Requirement Phase
needs an attention
Requirement Phase is the Critical Sub-Process
24 File Number
25. Sub-Process Identification
Statistical Justification: Test of Hypothesis H0: μ1 = μ2
H1: μ1 ≠ μ2
Variance DD between Injection
to Detection
If P ≤ 0.05, Reject H0
Module / Feature Req Design Code
If P > 0.05, Accept H0
Exception Service 1.591 0.955 0.318
External Interface 1.619 0.905 0.429
DL Scheduler 1.679 1.071 0.464
Alert registry module 1.292 0.708 0.667
Rendering 0.947 0.737 0.474
GGF 1.586 0.690 0.586
Launchpad 1.059 0.647 0.471
CCD 0.778 0.630 0.667
Semaphore Service 1.609 0.826 0.652
FSS 1.211 0.947 0.684
File System Service 1.462 0.769 0.769
ECLF 1.733 0.467 0.467
Socket Library 1.792 0.625 0.958
Installation 1.500 0.833 0.778
GPC 1.238 0.714 0.333
MTL 1.250 0.375 1.000
Alert response module 1.250 0.625 0.938
Notification Service 1.136 1.045 0.864
Req phase DD is different
Blackberry Thick Client 1.308 0.462 0.769
Power Backup service 0.931 0.793 0.828
from Design & Code
Share Point Client 1.583 0.583 0.167
Process Service 1.087 0.304 0.652
Platform Resource Service 1.077 0.385 0.385
Statistically proven that
Power on/off 0.950 0.500 0.900
Thread Service 0.889 0.556 0.370
Req phase need an
License Management 0.815 0.593 0.667
attention…!!!
Periodic IPC Service 0.800 0.867 0.800
PDD 1.308 0.769 0.154
CALF 0.913 0.696 0.609
Alert System 1.038 0.500 0.423
25 File Number
27. Improvement Alternatives
1. By reducing the Defect Injection Rate by strengthening the
development process
2. By increasing the Defect Detection Rate by strengthening the
defect removal process
Second alternative is considered for the discussion
27 File Number
28. Req Defect Density Mean Shift
Histogram of Req Detection DD, Req Injection DD
Normal
1.6 Variable
Req Detection DD
1.4 Req Injection DD
Mean StDev N
1.2 0.5586 0.2732 30
1.806 0.4581 30
1.0
Density
0.8
0.6
0.4
0.2
0.0
0.0 0.4 0.8 1.2 1.6 2.0 2.4 2.8
Data
Req Defect Detection Mean need a Shift from 0.5586 to 1.806
28 File Number
29. Project Goal
Assume project sets a goal of 40% improvement in
Requirement Defect Detection Density mean
Histogram of Req Detection DD, Req Injection DD, 40% Imp Detectio
Normal
1.6 Variable
Req Detection DD
1.4 Req Injection DD
40% Imp Detection Req DD
1.2 Mean StDev N
0.5586 0.2732 30
1.0 1.806 0.4581 30
0.7820 0.3825 30
Density
0.8
0.6
0.4
0.2
0.0
0.0 0.4 0.8 1.2 1.6 2.0 2.4 2.8
Data
Note: Project team has to document the rationale for selecting 40% improvement
40% improvement is a mean shift from 0.56 to 0.78 Defcets / Req
29 File Number
31. Sub-Process Analysis
SW Development Process
Requirement Phase Design Phase Code Phase
Develop Review Rework Develop Review Rework Develop Review Rework Next Process Steps
Requirement Phase Elaboration
Change
Req Planning Req Capture Req Analyze Docum ent Review Rew ork Baseline
Managem ent
Planning Developm ent Review Change Managem ent
Process Process Process Process
Probable Process, Product & People Attributes
x2 - Req Complexity x10 - Req Volatility
x1 - Author's x8 - Reviewer's Domain Expertise
x3 - Development Effort / Req
Domain Expertise x9 - Review Effort / Req
x4 - Risk of Completeness of Req
x5 - Risk of Ambiguity of Req
x6 - Risk of Non Testable Req
x7 - Risk of Late arrival of Req
Which are the Critical Sub-Process Parameters?
Consider factors related to Process, Product & People
31 File Number
32. Sub-Process Analysis
SW Development Process
Requirement Phase Design Phase Code Phase
Develop Review Rework Develop Review Rework Develop Review Rework Next Process Steps
Sub-Process Identification
Change
Req Planning Req Capture Req Analyze Docum ent Review Rew ork Baseline
Managem ent
Planning Developm ent Review Change Managem ent
Process Process Process Process
Available Process, Product & People Attributes
x2 - Req Complexity x10 - Req Volatility
x1 - Author's x8 - Reviewer's Domain Expertise
x3 - Development Effort / Req
Domain Expertise x9 - Review Effort / Req
x4 - Risk of Completeness of Req
x5 - Risk of Ambiguity of Req
x6 - Risk of Non Testable Req
x7 - Risk of Late arrival of Req
Sub-Process Output Measure
Y1 = f (x1, x3, x8, x9, x10)
Req Defect Density = f (Author’s domain Expt, Dev Effort/Req, Reviewers Domain
Expt, Rev Effort/Req, Req Volatility)
32 File Number
33. Metrics Definition of selected input factors
Metrics Data
x Parameter Name Unit Definition / Guidelines
Type Type
Years of experience in the same or similar domain of
x1 = Author's Domain Expertise Objective Continuous Years
the author
Time spent by author on developing the requirements of
x3 = Development Effort / Req Objective Continuous Hrs / Req
the feature or module
Average Years of experience in the same or similar
x8 = Reviewer's Domain Expertise Objective Continuous Years
domain of the reviewers
Time spent by entire team in reviewing the requirement
x9 = Review Effort / Req Objective Continuous Hrs / Req
document
(# of Req [# of times] changed ) / (Total # of Req in the
x10 = Req Volatility Objective Continuous Ratio
feature or module)
33 File Number
35. Sub-Process Analysis
Req Defect Density: Output Measure – Req Defect Density (Y1)
I Chart of Req DD
1
1.25 UCL=1.235
1.00
Individual Value
0.75
_
X=0.559
0.50
0.25
0.00
LCL=-0.118
1 4 7 10 13 16 19 22 25 28
Observation
35 File Number
36. Sub-Process Analysis
Output Measure (Y1) Comparison with Input Measures (x’s)
Effect is seen in Output measure, for change in Input measures
36 File Number
37. Sub-Process Analysis
Analyze the Correlation
Scatterplot of Req DD vs Authors Doma, Dev Effort /, Reviewers Do, ...
Authors Domain Experience Dev Effort / Req Reviewers Domain Experience
1.5
1.0
0.5
Req DD
0.0
0 4 8 0 1 2 1 2 3
Rev Effort / Req Req Volatility
1.5
1.0
0.5
0.0
0.5 1.0 1.5 0.0 0.2 0.4
Inference:
• Reviewer’s Domain Experience, Review Effort / Req and Req Volatility has positive correlation
• Dev Effort / Req has a negative correlation
• Author's Domain Experience has no correlation
37 File Number
38. Model Building
Regression Analysis
P ≤ 0.05
R-Sq (adj) > 70%
(Thumb rule)
38 File Number
39. Model Building
Regression Analysis – Reduced Model
Note:
Though Dev Effort / Req & Req Volatility
are not statistically significant, they are
considered in the reduced model
Req Defect Density = 0.153 - 0.0618 Dev Effort / Req + 0.0608 Reviewers Domain
Experience + 0.48 Review Effort / Req + 0.23 Req Volatility
39 File Number
40. Statistical V/s Practical
Project objective is to “Uncover” more defects in the Requirement phase
Req Defect Density = 0.153
- 0.0618 Dev Effort / Req
+ 0.0608 Reviewers Domain Experience
+ 0.48 Review Effort / Req
+ 0.23 Req Volatility
To have more defect density in the Requirement phase, the Dev Effort / Req should
be low, Reviewers Domain Experience should be high, Review Effort should be high,
Req Volatility should be high (either few or all).
It practically does not make sense that, to have more Req DD the Req Volatility
should be high or spend less time in development activities. If we do so, then it
means we are intentionally introducing more defects, rather taking any proactive /
systemic measures to uncover more defects in Req phase.
Reviewers Domain Experience & Review Effort / Requirement are the factors which
could help in uncover more defects.
It means that, though “Dev Effort / Req, Reviewers Domain Experience, Review
Effort / Req & Req Volatility” are Critical Parameters, “Reviewers Domain
Experience, Review Effort / Req” are Control Parameter
40 File Number
41. How to use the model…?
At the beginning of the project:
Use the planned or anticipated values of the x’s to predict the defect
density, take the appropriate action if the predicted defect density is not
within the acceptable range, by changing the values of control factors
During execution of the project:
Use the actual values of the x’s to predict the defect density and validate
the model by actual values of the defect density
Calibrate the model with new data set and enhance the model
41 File Number
42. Probabilistic Model from Deterministic Model
(Study the process behavior by knowing the input distribution)
(“What-If” Analysis)
42 File Number
43. Probabilistic Model by Simulation
Use Crystal Ball tool to arrive at Simulation model
Define the simulation model in Crystal Ball tool for the “Regression
equation” by fitting the distribution for the input parameters and the forecast
for the predictor.
43 File Number
45. Process Improvement Steps
1. Do Root Cause Analysis (RCA) and identify the causes for defect leakage in Req
phase
2. Prioritize the causes (using Pareto)
3. Identify improvement alternatives in Req phase
4. Study the process behavior by simulating the process for the proposed
improvements (What-If analysis)
5. Study the process improvement having an impact on process output measure
(Goal)
6. Pilot the process in few projects
7. Analyze results
8. Institutionalize and deploy the process improvement in other projects
45 File Number
46. “What If” Analysis…???!!!
Assume that, if the new proposed process improvement suggest to have a balanced
composition of reviewers with experienced people (Min of 1.5 years, average of 2.4 to
the earlier of 0.5 years, average of 1.72, and an improvement in the review process
which results an additional review effort of mean 10Hrs and Std Deviation of 1.5 per
inspection, then, the New input parameter distributions looks like…
Reviewers Domain Experience Review Effort / Req
Old
New
46 File Number
47. “What If” Analysis…???!!!
Does the New proposed process meet
the project objective of 40%
improvement in Requirement Defect
Detection Density Mean?
Old
Req DD of old process = 0.556
Req DD of New proposed process = 0.847
% improvement to that of earlier process
= (0.847 – 0.556) / 0.556 = 52.34%
The “New” proposed process will
improve Req DD Mean by 52.34%
New
47 File Number
48. Probable Improvements in End Result
(Probable change in Post Release Defects and Effort Estimation)
48 File Number
49. What is possible changes in “End Measures”?
Req Defect Removel
Req Review Process
Efficiency (DRE)
Mean Std Dev Mean Std Dev
Current Performance
0.697 0.359 0.302 0.099
Measure
Note: Change the input distribution for Req Review New Proposed
Effort / Req & Req phase DRE 1.210 0.464 0.474 0.108
Performance Measure
49 File Number
50. Possible changes in “Effort”
Current Process New Proposed Process
50 File Number
51. Possible changes in “Quality”
Current Process New Proposed Process
Observation:
• Though there is increase in Req review effort, there is NOT much change in Total
Effort. Because, it is compensated by reduction in effort to fix the defects in later
phases
• However, there is improvement in the post release defect leakage measure
• The certainty of meeting quality goal of 0.1 defects / Req has increased from 78.5% to
83.0%
The “New” proposed process can be piloted
51 File Number
52. Pilot Improvements in new Project
(Validating the predicted improvements)
52 File Number
53. At the beginning of Project
Predict Req Detection DD from Planned or anticipated values of x’s
Regression Equation:
Req Defect Density = 0.153 - 0.0618 Dev Effort / Req + 0.0608 Reviewers Domain
Experience + 0.48 Review Effort / Req + 0.23 Req Volatility
1.4
1.22
1.10
1.2
1.01
1.00
0.98
0.98
0.95
0.90
1
0.83
Defect Density
0.70
0.68
0.8
0.6
0.4
0.2
0
1 2 3 4 5 6 7 8 9 10 11
Components
Predicted Req DD from Planned x's
53 File Number
54. During the Execution of Project
Monitor & Control the Input Parameters & Monitor Output Predictor
Output Measure (Y1) Input Measures (x’s)
I Chart of Predicted Req DD from Actual x
1.50
UCL=1.466
1.25
Individual Value
1.00 _
X=0.931
0.75
0.50
LCL=0.396
1 2 3 4 5 6 7 8 9 10 11
Observation
54 File Number
55. During the Execution of Project
Predict Req Detection DD from actual values of x’s
Regression Equation:
Req Defect Density = 0.153 - 0.0618 Dev Effort / Req + 0.0608 Reviewers Domain
Experience + 0.48 Review Effort / Req + 0.23 Req Volatility
1.4
1.22
1.10
1.09
1.09
1.08
1.06
1.2
1.02
1.01
1.00
1.00
0.98
0.98
0.95
0.90
0.87
1
0.83
Defect Density
0.79
0.77
0.74
0.72
0.70
0.68
0.8
0.6
0.4
0.2
0
1 2 3 4 5 6 7 8 9 10 11
Components
Predicted Req DD from Planned x's Predicted Req DD from Actual x's
55 File Number
56. During the Execution of Project
Compare the actual Defect Density with Predict from planned values of x’s
and actual values of x’s
1.33
1.27
1.25
1.4
1.22
1.20
1.15
1.10
1.09
1.09
1.08
1.06
1.2 1.03
1.02
1.01
1.00
1.00
0.98
0.98
0.95
0.91
0.90
0.89
0.87
0.86
1
0.84
0.83
Defect Density
0.79
0.77
0.74
0.72
0.70
0.69
0.68
0.8
0.6
0.4
0.2
0
1 2 3 4 5 6 7 8 9 10 11
Components
Predicted Req DD from Planned x's Predicted Req DD from Actual x's Actual Req DD
Note: Existing Regression equation may not be valid, because of change in process (Process Improvement)
Calibrate the Prediction Equation with New data set
56 File Number
57. Is Improvement Statistically Significant?
Staged Comparison:
I Chart of Req DD - Actual by Process Stage
Before After
1.75
UCL=1.674
1.50
1
1.25
_ Mean shift is
Individual Value
1.00 X=1.038
0.75
observed…!!!
0.50
LCL=0.402
0.25
0.00
1 5 9 13 17 21 25 29 33 37 41
Observation
57 File Number
58. Is Improvement Statistically Significant?
Statistical Justification: Test of Hypothesis H0: μ1 = μ2
Mean are same, there is NO significant difference
in DD between the data samples
H1: μ1 ≠ μ2
Mean are different, there is significant difference
in DD between the data samples
If P ≤ 0.05, Reject H0
If P > 0.05, Accept H0
The mean of two data set is
different
The improvement is
Statistically Significant
Measure and compare the end results after the
completion of the project…
58 File Number
59. Looking back…
We have seen one of the ways of…
• Assessing the confidence of project in meeting the project’s multiple goals
• Identifying the Critical Sub-Process with Quantitative justification
• Setting Quantitative project improvement goal
• Defining Sub-Process level Model and arriving at Critical & Controllable factors
• Arriving “Probabilistic” Model from a “Deterministic” Model
• Doing “What-if” analysis for a proposed process improvement
• Demonstrating whether the proposed solution will meet the project’s objective
(end process result), before deploying the solution
• Demonstrating the usage of models at different stages of the project lifecycle
• Demonstrating that the improved process is statistically significant
59 File Number
61. Acknowledgement
Authors wish to thank the Management of Honeywell Technology
Solutions Pvt, Ltd, Bangalore for giving an opportunity to present this
paper
Thanks to Venkatachalam V. & Dakshina Murthy for their guidance &
support
61 File Number
62. Contact Details
Office Address: Sreenivasa M Gangadhara
Honeywell Technology Solutions Ltd., Six Sigma Black Belt
151/1, Doraisanipalya, Bannerghatta Road Functional Specialist-Process
Bangalore – 560 226 Sreenivasa.gangadhara@honeywell.com
Karnataka State, India. Mobile: +91-98804 24780
+91-80-2658 8360
+91-80-4119 7222 Ajay Simha
+91-80-2658 4750 Fax Six Sigma Green Belt
Principal Engineer
Ajay.simha@honeywell.com
Mobile: +91-98864 99404
Amit Bhattacharjee
Six Sigma Black Belt
Principal Engineer
Amit.bhattacharjee@honeywell.com
Mobile: +91-99860 22908
Archana Kumar
Principal Engineer
Archana.kumar@honeywell.com
Mobile: +91-97407 77667
62 File Number