Mais conteúdo relacionado
Semelhante a Agile werkt - Hennie Huijgens - NESMA najaarsbijeenkomst 2012 (20)
Agile werkt - Hennie Huijgens - NESMA najaarsbijeenkomst 2012
- 2. Agenda
• Background
• Set-up of the study
• Data collection
• Analysis of the core metrics
• Analysis of the performance
indicators
• Characteristics of best practices
• Characteristics of worst practices
• Supplementary study on striking
factors
• Index numbers
Agile werkt - © Goverdson 2012 2
- 3. Background
• Hennie Huijgens
• Working as a measurement & analysis
expert since 20 years
• Clients; large information-intensive
organizations (e.g. banking, insurance,
pension funds, government, telco)
• Specialism; Measurement & Analysis,
Information Management, Risk
Management
• NESMA board member 1999 to 2007
• MSc in Information Management
(University of Amsterdam) in 2010
Agile werkt - © Goverdson 2012 3
- 4. Set-up of the study
Data Analysis
Core metrics
Size
Performance
Indicators
Best Practices &
Duration Worst Practices
Time-to-market
Effort Success factors
Productivity
Cost Fail factors
Process Quality &
Product Quality
Defects &
Incidents
Predictability Project planning Scalability Delivery model
Index Numbers
Supplementary study on striking factors
Agile werkt - © Goverdson 2012 4
- 5. Data collection
• 2 Comparable information-intensive organizations
• 278 finalized IT-projects; 23 running IT-projects
• 249 Waterfall projects; 29 Agile projects (Scrum)
• For all projects the size was measured according to the ISO/IEC 24570
(NESMA) counting practice (Function Points)
• All assessed IT-projects were about solution delivery; focus at software
development (new or enhancements); in some cases with hardware or
middleware implementations within the project scope
• The investigated population was divers in subject (e.g. internet, mobile
apps, call enter solutions, marketing & sales, products, client based
systems, transactional services, business intelligence)
• Both organisations started of with a process improvement program
(CMMI) and (at a later stage) moved from waterfall towards Scrum
Agile werkt - © Goverdson 2012 5
- 6. Analysis of the Core Metrics
Size
(Functionpoints)
Duration
(Months)
Effort / Cost (Hours
/ Euros)
Quality
(Defects / Incidents)
Agile werkt - © Goverdson 2012 6
- 7. Size
• Size measured in functionpoints
Scalability based at numbers of projects
70% • ISO/IEF 24570 (NESMA) counting practice
60% • Smallest project was 9 FP; largest was 4.600
50% FP
40% Agile • 60% of the projects (presenting 32% of the
30% Waterval project cost) was smaller than 200 FP (small)
Totaal
20%
• 31% of the projects (42% of the project cost)
10%
was between 200 and 600 FP (medium)
0%
< 200 FP 200 - 600 FP > 600 FP
• 9% of the projects was larger than 200 FP;
representing 27% of the project cost (large)
• Medium and large projects deliver most end-
user functionality; resp. 41% and 37% of the
functionpoints are delivered by these
projects. Small projects delivered 21% of the
functionpoints
Agile werkt - © Goverdson 2012 7
- 8. Duration
Duration (Months) versus Size (Functionpoints)
100
• Duration measured in
months; from start-up
phase to aftercare
• Mean duration waterfall
projects: 9,25 months
Life Duration (Months)
• Mean duration agile
10
projects: 7,94 months
1
10 100 1.000 10.000
Size (FP)
Agile Waterval Avg. Line Style
Agile werkt - © Goverdson 2012 8
- 9. Project Cost
Project cost (euro's) versus Size (functionpoints)
100.000
• Project cost measured in
euro’s; from start-up
phase to aftercare
10.000
• Including supplier cost;
excluding investment
Life Cost (EUR) (thousands)
cost (e.g. software
licences, hardware /
1.000
middleware investment)
• Mean cost waterfall
projects: € 781 K
100
• Mean cost agile projects:
€ 834 K
10
10 100 1.000 10.000
Size (FP)
Agile Waterval Avg. Line Style
Agile werkt - © Goverdson 2012 9
- 10. Quality (Defects)
Process Quality (Defects) versus Size (Functionpoints) • Quality measured in
10.000
number of defects
(findings) during
development (unit test
1.000
to go live)
• Mean quality waterfall
Errors (SysInt-Del)
projects: 80
•
100
Mean quality agile
projects: 128
10
1
10 100 1.000 10.000
Size (FP)
Agile Waterval Avg. Line Style
Agile werkt - © Goverdson 2012 10
- 11. Analysis of the Performance Indicators
Size Time-to-market
(Functionpoints) (Days / FP)
Duration Productivity
(Months) (Cost / FP)
Effort / Cost (Hours Process Quality
/ Euros) (Defects / FP)
Product Quality
Quality
(Incidents / FP)
(Defects / Incidents)
Not in the study
Agile werkt - © Goverdson 2012 11
- 12. Time-to-market
Time-to-Market (Calendar Days per FP) versus Size (FP)
1.000
• Time-to-market is
expressed in (calendar)
days per functionpoint
100 (‘how fast is a function-
point delivered?’)
• Mean Time-to-market
Calender Days/FP
10
waterfall projects: 2,91
days / FP
1
• Mean Time-to-market
agile projects: 1,93 days /
0,1
FP
• Remark; an alternative
measure for TTM is a
10 100 1.000
0,01
10.000
weighted average of days
Size (FP) per FP, where size is the
weighting factor.
Agile Waterval Avg. Line Style
Agile werkt - © Goverdson 2012 12
- 13. Productivity
Productiviy (Euro's per FP) versus Size (FP) • Productivity is expressed
100
in project cost per
functionpoint (the price
of one functionpoint)
• Mean Productivity
10 waterfall projects: 4.613
Life Cost/FP (thousands)
euro / FP
• Mean Productivity agile
projects: 3.360 euro / FP
• Not in scope of the study:
1
net versus gross
productivity
• Remark; an alternative
measure for productivity
0,1 is a weighted average of
10 100 1.000 10.000
Size (FP)
cost per FP, where size is
the weighting factor.
Agile Waterval Avg. Line Style
Agile werkt - © Goverdson 2012 13
- 14. Process Quality
Process Quality (Defects per FP) versus Size (FP) • Process Quality (in-
10
process product quality)
is expressed in number of
defects per functionpoint
• Coherence with Product
1 Quality (before versus
after Go Live)
Defects/FP
• Mean Process Quality
waterfall projects: 0,38
defects / FP
0,1
• Mean Process Quality
agile projects: 0,30
defects / FP
• Remark; an alternative
0,01 measure for Process
10 100 1.000 10.000
Size (FP)
Quality is a weighted
average of defects per FP,
where size is the
Agile Waterval Avg. Line Style
weighting factor.
Agile werkt - © Goverdson 2012 14
- 15. Best Practices & Worst Practices
Analysis of the performance scores based at Star-rating
• For every performance indicator that scores better than average
(under the trend line in the figures) a project gets a star. For every
performance indicator that scores above Sigma+1 (above the highest
dotted line in the figures) a project loses a star
• 3-star projects: performed better than average for all 3 performance
indicators Characteristics of best practices
• 0-star projects: performed worse than average for all 3 performance
indicators (or no data was measured) Characteristics of worst
practices
Agile werkt - © Goverdson 2012 15
- 16. Success factors for IT-projects
Project type was The 7 reason’s behind Best Practices*:
Business Intelligence
5% 1. Single application (no application clustering)
Dedicated test
Single application (no 2. Working in releases
resources
application clustering)
7%
26% 3. Fixed and experienced project team
Close cooperation
with external party 4. Scrum
7%
5. Close cooperation with external party (same party
Scrum mentioned several times)
15%
6. Dedicated test resources (e.g. test
environment, deployment tools)
Working in releases
20% 7. Project type was Business Intelligence
Fixed and
experienced project
team
20% *) Based at 30 measured projects that scored better than average
for both productivity, time-to-market and process quality.
Agile werkt - © Goverdson 2012 16
- 17. Fail factors for IT-projects
The 10 reason’s behind Worst Practices*:
Package with 1. Complex environment (e.g. back-
customization
Bad performing
5% Complex offices, infrastructure, many stakeholders)
external supplier environment
5% 18% 2. New technology (causing technical problems)
Dependencies with
other domains
3. Dependencies with other projects
7%
4. Preconditioned or ‘ technical’ projects
Scope changes
7% 5. Pilot or PoC in project (incl. complex RFP/RFI)
New technology
14%
Complex legacy 6. Complex legacy environment (e.g. bad
environment
9%
documentation)
dependencies with
7. Scope changes during project (exceptions)
Pilot or PoC
other
11%
13% 8. Dependencies with other domains
Preconditioned or
'technical' projects 9. Bad performing external supplier
11%
10. Package with high amount of customization
*) Based at 15 measured projects that scored worse than average
for both productivity, time-to-market and process quality.
Agile werkt - © Goverdson 2012 17
- 18. Supplementary study on 4 striking factors
Factors that patently obvious influenced the performance:
• Predictability
• Project planning
• Scalability
• Delivery model
Agile werkt - © Goverdson 2012 18
- 19. Predictability
F/A Plot (Cost) F/A Plot (Schedule)
2.50 2.00
2.00
1.50
1.50
Forecast / Actual
Forecast / Actual
1.00
1.00
0.50
0.50
0.00 0.00
0.00 0.20 0.40 0.60 0.80 1.00 0.00 0.20 0.40 0.60 0.80 1.00
Project completion Project completion
Cost predictability Schedule predictability
• On average almost a perfect match • On average the planned Go Live date was
between planning and realisation of 3 months to early
project cost
• The measure shows a bias on
• However: good steering on cost
expenditure is not the same as aiming for underestimating the delivery date
a good performance….
Agile werkt - © Goverdson 2012 19
- 20. Project planning
The period preceding the big-bang towards Scrum; 3 dilemma’s occurred:
1. Experts do not plan for improvement
2. Managing uncertainties is not on the agenda
3. Managers steer at cost expenditure;
‘Flying an airplane with only one instrument… a fuel meter…’
The result: planning realised; performance declined
Agile werkt - © Goverdson 2012 20
- 21. Scalability: the effect of size
Schaalgrootte op basis van aantal projecten Number of projects
70%
More than half of the projects are small projects
60% (size less than 200 FP). Slightly less than one third of
50% the projects are medium sized (between 200 and 600
40% Agile
FP).
30% Waterval
Totaal
20%
10%
0%
< 200 FP 200 - 600 FP > 600 FP
Schaalgrootte op basis van omvang (FP) Added value expressed in function points
70% However medium and large sized projects deliver the
60% greater part of the value, measured in end-user
50% functionality (the number of function points). Large
40% Agile
agile projects (> 600 FP) even deliver more than 60%
30%
of the value.
Waterval
20% Totaal Lesson: The majority of finalized projects is small as
10% to size; while conversely medium and large sized
0%
projects deliver the greater part of the value for the
< 200 FP 200 - 600 FP > 600 FP end-user.
Agile werkt - © Goverdson 2012 21
- 22. Delivery model: agile wins
Schaalgrootte op basis van kosten per FP Productivity
€ 5,000 The production of a function point in small waterfall projects
cost on average more (euro 4.728) than in medium sized
€ 4,000 (euro 3.344) and large projects (euro 2.423). Agile projects
show another result; one function point in a small project cost
€ 3,000 on average 3.787 euro's, while the production cost for a
Agile
Waterval
function point in medium and large sized projects is almost
€ 2,000
Totaal
equal (respectively 1.515 euro’s and 1.475 euro’s).
€ 1,000 Lesson: Large projects are cheaper than small projects.
Lesson: Agile projects show a better productivity than
€0
waterfall projects.
< 200 FP 200 - 600 FP > 600 FP
Schaalgrootte op basis van dagen per FP Time-to-market
3.00 The delivery of a function point within both waterfall and
agile projects takes more time in small projects than in
2.50
medium sized projects. Large projects accordingly deliver
2.00 faster than medium sized projects. What strikes the most is
Agile that agile projects in all cases deliver a function point faster
1.50
Waterval
than waterfall projects.
1.00 Totaal Lesson: Larger projects deliver a function point faster than
small projects.
0.50
Lesson: Agile projects show a better time-to-market than
0.00
waterfall projects.
< 200 FP 200 - 600 FP > 600 FP
Agile werkt - © Goverdson 2012 22
- 23. Some index numbers
Waterfall Agile Improvement
Time-to-Market (Calendar Days / FP) 2,91 1,93 34%
Productivity (Cost in Euros / FP) 4.613 3.360 27%
Process Quality (Defects / FP) 0,38 0,30 21%
Standard Error (avg. of 3 indicators) 0,78 0,63 19%
Time-to-Market (Calendar Days / FP)* 1,13 0,56 51%
Productivity (Cost in Euros / FP)* 3.374 1.814 46%
* Weighted average with size as weighting factor
Agile werkt - © Goverdson 2012 23
- 24. Summary
Lessons from the study:
• The majority of finalized projects is small as to size; while conversely medium
and large sized projects deliver the greater part of the value for the end-user
• Large projects are cheaper, deliver a function point faster, and show less
defects per functionpoint than small projects
• Agile projects show a better productivity, time-to-market and process quality
than waterfall projects, meaning;
• Agile teams work faster, cheaper and deliver better quality
• Standard error (r2): agile trends are more reliable as a source for project
estimates
Agile werkt - © Goverdson 2012 24