Companies often go to great lengths to collect metrics. However, even the most rigorously collected data tends to be ignored, despite the findings and potential for improving practices. Today, one metric that cannot be ignored is customer satisfaction. Customers are more than willing to share their thoughts in a manner that can impact your bottom line. Social media gives consumers a stronger voice than ever, and damage to your brand is only one tweet away. The question is: Are you listening to your customers? Paul Fratellone helps you break down current process metrics so you can build them back up with business and customer value at the forefront. With feedback on how well you are attaining your objectives, you can create a powerful action plan for change that will receive the attention it deserves. If you are serious about improving the value of your projects to the business, join this session and let the right data drive your improvement actions.
Powerful Google developer tools for immediate impact! (2023-24 C)
Measure Customer and Business Feedback to Drive Improvement
1. BT12
Concurrent Session
11/14/2013 3:45 PM
"Measure Customer and
Business Feedback to Drive
Improvement"
Presented by:
Paul Fratellone
uTest
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888 268 8770 904 278 0524 sqeinfo@sqe.com www.sqe.com
2. Paul Fratellone
uTest
Paul Fratellone’s career in quality assurance and testing began in the mid1980s and has spanned multiple industries and domains. Through the
years, Paul has recognized certain patterns and pain points which all
organizations need to deal with. Building a business case for justifying
investments in quality and testing has enabled organizations to not only
measure success but continually improve. Paul’s perspective and passion
are clearly rooted in ensuring teams provide value to the business and
ultimately to end users. Knowing what is important to the customer is how
Paul has quantitatively articulated the cost-risk-benefit equation of quality to
business owners and IT management.
3. 9/10/13
Measure Customer and Business
Feedback to Drive Improvement
Presented by Paul Fratellone
November 2013
|
Session Name: B12 Metrics
|
1
4. 9/10/13
What’s the landscape
How many are working in B2C ____ B2B ____ Internal ____
How many are collecting project/product metrics? ____
Financial ___ Schedule ___ Effort ____ Quality ____ Scope ____
Is there a process improvement initiative
Yes ____ No ____ Planned ____
For those PII what has been the progress?
Good ____ Slow ____ None ____
How many have Organization & Department level goals & objectives Yes
____ No____?
Are their financial incentives associated with
Goal attainment
Yes ____ No ____?
Improvements
Yes ____ No ____?
|
Defining what are metrics
Information that tells leadership the
probability of success or failure in attaining
goals and objectives
Many data points/components
and processes involved in the
results - is why it’s difficult to
improve.
|
2
5. 9/10/13
Underlying principles
• Focused on specific
goals
– Purpose of measurement
• Object(s) to be measured
– Applied to all life-cycle products,
processes and resources
• Perspective
– Viewpoint from which the measure is
taken
Victor R. Basili - "Software Modeling and Measurement: The Goal Question Metric Paradigm,"
|
Why Bother
• Documents goals / targets are being attained
• Articulates trends and early warning notification
• Quantifiable reference and baseline for improvements
• Increased accuracy of cost allocation
• Enables accelerated root cause analysis
• Hidden costs and/or costs not being measured could
negatively impact goals
• Visibility into the maturity and readiness for the release/
production deployment
• Enables effective management of the software/product
delivery life cycle
|
3
6. 9/10/13
Roadblocks to Actions/Improvements
• Organizational Corporate
– Non existent / cascading goals
– No incentive to change
– Uncertain market conditions
– Management indecision
• Operational Layers
– Insufficient / Inaccurate data
– No analytics/intelligence data
– Poor planning/estimating models
– Unreliable historical data
|
All too true for some organizations
|
4
7. 9/10/13
What’s in your middle
Product
Marketing
Sales
Technology
|
Around and around …
1.
2.
3.
4.
5.
6.
7.
Marketing identifies a need
Product specifies a solution
Engineering builds the product
QA tests the application
Help desk listens to complaints
Blame game and finger pointing
Customers comment in social
media
8. Repeat and hope for better
results…
|
5
8. 9/10/13
Customer’s Perspective: Outcomes
I used to be able to do…
but now I cannot, what
happened?
Great Product
but I wish it
could …
… I had to input
all my data all
over again, this
sucks
I could not
install it
Your application
keeps on crashing,
FIX IT !!!
|
Customers are chatting: Outcomes
• Production outages/severity
• Transaction processing
• Promotions and campaigns
• Credits and refunds
• Customer service calls
• User Experience
|
6
10. 9/10/13
What’s needed to action change
Social media negative reviews
What when wrong …
Risk & Impact to business goals
What was being measured (needs to be)
What warning signals need to be in place
What processes are involved in the
delivery of the results
Base Line / Benchmarks
Articulate the changes / Improvements
State the target (measurable) improvement
Measure results – Comparisons
Success Attainment – Course Corrections
|
Defining the relationships
Goals
Deviations (actual to
plan)
Result
Assessment Focus
Schedule
Resource utilization
Delivery cycle duration
Portfolio disruption
Missed window of opportunity
& revenue opportunities
Customer Satisfaction
Loss of credibility
Estimation & Planning
• Resource
• Duration
Risk Based Testing
Content
New features &
enhancements
Sustained maintenance
Missed Revenue
Opportunities
Portfolio disruption
Customer Satisfaction
Loss of credibility
Time to Market delays
Estimation & Planning
SDLC Project Delivery
Test Coverage &
Regression
Test Modeling & Design
Requirements Mgmt
Quality
Predictability
Reliability
Performance
Customer Satisfaction
Increased incidents
Missed Revenue
Opportunities
Increase in Customer Service
Production Outages/Hot Fixes
Customer Satisfaction
Defect Root Cause
Analysis
Project Metrics Reporting
Risked based testing
Regression suites
Cost
CapEx & OpEx
Release Costs
Production Quality
Increased incidents in
production
Time to Market delays
Risked based testing
Tools & Automation
|
8
11. 9/10/13
Models & Estimating
Estimating
The process of forecasting or approximating the time
and cost (effort) of completing activities i.e.
deliverables.
Contingencies (buffers)
Are created to offset uncertainty; reduce the likelihood
of…
|
Estimates get better over time
|
9
12. 9/10/13
Real World Case Study
Client Profile
Fortune 500 Financial
Services
Situation:
Perceived low value
Cost Overruns
Increasing quality issues
Customer Service issues
Unable to scale
|
Problems and Challenges:
• Low confidence to deliver “ON” Quality, Content,
Schedule & Budget
• Racing to the finish line: Unplanned effort (overtime)
to deliver release
• Ineffective quality gates and checkpoints throughout
the SDLC to mitigate risk to the business
• Weak change management process
• Inefficient utilization of user (UAT) resources
• Insufficient implementation of process improvements
• Deficient root cause analysis
|
10
13. 9/10/13
Where will we be: High level roadmap
2Q-2013
Aligned Delivery Goals
Portfolio Delivery Management
Baselines & Cost of Quality
Process Focused/Monitored
Enhanced SDLC
Enhanced Estimation
Warnings & Triggers
1Q-2013
Wave 3
Optimising (O)
Kaizen & Evolve
towards higher
maturity
Wave 2
Enhanced Tools
Reliable Data
Actionable Improvements
Feedback & Comparison
Efficient (E):
Establish an
efficient process
Wave 1
Test Case repository improvements
Enhanced test case and defect metrics
Enhanced Testbed approach
Root Cause Enabled
Controlled (C)
Bring process
under control
Starting Point
|
What did we promise… What’s the ROI
Description
Year 1
Cost
Year 2
Benefits
Cost
Benefits
60,000
5,000
-
60,000
2,500
-
Staff Training on enhanced QC
Test Coordination Reporting &
Administration
35,000
-
15,000
9,000
9,000
Test Case Efficiency
40,000
50,000
108,000
Test Case Reuse
30,000
36,000
Test/Impact Analysis
70,000
84,000
12,000
QC Administrator
QC training
Test Data/ Bed Reuse & Test Data
Mgmt.
5,000
7,000
Requirements & Traceability
30,000
94,000
Subtotal
Net Benefits
175,000
260,000
85,000
124,000
77,500
373,000
295,500
|
22
11
14. 9/10/13
Test Process Improvements – Detailed
Description
Savings
Test%Data/%Bed%
Reuse%&%Test%Data%
Mgmt.%
Requirements%&%
Traceability%%
2 months to prepare
40,000 to ready QC
Related to TC
efficiency. No
additional cost
Related to TC
efficiency. No
additional cost
7,000 yr. 1
12,000 yr. 2
1 month of effort to
design re-usable test
bed.
94,000 yr. 1
124,000 yr. 2
Test/Impact%%
Analysis%
50,000 in year
1
108,000 in yr.
2 and beyond
70,000 yr. 1
84,000 yr. 2
Test%Case%Reuse%
As long as a training
starts in 2012 (4Q)
benefits can be
realized immediately
30,000 /yr. 1
36,000/yr. 2
Test%Case%Efficiency%
Time to implement
9,000/yr.
Test%Coordina-on%
Repor-ng%&%
Administra-on%
2 months of effort
Benefits Details
10615%%of%Test%Coordina-on%effort%
reduc-on%.%
Addi-onal%savings%can%be%realized%
for%test%analysts%also%dedica-ng%-me%
to%report%genera-on%
Poten-ally%20640%%test%cases%are%
Redundant%/%Non%Value%Add%(NVA)%
consumes%effort%in%test%prep%and%
execu-on.%
%25650%%Test%Case%Design%effort%
reduc-on.%
%
Calculation
60,000 *(25% in report
generation) 15,000 per annum
* 15% reduction in effort . 2,250 * 4 TCs
15% of STLC overall test effort
will be saved from preparation
and execution
9,000/month
3,000/month *10 months .
Faster%Impact/CR%analysis%leads%to%
7,000/month * 10 months
faster%test%prepara-on%-me%by%a%
factor%of%40%.%Leads%to%a%overall%test%
cycle%effort%savings%of%4610%.%
10625%%of%Test%Bed/Data%reuse%
1,000/month
effort%savings.%%This%is%specific%to%G.I.%
Cost 5,000
as%the%Life%data%is%more%complicated%%
At%a%minimum%a%5%%reduc-on%in%
produc-on%incidents%in%next%year%will%
lead%to%a%savings%of%124K%GBP.%
124,000 Savings due to
reduction in defects found in
production
30,000 investment to load
historical data
|
Analyzing the data points
Defect Analysis
Focus Area
" Avg. % Defects found in QA/ST is at
71.4%, & 67.8% for High Severity
Defects (Critical & Serious)
" Highest Prod. and UAT defect % is
found in BAU (34%-UAT, 24%-Prod)
releases and the non mainstream
releases - Like Release O1, K1
contributing to 32.5 and 24% Production
defect % respectively.
" Recent Release P has done a better job
comparatively with 90% defects found
in QA/ST. This can be analyzed further
for future improvements.
" Avg across products being 70.4% found
in QA/ST.
" Finance and All enterprise products
having a very low QA/ST defect%.
• Delayed/minimal
collaboration
• Test Resource Planning
• Test Cycle Estimation
• Requirements
Management
" 48% (45% coming from Misc. Depts.)
and 59.5% (24% coming from
production).
•
•
Clarity /completeness
of requirements
Insufficient
requirements analysis
• Change Management
• Poor Impact
analysis
• Inadequate TC Design/
Strategy
•
•
•
•
Weak/Insufficient
regression
Inadequate Test Bed
Requirements/TC
traceability
Levels of Testing
Deliverable
! Root causes to
support/identify
process
improvements
areas
! Identify patterns
and common areas
of risk /
improvements that
will assist other
products across the
portfolio
! Metrics and KPI’s
to compare results
of improvements
! Common utilization
of tools, test
standards and
|
techniques
12
15. 9/10/13
Portfolio Management
O
S DL C
Named Release
Hig h%L evel%E s timates %
B S C %T eam%P rint%Analys is %(inc .%final%wording s )
P rint%Analys is
L as t%S pec ific ations %S ig ned%O ff
P olic y%Wording s %S ig ned%O ff%(R R D )
P roduc t%Analys is
B uild%(F rom%NF D %C O D E %freez e%lifted)
MI%Utility%D ev/T es ting %(MI%T eam)
T es ting %M%C hang es %in%R eleas e%(NR T )
T es ting %M%D efec ts %(NR T )
L as t%R equirements %B uild
L as t%defec t%build
R eg res s ion%T es ting %in%NF T %(Automated)
R eg res s ion%T es ting %in%S ys tem%T es t%(NF Q)
T es ting %S ig ned%O ff
May
4 11 18 25 1
2012
2013
June
July
August
September
October
November
December
January
February
8 15 22 29 6 13 20 27 3 10 17 24 31 7 14 21 28 5 12 19 26 2 9 16 23 30 7 14 21 28 4 11 18 25 1 8 15 22 1
P
O1
24 24
24
25
25
26
24
26
27
25
P
P
P
O
P
25
25
24
Analys is %(C ore)
B us ines s %R equirements %S ig ned%O ff%(C ore)
S pec ific ations %S ig ned%O ff%(C ore)
T arg et%D emo%of%S prints
C R /D efec t%T es ting %in%T H1%Ag reed%by%NF UM%(L ates t%
24
24
25
27
29
28
24
2
29
28
3
4
27
26
26
27
1
27
27
26
28
25
P
D ate)
27
27
26
25
25
27
24
26
26
26
25
24
24
25
28
27
27
25
P
P
28
27
26
24
O
O
29
28
25
25
24
5
25
29
27
26
24
P
29
28
27
26
25
24
24
P
O
27
28
28
26
25
24
March
15 22 29
27
26
25
24
P
O
28
27
26
25
24
P
28
25
24
P
27
26
24
8
26
25
24
CHRISTMAS
January
February
March
April
13 20 27 3 10 17 24 2 9 16 23 30 6 13 20 27
6
7
8
27
27
30
CHRISTMAS
Year
Month
Week Ending
31
30
29
26
30
27
If we have to manage this manually, it will be costly,
inefficient and ineffective
Ability to proactively respond to warnings and risk
areas will be challenged.
|
Strike Zone Reporting
Project(Health
Project/Name
Critical/Success/Factors
Strike/Zone
3/15/2012
Mitigation
Target Threshold Status (if/Status/other/than/
Green)
Last'Update
Remarks
(optional)
Pipeline(&(Delivery(Awareness
Client'/'Project'
Technical'&'Requirments'Assessment
Resource'Estimation
Schedule
Budget
SOW'&'Agreements
Green
Planning
#REF!
Project'Schedule/Timeline
Project'Milestones'&'Deliverables
Project'Organization
Roles'&'Responsibilities
Project'Organization
Develop'Resource'Plans
Finalized'Requirements
Execute(&(Launch
#REF!
Milestones(
Close(&(Maintain(
#REF!
|
13
16. 9/10/13
Are we going to make it?
Code%Drop%1
Count
#%of%TCs%PlAN%Execute
#%of%TCs%ACTUAL%Execute
%%completed
#%of%TCs%with%defects
Code%Drop%2
#%of%CarryBover%TCs
Defect%Testing%(esimated)%
Increase%from%CD1
Planned%CD2%
Total%TC
Ability%to%execute%in%timeframe
#%of%TCs%PlAN%Execute
#%of%TCs%ACTUAL%Execute
%%completed
#%of%TCs%with%defects
Days
Comments
What%are%the%causes%for%deviations
Is%this%growing%/%larger%than%estimated
So%what%were%the%causes%for%the%carry%over
Enough%Contingency%/%buffer
Additional%effort%to%account%for%(un>planned)
Is%this%in%line%with%plan?%Can%we%complete%on%time
Are%we%going%to%make%it?%How%close?
Are%we%tracking%to%plan?%Are%we%going%to%make%it?
How%much%carry%over%for%the%next%drop?
|
Dashboard & Reporting
Transparent
Accessible
Actionable
Root Cause Enabled
Key Performance indicators
Defined Success Criteria
Quality Thresholds
Data Trending
|
14
17. 9/10/13
Stay the course
|
Changing your perspective
Brand Image
Brand recognition
Customer preference
Visibility in the
marketplace
Market Share
Stock Value
Report on deviations to plan – In terms of impacts to the
business
|
15
18. 9/10/13
Think and Test like and End User
Spend the time and find out some information
about your product, users, market
• What quality dimensions are important to each
• How to measure them
Protect your brand image and good will as bad
news travels very fast in the social networking
era.
• When customers delay their purchase, there is an
immediate impact to operating revenue.
• They might start to lose confidence in your software;
this translates to lost opportunities and negatively
impacts future revenue streams
|
Web/Mobile Analytics
http://www.flurry.com/
http://www.appannie.com/
http://www.applause.com/
http://www.sitejabber.com/
http://www.appsfire.com/
|
16
19. 9/10/13
Thanks for Attending
Please fill out an evaluation form and drop it in
the collection basket located at the back of the
room.
|
Contact Information
Paul Fratellone, Test Evangelist
uTest Inc.
HQ: Framingham, MA 01772
Voice +1 570.269.5342
email: paul.fratellone@utest.com
www.utest.com
|
17