Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
An introduction to Hypothesis Based Testing
1. Introduction to HBT
“Hypothesis Based testing”
A scientific personal test methodology to delivering clean software
Copyright STAG Software Private Limited, 2010-11 www.stagsoftware.com
2. Test Methodologies in vogue
…focus on activities
…are driven by process
…powered by tools
…depends on experience
Test methodologies focus on activities that are driven by a process that are
powered by tools, successful outcomes still depend a lot on experience
Copyright STAG Software Private Limited, 2010-11 2
3. Testing - A Fishing Analogy
Assume that you are a fisherman
(woman) . You fish in a lake and
make Rs 100 per day. What do you
have to do get higher returns say
Rs 200 per day?
Constraints:
You cannot:
... fish anywhere else
....control the selling price
... increase production of fish.
What would you do extract higher
business value ( or returns)?
Copyright STAG Software Private Limited, 2010-11 3
4. HBT in a nutshell
Focus on the goal &
What defects are you looking for?
then on the activities
What ‘fishes’ to catch
When is it detectable earliest?
Formulate a staged quality growth model
When and where to catch
Create a ‘complete’ set of test cases
Big net, small holes
Use appropriate tooling
Measure the right stuff & course correct
Cover more, move fast, course correct
Copyright STAG Software Private Limited, 2010-11 4
5. Hypothesis-based Testing
...is a scientific personal test methodology powered
by a defect detection technology that enables an
individual to rapidly & effectively deliver “Clean
Software”
What is clean software?
Identify ‘Cleanliness Criteria”
Goal
Hypothesize
Devise Proof
Tooling Assess &
PDT Support Analyze
Copyright STAG Software Private Limited, 2010-11 5
6. How is HBT different
from other methodologies?
Goal
al
ic
T
p
HB
Ty
Powered by experience
drives
Activities
......................................
Powered by defect detection
......................................
......................................
technology (STEM)
Activities
...................................... ......................................
...................................... ......................................
...................................... ......................................
......................................
......................................
hopefully ......................................
results in
Goal
Copyright STAG Software Private Limited, 2010-11
7. COMPARING METHODOLOGIES
Attributes of a methodology
Engineering Management
Methodology Characteristic Effective Efficient Consistent Scalable Visible Agile
Process driven Well laid process
Domain
Experience based
centered
Ad-hoc Individual creativity
Individual analytical
Exploratory
skills
Automation
Tool based
driven
Agile Frequent evaluation
Scientific &
HBT
goal-focussed
Copyright STAG Software Private Limited, 2010-11 7
8. 1. Be clear where you want to go! (Clear goal)
Marketplace
}
Expectations Environment
Business value
How good?
Cleanliness criteria
End users
Requirements Example: Clean Water implies
Needs Features 1.Colourless
Attributes 2.No suspended particles
What do I want?
Usage 3.No bacteria
4.Odourless
Accelerate understanding/ramp-up
Copyright STAG Software Private Limited, 2010-11 8
9. 2. Know the route clearly
3. Use learnings from others
potential defect Types
What types of defects
do I need to uncover?
Hypothesize
Example:
Data validation
Cleanliness criteria Timeouts
Resource leakage
Calculation
Storage
Presentation
Transactional ...
Potential defect types
Accelerate goal clarity
Copyright STAG Software Private Limited, 2010-11 9
16. Hypothesis Based Testing (HBT)
A goal focused methodology to validation
Consists of SIX stages of “doing”
S6 S1 The central theme of HBT is
Assess & Understand “hypothesize potential defects that
ANALYZE EXPECTATIONS
can cause loss of expectations and
prove that they will not exist”.
S5 Tooling HBT Understand
S2
SUPPORT CONTEXT
The focus is on the goal and
Devise Formulate
how we shall achieve it,
PROOF HYPOTHESIS
rather than the various activities.
S4 S3 i.e”goal-centric vs. activity-based”
Copyright STAG Software Private Limited, 2010-11 16
17. SIX stages of “doing” are powered by
HBT & STEM EIGHT thinking disciplines
S6 S1 ‘deliver clean software
GOAL quickly & cost-effectively’
Assess & Understand
ANALYZE EXPECTATIONS
D8 D1
“methodology”
D7 D2
S5 Tooling Understand a system of ways of doing
STEM S2
SUPPORT D6 D3 CONTEXT HBT ‘goal centered scientific
D5 D4
approach to validation’
Devise Formulate
HYPOTHESIS
“method”
PROOF
STEM a particular way of doing something
S4 S3 ‘defect detection technology
from STAG’
Copyright STAG Software Private Limited, 2010-11 17
18. STEM 2.0
STAG Test Engineering Method
STEM Way
D8 D1
Consists of EIGHT Disciplines
Analysis & Business value
management understanding
& THIRTY-TWO scientific concepts
D7 D2
Execution & Defect STEM Core
reporting STEM Core hypothesis
32 core
concepts A discipline consists of steps
Visibility
Strategy & each of which is aided by
planning D3
D6 scientific concept(s)
Tooling Test design
D5 D4
Copyright STAG Software Private Limited, 2010-11 18
19. STEM Core - Provides Scientific Basis
Consists of 32 Core Concepts
D1 Business value understanding D2 Defect hypothesis
Landscaping EFF model
Viewpoints (Error-Fault-Failure)
Reductionist principle Defect centricity principle
Interaction matrix Negative thinking
Operational profiling Orthogonality principle
Attribute analysis Defect typing
GQM
D4 Test design D3 Test strategy & planning
Reductionist principle
Orthogonality principle
Input granularity principle
Tooling needs assessment
Box model
Defect centered AB
Behavior-Stimuli approach
Quality growth principle
Techniques landscape
Techniques landscape
Complexity assessment
Process landscape
Operational profiling
Test coverage evaluation
Copyright STAG Software Private Limited, 2010-11 19
21. Needs & Expectations
}
Needs Construction oriented
Results in features in the software
Implemented using technology(ies) User requirements/
by developers Technical specifications
End users
}
Normally not as clear or
Expectations purposeful
How well should the needs be met?
HBT enables extraction of
Is the focus of the test staff
Cleanliness Criteria to set
up clear goal
Copyright STAG Software Private Limited, 2010-11 21
22. HBT Overview
Needs Expectations Risk assessment
User types Cleanliness Criteria Quality index
Requirements Potential Defect Types Test outcome
Features Quality Levels Cycle scoping
Attributes Test Types Test scripts
}
Usage profile Test Techniques Tooling architecture
Business logic Test Scenarios/Cases Metrics
Data Req. traceability Fault traceability
Copyright STAG Software Private Limited, 2010-11 22
24. S6 S1
S5 HBT S2 Understand Expectations
S4 S3
Understand Understand
Understand the
marketplace for deployment
technology(ies) used
software environment
Identify end user types
& #users for each type
Identify business
requirements for each
user type
Copyright STAG Software Private Limited, 2010-11 24
25. S6 S1
S5 HBT S2 Understand Expectations
S4 S3
Understand Understand
Understand the
marketplace for deployment
technology(ies) used
software environment
Identify end user types
& #users for each type
STEM Core concepts
Landscapes
Viewpoints
Identify business
requirements for each
user type
Copyright STAG Software Private Limited, 2010-11 25
26. S6 S1
S5 HBT S2 Understand Expectations
S4 S3
Understand Understand
Understand the
marketplace for deployment
technology(ies) used
software environment
Identify end user types
& #users for each type
e s
O u tcom t
men e map
ew docu eatur Identify business
ervi ment/f
Ov ire requirements for each
u t
Req type lis user type
User
Copyright STAG Software Private Limited, 2010-11 26
27. S6 S1
S5 HBT S2 Understand Context
S4 S3
Identify technical
Understand Understand profile of
features and baseline
dependancies usage
them
Identify critical success
factors
Prioritize value of end
user(s) and features
Setup cleanliness Ensure attributes are
criteria testable
Copyright STAG Software Private Limited, 2010-11 27
28. S6 S1
S5 HBT S2 Understand Context
S4 S3
Identify technical
Understand Understand profile of
features and baseline
dependancies usage
them
STEM Core concepts Identify critical success
Reductionist principle factors
Interaction matrix
Operational profiling
Attribute analysis
GQM
(Goal-Question-Metric) Prioritize value of end
user(s) and features
Setup cleanliness Ensure attributes are
criteria testable
Copyright STAG Software Private Limited, 2010-11 28
29. S6 S1
S5 HBT S2 Understand Context
S4 S3
Identify technical
Understand Understand profile of
features and baseline
dependancies usage
them
Identify critical success
e s
tcom
factors
Ou atrix
u
st
re li itizat ion m
Feat prior
e ia
Prioritize value of end
Valu e profile s list riter
Us ag ribute ent c user(s) and features
tt se ssm
K ey a iness as
l
C lean
Setup cleanliness Ensure attributes are
criteria testable
Copyright STAG Software Private Limited, 2010-11 29
30. S6 S1
S5 HBT S2 Formulate Hypothesis
S4 S3
Identify PD due to Identify PD due to Identify potential faults
data, logic structure,technology based on usage
Identify potential
failures and therefore
PD
Identify error injection
opportunities and
therefore PD
Map PDTs to
PD = Potential defects Group PDs to form PDTs
requirements/features
PDT = Potential defect types
Copyright STAG Software Private Limited, 2010-11 30
31. S6 S1
S5 HBT S2 Formulate Hypothesis
S4 S3
Identify PD due to Identify PD due to Identify potential faults
data, logic structure,technology based on usage
Identify potential
STEM Core concepts
EFF model failures and therefore
(Error-Fault-Failure) PD
Defect centricity principle
Negative thinking
Orthogonality principle Identify error injection
Defect typing opportunities and
therefore PD
Map PDTs to
PD = Potential defects Group PDs to form PDTs
requirements/features
PDT = Potential defect types
Copyright STAG Software Private Limited, 2010-11 31
32. S6 S1
S5 HBT S2 Formulate Hypothesis
S4 S3
Identify PD due to Identify PD due to Identify potential faults
data, logic structure,technology based on usage
Identify potential
failures and therefore
s PD
ome og
Outc catal t
efects char
nt ial d gation atrix Identify error injection
Pote propa ility m opportunities and
t
Faul traceab therefore PD
t
Faul
Map PDTs to
PD = Potential defects Group PDs to form PDTs
requirements/features
PDT = Potential defect types
Copyright STAG Software Private Limited, 2010-11 32
33. S6 S1
S5 HBT S2 DEVISE PROOF - Part 1of3
y and plan
S4 S3 Test strateg
Formulate quality Identify types of test to
Understand scope
levels be performed
Identify test techniques
Identify defect
Identify risks
detection process
Estimate effort using
Formulate cycles and
defect based activity Identify tooling needs
there scope
breakdown
Copyright STAG Software Private Limited, 2010-11 33
34. S6 S1
S5 HBT S2 DEVISE PROOF - Part 1of3
y and plan
S4 S3 Test strateg
Formulate quality Identify types of test to
Understand scope
levels be performed
STEM Core concepts Identify test techniques
Orthogonality principle
Tooling needs assessment
Defect centered AB
Quality growth principle
Techniques landscape Identify defect
Identify risks Process landscape
detection process
Estimate effort using
Formulate cycles and
defect based activity Identify tooling needs
there scope
breakdown
Copyright STAG Software Private Limited, 2010-11 34
35. S6 S1
S5 HBT S2 DEVISE PROOF - Part 1of3
y and plan
S4 S3 Test strateg
Formulate quality Identify types of test to
Understand scope
levels be performed
Identify test techniques
s
ome
Outc
eg y
s trat Identify defect
Identify risks
Test plan detection process
Test
Estimate effort using
Formulate cycles and
defect based activity Identify tooling needs
there scope
breakdown
Copyright STAG Software Private Limited, 2010-11 35
36. S6 S1
S5 HBT S2 DEVISE PROOF - Part 2of3
n
S4 S3 Test desig
Identify test level to Partition each entity
Model the intended
design consider & & understand
behavior semi-formally
identify entities business logic/data
Generate the test
scenarios
For each scenario,
generate test cases
Assess the test Trace the scenarios to Refine scenarios/
adequacy by fault the PDT (requirement cases using structural
coverage analysis tracing is built in) properties
Copyright STAG Software Private Limited, 2010-11 36
37. S6 S1
S5 HBT S2 DEVISE PROOF - Part 2of3
n
S4 S3 Test desig
Identify test level to Partition each entity
Model the intended
design consider & & understand
behavior semi-formally
identify entities business logic/data
STEM Core concepts Generate the test
Reductionist principle
Input granularity principle scenarios
Box model
Behavior-Stimuli approach
Techniques landscape
Complexity assessment For each scenario,
Operational profiling generate test cases
Test coverage evaluation
Assess the test Trace the scenarios to Refine scenarios/
adequacy by fault the PDT (requirement cases using structural
coverage analysis tracing is built in) properties
Copyright STAG Software Private Limited, 2010-11 37
38. S6 S1
S5 HBT S2 DEVISE PROOF - Part 2of3
n
S4 S3 Test desig
Identify test level to Partition each entity
Model the intended
design consider & & understand
behavior semi-formally
identify entities business logic/data
e s
Out com s
case case Generate the test
an d t
na rios HBT tes scenarios
ce o
T est s rming t x
nfo ture)
(co tec m atrix y matri
i ility eabilit
arch traceab s trac For each scenario,
t t
Faul iremen generate test cases
u
Req
Assess the test Trace the scenarios to Refine scenarios/
adequacy by fault the PDT (requirement cases using structural
coverage analysis tracing is built in) properties
Copyright STAG Software Private Limited, 2010-11 38
39. S6 S1
S5 HBT S2 DEVISE PROOF - Part 3of3
sign
S4 S3 Metrics de
Identify progress Identify adequacy Identify progress
aspects (coverage) aspects aspects
For each of the
aspects identify the
intended goal to meet
For each of these
goals, identify
questions to ask
Identify when you To answer these
want to measure and questions, identify
how to measure metrics
Copyright STAG Software Private Limited, 2010-11 39
40. S6 S1
S5 HBT S2 DEVISE PROOF - Part 3of3
sign
S4 S3 Metrics de
Identify progress Identify adequacy Identify progress
aspects (coverage) aspects aspects
For each of the
aspects identify the
STEM Core concepts intended goal to meet
GQM
Quality quantification
model For each of these
goals, identify
questions to ask
Identify when you To answer these
want to measure and questions, identify
how to measure metrics
Copyright STAG Software Private Limited, 2010-11 40
41. S6 S1
S5 HBT S2 DEVISE PROOF - Part 3of3
sign
S4 S3 Metrics de
Identify progress Identify adequacy Identify progress
aspects (coverage) aspects aspects
s For each of the
tc ome
Ou char
t aspects identify the
m e nts intended goal to meet
a sure
Me
For each of these
goals, identify
questions to ask
Identify when you To answer these
want to measure and questions, identify
how to measure metrics
Copyright STAG Software Private Limited, 2010-11 41
43. S6 S1
S5 HBT S2
DEVISE PROOF
Requirements traceability
S4 S3
Requirements Every test case is mapped to a
traceability requirement.
R1 TC1 (OR)
Every requirement does indeed have a
R2 TC2
test case.
R3 TC3
The intention of to ensure that each
... ... requirement can indeed be validated.
Rm TCi
Is seen as a indicator of “test adequacy”
Copyright STAG Software Private Limited, 2010-11 43
44. S6 S1
S5 HBT S2
DEVISE PROOF
Fault traceability
S4 S3
Map potential defects Map test cases to potential
to requirement defects they can detect
PD1 R1 TC1 PD1
PD2 R2 TC2 PD2
PD3 R3 TC3 PD3
... ... ... ...
PDn Rm TCi PDn
Tracing the potential defects to the requirements & test cases is Fault
Traceability.
Allows us to understand that intended potential defects can indeed be
uncovered
Copyright STAG Software Private Limited, 2010-11 44
45. Requirement & fault Traceability
Assume that each requirement had just one
Requirements traceability is test case. This implies that we have good RTM
“Necessary but not sufficient” i.e. each requirement has been covered.
What we do know is that could there additional
test cases for some of the requirements?
Fault Fault So RTM is a necessary condition but NOT a
traceability traceability sufficient condition.
PD1 R1 TC1 PD1 So, what does it take to be sufficient?
PD2 R2 TC2 PD2
If we had a clear notion of types of defects that
PD3 R3 TC3 PD3 could affect the customer experience and then
... ... ... ... mapped these to test cases, we have Fault
PDn Rm TCi PDn Traceability Matrix (FTM as proposed by HBT).
This allows us to be sure that our test cases can
Requirements
traceability indeed detect those defects that will impact
customer experience.
Copyright STAG Software Private Limited, 2010-11 45
46. Key concepts
Focused defect identification
Quality (Cleanliness)
Each ES is focussed on
uncovering certain PDT
PDT8 TT5
PDT7 TT4 ES1
PDT6 PDT1
ES2
uncover
PDT5 TT3
will
PDT4
PDT3 TT2 ES3
PDT2
PDT2 ES4
PDT1 TT1
Stage (Time)
Each PDT can be detected
A test is a collection of by a specific test technique
evaluation scenarios (ES)
ES1
PDT1
ES1
enabled by
technique
ES2
consists of
PDT1 ES2
TT1 ES3
PDT2 ES3 PDT2
ES4 ES4
Copyright STAG Software Private Limited, 2010-11 46
47. STEM Test Case Architecture
Test Type #3 TC
Test Type #2 TC
Test Type #1 TC
QL3 Test cases
QL2 Test cases
QL1 Test cases PDT
Potential defect types
Test cases are categorized by levels and then by types
Results in
>> Excellent clarity
>> Purposeful i.e. defect oriented
>> Clear insight to quality
>> Higher coverage
Copyright STAG Software Private Limited, 2010-11 47
48. STEM Test Case Architecture
Organized by Quality levels
sub-ordered by items (features/modules..),
segregated by type,
ranked by importance/priority,
sub-divided into conformance(+) and robustness(-),
classified by early (smoke)/late-stage evaluation,
tagged by evaluation frequency,
linked by optimal execution order,
classified by execution mode (manual/automated)
A well architected set of test cases is like a effective bait
that can ‘attract defects’ in the system.
It is equally important to ensure that they are well
organized to enable execution optimization and have
the right set of information to ensure easy automation.
Copyright STAG Software Private Limited, 2010-11 48
49. Test adequacy Analysis
Breadth Types of tests
Depth Quality levels
Porosity Test case “fine-ness”
Test breadth
Test depth
Conformance vs. Robustness
QL4
QL3
Test porosity QL2
QL1
Copyright STAG Software Private Limited, 2010-11 49
50. Clear assessment (Better visibility)
Quality report
CC1 CC2 CC3 CC4
E1
Clear assessment implies that we
are able to objectively state that an E2
element under test is able to meet E3
the intended cleanliness criteria
E4
E5
Met
Not met
Partially met
Copyright STAG Software Private Limited, 2010-11 50
51. S6 S1
S5 HBT S2 Tooling Support
S4 S3
Perform tooling Identify automation Assess automation
benefit analysis scope complexity
Identify the order in
which scenarios need to
be automated
Evaluate tools
Debug and baseline Design automation
Develop scripts
scripts architecture
Copyright STAG Software Private Limited, 2010-11 51
52. S6 S1
S5 HBT S2 Tooling Support
S4 S3
Perform tooling Identify automation Assess automation
benefit analysis scope complexity
Identify the order in
STEM Core concepts which scenarios need to
Automation complexity be automated
assessment
Minimal babysitting
principle
Clear separation of
concerns principle Evaluate tools
Debug and baseline Design automation
Develop scripts
scripts architecture
Copyright STAG Software Private Limited, 2010-11 52
53. S6 S1
S5 HBT S2 Tooling Support
S4 S3
Perform tooling Identify automation Assess automation
benefit analysis scope complexity
e s nt Identify the order in
ut com cume ort which scenarios need to
O fi ts do t rep
d bene ssmen be automated
s an y asse ecture
Need lexit it
p arch s e
Com mation ent d scop
Auto quirem asing an Evaluate tools
ol re tion ph pts
To ma ri
A uto ation sc
m
Auto
Debug and baseline Design automation
Develop scripts
scripts architecture
Copyright STAG Software Private Limited, 2010-11 53
54. S6 S1
S5 HBT S2 Assess and Analyze
S4 S3
Identify test cases/
Execute test cases,
scripts to be Record defects
record outcomes
executed
Record learnings from
the activity and the
context
Record status of
execution
Update strategy, plan, Quantify quality and Analyze execution
scenarios, cases/scripts identify risk to delivery progress
Copyright STAG Software Private Limited, 2010-11 54
55. S6 S1
S5 HBT S2 Assess and Analyze
S4 S3
Identify test cases/
Execute test cases,
scripts to be Record defects
record outcomes
executed
Record learnings from
the activity and the
STEM Core concepts context
Contextual awareness
Defect rating principle
Gating principle
Record status of
execution
Update strategy, plan, Quantify quality and Analyze execution
scenarios, cases/scripts identify risk to delivery progress
Copyright STAG Software Private Limited, 2010-11 55
56. S6 S1
S5 HBT S2 Assess and Analyze
S4 S3
Identify test cases/
Execute test cases,
scripts to be Record defects
record outcomes
executed
s
ome Record learnings from
Outc tus repo
rt
the activity and the
ta
t ion s t context
E xecu repor
t rt
D efec ss repo ort ases
/
e p /c
P rogr iness re enarios Record status of
nl sc
Clea ted test gs execution
a n rnin
Upd egy/pla ions/lea
t t
stra bserva
o
Key Quantify quality and
Update strategy, plan, Analyze execution
scenarios, cases/scripts identify risk to delivery progress
Copyright STAG Software Private Limited, 2010-11 56
57. HBT Case Study
• Web based product – Version 2.x
• One month pilot
• 5-day orientation on HBT/STEM 2.0 to the
team
• Two teams were involved – HBT team and a
non-HBT team
Copyright STAG Software Private Limited, 2010-11 57
58. HBT case study - Test case details
Test case details
STEM Normal
Module Increase
method method
M1 100 28 257% Test case details (STEM Method)
M2 85 52 63% Module Total Positive Negative
M3 95 66 44%
M1 100 59 41
M4 132 72 83% M2 85 68 17
M5 127 28 354% M3 95 67 28
M6 855 116 637% M4 132 112 20
TOTAL 1394 362 285% M5 127 85 42
M6 855 749 106
Nearly 3x improvement in test cases
TOTAL 1394 1140 254
increasing probability of
higher defect yield 2x improvement in Negative cases
increasing probability of defect
yield
Copyright STAG Software Private Limited, 2010-11 58
59. HBT case study - Defect, effort details
Defect details
STEM Normal
Increase
method method
Effort details
#Defects 32 16 100% STEM Normal
Stage method method
STEM Method did yield 2x #defects (hours) (hours)
20 (Major), 12(Minor) Test analysis
30* 20
Note that out of the 32 defects found, & Design
few were residual defects.
One of them was a critical one, that
*Observations
corrupts the entire data.
1. STEM team found key defects
lowering the cost of support.
2. In the case of Normal method,
they would have spent higher
efforts post-release.
Copyright STAG Software Private Limited, 2010-11 59
60. Results & Benefits
Results
• STEM found important residual defects
• STEM team was confident of “guarantee”
• Slight increase in initial effort, but well compensated
with the quality of defects
• Customer was happy with the results and approach
Benefits
• PDT propelled test design
• Quality of defects yielded during pilot was high
• Robust test design –A good input for test automation
Copyright STAG Software Private Limited, 2010-11 60
61. HBT Results
50%-5x reduction in post-release defects
Re-architecting test assets increases test coverage by 250%
30% defect leakage reduction from early stage
Test assessment accelerates integration
Terse requirement - Holes found & fixed at Stage#1
Smart automation - 3x reduction in time
Copyright STAG Software Private Limited, 2010-11 61
62. STAG Solutions & Services
based on HBT & STEM 2.0
Optimization
•Design asset re-engineering
Product
•Maintenance optimization
Quality enhancement
•LSPS solution
•Coverage enhancement
Productivity enhancement
•Tool adoption
Test acceleration
Diagnostics & control
Organization •E-Learning validation suite
•UT assessment
•Mobile app validation suite
•IT assessment
•ERP validation suite
Skill enhancement System enhancement •Bluetooth validation suite
•COMPASSTM
•DevQ system
People •Finishing school
•QA system
EBA Test services
•Outsourced testing
Corp Training •Managed QA
STEMTM •HBT Series Quality injection E&T •JumpStart QA
Retail •Robust TD •Req validation •Assessment services
•Purposeful strategy
•Successful automation •Archi. validationTA •Custom tooling
•Functional automation
•LSPS validation
Copyright STAG Software Private Limited, 2010-11 62
63. Thank you!
STEMTM is the trademark of STAG Software Private Limited
Copyright STAG Software Private Limited, 2010-11