EuroSTAR Software Testing Conference 2008 presentation on ET, Best of Both Worlds by Derk jan de Grood. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Direct Style Effect Systems -The Print[A] Example- A Comprehension Aid
Derk jan de Grood - ET, Best of Both Worlds
1. Leiden Singapore Minneapolis Edinburgh
Best of both worlds
By: Derk-Jan de Grood
Date: November 2008
Location: EuroSTAR
Exploratory testing in a
structured way
2. www.collis.nl 2
Objectives for this presentation
Some basic principles of exploratory and
structured testing
Insight in the way different approaches can
benefit from each other
Answer to the question ‘Why should I ?’
3. Mail exchange
DJ: I got the idea, also during
discussion with John Bach, that in
the US a lot of Exploratory testing
is done as a separate thing. Thus,
although structured it is not
embedded in a test method. Do
you disagree?
Lee: Depends on the organization –
some see it as two different
approaches (even done by different
people). With others ET is very
integrated. It’s difficult to
generalize.
www.collis.nl 3
5. 5
Schools of testing
Analytic School
Aka Structural testing
Code coverage
Verification
Standard School
Sees testing as a way to
measure progress with emphasis
on cost and repeatable standards
Make sure each requirement is tested
Quality School
Emphasizes process and acts as
Gatekeeper
Protect users from bad software
Testing is a stepping stone to process
improvement
Context-Driven School
Emphasizes people,
seeking bugs that
stakeholders care about
Exploratory testing
Commercial market driven
software
Agile School
Uses testing to prove that
development is complete;
emphasizes automated testing
Test Driven development
[Pettichord 2007]
6. 6
Which school would you apply ?
Which school would apply for testing :
Auto-pilot system for airplane (analytical)
SAP system (standard)
Website supporting a campaign (context)
Innovative system (agile)
Mobile phone (standard?)
Off-shore build system (quality?)
No specs available (context or agile)
Game (context)
12. 12
What is Exploratory Testing?
An approach for unscripted testing based upon
skills and experience of the tester. ET is a risk
based technique using a formal procedure, test
charters and heuristics.
The no 1. excuse for not having to prepare our
test design in full detail:
We do exploratory testing !
“Exploratory testing is simultaneous
learning, test design, and test execution.”
James Bach
13. Real live situation
A Colleague said:
“The specs are outdated and incomplete. There
is a need for manual regression scripts. In
preparation phase our test scripts only contains
the basic info (test purposes). During execution
we write down the details”
www.collis.nl 13
Exploratory testing or not?
16. 16
ET
Error in the s/w
Coverage of the testdesign
1. Points of Interest (POI)
2. First tests executed
3. Plan next step based
upon test results
4. Define new POI
5. Cont. with next POI
6. Conclusion thus finished !
21. 21
Building our test design
Syntax
PCT
BVA
EP
Exploratory
“the puzzle changes the puzzling.”
James Bach
22. 22
ET fits y/n ?
Y
Rapid feedback
Extension to scripted tests
Find most important bug in
shortest time
Check work of other tester
Little or no specifications
Domain knowledge available
N
Feedback loop breaks down
Detailed calculations
(life) Critical functions
Testing of reliability/performance
Test ware is important
Testers are less skilled
Source:ISEBPractitioner–ImproveQS
30. 30
Evaluation
The ET session gave us clear understanding of the quality of
the system. This was achieved in a very short period.
The fun about ET is that its
fundamentals are easily understood.
Jaap Azier (KPN)
In order to use ET effectively we need to
take the lessons learned into account. In
special the logging and scenario testing.
Still I am glad we did ET. It enabled other
people to get insight in the quality of the
system.
Carin Smits
(KPN)
Exploratory testing is testing on the edge.
ET means taking the most out of people, this implies
your dealing with people issues.
It is exciting to find the edge of ‘we have tested all the
essential’.
Hugo Achthoven
Implementation Manager (KPN)
The project went well, great team
working. Together we worked towards
the best working method. In the end, we
certainly have found it. This resulted in
clear and traceable test results.
Jasper Overgaauw
Testexpert (Collis)
Close collaboration between users and test expert
provided a judgement on the quality of the
system, in a only short term.
Matthijs Jorissen
Testexpert (Sogeti)