SlideShare uma empresa Scribd logo
1 de 47
Baixar para ler offline
 
 
 
torial 
 
Presented by: 
Michae Bolton 
Brought to you by: 
 
 
340 Corporate Way, Suite   Orange Park, FL 32073 
888‐2
MA 
Full day Tu
4/7/2014   
8:30 AM 
 
 
 
 
“Critical Thinking for Software Testers” 
 
 
l 
DevelopSense 
 
 
 
 
 
 
 
 
 
300,
68‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com 
Michael Bolton
DevelopSense
 
Tester, consultant, and trainer Michael Bolton is the coauthor (with James Bach) of
Rapid Software Testing, a course that presents a methodology and mindset for testing
software expertly in uncertain conditions and under extreme time pressure. Michael is
a leader in the context-driven software testing movement with twenty years of
experience testing, developing, managing, and writing about software. Currently, he
leads DevelopSense, a Toronto-based consultancy. Prior to DevelopSense, he was with
Quarterdeck Corporation, where he managed the company’s flagship products and
directed project and testing teams—both in-house and worldwide. Contact Michael
at michael@developsense.com.
 
Critical Thinking for Testers Michael Bolton and James Bach
1
Critical Thinking for Testers
James Bach
http://www.satisfice.com
james@satisfice.com
Twitter: @jamesmarcusbach
Michael Bolton
http://www.developsense.com
michael@developsense.com
Twitter: @michaelbolton
Simple Problems
Critical Thinking for Testers Michael Bolton and James Bach
2
A Simple Puzzle
“Steve, an American man, is very shy and withdrawn,
invariably helpful but with little interest in people or in the
world of reality. A meek and tidy soul, he has a need for
order and structure, and a passion for detail.”
Is Steve more likely to be
a librarian? a farmer?
Wait, let’s try something really simple…
Can we agree?
Can we share common ground?
“There are four geometric figures on this slide.”
“There is one square among those figures.”
“The square is shaded in blue.”
Critical Thinking for Testers Michael Bolton and James Bach
3
Beware of
Shallow Agreement!
Wait, let’s try something really simple…
Reflex is IMPORTANT
But Critical Thinking is About Reflection
REFLEX
REFLECTION
Faster
Looser
Slower
Surer
get more
data
System 2
System 1
See Thinking Fast and Slow, by Daniel Kahneman
Critical Thinking for Testers Michael Bolton and James Bach
4
The Nature of Critical Thinking
• “Critical thinking is purposeful, self-regulatory
judgment which results in interpretation,
analysis, evaluation, and inference, as well as
explanation of the evidential, conceptual,
methodological, criteriological, or contextual
considerations upon which that judgment is
based.” - Critical Thinking: A Statement of Expert Consensus for Purposes
of Educational Assessment and Instruction, Dr. Peter Facione
(Critical thinking is, for the most part, about getting all the benefits of
your “System 1” thinking reflexes while avoiding self-deception and
other mistakes.)
Bolton’s Definition of Critical Thinking
• Michael Bolton
Testing is enactment of critical thinking about software.
Critical thinking must begin with our belief in the
likelihood of errors in our thinking.
Critical Thinking for Testers Michael Bolton and James Bach
5
The Nature of Critical Thinking
• We call it critical thinking whenever we
systematically doubt something that the “signs”
tell us is probably true. Working through the
doubt gives us a better foundation for our beliefs.
• Critical thinking is a kind of de-focusing tactic,
because it requires you to seek alternatives to
what is already believed or what is being claimed.
• Critical thinking is also a kind of focusing tactic,
because it requires you to analyze the specific
reasoning behind beliefs and claims.
Why You Should Care
Technology is way more
tricky than regular life.
But testers
are not supposed
to get tricked.
Critical Thinking for Testers Michael Bolton and James Bach
6
Don’t Be A Turkey
• Every day the turkey adds one more data
point to his analysis proving that the farmer
LOVES turkeys.
• Hundreds of observations
support his theory.
• Then, a few days before
Thanksgiving…
Based on a story told by Nassim Taleb, who stole it from Bertrand
Russell, who stole it from David Hume.
Graph of My Fantastic Life! Page 25!
(by the most intelligent Turkey in the world)
WellBeing!
DATA
ESTIMATED
POSTHUMOUSLY
AFTER THANKSGIVING
“Corn meal a little off
today!”
Don’t Be A Turkey
• No experience of the past can LOGICALLY be
projected into the future, because we have no
experience OF the future.
• No big deal in a world of
stable, simple patterns.
• BUT SOFTWARE IS NOT
STABLE OR SIMPLE.
• “PASSING” TESTS CANNOT
PROVE SOFTWARE GOOD.
Based on a story told by Nassim Taleb, who stole it from Bertrand
Russell, who stole it from David Hume.
Critical Thinking for Testers Michael Bolton and James Bach
7
Themes
• Technology consists of complex and ephemeral relationships that
can seem simple, fixed, objective, and dependable even when
they aren’t.
• Testers are people who ponder and probe complexity.
• Basic testing is a straightforward technical process.
• But, excellent testing is a difficult social and psychological process
in addition to the technical stuff.
A tester is someone who knows that
things can be different.
Jerry Weinberg
Critical Thinking for Testers Michael Bolton and James Bach
8
Critical Thinking About Testing:
Call this “Checking” not Testing
Observe Evaluate Report
Interact with the
product in specific
ways to collect
specific
observations.
Apply algorithmic
decision rules to
those
observations.
Report any
failed checks.
means
operating a product
to check specific
facts about it…
Critical Thinking for Testers Michael Bolton and James Bach
9
Acquiring the competence, motivation,
and credibility to…
Testing is…
create the conditions necessary to…
…so that you help your clients to make
informed decisions about risk.
evaluate a product by learning
about it through experimentation, which includes to
some degree: questioning, study, modeling,
observation and inference, including…
operating a product
to check specific
facts about it…
Critical Thinking for Testers Michael Bolton and James Bach
10
This is what people think testers do
described actual
“Compare the product to its specification”
This is more like what testers really do
imagined
actualdescribed
“Compare the idea
of the product to
a description of it”
“Compare the actual product
to a description of it”
“Compare the idea
of the product to
the actual product”
Critical Thinking for Testers Michael Bolton and James Bach
11
This is what you find…
The designer INTENDS the product to be
Firefox compatible,
but never says so, and it actually is not.
The designer INTENDS the product to
be Firefox compatible, SAYS SO IN
THE SPEC,
but it actually is not.
The designer assumes
the product is not Firefox compatible,
and it actually is not, but the ONLINE
HELP SAYS IT IS.
The designer
INTENDS
the product to be
Firefox compatible,
SAYS SO,
and IT IS.
The designer assumes
the product is not
Firefox compatible,
but it ACTUALLY IS, and the ONLINE
HELP SAYS IT IS.
The designer INTENDS the product
to be Firefox compatible,
MAKES IT FIREFOX COMPATIBLE,
but forgets to say so in the spec.
The designer assumes
the product is not Firefox
compatible, and no one
claims that it is,
but it ACTUALLY IS.
Exercise
Calculator Test
“I was carrying a calculator.
I dropped it!
Perhaps it is damaged!
What might you do to test it?”
Critical Thinking for Testers Michael Bolton and James Bach
12
and dry it out before attempting to test it.
When did I drop it? Was I in the middle of a calculation? If so part of my testing might be to visually inspect
the status of the display to determine whether the calculator appears to still be in the state it was at the
time I dropped it. If so, I might continue the calculation from that point, unless I believe that the drop
probably damaged the calculator.
Did I drop it on a hard surface with a force that makes me suspect internal damage? If so then I would
expect possible hairline fractures. I imagine that would lead to intermittent or persistent short circuits or
broken circuits. I also suspect damage to moving parts, battery or solar cell connections, or screen.
Did I drop it into a destructive chemical environment? If so, I might worry more about the progressive
decay of the components.
Did I drop it into a dangerous biological or radiological environment? If so, the functions of the calculator
maybe less concern than contaminants. I may have to test it with a Geiger counter.
Was the calculator connected to anything else whereby the connection (data cable or AC/cable or duct
tape that fastened it to a Faberge egg) could have been damaged, or could have damaged the thing it was
connected to?
Did I detect anything while it was dropping that leads me to suspect any damage in particular (e.g. an
electrical flash, or maybe a loud popping sound)?
Am I aware of a history of "drop" related problems with this calculator? Have I ever dropped it before?
Is the calculator ruggedized? Is it designed to be dropped in this way?
What is my relationship to this calculator? Is it mine or someone else's? Maybe I'm just borrowing it.
What is the value of this calculator. I assume that this is not a precious artifact from a museum. The
exercise as presented appears to be about a calculator as calculating machine, rather than as a precious
Minoan urn that happens to have calculator functions built into it.
Exercise
What makes an assumption more dangerous?
• Not “what specific assumptions are more
dangerous?”…
• But “what factors would make one
assumption more dangerous than another?”
• Or “what would make the same assumption
more dangerous from one time to another?”
Critical Thinking for Testers Michael Bolton and James Bach
13
What makes an assumption more dangerous?
1. Consequential: required to support critical plans and activities. (Changing the
assumption would change important behavior.)
2. Unlikely: may conflict with other assumptions or evidence that you have. (The
assumption is counter-intuitive, confusing, obsolete, or has a low probability of
being true.)
3. Blind: regards a matter about which you have no evidence whatsoever.
4. Controversial: may conflict with assumptions or evidence held by others. (The
assumption ignores controversy.)
5. Impolitic: expected to be declared, by social convention. (Failing to disclose the
assumption violates law or local custom.)
6. Volatile: regards a matter that is subject to sudden or extreme change. (The
assumption may be invalidated unexpectedly.)
7. Unsustainable: may be hard to maintain over a long period of time. (The
assumption must be stable.)
8. Premature: regards a matter about which you don’t yet need to assume.
9. Narcotic: any assumption that comes packaged with assurances of its own safety.
10.Latent: Otherwise critical assumptions that we have not yet identified and dealt
with. (The act of managing assumptions can make them less critical.)
Assumptions vs. Inferences
• An assumption is a something that we take to
be true without sufficient evidence
• An inference is a conclusion formed from data
or evidence
• Conclusions may be incorrect because of
faulty models or faulty logic
Critical Thinking for Testers Michael Bolton and James Bach
14
How to Think Critically:
Introducing Pauses
• You may not understand.
– you may have misheard
– errors in interpreting and modeling a situation
– communication errors
• What you understand may not be true.
– missing information
– observations not made
– tests not run
• The truth may not matter, or may matter much
more than you think.
– poor understanding of risk
Huh?
Really?
So?
Giving System 2 time to wake up!
To What Do We Apply Critical Thinking?
• The Product
• What it is
• Descriptions of it is
• Descriptions of what it does
• Descriptions of what it's
supposed to be
• Testing
• Context
• Procedures
• Coverage
• Oracles
• Strategy
• The Project
• Schedule
• Infrastructure
• Processes
• Social orders
• Words
• Language
• Pictures
• Problems
• Biases
• Logical fallacies
• Evidence
• Causation
• Observations
• Learning
• Design
• Behavior
• Models
• Measurement
• Heuristics
• Methods
• …
Critical Thinking for Testers Michael Bolton and James Bach
15
“Huh?”
Critical Thinking About Words
• Among other things, testers question premises.
• A suppressed premise is an unstated premise that
an argument needs in order to be logical.
• A suppressed premise is something that should
be there, but isn’t…
• (…or is there, but it’s invisible or implicit.)
• Among other things, testers bring suppressed
premises to light and then question them.
• A diverse set of models can help us to see the
things that “aren’t there.”
31
Example: Generating Interpretations
• Selectively emphasize each word in a
statement; also consider alternative meanings.
MARY had a little lamb.
Mary HAD a little lamb.
Mary had A little lamb.
Mary had a LITTLE lamb.
Mary had a little LAMB.
32
Critical Thinking for Testers Michael Bolton and James Bach
16
“Really?”
The Data Question
“So?”
Critical Thinking About Risk
“The system shall operate at an input voltage range
of nominal 100 - 250 VAC.”
“Try it with an input voltage in the range of 100-250.”
Poor answer:
How do you test this?
Critical Thinking for Testers Michael Bolton and James Bach
17
Heuristic Model:
The Four-Part Risk Story
• Victim. Someone that experiences the impact of a problem.
Ultimately no bug can be important unless it victimizes a human.
• Problem: Something the product does that we wish it wouldn’t do.
• Vulnerability: Something about the product that causes or allows
it to exhibit a problem, under certain conditions.
• Threat: Some condition or input external to the product that, were it
to occur, would trigger a problem in a vulnerable product.
Some person may be hurt or annoyed
because of something that might go wrong
while operating the product,
due to some vulnerability in the product
that is triggered by some threat.
How Do We Know What “Is”?
“We know what is because we see what is.”
We believe
we know what is because we see
what we interpret as signs that indicate
what is
based on our prior beliefs about the world
and our (un)awareness of things around us.
Critical Thinking for Testers Michael Bolton and James Bach
18
How Do We Know What “Is”?
“If I see X, then probably Y, because probably A, B, C, D, etc.”
• THIS CAN FAIL:
– Ice cream that wasn’t
– Getting into a car– oops, not my car.
– Bad driving– Why?
– Bad work– Why?
– Ignored people at my going away party– Why?
– Couldn’t find soap dispenser in restroom– Why?
– Ordered orange juice at seafood restaurant– waitress misunderstood
Remember this, you testers!
Critical Thinking for Testers Michael Bolton and James Bach
19
Models Link Observation and Inference
• A model is an idea, activity, or object…
• …that represents another idea, activity, or object…
• …whereby understanding the model may help you understand
or manipulate what it represents.
39
such as an idea in your mind, a diagram, a list of words, a spreadsheet, a
person, a toy, an equation, a demonstration, or a program
such as something complex that you need to work with or study.
- A map helps navigate across a terrain.
- 2+2=4 is a model for adding two apples to a basket that already has two apples.
- Atmospheric models help predict where hurricanes will go.
- A fashion model helps understand how clothing would look on actual humans.
- Your beliefs about what you test are a model of what you test.
Models Link Observation & Inference
• Testers must distinguish
observation from inference!
• Our mental models form the
link between them
• Defocusing is lateral thinking.
• Focusing is logical (or “vertical”)
thinking.
40
My model
of the world
“I see…”
“I believe…”
Critical Thinking for Testers Michael Bolton and James Bach
20
41
Modeling Bugs as Magic Tricks
• Our thinking is limited
• We misunderstand probabilities
• We use the wrong heuristics
• We lack specialized knowledge
• We forget details
• We don’t pay attention to the right things
• The world is hidden
• states
• sequences
• processes
• attributes
• variables
• identities
Magic tricks work
for the same reasons
that bugs exist
Studying magic can
help you develop
the imagination
to find better bugs.
Testing magic is
indistinguishable from
testing sufficiently
advanced technology
How many test cases are needed to test
the product represented by this flowchart?
Critical Thinking for Testers Michael Bolton and James Bach
21
Critical Thinking About Diagrams
Analysis
• [pointing at a box] What if the function in this box fails?
• Can this function ever be invoked at the wrong time?
• [pointing at any part of the diagram] What error checking do
you do here?
• [pointing at an arrow] What exactly does this arrow mean?
What would happen if it was broken?
Web Server
Database
LayerApp Server
Browser
Guideword Heuristics for Diagram Analysis
What’s there? What happens? What could go wrong?
Boxes
• Interfaces (testable)
• Missing/Drop-out
• Extra/Interfering/Transient
• Incorrect
• Timing/Sequencing
• Contents/Algorithms
• Conditional behavior
• Limitations
• Error Handling
Lines
• Missing/Drop-out
• Extra/Forking
• Incorrect
• Timing/Sequencing
• Status Communication
• Data Structures
Web Server
Database
LayerApp Server
Browser
Paths
• Simplest
• Popular
• Critical
• Complex
• Pathological
• Challenging
• Error Handling
• Periodic
Testability!
Critical Thinking for Testers Michael Bolton and James Bach
22
Visualizing Test Coverage: Annotation
45
Web Server
App Server
Browser
Database
Layer
Build
Error Monitor
Survey
Build
Error Monitor
Coverage analysis
Force fail
Force fail
Man-in-middle
Data generator
Build stressbots
Server stress
Performance data Inspect reports
Table consistency oracle
Datagen Oracle
Performance history
Build history oracle
History oracle
History oracle
Review Error
Output
Beware Visual Bias!
• setup
• browser type & version
• cookies
• security settings
• screen size
• review client-side scripts & applets
• usability
• specific functions 46
Web Server
App Server
Browser
Database
Layer
Critical Thinking for Testers Michael Bolton and James Bach
23
One way to cope with
really complex diagrams
• Consider making a
special diagram that
includes only the things
that are worth testing,
then put the annotations
as bullets on the
bottom…
DB
!!Hook
PTT
Head (P)
PC
Mic
Covert
DB
!!Hook
PTT
Head (S)
PC
Mic
Covert
Splitter
(optional)
Power DB
Torso
(optional)
PTT
PC
Mic
DB != DB, DB == DB
Disconnect/Connect
Start/Stop/Restart/Reset
On hook/off hook
PTT Y/N
Signal arriving at antenna
No testing for extender box?!
Coverage
Screen Match
Contrast/Volume Independence
Muted/Unmuted
Reset on Disconnect
Reset on System Error
Pops
Oracles
Happy path
Spam test
Connection tests (failover)
DB interactions
Pairwise interactions
Head interactions
Time (leave it sitting)
Ideas
PTT Mic
Mic
PTT
Extender box Extender box
Extender box Extender box
Critical Thinking for Testers Michael Bolton and James Bach
24
Testing against requirements
is all about modeling.
“The system shall operate at an input voltage range
of nominal 100 - 250 VAC.”
“Try it with an input voltage in the range of 100-250.”
Poor answer:
How do you test this?
“Pass Rate” is a Popular Metric
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
2/1 2/8 2/15 2/22 3/1
Pass Rate
Pass Rate
Critical Thinking for Testers Michael Bolton and James Bach
25
Totally different situations…
same exact graph!
Pass Rate Passed Failed Total
10% 100 900 1000
50% 500 500 1000
70% 700 300 1000
80% 800 200 1000
90% 900 100 1000
90% 900 100 1000
Pass Rate Passed Failed Total
10% 1 9 10
50% 5 5 10
70% 7 3 10
80% 8 2 10
90% 9 1 10
90% 9 1 10
Pass Rate Passed Failed Total
10% 1 9 10
50% 25 25 50
70% 70 30 100
80% 160 40 200
90% 450 50 500
90% 900 100 1000
Pass Rate Passed Failed Total
10% 100 900 1000
50% 75 75 150
70% 70 30 100
80% 40 10 50
90% 36 4 40
90% 27 3 30
Critical Thinking About Measurement
from an actual client
DER – Defect Escape Rate
The Defect Escape Rate measures the number of undiscovered
defects that escaped detection in the product development cycle
and were released to customers. An escape is a defect found
while using a released product. DER is defined as:
DER= (Defect Escapes /Total Defects)∗100
DER is a lagging indicator of product quality. The number of
escapes is always zero until after product is released. It is
reported as a percentage and a low number is desired. Each
business unit has a target DER percentage and an Escape
Analysis should be performed on each defect to improve test
coverage. It is desirable for the DER for a product line to decline
over time. See appendix for calculation details.
Look! Measurement!
What could possibly go wrong?
Critical Thinking for Testers Michael Bolton and James Bach
26
Construct Validity & External Validity
• Construct validity is (informally) the degree to which
your attributes and measurements can justified
within an experiment or observation
– How do you demarcate the difference between one of
something and not-one of something?
– How do you know that you’re measuring what you think
you’re measuring?
• External validity is the degree to which your
experiment or observation can be generalized to the
world outside
– How do you know that your experiment or observation
will be relevant at other times or in other places?
Kaner & Bond’s Tests for Construct Validity
from http://www.kaner.com/pdfs/metrics2004.pdf
• What is the purpose of your measurement? The scope?
• What is the attribute you are trying to measure?
• What are the scale and variability of this attribute?
• What is the instrument you’re using? What is its scale and
variability?
• What function (metric) do you use to assign a value to the
attribute?
• What’s the natural scale of the metric?
• What is the relationship of the attribute to the metric’s value?
• What are the natural, foreseeable side effects of using this
measure?
The essence of good measurement is a model that incorporates
answers to questions like these.
If you don’t have solid answers, you aren’t doing measurement;
you are just playing with numbers.
Critical Thinking for Testers Michael Bolton and James Bach
27
Test Framing
• Test framing is the set of logical connections
that structure and inform a test and its result
• The framing of a test consists of
– premises; essentially ordinary statements
– logical “connectors”
• formal: if, then, else, and, or
• informal: although, maybe,
• A change in ONE BIT in the framing of the test
can invert its result.
Exercise:
Test Framing
• “I performed the tests. All my tests passed.
Therefore, the product works.”
• “The programmer said he fixed the bug. I can’t
reproduce it anymore. Therefore it must be
fixed.”
• “Microsoft Word frequently crashes while I am
using it. Therefore it’s a bad product.”
• “It’s way better to find bugs earlier than to find
them later.”
56
Critical Thinking for Testers Michael Bolton and James Bach
28
Safety Language
(aka “epistemic modalities”)
• “Safety language” in software testing, means to qualify
or otherwise draft statements of fact so as to avoid
false confidence.
• Examples:
So far…
The feature worked
It seems…
I think…
It appears…
apparently…
I infer…
I assumed…
I have not yet seen any
failures in the feature…
Safety Language In Action
Critical Thinking for Testers Michael Bolton and James Bach
29
Safety Language In Action
Who Says?
Critical Thinking About Research
• Research varies in quality
• Research findings often contradict one another
• Research findings do not prove conclusions
• Researchers have biases
• Writers and speakers may simplify or distort
• “Facts” change over time
• Research happens in specific environments
• Human desires affect research outcomes
Asking the Right Questions: A Guide to Critical Thinking
M. Neil Browne & Stuart M. Keeley
60
ALL of these things apply to testing, too.
Critical Thinking for Testers Michael Bolton and James Bach
30
Critical Thinking About
Common Beliefs About Testing
• Every test must have an expected, predicted result.
• Effective testing requires complete, clear, consistent, and
unambiguous specifications.
• Bugs found earlier cost less to fix than bugs found later.
• Testers are the quality gatekeepers for a product.
• Repeated tests are fundamentally more valuable.
• You can’t manage what you can’t measure.
• Testing at boundary values is the best way to find bugs.
Critical Thinking About
Common Beliefs About Testing
• Test documentation is needed to deflect legal liability.
• The more bugs testers find before release, the better the testing
effort has been.
• Rigorous planning is essential for good testing.
• Exploratory testing is unstructured testing, and is therefore
unreliable.
• Adopting best practices will guarantee that we do a good job of
testing.
• Step by step instructions are necessary to make testing a
repeatable process.
Critical Thinking for Testers Michael Bolton and James Bach
31
Critical thinking about practices
What does “best practice” mean?
• Someone: Who is it? What do they know?
• Believes: What specifically is the basis of their belief?
• You: Is their belief applicable to you?
• Might: How likely is the suffering to occur?
• Suffer: So what? Maybe it’s worth it.
• Unless: Really? There’s no alternative?
• You do this practice: What does it mean to “do” it? What does it
cost? What are the side effects? What if you do it badly? What if
you do something else really well?
Beware of…
• Numbers: “We cut test time by 94%.”
• Documentation: “You must have a written plan.”
• Judgments: “That project was chaotic. This project was a success.”
• Behavior Claims: “Our testers follow test plans.”
• Terminology: Exactly what is a “test plan?”
• Contempt for Current Practice: CMM Level 1 (initial) vs.
CMM level 2 (repeatable)
• Unqualified Claims: “A subjective and unquantifiable requirement
is not testable.”
Critical Thinking for Testers Michael Bolton and James Bach
32
Look For…
• Context: “This practice is useful when you want the power of creative
testing but you need high accountability, too.”
• People: “The test manager must be enthusiastic and a real hands-on
leader or this won’t work very well.”
• Skill: “This practice requires the ability to tell a complete story about
testing: coverage, techniques, and evaluation methods.”
• Learning Curve: “It took a good three months for the testers to get
good at producing test session reports.”
• Caveats: “The metrics are useless unless the test manager holds daily
debriefings.”
• Alternatives: “If you don’t need the metrics, you ditch the daily
debriefings and the specifically formatted reports.”
• Agendas: “I run a testing business, specializing in exploratory testing.”
Critical Thinking About Processes
• This is a description of a bug investigation process
that a particular company uses. Does it make sense?
See James Bach, Investigating Bugs: A Testing Skills Study
http://www.satisfice.com/articles/investigating-bugs.pdf
Critical Thinking for Testers Michael Bolton and James Bach
33
Appendices
Some Verbal Heuristics:
“A vs. THE”
• Example: “A problem…” instead of “THE problem…”
• Using “A” instead of “THE” helps us to avoid several
kinds of critical thinking errors
– single path of causation
– confusing correlation and causation
– single level of explanation
When trying to explain something,
prefer "a" to "the".
Critical Thinking for Testers Michael Bolton and James Bach
34
Some Verbal Heuristics:
“Unless…”
• When someone asks a question based on a false or
incomplete premise, try adding “unless…” to the
premise
• When someone offers a Grand Truth about testing,
append “unless…” or “except in the case of…”
At the end of a statement,
try adding "unless..."
Some Verbal Heuristics:
“And Also…”
• The product gives the correct result! Yay!
• …It also may be silently deleting system files.
Whatever is happening,
something else may ALSO be happening.
Critical Thinking for Testers Michael Bolton and James Bach
35
Some Verbal Heuristics:
“Or not…”
Whatever has happened,
something else might not have happened.
Whatever is true now
may not be true for long.
Some Verbal Heuristics:
“So far” and “Not yet”
• The product works… so far.
• We haven’t seen it fail… yet.
• No customer has complained… yet.
• Remember: There is no test for ALWAYS.
Critical Thinking for Testers Michael Bolton and James Bach
36
The Rule of Three: if you haven’t thought of
at least three plausible and non-trivial interpretations,
you probably haven’t thought enough.
Some Verbal Heuristics:
“What Else Could This Be?”
Why is this area blank?
See How Doctors Think, Dr. Jerome Groopman
Critical Thinking About Projects
• “You will have five weeks to test the product”
5 weeks
Critical Thinking for Testers Michael Bolton and James Bach
37
Exercise
Events Testing
• You want to test the interaction between
two potentially overlapping events.
• How would you test this?
time
Event A
Event B
Some Common Thinking Errors
• Reification Error
– giving a name to a concept, and then believing it has
an objective existence in the world
– ascribing material attributes to mental constructs—
“that product has quality”
– mistaking relationships for things—“its purpose is…”
– purpose and quality are relationships, not attributes;
they depend on the person
– how can we count ideas? how can we quantify
relationships?
Critical Thinking for Testers Michael Bolton and James Bach
38
Some Common Thinking Errors
• Fundamental Attribution Error
– “it always works that way”; “he’s a jerk”
– failure to recognize that circumstance and context
play a part in behaviour and effects
• The Similarity-Uniqueness Paradox
– “all companies are like ours”; “no companies are like
ours”
– failure to consider that everything incorporates
similarities and differences
• Missing Multiple Paths of Causation
– “A causes B” (even though C and D are also required)
Some Common Thinking Errors
• Assuming that effects are linear with causes
– “If we have 20% more traffic, throughput will slow
by 20%”
– this kind of error ignores non-linearity and
feedback loops—c.f. general systems
• Reactivity Bias
– the act of observing affects the observed
– a.k.a. “Heisenbugs”, the Hawthorne Effect
• The Probabilistic Fallacy
– confusing unpredictability and randomness
– after the third hurricane hits Florida, is it time to
relax?
Critical Thinking for Testers Michael Bolton and James Bach
39
Some Common Thinking Errors
• Binary Thinking Error / False Dilemmas
– “all manual tests are bad”; “that idea never works”
– failure to consider gray areas; belief that something is
either entirely something or entirely not
• Unidirectional Thinking
– expresses itself in testing as a belief that “the
application works”
– failure to consider the opposite: what if the
application fails?
– to find problems, we need to be able to imagine that
they might exist
Some Common Thinking Errors
• Availability Bias
– the tendency to favor prominent or vivid instances in
making a decision or evaluation
– example: people are afraid to fly, yet automobiles are
far more dangerous per passenger mile
– to a tech support person (or to some testers), the
product always seems completely broken
– spectacular failures often get more attention than
grinding little bugs
• Confusing concurrence with correlation
– “A and B happen at the same time; they must be
related”
Critical Thinking for Testers Michael Bolton and James Bach
40
Some Common Thinking Errors
• Nominal Fallacies
– believing that we know something well because we can
name it
• “equivalence classes”
– believing that we don’t know something because we
don’t have a name for it at our fingertips
• “the principle of concomitant variation”; “inattentional
blindness”
• Evaluative Bias of Language
– failure to recognize the spin of word choices
– …or an attempt to game it
– “our product is full-featured; theirs is bloated”
Some Common Thinking Errors
• Selectivity Bias
– choosing data (beforehand) that fits your preconceptions
or mission
– ignoring data that doesn’t fit
• Assimilation Bias
– modifying the data or observation (afterwards) to fit the
model
– grouping distinct things under one conceptual umbrella
– Jerry Weinberg refers to this as “lumping”
– for testers, the risk is in identifying setup, pinpointing,
investigating, reporting, and fixing as “testing”
Critical Thinking for Testers Michael Bolton and James Bach
41
Some Common Thinking Errors
• Narrative Bias
– a.k.a “post hoc, ergo propter hoc”
– explaining causation after the facts are in
• The Ludic Fallacy
– confusing complex human activities with random,
roll-of-the-dice games
– “Our project has a two-in-three chance of success”
• Confusing correlation with causation
– “When I change A, B changes; therefore A must be
causing B”
Some Common Thinking Errors
• Automation bias
– people have a tendency to believe in results from an automated
process out of all proportion to validity
• Formatting bias
– It’s more credible when it’s on a nicely formatted spreadsheet or
document
– (I made this one up)
• Survivorship bias
– we record and remember results from projects (or people) who
survived
– the survivors prayed to Neptune, but so did the sailors who died
– What was the bug rate for projects that were cancelled?
Critical Thinking for Testers Michael Bolton and James Bach
42
Do you prefer A or B?
Imagine that the US is preparing for the outbreak of an unusual
Asian disease, which is expected to kill 600 people. Two
alternative programs to combat the disease have been
proposed. Assume that the exact scientific estimates of the
consequences of the programs are as follows.
Program A: If Program A is adopted, 200 people will be saved.
Program B: If Program B is adopted, there is 1/3 probability that
600 people will be saved, and 2/3 probability that
no people will be saved.
See Daniel Kahneman, Thinking Fast and Slow
Do you prefer C or D?
Imagine two more programs to combat the disease are
proposed:
Program C: If Program C is adopted, 400 people will die.
Program D: If Program D is adopted, there is 1/3
probability that 600 people will be saved, and 2/3
probability that
no people will be saved.
Critical Thinking for Testers Michael Bolton and James Bach
43
A = C B = D A > B D > C
Program A: If Program A is adopted, 200 people will be saved.
(3/4 surveyed prefer this to B)
Program B: If Program B is adopted, there is 1/3 probability that
600 people will be saved, and 2/3 probability that
no people will be saved.
Program C: If Program A is adopted, 400 people will die.
Program D: If Program B is adopted, there is 1/3 probability that
600 people will be saved, and 2/3 probability that
no people will be saved.
(3/4 surveyed prefer this to C)
Isn’t it all just semantics?
• What does “semantics” mean?
• How many “events” were there when the twin
towers were hit? Does it matter?
• What’s the difference between “ad hoc”
testing and “exploratory testing”? Don’t they
mean the same thing?
Critical Thinking for Testers Michael Bolton and James Bach
44
Tacit vs. Explicit Knowledge
• Explicit knowledge is knowledge that has been
told
• Tacit knowledge comes in three forms
– Relational (“weak”) tacit knowledge: knowledge,
residing in a human mind, that could be told, but
has not been for various reasons
– Somatic (“medium”) tacit knowledge: knowledge
that comes from residing in a human body
– Collective (“strong”) tacit knowledge: knowledge
that is embodied in society
See Harry Collins, Tacit and Explicit Knowledge
The Role of Tacit Knowledge
• Although people tend to favour the explicit
(why not? we can talk about it), much of what
we do in testing is based on evolving tacit
knowledge.
What are the elements of tacit knowledge in
your testing? What parts can be made explicit?
What parts cannot?
Critical Thinking for Testers Michael Bolton and James Bach
45
Conditional Probability
Reasoning about it is HARD.
Unhappy
test result
Thesepeoplehavethedisease
These people are fine.Happy
test result
These people are fine too.
Conditional Probability
Uh-oh.
Yay!But…Ooops.Missedit.
No problemo.Yay!
Sorry we freaked you out.
Bad
news.
Try considering the number of people in the overall population
who have the disease BEFORE considering the test result.

Mais conteúdo relacionado

Destaque

The Art of Complex System Testing
The Art of Complex System TestingThe Art of Complex System Testing
The Art of Complex System TestingTechWell
 
Designing for Testability: Differentiator in a Competitive Market
Designing for Testability: Differentiator in a Competitive MarketDesigning for Testability: Differentiator in a Competitive Market
Designing for Testability: Differentiator in a Competitive MarketTechWell
 
Measurement and Metrics for Test Managers
Measurement and Metrics for Test ManagersMeasurement and Metrics for Test Managers
Measurement and Metrics for Test ManagersTechWell
 
Congruent Coaching: An Interactive Exploration
Congruent Coaching: An Interactive ExplorationCongruent Coaching: An Interactive Exploration
Congruent Coaching: An Interactive ExplorationTechWell
 
Testing in the Wild: Practices for Testing Beyond the Lab
Testing in the Wild: Practices for Testing Beyond the LabTesting in the Wild: Practices for Testing Beyond the Lab
Testing in the Wild: Practices for Testing Beyond the LabTechWell
 
Extreme Automation: Software Quality for the Next Generation Enterprise
Extreme Automation: Software Quality for the Next Generation EnterpriseExtreme Automation: Software Quality for the Next Generation Enterprise
Extreme Automation: Software Quality for the Next Generation EnterpriseTechWell
 
Automate Mobile App Testing—Or Go Crazy
Automate Mobile App Testing—Or Go CrazyAutomate Mobile App Testing—Or Go Crazy
Automate Mobile App Testing—Or Go CrazyTechWell
 
Continuous Test Automation
Continuous Test AutomationContinuous Test Automation
Continuous Test AutomationTechWell
 

Destaque (8)

The Art of Complex System Testing
The Art of Complex System TestingThe Art of Complex System Testing
The Art of Complex System Testing
 
Designing for Testability: Differentiator in a Competitive Market
Designing for Testability: Differentiator in a Competitive MarketDesigning for Testability: Differentiator in a Competitive Market
Designing for Testability: Differentiator in a Competitive Market
 
Measurement and Metrics for Test Managers
Measurement and Metrics for Test ManagersMeasurement and Metrics for Test Managers
Measurement and Metrics for Test Managers
 
Congruent Coaching: An Interactive Exploration
Congruent Coaching: An Interactive ExplorationCongruent Coaching: An Interactive Exploration
Congruent Coaching: An Interactive Exploration
 
Testing in the Wild: Practices for Testing Beyond the Lab
Testing in the Wild: Practices for Testing Beyond the LabTesting in the Wild: Practices for Testing Beyond the Lab
Testing in the Wild: Practices for Testing Beyond the Lab
 
Extreme Automation: Software Quality for the Next Generation Enterprise
Extreme Automation: Software Quality for the Next Generation EnterpriseExtreme Automation: Software Quality for the Next Generation Enterprise
Extreme Automation: Software Quality for the Next Generation Enterprise
 
Automate Mobile App Testing—Or Go Crazy
Automate Mobile App Testing—Or Go CrazyAutomate Mobile App Testing—Or Go Crazy
Automate Mobile App Testing—Or Go Crazy
 
Continuous Test Automation
Continuous Test AutomationContinuous Test Automation
Continuous Test Automation
 

Mais de TechWell

Failing and Recovering
Failing and RecoveringFailing and Recovering
Failing and RecoveringTechWell
 
Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization TechWell
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTechWell
 
System-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartSystem-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartTechWell
 
Build Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyBuild Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyTechWell
 
Testing Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTesting Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTechWell
 
Implement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowImplement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowTechWell
 
Develop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityDevelop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityTechWell
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyEliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyTechWell
 
Transform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTransform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTechWell
 
The Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipThe Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipTechWell
 
Resolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsResolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsTechWell
 
Pin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GamePin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GameTechWell
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsAgile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsTechWell
 
A Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationA Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationTechWell
 
Databases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessDatabases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessTechWell
 
Mobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateMobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateTechWell
 
Cultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessCultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessTechWell
 
Turn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTurn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTechWell
 

Mais de TechWell (20)

Failing and Recovering
Failing and RecoveringFailing and Recovering
Failing and Recovering
 
Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build Architecture
 
System-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartSystem-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good Start
 
Build Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyBuild Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test Strategy
 
Testing Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTesting Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for Success
 
Implement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowImplement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlow
 
Develop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityDevelop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your Sanity
 
Ma 15
Ma 15Ma 15
Ma 15
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyEliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps Strategy
 
Transform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTransform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOps
 
The Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipThe Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—Leadership
 
Resolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsResolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile Teams
 
Pin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GamePin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile Game
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsAgile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
 
A Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationA Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps Implementation
 
Databases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessDatabases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery Process
 
Mobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateMobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to Automate
 
Cultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessCultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for Success
 
Turn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTurn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile Transformation
 

Último

EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilV3cube
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 

Último (20)

EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of Brazil
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 

Critical Thinking for Software Testers

  • 1.       torial    Presented by:  Michae Bolton  Brought to you by:      340 Corporate Way, Suite   Orange Park, FL 32073  888‐2 MA  Full day Tu 4/7/2014    8:30 AM          “Critical Thinking for Software Testers”      l  DevelopSense                    300, 68‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com 
  • 2. Michael Bolton DevelopSense   Tester, consultant, and trainer Michael Bolton is the coauthor (with James Bach) of Rapid Software Testing, a course that presents a methodology and mindset for testing software expertly in uncertain conditions and under extreme time pressure. Michael is a leader in the context-driven software testing movement with twenty years of experience testing, developing, managing, and writing about software. Currently, he leads DevelopSense, a Toronto-based consultancy. Prior to DevelopSense, he was with Quarterdeck Corporation, where he managed the company’s flagship products and directed project and testing teams—both in-house and worldwide. Contact Michael at michael@developsense.com.  
  • 3. Critical Thinking for Testers Michael Bolton and James Bach 1 Critical Thinking for Testers James Bach http://www.satisfice.com james@satisfice.com Twitter: @jamesmarcusbach Michael Bolton http://www.developsense.com michael@developsense.com Twitter: @michaelbolton Simple Problems
  • 4. Critical Thinking for Testers Michael Bolton and James Bach 2 A Simple Puzzle “Steve, an American man, is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.” Is Steve more likely to be a librarian? a farmer? Wait, let’s try something really simple… Can we agree? Can we share common ground? “There are four geometric figures on this slide.” “There is one square among those figures.” “The square is shaded in blue.”
  • 5. Critical Thinking for Testers Michael Bolton and James Bach 3 Beware of Shallow Agreement! Wait, let’s try something really simple… Reflex is IMPORTANT But Critical Thinking is About Reflection REFLEX REFLECTION Faster Looser Slower Surer get more data System 2 System 1 See Thinking Fast and Slow, by Daniel Kahneman
  • 6. Critical Thinking for Testers Michael Bolton and James Bach 4 The Nature of Critical Thinking • “Critical thinking is purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based.” - Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction, Dr. Peter Facione (Critical thinking is, for the most part, about getting all the benefits of your “System 1” thinking reflexes while avoiding self-deception and other mistakes.) Bolton’s Definition of Critical Thinking • Michael Bolton Testing is enactment of critical thinking about software. Critical thinking must begin with our belief in the likelihood of errors in our thinking.
  • 7. Critical Thinking for Testers Michael Bolton and James Bach 5 The Nature of Critical Thinking • We call it critical thinking whenever we systematically doubt something that the “signs” tell us is probably true. Working through the doubt gives us a better foundation for our beliefs. • Critical thinking is a kind of de-focusing tactic, because it requires you to seek alternatives to what is already believed or what is being claimed. • Critical thinking is also a kind of focusing tactic, because it requires you to analyze the specific reasoning behind beliefs and claims. Why You Should Care Technology is way more tricky than regular life. But testers are not supposed to get tricked.
  • 8. Critical Thinking for Testers Michael Bolton and James Bach 6 Don’t Be A Turkey • Every day the turkey adds one more data point to his analysis proving that the farmer LOVES turkeys. • Hundreds of observations support his theory. • Then, a few days before Thanksgiving… Based on a story told by Nassim Taleb, who stole it from Bertrand Russell, who stole it from David Hume. Graph of My Fantastic Life! Page 25! (by the most intelligent Turkey in the world) WellBeing! DATA ESTIMATED POSTHUMOUSLY AFTER THANKSGIVING “Corn meal a little off today!” Don’t Be A Turkey • No experience of the past can LOGICALLY be projected into the future, because we have no experience OF the future. • No big deal in a world of stable, simple patterns. • BUT SOFTWARE IS NOT STABLE OR SIMPLE. • “PASSING” TESTS CANNOT PROVE SOFTWARE GOOD. Based on a story told by Nassim Taleb, who stole it from Bertrand Russell, who stole it from David Hume.
  • 9. Critical Thinking for Testers Michael Bolton and James Bach 7 Themes • Technology consists of complex and ephemeral relationships that can seem simple, fixed, objective, and dependable even when they aren’t. • Testers are people who ponder and probe complexity. • Basic testing is a straightforward technical process. • But, excellent testing is a difficult social and psychological process in addition to the technical stuff. A tester is someone who knows that things can be different. Jerry Weinberg
  • 10. Critical Thinking for Testers Michael Bolton and James Bach 8 Critical Thinking About Testing: Call this “Checking” not Testing Observe Evaluate Report Interact with the product in specific ways to collect specific observations. Apply algorithmic decision rules to those observations. Report any failed checks. means operating a product to check specific facts about it…
  • 11. Critical Thinking for Testers Michael Bolton and James Bach 9 Acquiring the competence, motivation, and credibility to… Testing is… create the conditions necessary to… …so that you help your clients to make informed decisions about risk. evaluate a product by learning about it through experimentation, which includes to some degree: questioning, study, modeling, observation and inference, including… operating a product to check specific facts about it…
  • 12. Critical Thinking for Testers Michael Bolton and James Bach 10 This is what people think testers do described actual “Compare the product to its specification” This is more like what testers really do imagined actualdescribed “Compare the idea of the product to a description of it” “Compare the actual product to a description of it” “Compare the idea of the product to the actual product”
  • 13. Critical Thinking for Testers Michael Bolton and James Bach 11 This is what you find… The designer INTENDS the product to be Firefox compatible, but never says so, and it actually is not. The designer INTENDS the product to be Firefox compatible, SAYS SO IN THE SPEC, but it actually is not. The designer assumes the product is not Firefox compatible, and it actually is not, but the ONLINE HELP SAYS IT IS. The designer INTENDS the product to be Firefox compatible, SAYS SO, and IT IS. The designer assumes the product is not Firefox compatible, but it ACTUALLY IS, and the ONLINE HELP SAYS IT IS. The designer INTENDS the product to be Firefox compatible, MAKES IT FIREFOX COMPATIBLE, but forgets to say so in the spec. The designer assumes the product is not Firefox compatible, and no one claims that it is, but it ACTUALLY IS. Exercise Calculator Test “I was carrying a calculator. I dropped it! Perhaps it is damaged! What might you do to test it?”
  • 14. Critical Thinking for Testers Michael Bolton and James Bach 12 and dry it out before attempting to test it. When did I drop it? Was I in the middle of a calculation? If so part of my testing might be to visually inspect the status of the display to determine whether the calculator appears to still be in the state it was at the time I dropped it. If so, I might continue the calculation from that point, unless I believe that the drop probably damaged the calculator. Did I drop it on a hard surface with a force that makes me suspect internal damage? If so then I would expect possible hairline fractures. I imagine that would lead to intermittent or persistent short circuits or broken circuits. I also suspect damage to moving parts, battery or solar cell connections, or screen. Did I drop it into a destructive chemical environment? If so, I might worry more about the progressive decay of the components. Did I drop it into a dangerous biological or radiological environment? If so, the functions of the calculator maybe less concern than contaminants. I may have to test it with a Geiger counter. Was the calculator connected to anything else whereby the connection (data cable or AC/cable or duct tape that fastened it to a Faberge egg) could have been damaged, or could have damaged the thing it was connected to? Did I detect anything while it was dropping that leads me to suspect any damage in particular (e.g. an electrical flash, or maybe a loud popping sound)? Am I aware of a history of "drop" related problems with this calculator? Have I ever dropped it before? Is the calculator ruggedized? Is it designed to be dropped in this way? What is my relationship to this calculator? Is it mine or someone else's? Maybe I'm just borrowing it. What is the value of this calculator. I assume that this is not a precious artifact from a museum. The exercise as presented appears to be about a calculator as calculating machine, rather than as a precious Minoan urn that happens to have calculator functions built into it. Exercise What makes an assumption more dangerous? • Not “what specific assumptions are more dangerous?”… • But “what factors would make one assumption more dangerous than another?” • Or “what would make the same assumption more dangerous from one time to another?”
  • 15. Critical Thinking for Testers Michael Bolton and James Bach 13 What makes an assumption more dangerous? 1. Consequential: required to support critical plans and activities. (Changing the assumption would change important behavior.) 2. Unlikely: may conflict with other assumptions or evidence that you have. (The assumption is counter-intuitive, confusing, obsolete, or has a low probability of being true.) 3. Blind: regards a matter about which you have no evidence whatsoever. 4. Controversial: may conflict with assumptions or evidence held by others. (The assumption ignores controversy.) 5. Impolitic: expected to be declared, by social convention. (Failing to disclose the assumption violates law or local custom.) 6. Volatile: regards a matter that is subject to sudden or extreme change. (The assumption may be invalidated unexpectedly.) 7. Unsustainable: may be hard to maintain over a long period of time. (The assumption must be stable.) 8. Premature: regards a matter about which you don’t yet need to assume. 9. Narcotic: any assumption that comes packaged with assurances of its own safety. 10.Latent: Otherwise critical assumptions that we have not yet identified and dealt with. (The act of managing assumptions can make them less critical.) Assumptions vs. Inferences • An assumption is a something that we take to be true without sufficient evidence • An inference is a conclusion formed from data or evidence • Conclusions may be incorrect because of faulty models or faulty logic
  • 16. Critical Thinking for Testers Michael Bolton and James Bach 14 How to Think Critically: Introducing Pauses • You may not understand. – you may have misheard – errors in interpreting and modeling a situation – communication errors • What you understand may not be true. – missing information – observations not made – tests not run • The truth may not matter, or may matter much more than you think. – poor understanding of risk Huh? Really? So? Giving System 2 time to wake up! To What Do We Apply Critical Thinking? • The Product • What it is • Descriptions of it is • Descriptions of what it does • Descriptions of what it's supposed to be • Testing • Context • Procedures • Coverage • Oracles • Strategy • The Project • Schedule • Infrastructure • Processes • Social orders • Words • Language • Pictures • Problems • Biases • Logical fallacies • Evidence • Causation • Observations • Learning • Design • Behavior • Models • Measurement • Heuristics • Methods • …
  • 17. Critical Thinking for Testers Michael Bolton and James Bach 15 “Huh?” Critical Thinking About Words • Among other things, testers question premises. • A suppressed premise is an unstated premise that an argument needs in order to be logical. • A suppressed premise is something that should be there, but isn’t… • (…or is there, but it’s invisible or implicit.) • Among other things, testers bring suppressed premises to light and then question them. • A diverse set of models can help us to see the things that “aren’t there.” 31 Example: Generating Interpretations • Selectively emphasize each word in a statement; also consider alternative meanings. MARY had a little lamb. Mary HAD a little lamb. Mary had A little lamb. Mary had a LITTLE lamb. Mary had a little LAMB. 32
  • 18. Critical Thinking for Testers Michael Bolton and James Bach 16 “Really?” The Data Question “So?” Critical Thinking About Risk “The system shall operate at an input voltage range of nominal 100 - 250 VAC.” “Try it with an input voltage in the range of 100-250.” Poor answer: How do you test this?
  • 19. Critical Thinking for Testers Michael Bolton and James Bach 17 Heuristic Model: The Four-Part Risk Story • Victim. Someone that experiences the impact of a problem. Ultimately no bug can be important unless it victimizes a human. • Problem: Something the product does that we wish it wouldn’t do. • Vulnerability: Something about the product that causes or allows it to exhibit a problem, under certain conditions. • Threat: Some condition or input external to the product that, were it to occur, would trigger a problem in a vulnerable product. Some person may be hurt or annoyed because of something that might go wrong while operating the product, due to some vulnerability in the product that is triggered by some threat. How Do We Know What “Is”? “We know what is because we see what is.” We believe we know what is because we see what we interpret as signs that indicate what is based on our prior beliefs about the world and our (un)awareness of things around us.
  • 20. Critical Thinking for Testers Michael Bolton and James Bach 18 How Do We Know What “Is”? “If I see X, then probably Y, because probably A, B, C, D, etc.” • THIS CAN FAIL: – Ice cream that wasn’t – Getting into a car– oops, not my car. – Bad driving– Why? – Bad work– Why? – Ignored people at my going away party– Why? – Couldn’t find soap dispenser in restroom– Why? – Ordered orange juice at seafood restaurant– waitress misunderstood Remember this, you testers!
  • 21. Critical Thinking for Testers Michael Bolton and James Bach 19 Models Link Observation and Inference • A model is an idea, activity, or object… • …that represents another idea, activity, or object… • …whereby understanding the model may help you understand or manipulate what it represents. 39 such as an idea in your mind, a diagram, a list of words, a spreadsheet, a person, a toy, an equation, a demonstration, or a program such as something complex that you need to work with or study. - A map helps navigate across a terrain. - 2+2=4 is a model for adding two apples to a basket that already has two apples. - Atmospheric models help predict where hurricanes will go. - A fashion model helps understand how clothing would look on actual humans. - Your beliefs about what you test are a model of what you test. Models Link Observation & Inference • Testers must distinguish observation from inference! • Our mental models form the link between them • Defocusing is lateral thinking. • Focusing is logical (or “vertical”) thinking. 40 My model of the world “I see…” “I believe…”
  • 22. Critical Thinking for Testers Michael Bolton and James Bach 20 41 Modeling Bugs as Magic Tricks • Our thinking is limited • We misunderstand probabilities • We use the wrong heuristics • We lack specialized knowledge • We forget details • We don’t pay attention to the right things • The world is hidden • states • sequences • processes • attributes • variables • identities Magic tricks work for the same reasons that bugs exist Studying magic can help you develop the imagination to find better bugs. Testing magic is indistinguishable from testing sufficiently advanced technology How many test cases are needed to test the product represented by this flowchart?
  • 23. Critical Thinking for Testers Michael Bolton and James Bach 21 Critical Thinking About Diagrams Analysis • [pointing at a box] What if the function in this box fails? • Can this function ever be invoked at the wrong time? • [pointing at any part of the diagram] What error checking do you do here? • [pointing at an arrow] What exactly does this arrow mean? What would happen if it was broken? Web Server Database LayerApp Server Browser Guideword Heuristics for Diagram Analysis What’s there? What happens? What could go wrong? Boxes • Interfaces (testable) • Missing/Drop-out • Extra/Interfering/Transient • Incorrect • Timing/Sequencing • Contents/Algorithms • Conditional behavior • Limitations • Error Handling Lines • Missing/Drop-out • Extra/Forking • Incorrect • Timing/Sequencing • Status Communication • Data Structures Web Server Database LayerApp Server Browser Paths • Simplest • Popular • Critical • Complex • Pathological • Challenging • Error Handling • Periodic Testability!
  • 24. Critical Thinking for Testers Michael Bolton and James Bach 22 Visualizing Test Coverage: Annotation 45 Web Server App Server Browser Database Layer Build Error Monitor Survey Build Error Monitor Coverage analysis Force fail Force fail Man-in-middle Data generator Build stressbots Server stress Performance data Inspect reports Table consistency oracle Datagen Oracle Performance history Build history oracle History oracle History oracle Review Error Output Beware Visual Bias! • setup • browser type & version • cookies • security settings • screen size • review client-side scripts & applets • usability • specific functions 46 Web Server App Server Browser Database Layer
  • 25. Critical Thinking for Testers Michael Bolton and James Bach 23 One way to cope with really complex diagrams • Consider making a special diagram that includes only the things that are worth testing, then put the annotations as bullets on the bottom… DB !!Hook PTT Head (P) PC Mic Covert DB !!Hook PTT Head (S) PC Mic Covert Splitter (optional) Power DB Torso (optional) PTT PC Mic DB != DB, DB == DB Disconnect/Connect Start/Stop/Restart/Reset On hook/off hook PTT Y/N Signal arriving at antenna No testing for extender box?! Coverage Screen Match Contrast/Volume Independence Muted/Unmuted Reset on Disconnect Reset on System Error Pops Oracles Happy path Spam test Connection tests (failover) DB interactions Pairwise interactions Head interactions Time (leave it sitting) Ideas PTT Mic Mic PTT Extender box Extender box Extender box Extender box
  • 26. Critical Thinking for Testers Michael Bolton and James Bach 24 Testing against requirements is all about modeling. “The system shall operate at an input voltage range of nominal 100 - 250 VAC.” “Try it with an input voltage in the range of 100-250.” Poor answer: How do you test this? “Pass Rate” is a Popular Metric 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 2/1 2/8 2/15 2/22 3/1 Pass Rate Pass Rate
  • 27. Critical Thinking for Testers Michael Bolton and James Bach 25 Totally different situations… same exact graph! Pass Rate Passed Failed Total 10% 100 900 1000 50% 500 500 1000 70% 700 300 1000 80% 800 200 1000 90% 900 100 1000 90% 900 100 1000 Pass Rate Passed Failed Total 10% 1 9 10 50% 5 5 10 70% 7 3 10 80% 8 2 10 90% 9 1 10 90% 9 1 10 Pass Rate Passed Failed Total 10% 1 9 10 50% 25 25 50 70% 70 30 100 80% 160 40 200 90% 450 50 500 90% 900 100 1000 Pass Rate Passed Failed Total 10% 100 900 1000 50% 75 75 150 70% 70 30 100 80% 40 10 50 90% 36 4 40 90% 27 3 30 Critical Thinking About Measurement from an actual client DER – Defect Escape Rate The Defect Escape Rate measures the number of undiscovered defects that escaped detection in the product development cycle and were released to customers. An escape is a defect found while using a released product. DER is defined as: DER= (Defect Escapes /Total Defects)∗100 DER is a lagging indicator of product quality. The number of escapes is always zero until after product is released. It is reported as a percentage and a low number is desired. Each business unit has a target DER percentage and an Escape Analysis should be performed on each defect to improve test coverage. It is desirable for the DER for a product line to decline over time. See appendix for calculation details. Look! Measurement! What could possibly go wrong?
  • 28. Critical Thinking for Testers Michael Bolton and James Bach 26 Construct Validity & External Validity • Construct validity is (informally) the degree to which your attributes and measurements can justified within an experiment or observation – How do you demarcate the difference between one of something and not-one of something? – How do you know that you’re measuring what you think you’re measuring? • External validity is the degree to which your experiment or observation can be generalized to the world outside – How do you know that your experiment or observation will be relevant at other times or in other places? Kaner & Bond’s Tests for Construct Validity from http://www.kaner.com/pdfs/metrics2004.pdf • What is the purpose of your measurement? The scope? • What is the attribute you are trying to measure? • What are the scale and variability of this attribute? • What is the instrument you’re using? What is its scale and variability? • What function (metric) do you use to assign a value to the attribute? • What’s the natural scale of the metric? • What is the relationship of the attribute to the metric’s value? • What are the natural, foreseeable side effects of using this measure? The essence of good measurement is a model that incorporates answers to questions like these. If you don’t have solid answers, you aren’t doing measurement; you are just playing with numbers.
  • 29. Critical Thinking for Testers Michael Bolton and James Bach 27 Test Framing • Test framing is the set of logical connections that structure and inform a test and its result • The framing of a test consists of – premises; essentially ordinary statements – logical “connectors” • formal: if, then, else, and, or • informal: although, maybe, • A change in ONE BIT in the framing of the test can invert its result. Exercise: Test Framing • “I performed the tests. All my tests passed. Therefore, the product works.” • “The programmer said he fixed the bug. I can’t reproduce it anymore. Therefore it must be fixed.” • “Microsoft Word frequently crashes while I am using it. Therefore it’s a bad product.” • “It’s way better to find bugs earlier than to find them later.” 56
  • 30. Critical Thinking for Testers Michael Bolton and James Bach 28 Safety Language (aka “epistemic modalities”) • “Safety language” in software testing, means to qualify or otherwise draft statements of fact so as to avoid false confidence. • Examples: So far… The feature worked It seems… I think… It appears… apparently… I infer… I assumed… I have not yet seen any failures in the feature… Safety Language In Action
  • 31. Critical Thinking for Testers Michael Bolton and James Bach 29 Safety Language In Action Who Says? Critical Thinking About Research • Research varies in quality • Research findings often contradict one another • Research findings do not prove conclusions • Researchers have biases • Writers and speakers may simplify or distort • “Facts” change over time • Research happens in specific environments • Human desires affect research outcomes Asking the Right Questions: A Guide to Critical Thinking M. Neil Browne & Stuart M. Keeley 60 ALL of these things apply to testing, too.
  • 32. Critical Thinking for Testers Michael Bolton and James Bach 30 Critical Thinking About Common Beliefs About Testing • Every test must have an expected, predicted result. • Effective testing requires complete, clear, consistent, and unambiguous specifications. • Bugs found earlier cost less to fix than bugs found later. • Testers are the quality gatekeepers for a product. • Repeated tests are fundamentally more valuable. • You can’t manage what you can’t measure. • Testing at boundary values is the best way to find bugs. Critical Thinking About Common Beliefs About Testing • Test documentation is needed to deflect legal liability. • The more bugs testers find before release, the better the testing effort has been. • Rigorous planning is essential for good testing. • Exploratory testing is unstructured testing, and is therefore unreliable. • Adopting best practices will guarantee that we do a good job of testing. • Step by step instructions are necessary to make testing a repeatable process.
  • 33. Critical Thinking for Testers Michael Bolton and James Bach 31 Critical thinking about practices What does “best practice” mean? • Someone: Who is it? What do they know? • Believes: What specifically is the basis of their belief? • You: Is their belief applicable to you? • Might: How likely is the suffering to occur? • Suffer: So what? Maybe it’s worth it. • Unless: Really? There’s no alternative? • You do this practice: What does it mean to “do” it? What does it cost? What are the side effects? What if you do it badly? What if you do something else really well? Beware of… • Numbers: “We cut test time by 94%.” • Documentation: “You must have a written plan.” • Judgments: “That project was chaotic. This project was a success.” • Behavior Claims: “Our testers follow test plans.” • Terminology: Exactly what is a “test plan?” • Contempt for Current Practice: CMM Level 1 (initial) vs. CMM level 2 (repeatable) • Unqualified Claims: “A subjective and unquantifiable requirement is not testable.”
  • 34. Critical Thinking for Testers Michael Bolton and James Bach 32 Look For… • Context: “This practice is useful when you want the power of creative testing but you need high accountability, too.” • People: “The test manager must be enthusiastic and a real hands-on leader or this won’t work very well.” • Skill: “This practice requires the ability to tell a complete story about testing: coverage, techniques, and evaluation methods.” • Learning Curve: “It took a good three months for the testers to get good at producing test session reports.” • Caveats: “The metrics are useless unless the test manager holds daily debriefings.” • Alternatives: “If you don’t need the metrics, you ditch the daily debriefings and the specifically formatted reports.” • Agendas: “I run a testing business, specializing in exploratory testing.” Critical Thinking About Processes • This is a description of a bug investigation process that a particular company uses. Does it make sense? See James Bach, Investigating Bugs: A Testing Skills Study http://www.satisfice.com/articles/investigating-bugs.pdf
  • 35. Critical Thinking for Testers Michael Bolton and James Bach 33 Appendices Some Verbal Heuristics: “A vs. THE” • Example: “A problem…” instead of “THE problem…” • Using “A” instead of “THE” helps us to avoid several kinds of critical thinking errors – single path of causation – confusing correlation and causation – single level of explanation When trying to explain something, prefer "a" to "the".
  • 36. Critical Thinking for Testers Michael Bolton and James Bach 34 Some Verbal Heuristics: “Unless…” • When someone asks a question based on a false or incomplete premise, try adding “unless…” to the premise • When someone offers a Grand Truth about testing, append “unless…” or “except in the case of…” At the end of a statement, try adding "unless..." Some Verbal Heuristics: “And Also…” • The product gives the correct result! Yay! • …It also may be silently deleting system files. Whatever is happening, something else may ALSO be happening.
  • 37. Critical Thinking for Testers Michael Bolton and James Bach 35 Some Verbal Heuristics: “Or not…” Whatever has happened, something else might not have happened. Whatever is true now may not be true for long. Some Verbal Heuristics: “So far” and “Not yet” • The product works… so far. • We haven’t seen it fail… yet. • No customer has complained… yet. • Remember: There is no test for ALWAYS.
  • 38. Critical Thinking for Testers Michael Bolton and James Bach 36 The Rule of Three: if you haven’t thought of at least three plausible and non-trivial interpretations, you probably haven’t thought enough. Some Verbal Heuristics: “What Else Could This Be?” Why is this area blank? See How Doctors Think, Dr. Jerome Groopman Critical Thinking About Projects • “You will have five weeks to test the product” 5 weeks
  • 39. Critical Thinking for Testers Michael Bolton and James Bach 37 Exercise Events Testing • You want to test the interaction between two potentially overlapping events. • How would you test this? time Event A Event B Some Common Thinking Errors • Reification Error – giving a name to a concept, and then believing it has an objective existence in the world – ascribing material attributes to mental constructs— “that product has quality” – mistaking relationships for things—“its purpose is…” – purpose and quality are relationships, not attributes; they depend on the person – how can we count ideas? how can we quantify relationships?
  • 40. Critical Thinking for Testers Michael Bolton and James Bach 38 Some Common Thinking Errors • Fundamental Attribution Error – “it always works that way”; “he’s a jerk” – failure to recognize that circumstance and context play a part in behaviour and effects • The Similarity-Uniqueness Paradox – “all companies are like ours”; “no companies are like ours” – failure to consider that everything incorporates similarities and differences • Missing Multiple Paths of Causation – “A causes B” (even though C and D are also required) Some Common Thinking Errors • Assuming that effects are linear with causes – “If we have 20% more traffic, throughput will slow by 20%” – this kind of error ignores non-linearity and feedback loops—c.f. general systems • Reactivity Bias – the act of observing affects the observed – a.k.a. “Heisenbugs”, the Hawthorne Effect • The Probabilistic Fallacy – confusing unpredictability and randomness – after the third hurricane hits Florida, is it time to relax?
  • 41. Critical Thinking for Testers Michael Bolton and James Bach 39 Some Common Thinking Errors • Binary Thinking Error / False Dilemmas – “all manual tests are bad”; “that idea never works” – failure to consider gray areas; belief that something is either entirely something or entirely not • Unidirectional Thinking – expresses itself in testing as a belief that “the application works” – failure to consider the opposite: what if the application fails? – to find problems, we need to be able to imagine that they might exist Some Common Thinking Errors • Availability Bias – the tendency to favor prominent or vivid instances in making a decision or evaluation – example: people are afraid to fly, yet automobiles are far more dangerous per passenger mile – to a tech support person (or to some testers), the product always seems completely broken – spectacular failures often get more attention than grinding little bugs • Confusing concurrence with correlation – “A and B happen at the same time; they must be related”
  • 42. Critical Thinking for Testers Michael Bolton and James Bach 40 Some Common Thinking Errors • Nominal Fallacies – believing that we know something well because we can name it • “equivalence classes” – believing that we don’t know something because we don’t have a name for it at our fingertips • “the principle of concomitant variation”; “inattentional blindness” • Evaluative Bias of Language – failure to recognize the spin of word choices – …or an attempt to game it – “our product is full-featured; theirs is bloated” Some Common Thinking Errors • Selectivity Bias – choosing data (beforehand) that fits your preconceptions or mission – ignoring data that doesn’t fit • Assimilation Bias – modifying the data or observation (afterwards) to fit the model – grouping distinct things under one conceptual umbrella – Jerry Weinberg refers to this as “lumping” – for testers, the risk is in identifying setup, pinpointing, investigating, reporting, and fixing as “testing”
  • 43. Critical Thinking for Testers Michael Bolton and James Bach 41 Some Common Thinking Errors • Narrative Bias – a.k.a “post hoc, ergo propter hoc” – explaining causation after the facts are in • The Ludic Fallacy – confusing complex human activities with random, roll-of-the-dice games – “Our project has a two-in-three chance of success” • Confusing correlation with causation – “When I change A, B changes; therefore A must be causing B” Some Common Thinking Errors • Automation bias – people have a tendency to believe in results from an automated process out of all proportion to validity • Formatting bias – It’s more credible when it’s on a nicely formatted spreadsheet or document – (I made this one up) • Survivorship bias – we record and remember results from projects (or people) who survived – the survivors prayed to Neptune, but so did the sailors who died – What was the bug rate for projects that were cancelled?
  • 44. Critical Thinking for Testers Michael Bolton and James Bach 42 Do you prefer A or B? Imagine that the US is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows. Program A: If Program A is adopted, 200 people will be saved. Program B: If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved. See Daniel Kahneman, Thinking Fast and Slow Do you prefer C or D? Imagine two more programs to combat the disease are proposed: Program C: If Program C is adopted, 400 people will die. Program D: If Program D is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved.
  • 45. Critical Thinking for Testers Michael Bolton and James Bach 43 A = C B = D A > B D > C Program A: If Program A is adopted, 200 people will be saved. (3/4 surveyed prefer this to B) Program B: If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved. Program C: If Program A is adopted, 400 people will die. Program D: If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved. (3/4 surveyed prefer this to C) Isn’t it all just semantics? • What does “semantics” mean? • How many “events” were there when the twin towers were hit? Does it matter? • What’s the difference between “ad hoc” testing and “exploratory testing”? Don’t they mean the same thing?
  • 46. Critical Thinking for Testers Michael Bolton and James Bach 44 Tacit vs. Explicit Knowledge • Explicit knowledge is knowledge that has been told • Tacit knowledge comes in three forms – Relational (“weak”) tacit knowledge: knowledge, residing in a human mind, that could be told, but has not been for various reasons – Somatic (“medium”) tacit knowledge: knowledge that comes from residing in a human body – Collective (“strong”) tacit knowledge: knowledge that is embodied in society See Harry Collins, Tacit and Explicit Knowledge The Role of Tacit Knowledge • Although people tend to favour the explicit (why not? we can talk about it), much of what we do in testing is based on evolving tacit knowledge. What are the elements of tacit knowledge in your testing? What parts can be made explicit? What parts cannot?
  • 47. Critical Thinking for Testers Michael Bolton and James Bach 45 Conditional Probability Reasoning about it is HARD. Unhappy test result Thesepeoplehavethedisease These people are fine.Happy test result These people are fine too. Conditional Probability Uh-oh. Yay!But…Ooops.Missedit. No problemo.Yay! Sorry we freaked you out. Bad news. Try considering the number of people in the overall population who have the disease BEFORE considering the test result.