You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Michael Bolton introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems. The rapid approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Michael to see how rapid testing focuses on both the mind set and skill set of the individual tester who uses tight loops of exploration and critical thinking skills to help continuously re-optimize testing to match clients' needs and expectations.
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
A Rapid Introduction to Rapid Software Testing
1. MA
Full-day Tutorial
4/29/13 8:30AM
A Rapid Introduction to Rapid
Software Testing
Presented by:
Michael Bolton
DevelopSense
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
2. Michael Bolton
Tester, consultant, and trainer Michael Bolton is the coauthor (with James Bach) of Rapid Software
Testing, a course that presents a methodology and mindset for testing software expertly in uncertain
conditions and under extreme time pressure. Michael is a leader in the context-driven software testing
movement with twenty years of experience testing, developing, managing, and writing about software.
Currently, he leads DevelopSense, a Toronto-based consultancy. Prior to DevelopSense, he was with
Quarterdeck Corporation, where he managed the company’s flagship products and directed project and
testing teams--both in-house and worldwide. Contact Michael at michael@developsense.com.
4. Assumptions About You
• You test software, or any other complex human
creation.
• You have at least some control over the design of your
tests and some time to create new tests.
• You are worried that your test process is spending too
much time and resources on things that aren’t
important.
• You test under uncertainty and time pressure.
• Your major goal is to find important problems quickly.
• You want to get very good at (software) testing.
3
A Question
What makes
testing harder or
slower?
2
5. Premises of Rapid Testing
1. Software projects and products are relationships
between people.
2. Each project occurs under conditions of uncertainty
and time pressure.
3. Despite our best hopes and intentions, some
degree of inexperience, carelessness, and
incompetence is normal.
4. A test is an activity; it is performance, not artifacts.
Premises of Rapid Testing
5. Testing’s purpose is to discover the status of the
product and any threats to its value, so that our
clients can make informed decisions about it.
6. We commit to performing credible, cost‐effective
testing, and we will inform our clients of anything that
threatens that commitment.
7. We will not knowingly or negligently mislead our
clients and colleagues or ourselves.
8. Testers accept responsibility for the quality of their
work, although they cannot control the quality of the
product.
3
7. Excellent Rapid Technical Work
Begins with You
When the ball comes to you…
Do you know you have the ball?
Can you receive the pass?
Do you know your options?
Do you know what your
role and mission is?
Do you know where
your teammates are?
Is your equipment ready?
Can you read the
situation on the field?
Are you aware of the
Can you let your teammates help you? criticality of the situation?
Are you ready to act, right now?
9
…but you don’t have to be great at everything.
• Rapid test teams are about diverse talents cooperating
• We call this the elliptical team, as opposed to the team of
perfect circles.
• Some important dimensions to vary:
•
•
•
•
•
•
•
•
•
Technical skill
Domain expertise
Temperament (e.g. introvert vs. extrovert)
Testing experience
Project experience
Industry experience
Product knowledge
Educational background
Writing skill
• Diversity makes exploration far more powerful
• Your team is powerful because of your unique contribution
10
5
8. What It Means To Test Rapidly
• Since testing is about finding a potentially infinite
number of problems in an infinite space in a finite
amount of time, we must…
• understand our mission and obstacles to fulfilling it
• know how to recognize problems quickly
• model the product and the test space to know where to look
for problems
• prefer inexpensive, lightweight, effective tools
• reduce dependence on expensive, time‐consuming artifacts,
while getting value from the ones we’ve got
• do nothing that wastes time or effort
• tell a credible story about all that
11
One Big Problem in Testing
Formality Bloat
• Much of the time, your testing doesn’t need to be very formal*
• Even when your testing does need to be formal, you’ll need to
do substantial amounts of informal testing in order figure out
how to do excellent formal testing.
• Who says? The FDA. See http://www.satisfice.com/blog/archives/602
• Even in a highly regulated environment, you do formal testing
You do informal testing to make sure
you don’t lose money, blow things up, or kill people.
primarily for the auditors.
* Formal testing means testing that must be done to verify a specific fact,
or that must be done in a specific way.
6
9. EXERCISE
Test the Famous Triangle
What is testing?
Serving Your Client
If you don’t have an understanding and an agreement
on what is the mission of your testing, then doing it
“rapidly” would be pointless.
14
7
11. How Do We Recognize Problems?
An oracle is…
a way to recognize
a problem.
Learn About Heuristics
Heuristics are fallible, “fast and frugal” methods of solving
problems, making decisions, or accomplishing tasks.
“The engineering method is
the use of heuristics
to cause the best change
in a poorly understood situation
within the available resources.”
Billy Vaughan Koen
Discussion of the Method
9
17. Purpose
When a product is inconsistent with its designers’ explicit
or implicit purposes, we suspect a problem.
Product
When a product is inconsistent internally—as when it
contradicts itself—we suspect a problem.
15
18. Statutes and Standards
When a product is inconsistent with laws or widely
accepted standards, we suspect a problem.
Consistency (“this agrees with that”)
an important theme in oracle principles
•
•
•
•
•
•
Familiarity: The system is not consistent with the pattern of any familiar problem.
Explainability: The system is consistent with our ability to describe it clearly.
World: The system is consistent with things that we recognize in the world.
History: The present version of the system is consistent with past versions of it.
Image: The system is consistent with an image that the organization wants to project.
Comparable Products: The system is consistent with comparable systems.
• Claims: The system is consistent with what important people say it’s supposed to be.
• Users’ Expectations: The system is consistent with what users want.
• Product: Each element of the system is consistent with comparable elements in the
same system.
• Purpose: The system is consistent with its purposes, both explicit and implicit.
• Standards and Statutes: The system is consistent with applicable laws, or relevant
implicit or explicit standards.
Consistency heuristics rely on the quality of your
models of the product and its context.
32
16
19. All Oracles Are Heuristic
An oracle doesn’t tell you that there IS a problem.
An oracle tells you that you might be seeing a problem.
An oracle can alert you to a possible problem,
but an oracle cannot tell you that there is no problem.
Consistency heuristics rely on the quality of
your models of the product and its context.
Rely solely on documented, anticipated sources of oracles,
and your testing will likely be slower and weaker.
Train your mind to recognize patterns of oracles
and your testing will likely be faster
and your ability to spot problems will be sharper.
General Examples of Oracles
things that suggest “problem” or “no problem”
•
•
•
•
•
•
•
•
•
•
People
A person whose opinion matters.
An opinion held by a person who matters.
A disagreement among people who matter.
A reference document with useful information.
A known good example output.
Mechanisms
A known bad example output.
A process or tool by which the output is checked.
A process or tool that helps a tester identify patterns.
A feeling like confusion or annoyance.
Feelings
A desirable consistency between related things.
Principles
34
17
20. Oracles from the Inside Out
Explicit
Tacit
Tester
Inference
Your
Feelings &
Mental Models
Observable
Consistencies
Other People
Experience
Conference
Reference
Stakeholders’
Feelings &
Mental Models
Shared Artifacts
(specs, tools, etc.)
Oracle Cost and Value
• Some oracles are more authoritative
• but more responsive to change
• Some oracles are more consistent
• but maybe not up to date
• Some oracles are more immediate
• but less reliable
• Some oracles are more precise
• but the precision may be misleading
• Some oracles are more accurate
• but less precise
• Some oracles are more available
• but less authoritative
• Some oracles are easier to interpret
• but more narrowly focused
18
21. Feelings As Heuristic Triggers For Oracles
• An emotional reaction is a trigger to attention
and learning
• Without emotion, we don’t reason well
• See Damasio, The Feeling of What Happens
• When you find yourself mildly concerned about
something, someone else could be very
concerned about it
• Observe emotions to help overcome your
biases and to evaluate significance
An emotion is a signal; consider looking into it
All Oracles Are Heuristic
• We often do not have oracles that establish a definite correct or incorrect
result, in advance. Oracles may reveal themselves to us on the fly, or later.
That’s why we use abductive inference.
• No single oracle can tell us whether a program (or a feature) is working
correctly at all times and in all circumstances.
That’s why we use a variety of oracles.
• Any program that looks like it’s working, to you, may in fact be failing in
some way that happens to fool all of your oracles. That’s why we proceed
with humility and critical thinking.
• We never know when a test is finished.
That’s why we try to maintain uncertainty when everyone else on the
project is sure.
• You (the tester) can’t know the deep truth about any result.
That’s why we report whatever seems likely to be a bug.
38
19
22. Oracles are Not Perfect
And Testers are Not Judges
• You don’t need to know for sure if something is a bug;
it’s not your job to decide if something is a bug; it’s your
job to decide if it’s worth reporting.
• You do need to form a justified belief that it MIGHT be
a threat to product value in the opinion of someone
who matters.
• And you must be able to say why you think so; you must
be able to cite good oracles… or you will lose credibility.
MIP’ing VS. Black Flagging
39
Coping With Difficult Oracle Problems
• Ignore the Problem
• Ask “so what?” Maybe the value of the information doesn’t justify the cost.
• Simplify the Problem
• Ask for testability. It usually doesn’t happen by accident.
• Built‐in oracle. Internal error detection and handling.
• Lower the standards. You may be using an unreasonable standard of correctness.
• Shift the Problem
• Parallel testing. Compare with another instance of a comparable algorithm.
• Live oracle. Find an expert who can tell if the output is correct.
• Reverse the function. (e.g. 2 x 2 = 4, then 4/2 = 2)
• Divide and Conquer the Problem
• Spot check. Perform a detailed inspection on one instance out of a set of outputs.
• Blink test. Compare or review overwhelming batches of data for patterns that stand
out.
•
•
•
•
Easy input. Use input for which the output is easy to analyze.
Easy output. Some output may be obviously wrong, regardless of input.
Unit test first. Learn about the pieces that make the whole.
Test incrementally. Learn about the product by testing over a period of time.
40
20
23. “Easy Input”
• Fixed Markers. Use distinctive fixed input patterns that are easy to
spot in the output.
• Statistical Markers. Use populations of data that have
distinguishable statistical properties.
• Self‐Referential Data. Use data that embeds metadata about itself.
(e.g. counterstrings)
• Easy Input Regions. For specific inputs, the correct output may be
easy to calculate.
• Outrageous Values. For some inputs, we expect error handling.
• Idempotent Input. Try a case where the output will be the same as
the input.
• Match. Do the “same thing” twice and look for a match.
• Progressive Mismatch. Do progressively differing things over time
and account for each difference. (code‐breaking technique)
41
Oracles Are Linked To Threats
To Quality Criteria
Capability
Scalability
Reliability
Compatibility
Usability
Performance
Charisma
Installability
Security
Development
Any inconsistency may represent diminished value.
Many test approaches focus on capability (functionality)
42
and underemphasize the other criteria.
21
28. To test a very simple product meticulously,
part of a complex product meticulously,
or to maximize test integrity…
1.
2.
3.
4.
5.
6.
Start the test from a known (clean) state.
Prefer simple, deterministic actions.
Trace test steps to a specified model.
Follow established and consistent lab procedures.
Make specific predictions, observations and records.
Make it easy to reproduce (automation may help).
51
General Focusing Heuristics
• use test‐first approach or unit testing for better code
coverage
• work from prepared test coverage outlines and risk lists
• use diagrams, state models, and the like, and cover them
• apply specific test techniques to address particular coverage
areas
• make careful observations and match to expectations
To do this more rapidly, make preparation and artifacts fast and frugal:
leverage existing materials and avoid repeating yourself.
Emphasize doing; relax planning. You’ll make discoveries along the way!
26
29. To find unexpected problems,
elusive problems that occur in sustained field use,
or more problems quickly in a complex product…
That’s a
PowerPoint
bug!
1.
2.
3.
4.
5.
6.
Start from different states (not necessarily clean).
Prefer complex, challenging actions.
Generate tests from a variety of models.
Question your lab procedures and tools.
Try to see everything with open expectations.
Make the test hard to pass, instead of easy to reproduce.
53
General Defocusing Heuristics
• diversify your models; intentional coverage in one area can lead
to unintentional coverage in other areas—this is a Good Thing
• diversify your test techniques
• be alert to problems other than the ones that you’re actively
looking for
• welcome and embrace productive distraction
• do some testing that is not oriented towards a specific risk
• use high‐volume, randomized automated tests
27
30. DISCUSSION
How Many Test Cases?
What About Quantifying Coverage Overall?
•
A nice idea, but we don’t know how to do it in a way
that is consistent with basic measurement theory
•
•
If we describe coverage by counting test cases, we’re
committing reification error.
If we use percentages to quantify coverage, we need to
establish what 100% looks like.
•
•
But we might do that with respect to some specific models.
Complex systems may display emergent behaviour.
28
31. Extent of Coverage
• Smoke and sanity
• Can this thing even be tested at all?
• Common, core, and critical
• Can this thing do the things it must do?
• Does it handle happy paths and regular input?
• Can it work?
• Complex, harsh, extreme and exceptional
• Will this thing handle challenging tests, complex data flows,
and malformed input, etc.?
• Will it work?
How Might We Organize,
Record, and Report Coverage?
•
•
•
•
•
•
automated tools (e.g. profilers, coverage tools)
annotated diagrams and mind maps
coverage matrices
bug taxonomies
Michael Hunter’s You Are Not Done Yet list
James Bach’s Heuristic Test Strategy Model
• described at www.satisfice.com
• articles about it at www.developsense.com
• Mike Kelly’s MCOASTER model
• product coverage outlines and risk lists
• session‐based test management
• http://www.satisfice.com/sbtm
See three articles here:
http://www.developsense.com/publications.html#coverage
29
34. Visualizing Test Progress
See “A Sticky Situation”, Better Software, February 2012
What IS Exploratory Testing?
• Simultaneous test design, test
execution, and learning.
• James Bach, 1995
But maybe it would be a good idea to underscore
why that’s important…
32
38. Questions About Exploration…
arrows and cycles
What happens when
the unexpected
happens during
Where does
exploration
come from?
exploration?
What do we do
with what we
learn?
Will everyone
explore the same way?
(value seeking)
Exploration is Not Just Action
arrows and cycles
36
40. What Exploratory Testing Is Not
• Touring
• http://www.developsense.com/blog/2011/12/what‐exploratory‐testing‐is‐not‐part‐1‐
touring/
• After‐Everything‐Else Testing
• http://www.developsense.com/blog/2011/12/what‐exploratory‐testing‐is‐not‐part‐2‐after‐
everything‐else‐testing/
• Tool‐Free Testing
• http://www.developsense.com/blog/2011/12/what‐exploratory‐testing‐is‐not‐part‐3‐tool‐
free‐testing/
• Quick Tests
• http://www.developsense.com/blog/2011/12/what‐exploratory‐testing‐is‐not‐part‐4‐quick‐
tests/
• Undocumented Testing
• http://www.developsense.com/blog/2011/12/what‐exploratory‐testing‐is‐not‐part‐5‐
undocumented‐testing/
• “Experienced‐Based” Testing
• http://www.satisfice.com/blog/archives/664
• defined by any specific example of exploratory testing
• http://www.satisfice.com/blog/archives/678
Exploratory Testing
The way we practice and teach it, exploratory testing…
• IS NOT “random testing” (or sloppy,
or slapdash testing)
• IS NOT “unstructured testing”
• IS NOT procedurally structured
• IS NOT unteachable
• IS NOT unmanageable
• IS NOT scripted
• IS NOT a technique
• IS “ad hoc”, in the dictionary sense,
“to the purpose”
• IS structured and rigorous
• IS cognitively structured
• IS highly teachable
• IS highly manageable
• IS chartered
• IS an approach
38
43. What does “taking advantage of resources” mean?
• Mission
• The problem we are here to solve for our customer.
• Information
• Information about the product or project that is needed for testing.
• Developer relations
• How you get along with the programmers.
• Team
• Anyone who will perform or support testing.
• Equipment & tools
• Hardware, software, or documents required to administer testing.
• Schedule
• The sequence, duration, and synchronization of project events.
• Test Items
• The product to be tested.
• Deliverables
• The observable products of the test project.
81
“Ways to test…”?
General Test Techniques
•
•
•
•
•
•
•
•
•
Function testing
Domain testing
Stress testing
Flow testing
Scenario testing
Claims testing
User testing
Risk testing
Automatic checking
82
41