One of my current responsibilities is to find ways to automate as much as practical of the ‘testing’ of the user experience (UX) of complex web-based applications. In my view, full test automation of UX is impractical and probably unwise, however we can use automation to find potential problems in UX even of rich, complex applications.
I, and others, are working to find ways to use automation to discover various types of these potential problems. Here’s an overview of some of the points I made. The work is ongoing and is likely to have demonstrated significant business benefits by the time of the EuroStar conference. In my experience, heuristics are useful in helping identify potential issues. Various people have managed to create test automation that essentially automates various heuristics. All the examples I’ve provided are available as free opensource software (FOSS). I’ve learnt to value opensource because it reduces the cost of experimentation and allows us to extend and modify the code e.g. to add new heuristics relatively easily (you still need to be able to write code, however the code is freely and immediately available).
Automation is (often) necessary, but not sufficient Automation and automated tests can be beguiling, and paradoxically increase the chances of missing critical problems if we chose to rely mainly or even solely on the automated tests. Even with state of the art (the best we can do across the industry) automated tests I still believe we need to ask additional questions about the software being tested. Sadly, in my experience, most automated tests are poorly designed and implemented, which increases the likelihood of problems eluding the automated tests.
'Pushing The Boundaries Of User Experience Test Automation' by Julian Harty
1. Pushing the boundaries of User Experience
Test Automation
@EuroStar 2011
JULIAN HARTY
15th October 2011
2. EuroStar 2011
What we want to achieve
To automatically detect (some)
issues that may adversely
affect the User Experience
2
3. EuroStar 2011
So what are the business problems?
• Quality-In-Use:
– UX, Cross-browser & Accessibility issues on live site
• Engineering Productivity:
– Time taken from feature-complete to live site
• Quality Engineering:
– Over-reliance on manual testing
– Existing test-automation under-performing
3
4. EuroStar 2011
And some UX Test Automation problems…
• Minimal existing work in Industry
–Existing tools tend to test static aspects
• Academic work appears to be unsuitable
• How to automatically measure and test UX
at all!
4
5. EuroStar 2011
UX = User Experience
• Dynamic, based on using the system
• We focus on the Human + Computer Interactions
5
6. EuroStar 2011
Using Heuristics*
1. When I navigate by tabbing I should return to
where I started in a reasonable number of
key-presses
2. Flow should be consistent through web-forms
3. Look-alike & work-alike across web browsers
4. I should not be able to bypass the security of
the site from links in the UI
* Fallible, but useful, guide/approach/practice
6
7. EuroStar 2011
Equivalence of input paths
Mouse
• Clicks…
Keyboard
• Keystrokes
7
New
Dropdown
Select
Send
Tab
Space
Down-arrow
Enter
Down-arrow
Down-arrow
Down-arrow
Down-arrow
Tab
EnterDesigning and Engineering Time by Steven Stow
8. EuroStar 2011
A real example: Information for Speakers
Mouseover; click = 2 steps 13 Tabs + 22 Tabs = 35 steps
8
9. EuroStar 2011
Automated exploration and discovery*
• Automating heuristic tests
–Web accessibility testing
–Fighting layout bugs
–BiDi checker
• Using Crawling tools
–Crawljax
* We’re working on complementary work on interactive
exploration and discovery
9
10. EuroStar 2011
Man & Machine
• Fully automated execution, human
inspection
• Interactive testing, aided by tools
10
14. EuroStar 2011
And what about Static Analysis?
• Don‟t overlook the value of static analysis
(but don‟t rely on it either…)
• Find ways to filter the reported results
• We‟ve created automated Accessibility
Tests, which check the contents of pages
using WCAG guidelines.
14
15. EuroStar 2011
Problems detected:
15
{"testName":"checkMouseEventHaveKeyboardEvent","description":
"Check that elements that have mouse actions also have keyboard actions.",
"severity":"error","elementCode":"<a href="#"
onclick="window.open('/Content/Terms-and-conditions.aspx',
'termsAndConditions','width=1000,height=600,resizable=yes,toolbars=0,menuba
r=no,scrollbars=yes,status=no'); return false;"> Click here to read terms
and conditions</a>"}
“checkTitleIsNotEmpty” for the entire page
"testName":"checkAltTextOnImage","description":
"Check that visible images have alt text",
"severity":"error","elementCode":
"<img src="/themes/Eurostar/images/ico-linkedin.png" />"
16. EuroStar 2011
Now what?
“Sure, Accessibility is important:
file it with the rest of the things
we should do...”
What should be done seldom gets done...
16
17. EuroStar 2011
3 Improvements for the price of 1
• Accessibility + Testability
– Both need to interpret the contents in the browser
– Improving one often improves the other
• Accessibility + SEO*
– Both find value in descriptive labels
17
[*] SEO: Search Engine Optimization
alt=“”
picture
Mens Black Levi 501 Jeans 32" Waist /32" Leg Excellent Condition
alt=“picture” alt=“Mens Black ...”
18. EuroStar 2011
Happiness
• Improved Accessibility: Users are happier
• Improved Testability: We‟re happier
• Improved SEO: Business is happier
18
19. EuroStar 2011
Beware of (over-)trusting test automation
• Automation as servant, not master, of our
software delivery
– Inaccurate results
– “Beware of Automation Bias” by M.L. Cumming[1]
• Automated tests and checks miss more than
they find
• Make behavior easier to assess
– Screenshots and contents of DOM to verify after tests ran
– Automated video recordings
[1] http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.91.2634&rep=rep1&type=pdf
19
22. EuroStar 2011
(Some of the) things our automated tests may miss…
22
Promotions
Navigation
History
Layout,
Rendering &
Formatting
Problem
JavaScript
CSS and
HTML
23. EuroStar 2011
Beware of poor-quality automated 'tests'
• AssertTrue(true);
• No/Inadequate/Too many Asserts
• Poor code design
• Falsehoods (False positives / negatives)
• Focus on:
– Improving the quality of your automated tests
– Finding ways to improve the quality, & testability, of
the code being tested
23
24. EuroStar 2011
Increasing the signal-to-noise of test results
• Suppress unwanted „warnings‟
–C.f. static analysis tools
• Increase potency of tests
• Consider dumping ineffective tests
24
25. EuroStar 2011
Further reading and research
The opensource project
http://code.google.com/p/web-accessibility-testing
Finding Usability Bugs with Automated Tests
http://queue.acm.org/detail.cfm?id=1925091
Fighting Layout Bugs
http://code.google.com/p/fighting-layout-bugs/
Experiences Using Static Analysis to Find Bugs
http://www.google.com/research/pubs/pub34339.html
My blog
http://blog.bettersoftwaretesting.com/
“Beware of Automation Bias” by M.L. Cummings
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.91.2634&rep=rep1&type=pdf
Designing and Engineering Time by Steven Stow
ISBN 978-0-321-50918-5
25
Delivery includes automated builds, testing,etcExamples of vast difference in the performance test results from various commercial performance testing products