This document summarizes tips for conducting user experience surveys. It discusses balancing organizational goals with user needs when defining survey aims. Interviews should be conducted before surveys to understand user goals and experiences. Questions should focus on recent, vivid experiences that users can easily answer rather than trying to ask about every topic. Testing should be built into the survey process from development to deployment and analysis. The ideal survey involves both a well-designed questionnaire and a process that gets useful insights through various stages of testing.
2. Many thanks to the
organisers, volunteers, and sponsors of UCD
2012, London
Supporters
Sponsors Organiser
2 2
3. Surveys: your views on these statements
A. “It‟s when someone says, "Can't I just send out a survey
and collect the data?" that I start to shake”.
– Indi Young
http://rosenfeldmedia.com/books/mental-models/blog/oxymoron_scientific_survey/
B. Online surveys are a great option for business owners
who would like to conduct their own research
– Smart Survey
http://www.smart-survey.co.uk/articles/10-advantages-of-online-surveys/
3
5. Three ways UX people encounter surveys
1. Post-test / post-task surveys e.g. SUS
2. Someone is going to do a survey anyway
3. Triangulating between survey data
and data from elsewhere
5
Image credit: infodesign.com.au
6. Agenda 1. Post-test / post-task surveys
2. Someone is going to do a survey
anyway
3. Triangulating between survey data
and data from elsewhere
6
7. Presser et al 2004: pretesting focuses on
a “broader concern for improving data quality
so that measurements meet
a survey‟s objective”
Field testing
focuses on the
mechanics and
procedures
Cognitive
Usability
interviewing
testing
focuses on
focuses on
the
interaction
questions
http://www.slideshare.net/cjforms/introduction-to-usability-testing-for-survey-research
8. Try some cognitive interviewing
• Pair up. One person gets to be the interviewer.
• Non-interviewer: wait for your instruction.
8
9. Try some cognitive interviewing
• Pair up. One person gets to be the interviewer.
• Non-interviewer: wait for your instruction.
• Interviewer: ask your pair to think aloud while answering
this question. Take notes.
„How many windows are there
in your house?‟
(Dillman et al, 2009) 9
10. OK, now swap and try this question
• Please think about a computer system or web site that
you used recently. Now think aloud as you answer this
question:
10
11. OK, now swap and try this question
• Please think about a computer system or web site that
you used recently. Now think aloud as you answer this
question:
11
(from the SUS, the System Usability Scale, Brooke 1986)
15. We use post-test questionnaires
for comparisons
• One iteration with another
• Products with each other
• This product with an ideal
Our aims in
doing a survey
15
16. Tullis and Stetson found that SUS
was the best questionnaire for comparisons
Tullis, T. S. and J. N. Stetson (2004).
A Comparison of Questionnaires for Assessing Website Usability. UPA 2004 Conference 16
http://www.upassoc.org/usability_resources/conference/2004/UPA-2004-TullisStetson.pdf
20. A sadly uninformative “survey” process
Notice
“Voice of the
big gap
customer”
Some Send Reward or
after each punish staff Insight
questions
transaction
20
21. Ask a sample, not everyone
Tip
Make me
feel special
21
22. A typical survey process, somewhat better
“Let‟s do
a survey”
Some
Send
and Add luck Insight
questions
hope
22
23. Probably best to be realistic
and bring in the boss here
What the
organisation
wants to achieve
Our aims in What the user
doing a survey wants to do
23
24. A better survey process
Goals Users Build Deploy Analyse
• Establish • Interview • Final version • Run the • Extract
“Let‟s do your goals users about of questions survey from useful ideas
a survey” for the the topics in • Build the approach to • Share with
survey your survey questionnaire follow-up others
Questions Questions
Some
you need users can Questionnaire Data Insight
questions
answers to answer
24
25. We‟ve seen this bit a few moments ago
Goals Users Build Deploy Analyse
• Establish • Interview • Final version • Run the • Extract
“Let‟s do your goals users about of questions survey from useful ideas
a survey” for the the topics in • Build the approach to • Share with
survey your survey questionnaire follow-up others
Questions Questions
Some
you need users can Questionnaire Data Insight
questions
answers to answer
25
26. The questions you need depend on your
organisational and UX goals
Goals Users Build Deploy Analyse
• Establish • Interview • Final version • Run the • Extract
“Let‟s do your goals users about of questions survey from useful ideas
a survey” for the the topics in • Build the approach to • Share with
survey your survey questionnaire follow-up others
Questions Questions
Some
you need users can Questionnaire Data Insight
questions
answers to answer
26
27. Goals come into the definition of usability
The extent to which a product
can be used by specified users
to achieve specified goals with
effectiveness, efficiency and satisfaction
in a specified context of use
(ISO 9241:11 1998) This assumes
that we agree
on the goals
27
28. We have lots of views ways of defining
user experience
28
29. But let‟s carry on with the standards theme
2.15 user experience
person's perceptions and responses resulting from the use
and/or anticipated use of a product, system or service
NOTE 1 User experience includes all the users' emotions, beliefs, preferences, perceptions, physical
and psychological responses, behaviours and accomplishments that occur before, during and after
use.
NOTE 2 User experience is a consequence of brand image, presentation, functionality, system
performance, interactive behaviour and assistive capabilities of the interactive system, the user's
internal and physical state resulting from prior experiences, attitudes, skills and personality, and the
context of use.
NOTE 3 Usability, when interpreted from the perspective of the users' personal goals, can include the
kind of perceptual and emotional aspects typically associated with user experience. Usability criteria
can be used to assess aspects of user experience.
ISO 9241-210 29
30. Before ISO 9241-210 came along…
user experience as the satisfaction bit of usability
The extent to which a product
can be used by specified users
to achieve specified goals with
effectiveness, efficiency and satisfaction
in a specified context of use
(ISO 9241:11 1998)
Can I choose
the goals?
30
32. Agenda 1. Post-test / post-task surveys
2. Someone is going to do a survey
anyway
3. Triangulating between survey data
and data from elsewhere
32
33. I think this question is trying
to ask about satisfaction
33
34. If you‟re going to ask about satisfaction,
what endpoints would you use on a scale?
A Z
35. Workshop participants came up with these –
mostly about recalled emotion
Satisfied Dissatisfied
Extremely satisfied Extremely dissatisfied
Love Hate
Interesting Boring
Fun Dull
Enjoyable Unpleasant
Made me feel good Made me feel bad
Easy Confusing
Enjoy Not enjoy
Workshop
Delighted Disappointed results
Friendly Scary
35
36. And these, mostly about whether the
experience was successful or not
Fast Slow
Effortless Painful
I could do what I came to do I couldn‟t do what I came to do
Success Failure
Rewarding Frustrating
Workshop
results
36
37. And these,
about predictions of future behaviour
Would come back Wouldn‟t come back
Would post kudos Would post complaints
Workshop
results
37
38. Here are some scales I thought of,
ahead of the workshop
Disappointed Thrilled
Something missing Something extra
Miserable Happy
Below par Above par
Unfair Privilege 38
39. But maybe the same level of satisfaction
generates different points on each scale
Disappointed Thrilled
Something missing Something extra
Miserable Happy
Below par Above par
Unfair Privilege 39
40. Satisfaction reflects different emotions
depending on level of engagement
Satisfaction here
Engaged = “delight”
Negative Emotion Positive
Indifferent
Satisfaction here
= “pleasant”
Adapted from Oliver, R. L. (1996)
“Satisfaction: A Behavioral Perspective on the Consumer”
42. Satisfaction requires comparison
of an experience to something else
Compared experience to what?
(nothing)
Expectations
Needs
Excellence (the ideal product)
Fairness
Events that might have been
Adapted from Oliver, R. L. (1996)
“Satisfaction: A Behavioral Perspective on the Consumer”
42
43. And the resulting thoughts differ accordingly
Compared experience to what? Resulting thoughts
(nothing) Indifference
Expectations Better / worse / different
Needs Met / not met / mixture
Excellence (the ideal product) Good / poor quality (or „good enough‟)
Fairness Treated equitably / inequitably
Events that might have been Vindication / regret
Adapted from Oliver, R. L. (1996)
“Satisfaction: A Behavioral Perspective on the Consumer”
43
44. Example: bronze medal winners tend
to be happier than silver medal winners
Nathan Twaddle, Olympic Bronze Medal Winner in Beijing
Matsumoto D, & Willingham B (2006). The thrill of victory and the
agony of defeat: spontaneous expressions of medal winners of the
44
2004 Athens Olympic Games.
Photo credit: peter.cipollone, Flickr
45. Not all experiences are equal
Winning an Major life event
Olympic medal
Watching an event Occasional,
from the 2012 salient
Olympics on TV
Watching the TV Unremarkable,
news on a slow day repetitive
45
News images from cnn.com
46. The approximate curve of forgetting
High
Major life event
Quality
of data
Low
Unremarkable,
repetitive Occasional, salient
Recent Long ago
46
Time since event
47. Ask about recent vivid
Tip experience
47
Image credit: Fraser Smith
48. Agenda 1. Post-test / post-task surveys
2. Someone is going to do a survey
anyway
3. Triangulating between survey data
and data from elsewhere
48
49. Memorable experiences are also complex
• Think about the experience of
attending this conference
– What did you expect to happen?
– What did you need to happen?
– What would the ideal experience have been?
– How did you expect to be treated
compared to other people at the event?
– If you hadn‟t come here, what else might have happened?
49
50. The exercise revealed quite a few different
perspectives on the conference
• These questions were quite easy:
participants had thought about these topics
– What did you expect to happen?
– What did you need to happen?
– What would the ideal experience have been?
• These questions were harder,
but gave fresh perspectives
– How did you expect to be treated
compared to other people at the event?
Workshop
– If you hadn‟t come here,
results
what else might have happened?
50
51. The challenge of UX and surveys:
which bit to measure?
The extent to which a product
can be used by specified users
to achieve specified goals with
effectiveness, efficiency and satisfaction
in a specified context of use
(ISO 9241:11 1998)
?
52. The challenge of satisfaction surveys:
which bit to measure?
Compared experience to what? Resulting thoughts
(nothing) Indifference
Expectations Better / worse / different
Needs Met / not met / mixture
Excellence (the ideal product) Good / poor quality (or „good enough‟)
Fairness Treated equitably / inequitably
Events that might have been Vindication / regret
??
53. The challenge of experience surveys:
which bit to measure?
• Think about an experience …
– What did you expect to happen?
– What did you need to happen?
– What would the ideal experience have been?
– How did you expect to be treated
compared to other people at the event?
– If you hadn‟t come here, what else might have happened?
???
54. Don‟t try to ask everything
Tip
54
http://www.census.gov/history/www/genealogy/decennial_census_records/
55. A quick, interesting question is fine
“Why did you come to
this web site today?”
Suggestion from Suzanne Boyd, Anthro-Tech 55
57. Bonus
Tip Successful
Survey = Questionnaire
+ Process
That involves
lots of testing
57
58. Tips 1. Ask questions that people can answer
2. Ask a sample, not everyone
3. Find out about users‟ goals
4. Interview first
5. Ask about recent, vivid experience
6. Don‟t try to ask everything
7. Build lots of testing into
your survey process
58