SlideShare uma empresa Scribd logo
1 de 36
Baixar para ler offline
Steps to Design a
      Better Survey

Jean E. Fox            Scott S. Fricker
     Office of Survey Methods Research
          Bureau of Labor Statistics

             October 19, 2012
Introduction
   Our backgrounds
    Usability
    Survey Methodology
   Goal of the presentation
    Combine what we know from our fields to
     improve usability surveys
Types of Usability Surveys
   Usability Tests
    Post-task
    Post-test (e.g., SUS)
   Ethnographic work
    Learn how people do their work
 Solicit input from users
 Administered
    Self-administered (online, paper)
    By interviewer (oral)
Introduction

   Three steps we‟ll discuss
    1. Decide what you really need to know
    2. Write the questions following best practices
    3. Test the survey
Step 1



Decide what you really need to know
Decide What You
        Really Need to Know
   Are you asking for data you really
    need?
    Will you really use it?
    Can you get the data somewhere else?
Decide What You
        Really Need to Know
   Are you asking questions respondents
    can answer?
    Can you include “screeners”?
       – Questions to allow respondents skip irrelevant
         questions
    Do you need separate surveys?
Decide What You
        Really Need to Know
   Are you asking for data in a format you
    can analyze?
    Open-ended vs multiple choice
   Are you really going to analyze it?
Step 2



Write the questions following best practices
Best Practices
 Rating scales
 Rankings
 Double-barreled questions
 Agree/Disagree items
 Satisficing
Types of Scales
   Likert-type item




   Semantic Differential
Types of Scales
   Bi-polar
    Previous examples
   Uni-polar
Rating Scales
   How many response options do you
    usually use in a rating scale?
    3…5…7…10… or something else?
   Number of options
    Generally, scales with 5-7 options are the
     most reliable.
    The optimum size depends on the issue
     being rated (Alwin, 1997; Garner, 1960)
      – More options for bi-polar scales
Scales
 Do you usually have a neutral midpoint?
 Odd or Even number of options
    Without a midpoint, respondents tend to
     choose randomly between two middle
     options.
    For usability, generally include a mid-point.
Rating Scales
 Do you label the endpoints, a few
  options, or all of them?
 Labels
    Use text labels for each option
    Avoid numbers, unless they are meaningful
      – Especially avoid using negative numbers.
        Respondents do not like to select negative
        options.
Rating Scales
   Be sure the scale is balanced.




    This scale has 3 “satisfied” options, but
     only one “dissatisfied” option.
Ranking

   Definitions
    Rating: Select a value for individual items
     from a scale
    Ranking: Select an order for the items,
     comparing each against all the others.
Ranking

   Consider other options before using
    ranking
    Ranking is difficult and less enjoyable than
     other evaluation methods (Elig and Frieze,
     1979).
    You don‟t get any interval level data
Ranking
   Recommendations
    Use ratings instead if you can.
      – Determine ranks from average ratings.
    Use rankings if you need respondents to
     prioritize options.
Question Wording
Double-Barreled Questions
   Avoid double-barreled questions
    They force respondents to make a single
     response to multiple questions
    They assume that respondents logically
     group the topics together, which may or
     may not be true
    Recommendations
       – Watch for the use of “and” in questions.
       – Eliminate all double-barreled questions.
       – Divide them into multiple questions.
Agree / Disagree Items
   Who uses agree / disagree items? Why?
    They are fairly easy to write
    You can cover lots of topics with one scale
    It‟s a fairly standard scale
    It‟s familiar to respondents
Agree / Disagree Items
   Unfortunately, they can be problematic
    They are prone to acquiescence bias
       – The tendency to agree with a statement
    They require an additional level of
     processing for the respondent
       – Respondents need to translate their response to
         the agree/disagree scale.
Agree / Disagree Items
   Recommendation
    Avoid agree / disagree items if possible
    Use “construct specific” responses
Other Issues
 Be sure the responses match the
  question.
 Speak the respondent‟s language
    Avoid jargon unless appropriate
   Remember that responses can be
    impacted by
    Question order
    The size of the text field
    Graphics, even seemingly innocuous ones
Broader Issue - Satisficing
 Responding to surveys often requires
  considerable effort
 Rather than finding the „optimal‟
  answer, people may take shortcuts,
  choose the first minimally acceptable
  answer
 “Satisficing” (Krosnick, 1991) – depends
  on:
    Task difficulty, respondent ability and
     motivation
Satisficing – Remedies
   Minimize task difficulty
    Minimize number of words in questions
    Avoid double-barreled questions
    Decompose questions when needed
       – Instead of asking how much someone spent on
         clothing, ask about different types of clothing
         separately
    Use ratings not rankings
    Label response options
Satisficing – Remedies, cont.
   Maximize motivation
    Describe purpose and value of study
    Provide instructions to think carefully
    Include random probes (“why do you say
     that?”)
    Keep surveys short
    Put important questions early
Satisficing – Remedies, cont.
   Minimize “response effects”
    Avoid blocks of ratings on the same scale
     (prevents „straight-lining‟)
    Do not offer „no opinion‟ response options
    Avoid agree/disagree, yes/no, true/false
     questions
Step 3



Test the survey
Testing Surveys
 Be sure your questions work
 Consider an expert review
    Need an expert
 For usability testing, be sure to include
  the survey in your pilot test.
 A common technique for evaluating
  surveys is Cognitive Interviewing (see
  Willis, 2005)
Cognitive Interviewing
   Cognitive interviewing basics
    Have participant complete the survey
    Afterwards, ask participants questions, such as
       – In your own words, what was the question asking?
       – What did you consider in determining your
         response?
       – Was there anything difficult about this question?
Cognitive Interviewing
   Cognitive interviewing basics (con‟t)
    Review the qualitative data you get to identify
     potential problems and solutions
    Like usability testing, there are different
     approaches (e.g., think aloud)
Summary
 Decide what you really need to know
 Write the questions following best
  practices
 Test the survey
Contact Information


 Jean E. Fox       Scott S. Fricker
Fox.Jean@bls.gov   Fricker.Scott@bls.gov
  202-691-7370         202-691-7390
References
Alwin, D.F. (1997). Feeling Thermometers Versus 7-Point Scales: Which
   Are Better? Sociological Methods and Research, 25(3), pp 318 – 340
Elig, T. W., & Frieze, I.H. (1979). Measuring causal attributions for success
    and failure. Journal of Personality and Social Psychology, 37(4), 621-
    634.
Garner, W.R. (1960). Rating scales, discriminability, and information
   transmission. The Psychological Review, 67 (6), 343-352.
Krosnick, J.A. (1991). Response strategies for coping with the cognitive
   demands of attitude strength in surveys. In J.M. Tanur (ed.) Questions
   About Questions: Inquiries into the Cognitive Bases of Surveys. New
   York: Russell Sage Foundation, pp. 177 – 203.
Krosnick, J.A. and Presser, S. (2010). Question and questionnaire design.
   In Handbook of Survey Research, 2nd Edition, Peter V. Marsden and
   James D. Wright (Eds). Bingley, UK: Emerald Group Publishing Ltd.
Willis, G. (2005). Cognitive Interviewing: A Tool for Improving
    Questionnaire Design, Thousand Oaks, CA: Sage Publications, Inc.

Mais conteúdo relacionado

Mais procurados

Questionnaire design and validation
Questionnaire design and validationQuestionnaire design and validation
Questionnaire design and validationKarim Elghanam
 
Chapter 7 class version(1)
Chapter 7 class version(1)Chapter 7 class version(1)
Chapter 7 class version(1)jbnx
 
Test planning & estimation
Test planning & estimationTest planning & estimation
Test planning & estimationLeslie Smart
 
TSLB3143 Topic 1c Experimental Research
TSLB3143 Topic 1c Experimental ResearchTSLB3143 Topic 1c Experimental Research
TSLB3143 Topic 1c Experimental ResearchYee Bee Choo
 
Best Practices for Test Case Writing
Best Practices for Test Case WritingBest Practices for Test Case Writing
Best Practices for Test Case WritingSarah Goldberg
 
Model-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelModel-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelBob Binder
 
Techniques of data collection
Techniques of data collectionTechniques of data collection
Techniques of data collectionvivek mhatre
 
Aligning tests to standards
Aligning tests to standardsAligning tests to standards
Aligning tests to standardsFariba Chamani
 
Test design techniques: Structured and Experienced-based techniques
Test design techniques: Structured and Experienced-based techniquesTest design techniques: Structured and Experienced-based techniques
Test design techniques: Structured and Experienced-based techniquesKhuong Nguyen
 
Writing Test Cases 20110808
Writing Test Cases 20110808Writing Test Cases 20110808
Writing Test Cases 20110808slovejoy
 
quantitative research
quantitative researchquantitative research
quantitative researchcaseolaris
 
Istqb 4-테스트설계기법-2015-3-배포
Istqb 4-테스트설계기법-2015-3-배포Istqb 4-테스트설계기법-2015-3-배포
Istqb 4-테스트설계기법-2015-3-배포Jongwon Lee
 
Quantitative Methods of Research
Quantitative Methods of ResearchQuantitative Methods of Research
Quantitative Methods of ResearchJan Ine
 

Mais procurados (20)

Test planning
Test planningTest planning
Test planning
 
Test plan
Test planTest plan
Test plan
 
Research Methods
Research Methods Research Methods
Research Methods
 
Questionnaire design and validation
Questionnaire design and validationQuestionnaire design and validation
Questionnaire design and validation
 
Chapter 7 class version(1)
Chapter 7 class version(1)Chapter 7 class version(1)
Chapter 7 class version(1)
 
Test planning & estimation
Test planning & estimationTest planning & estimation
Test planning & estimation
 
TSLB3143 Topic 1c Experimental Research
TSLB3143 Topic 1c Experimental ResearchTSLB3143 Topic 1c Experimental Research
TSLB3143 Topic 1c Experimental Research
 
Best Practices for Test Case Writing
Best Practices for Test Case WritingBest Practices for Test Case Writing
Best Practices for Test Case Writing
 
Model-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelModel-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next Level
 
Techniques of data collection
Techniques of data collectionTechniques of data collection
Techniques of data collection
 
Aligning tests to standards
Aligning tests to standardsAligning tests to standards
Aligning tests to standards
 
Test design techniques: Structured and Experienced-based techniques
Test design techniques: Structured and Experienced-based techniquesTest design techniques: Structured and Experienced-based techniques
Test design techniques: Structured and Experienced-based techniques
 
Questionnaires
QuestionnairesQuestionnaires
Questionnaires
 
Writing Test Cases 20110808
Writing Test Cases 20110808Writing Test Cases 20110808
Writing Test Cases 20110808
 
Software testing
Software testingSoftware testing
Software testing
 
quantitative research
quantitative researchquantitative research
quantitative research
 
Istqb 4-테스트설계기법-2015-3-배포
Istqb 4-테스트설계기법-2015-3-배포Istqb 4-테스트설계기법-2015-3-배포
Istqb 4-테스트설계기법-2015-3-배포
 
Quantitative Methods of Research
Quantitative Methods of ResearchQuantitative Methods of Research
Quantitative Methods of Research
 
Test planning
Test planningTest planning
Test planning
 
Stlc ppt
Stlc pptStlc ppt
Stlc ppt
 

Semelhante a Steps to Design a Better Survey (Jean Fox & Scott Fricker)

Survey Methodology and Questionnaire Design Theory Part II
Survey Methodology and Questionnaire Design Theory Part IISurvey Methodology and Questionnaire Design Theory Part II
Survey Methodology and Questionnaire Design Theory Part IIQualtrics
 
Survey Design: Introduction & Overview
Survey Design: Introduction & OverviewSurvey Design: Introduction & Overview
Survey Design: Introduction & OverviewJames Neill
 
Writing surveys that work
Writing surveys that workWriting surveys that work
Writing surveys that workrebeccaweiss
 
Writing surveys that work
Writing surveys that workWriting surveys that work
Writing surveys that workrebeccaweiss
 
12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.pptMcPoolMac
 
12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.pptlynette70
 
12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.pptRizkyAmelia80
 
12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.pptssuser23a6db1
 
Qualitative & quantitative-research-ppt.ppt
Qualitative & quantitative-research-ppt.pptQualitative & quantitative-research-ppt.ppt
Qualitative & quantitative-research-ppt.pptDaoudAlHumaidi1
 
Writing surveysthatwork
Writing surveysthatworkWriting surveysthatwork
Writing surveysthatworkrebeccaweiss
 
PR2-Questionnaire.pptx
PR2-Questionnaire.pptxPR2-Questionnaire.pptx
PR2-Questionnaire.pptxJessaBejer1
 
Edu 702 group presentation (questionnaire)
Edu 702   group presentation (questionnaire)Edu 702   group presentation (questionnaire)
Edu 702 group presentation (questionnaire)Azura Zaki
 
Edu 702 group presentation (questionnaire)
Edu 702   group presentation (questionnaire)Edu 702   group presentation (questionnaire)
Edu 702 group presentation (questionnaire)Adibah H. Mutalib
 
Week6 7a- developing a questionnaire
Week6 7a- developing a questionnaireWeek6 7a- developing a questionnaire
Week6 7a- developing a questionnaireHafizul Mukhlis
 
Questionnaire design spring2004
Questionnaire design spring2004Questionnaire design spring2004
Questionnaire design spring2004Jaseme_Otoyo
 
QUESTIONNAIRE METHOD.pptx
QUESTIONNAIRE METHOD.pptxQUESTIONNAIRE METHOD.pptx
QUESTIONNAIRE METHOD.pptxGarimaBhati5
 
22.10.17 instrumentation and data collection
22.10.17 instrumentation and data collection22.10.17 instrumentation and data collection
22.10.17 instrumentation and data collectionAdibah Latif
 

Semelhante a Steps to Design a Better Survey (Jean Fox & Scott Fricker) (20)

Survey Methodology and Questionnaire Design Theory Part II
Survey Methodology and Questionnaire Design Theory Part IISurvey Methodology and Questionnaire Design Theory Part II
Survey Methodology and Questionnaire Design Theory Part II
 
Survey Design: Introduction & Overview
Survey Design: Introduction & OverviewSurvey Design: Introduction & Overview
Survey Design: Introduction & Overview
 
Writing surveys that work
Writing surveys that workWriting surveys that work
Writing surveys that work
 
Writing surveys that work
Writing surveys that workWriting surveys that work
Writing surveys that work
 
SOC2002 Lecture 5
SOC2002 Lecture 5SOC2002 Lecture 5
SOC2002 Lecture 5
 
12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt
 
12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt
 
12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt
 
12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt12_quantitative-research-methodology.ppt
12_quantitative-research-methodology.ppt
 
Qualitative & quantitative-research-ppt.ppt
Qualitative & quantitative-research-ppt.pptQualitative & quantitative-research-ppt.ppt
Qualitative & quantitative-research-ppt.ppt
 
Writing surveysthatwork
Writing surveysthatworkWriting surveysthatwork
Writing surveysthatwork
 
PR2-Questionnaire.pptx
PR2-Questionnaire.pptxPR2-Questionnaire.pptx
PR2-Questionnaire.pptx
 
Questionnaire design
Questionnaire designQuestionnaire design
Questionnaire design
 
Edu 702 group presentation (questionnaire)
Edu 702   group presentation (questionnaire)Edu 702   group presentation (questionnaire)
Edu 702 group presentation (questionnaire)
 
Edu 702 group presentation (questionnaire)
Edu 702   group presentation (questionnaire)Edu 702   group presentation (questionnaire)
Edu 702 group presentation (questionnaire)
 
Week6 7a- developing a questionnaire
Week6 7a- developing a questionnaireWeek6 7a- developing a questionnaire
Week6 7a- developing a questionnaire
 
Questionnaire design spring2004
Questionnaire design spring2004Questionnaire design spring2004
Questionnaire design spring2004
 
QUESTIONNAIRE METHOD.pptx
QUESTIONNAIRE METHOD.pptxQUESTIONNAIRE METHOD.pptx
QUESTIONNAIRE METHOD.pptx
 
Chapter8
Chapter8Chapter8
Chapter8
 
22.10.17 instrumentation and data collection
22.10.17 instrumentation and data collection22.10.17 instrumentation and data collection
22.10.17 instrumentation and data collection
 

Mais de uxpa-dc

Hands on Usability Testing (Jonathan Rubin)
Hands on Usability Testing (Jonathan Rubin)Hands on Usability Testing (Jonathan Rubin)
Hands on Usability Testing (Jonathan Rubin)uxpa-dc
 
Mobile web vs app (Sharon Grubaugh)
Mobile web vs app (Sharon Grubaugh)Mobile web vs app (Sharon Grubaugh)
Mobile web vs app (Sharon Grubaugh)uxpa-dc
 
So you think you can moderate? (Andrew Schall)
So you think you can moderate? (Andrew Schall)So you think you can moderate? (Andrew Schall)
So you think you can moderate? (Andrew Schall)uxpa-dc
 
The hybrids are coming (John Whalen)
The hybrids are coming (John Whalen)The hybrids are coming (John Whalen)
The hybrids are coming (John Whalen)uxpa-dc
 
Developing a User Interface for Large-Scale Surveys (Jennifer Beck & Elizabet...
Developing a User Interface for Large-Scale Surveys (Jennifer Beck & Elizabet...Developing a User Interface for Large-Scale Surveys (Jennifer Beck & Elizabet...
Developing a User Interface for Large-Scale Surveys (Jennifer Beck & Elizabet...uxpa-dc
 
Usable Usability Reports (Dick Horst)
Usable Usability Reports (Dick Horst)Usable Usability Reports (Dick Horst)
Usable Usability Reports (Dick Horst)uxpa-dc
 
Style Me Pretty: Impact First Impressions (Sarah Weise & Linna Manomaitis Fer...
Style Me Pretty: Impact First Impressions (Sarah Weise & Linna Manomaitis Fer...Style Me Pretty: Impact First Impressions (Sarah Weise & Linna Manomaitis Fer...
Style Me Pretty: Impact First Impressions (Sarah Weise & Linna Manomaitis Fer...uxpa-dc
 
Behind Eyetracking: How the Brain Utilizies Its Eyes (Dixon Cleveland)
Behind Eyetracking: How the Brain Utilizies Its Eyes (Dixon Cleveland)Behind Eyetracking: How the Brain Utilizies Its Eyes (Dixon Cleveland)
Behind Eyetracking: How the Brain Utilizies Its Eyes (Dixon Cleveland)uxpa-dc
 
Intelligent content a case study
Intelligent content   a case studyIntelligent content   a case study
Intelligent content a case studyuxpa-dc
 
Adaptive Design, Adapted Adapted (Dara Pressley, Lindy Roux)
Adaptive Design, Adapted Adapted (Dara Pressley, Lindy Roux)Adaptive Design, Adapted Adapted (Dara Pressley, Lindy Roux)
Adaptive Design, Adapted Adapted (Dara Pressley, Lindy Roux)uxpa-dc
 
Agile / UX (Luis Rodriguez)
Agile / UX (Luis Rodriguez)Agile / UX (Luis Rodriguez)
Agile / UX (Luis Rodriguez)uxpa-dc
 
User Experience Evaluation of Surveys (Jennifer Romano Bergstrom & Ricardo Ca...
User Experience Evaluation of Surveys (Jennifer Romano Bergstrom & Ricardo Ca...User Experience Evaluation of Surveys (Jennifer Romano Bergstrom & Ricardo Ca...
User Experience Evaluation of Surveys (Jennifer Romano Bergstrom & Ricardo Ca...uxpa-dc
 
Optimizing User Experience Across Devices with Responsive Web Design (Clariss...
Optimizing User Experience Across Devices with Responsive Web Design (Clariss...Optimizing User Experience Across Devices with Responsive Web Design (Clariss...
Optimizing User Experience Across Devices with Responsive Web Design (Clariss...uxpa-dc
 
Building Your UX Team Through Practical Usability Training (Jonathan Rubin & ...
Building Your UX Team Through Practical Usability Training (Jonathan Rubin & ...Building Your UX Team Through Practical Usability Training (Jonathan Rubin & ...
Building Your UX Team Through Practical Usability Training (Jonathan Rubin & ...uxpa-dc
 
Developing Guidelines for Suites of Application (Rachel Sengers & Lesley Hump...
Developing Guidelines for Suites of Application (Rachel Sengers & Lesley Hump...Developing Guidelines for Suites of Application (Rachel Sengers & Lesley Hump...
Developing Guidelines for Suites of Application (Rachel Sengers & Lesley Hump...uxpa-dc
 
Visually Integrative Representation of User Types in Surveys (Ricardo Carvalh...
Visually Integrative Representation of User Types in Surveys (Ricardo Carvalh...Visually Integrative Representation of User Types in Surveys (Ricardo Carvalh...
Visually Integrative Representation of User Types in Surveys (Ricardo Carvalh...uxpa-dc
 
Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)
Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)
Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)uxpa-dc
 
mLearning for Veterans: Designing for Diverse Audiences (Michelle Chin)
mLearning for Veterans: Designing for Diverse Audiences (Michelle Chin)mLearning for Veterans: Designing for Diverse Audiences (Michelle Chin)
mLearning for Veterans: Designing for Diverse Audiences (Michelle Chin)uxpa-dc
 
Graphics and Wireframes (Scott McDaniel)
Graphics and Wireframes (Scott McDaniel)Graphics and Wireframes (Scott McDaniel)
Graphics and Wireframes (Scott McDaniel)uxpa-dc
 
Optimal SEO (Marianne Sweeny)
Optimal SEO (Marianne Sweeny) Optimal SEO (Marianne Sweeny)
Optimal SEO (Marianne Sweeny) uxpa-dc
 

Mais de uxpa-dc (20)

Hands on Usability Testing (Jonathan Rubin)
Hands on Usability Testing (Jonathan Rubin)Hands on Usability Testing (Jonathan Rubin)
Hands on Usability Testing (Jonathan Rubin)
 
Mobile web vs app (Sharon Grubaugh)
Mobile web vs app (Sharon Grubaugh)Mobile web vs app (Sharon Grubaugh)
Mobile web vs app (Sharon Grubaugh)
 
So you think you can moderate? (Andrew Schall)
So you think you can moderate? (Andrew Schall)So you think you can moderate? (Andrew Schall)
So you think you can moderate? (Andrew Schall)
 
The hybrids are coming (John Whalen)
The hybrids are coming (John Whalen)The hybrids are coming (John Whalen)
The hybrids are coming (John Whalen)
 
Developing a User Interface for Large-Scale Surveys (Jennifer Beck & Elizabet...
Developing a User Interface for Large-Scale Surveys (Jennifer Beck & Elizabet...Developing a User Interface for Large-Scale Surveys (Jennifer Beck & Elizabet...
Developing a User Interface for Large-Scale Surveys (Jennifer Beck & Elizabet...
 
Usable Usability Reports (Dick Horst)
Usable Usability Reports (Dick Horst)Usable Usability Reports (Dick Horst)
Usable Usability Reports (Dick Horst)
 
Style Me Pretty: Impact First Impressions (Sarah Weise & Linna Manomaitis Fer...
Style Me Pretty: Impact First Impressions (Sarah Weise & Linna Manomaitis Fer...Style Me Pretty: Impact First Impressions (Sarah Weise & Linna Manomaitis Fer...
Style Me Pretty: Impact First Impressions (Sarah Weise & Linna Manomaitis Fer...
 
Behind Eyetracking: How the Brain Utilizies Its Eyes (Dixon Cleveland)
Behind Eyetracking: How the Brain Utilizies Its Eyes (Dixon Cleveland)Behind Eyetracking: How the Brain Utilizies Its Eyes (Dixon Cleveland)
Behind Eyetracking: How the Brain Utilizies Its Eyes (Dixon Cleveland)
 
Intelligent content a case study
Intelligent content   a case studyIntelligent content   a case study
Intelligent content a case study
 
Adaptive Design, Adapted Adapted (Dara Pressley, Lindy Roux)
Adaptive Design, Adapted Adapted (Dara Pressley, Lindy Roux)Adaptive Design, Adapted Adapted (Dara Pressley, Lindy Roux)
Adaptive Design, Adapted Adapted (Dara Pressley, Lindy Roux)
 
Agile / UX (Luis Rodriguez)
Agile / UX (Luis Rodriguez)Agile / UX (Luis Rodriguez)
Agile / UX (Luis Rodriguez)
 
User Experience Evaluation of Surveys (Jennifer Romano Bergstrom & Ricardo Ca...
User Experience Evaluation of Surveys (Jennifer Romano Bergstrom & Ricardo Ca...User Experience Evaluation of Surveys (Jennifer Romano Bergstrom & Ricardo Ca...
User Experience Evaluation of Surveys (Jennifer Romano Bergstrom & Ricardo Ca...
 
Optimizing User Experience Across Devices with Responsive Web Design (Clariss...
Optimizing User Experience Across Devices with Responsive Web Design (Clariss...Optimizing User Experience Across Devices with Responsive Web Design (Clariss...
Optimizing User Experience Across Devices with Responsive Web Design (Clariss...
 
Building Your UX Team Through Practical Usability Training (Jonathan Rubin & ...
Building Your UX Team Through Practical Usability Training (Jonathan Rubin & ...Building Your UX Team Through Practical Usability Training (Jonathan Rubin & ...
Building Your UX Team Through Practical Usability Training (Jonathan Rubin & ...
 
Developing Guidelines for Suites of Application (Rachel Sengers & Lesley Hump...
Developing Guidelines for Suites of Application (Rachel Sengers & Lesley Hump...Developing Guidelines for Suites of Application (Rachel Sengers & Lesley Hump...
Developing Guidelines for Suites of Application (Rachel Sengers & Lesley Hump...
 
Visually Integrative Representation of User Types in Surveys (Ricardo Carvalh...
Visually Integrative Representation of User Types in Surveys (Ricardo Carvalh...Visually Integrative Representation of User Types in Surveys (Ricardo Carvalh...
Visually Integrative Representation of User Types in Surveys (Ricardo Carvalh...
 
Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)
Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)
Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)
 
mLearning for Veterans: Designing for Diverse Audiences (Michelle Chin)
mLearning for Veterans: Designing for Diverse Audiences (Michelle Chin)mLearning for Veterans: Designing for Diverse Audiences (Michelle Chin)
mLearning for Veterans: Designing for Diverse Audiences (Michelle Chin)
 
Graphics and Wireframes (Scott McDaniel)
Graphics and Wireframes (Scott McDaniel)Graphics and Wireframes (Scott McDaniel)
Graphics and Wireframes (Scott McDaniel)
 
Optimal SEO (Marianne Sweeny)
Optimal SEO (Marianne Sweeny) Optimal SEO (Marianne Sweeny)
Optimal SEO (Marianne Sweeny)
 

Steps to Design a Better Survey (Jean Fox & Scott Fricker)

  • 1. Steps to Design a Better Survey Jean E. Fox Scott S. Fricker Office of Survey Methods Research Bureau of Labor Statistics October 19, 2012
  • 2. Introduction  Our backgrounds Usability Survey Methodology  Goal of the presentation Combine what we know from our fields to improve usability surveys
  • 3. Types of Usability Surveys  Usability Tests Post-task Post-test (e.g., SUS)  Ethnographic work Learn how people do their work  Solicit input from users  Administered Self-administered (online, paper) By interviewer (oral)
  • 4. Introduction  Three steps we‟ll discuss 1. Decide what you really need to know 2. Write the questions following best practices 3. Test the survey
  • 5. Step 1 Decide what you really need to know
  • 6. Decide What You Really Need to Know  Are you asking for data you really need? Will you really use it? Can you get the data somewhere else?
  • 7. Decide What You Really Need to Know  Are you asking questions respondents can answer? Can you include “screeners”? – Questions to allow respondents skip irrelevant questions Do you need separate surveys?
  • 8. Decide What You Really Need to Know  Are you asking for data in a format you can analyze? Open-ended vs multiple choice  Are you really going to analyze it?
  • 9. Step 2 Write the questions following best practices
  • 10. Best Practices  Rating scales  Rankings  Double-barreled questions  Agree/Disagree items  Satisficing
  • 11. Types of Scales  Likert-type item  Semantic Differential
  • 12. Types of Scales  Bi-polar Previous examples  Uni-polar
  • 13. Rating Scales  How many response options do you usually use in a rating scale? 3…5…7…10… or something else?  Number of options Generally, scales with 5-7 options are the most reliable. The optimum size depends on the issue being rated (Alwin, 1997; Garner, 1960) – More options for bi-polar scales
  • 14. Scales  Do you usually have a neutral midpoint?  Odd or Even number of options Without a midpoint, respondents tend to choose randomly between two middle options. For usability, generally include a mid-point.
  • 15. Rating Scales  Do you label the endpoints, a few options, or all of them?  Labels Use text labels for each option Avoid numbers, unless they are meaningful – Especially avoid using negative numbers. Respondents do not like to select negative options.
  • 16. Rating Scales  Be sure the scale is balanced. This scale has 3 “satisfied” options, but only one “dissatisfied” option.
  • 17. Ranking  Definitions Rating: Select a value for individual items from a scale Ranking: Select an order for the items, comparing each against all the others.
  • 18. Ranking  Consider other options before using ranking Ranking is difficult and less enjoyable than other evaluation methods (Elig and Frieze, 1979). You don‟t get any interval level data
  • 19. Ranking  Recommendations Use ratings instead if you can. – Determine ranks from average ratings. Use rankings if you need respondents to prioritize options.
  • 21. Double-Barreled Questions  Avoid double-barreled questions They force respondents to make a single response to multiple questions They assume that respondents logically group the topics together, which may or may not be true Recommendations – Watch for the use of “and” in questions. – Eliminate all double-barreled questions. – Divide them into multiple questions.
  • 22. Agree / Disagree Items  Who uses agree / disagree items? Why? They are fairly easy to write You can cover lots of topics with one scale It‟s a fairly standard scale It‟s familiar to respondents
  • 23. Agree / Disagree Items  Unfortunately, they can be problematic They are prone to acquiescence bias – The tendency to agree with a statement They require an additional level of processing for the respondent – Respondents need to translate their response to the agree/disagree scale.
  • 24. Agree / Disagree Items  Recommendation Avoid agree / disagree items if possible Use “construct specific” responses
  • 25. Other Issues  Be sure the responses match the question.  Speak the respondent‟s language Avoid jargon unless appropriate  Remember that responses can be impacted by Question order The size of the text field Graphics, even seemingly innocuous ones
  • 26. Broader Issue - Satisficing  Responding to surveys often requires considerable effort  Rather than finding the „optimal‟ answer, people may take shortcuts, choose the first minimally acceptable answer  “Satisficing” (Krosnick, 1991) – depends on: Task difficulty, respondent ability and motivation
  • 27. Satisficing – Remedies  Minimize task difficulty Minimize number of words in questions Avoid double-barreled questions Decompose questions when needed – Instead of asking how much someone spent on clothing, ask about different types of clothing separately Use ratings not rankings Label response options
  • 28. Satisficing – Remedies, cont.  Maximize motivation Describe purpose and value of study Provide instructions to think carefully Include random probes (“why do you say that?”) Keep surveys short Put important questions early
  • 29. Satisficing – Remedies, cont.  Minimize “response effects” Avoid blocks of ratings on the same scale (prevents „straight-lining‟) Do not offer „no opinion‟ response options Avoid agree/disagree, yes/no, true/false questions
  • 30. Step 3 Test the survey
  • 31. Testing Surveys  Be sure your questions work  Consider an expert review Need an expert  For usability testing, be sure to include the survey in your pilot test.  A common technique for evaluating surveys is Cognitive Interviewing (see Willis, 2005)
  • 32. Cognitive Interviewing  Cognitive interviewing basics Have participant complete the survey Afterwards, ask participants questions, such as – In your own words, what was the question asking? – What did you consider in determining your response? – Was there anything difficult about this question?
  • 33. Cognitive Interviewing  Cognitive interviewing basics (con‟t) Review the qualitative data you get to identify potential problems and solutions Like usability testing, there are different approaches (e.g., think aloud)
  • 34. Summary  Decide what you really need to know  Write the questions following best practices  Test the survey
  • 35. Contact Information Jean E. Fox Scott S. Fricker Fox.Jean@bls.gov Fricker.Scott@bls.gov 202-691-7370 202-691-7390
  • 36. References Alwin, D.F. (1997). Feeling Thermometers Versus 7-Point Scales: Which Are Better? Sociological Methods and Research, 25(3), pp 318 – 340 Elig, T. W., & Frieze, I.H. (1979). Measuring causal attributions for success and failure. Journal of Personality and Social Psychology, 37(4), 621- 634. Garner, W.R. (1960). Rating scales, discriminability, and information transmission. The Psychological Review, 67 (6), 343-352. Krosnick, J.A. (1991). Response strategies for coping with the cognitive demands of attitude strength in surveys. In J.M. Tanur (ed.) Questions About Questions: Inquiries into the Cognitive Bases of Surveys. New York: Russell Sage Foundation, pp. 177 – 203. Krosnick, J.A. and Presser, S. (2010). Question and questionnaire design. In Handbook of Survey Research, 2nd Edition, Peter V. Marsden and James D. Wright (Eds). Bingley, UK: Emerald Group Publishing Ltd. Willis, G. (2005). Cognitive Interviewing: A Tool for Improving Questionnaire Design, Thousand Oaks, CA: Sage Publications, Inc.