3. • A survey is a collection of questions asked
repetitively to a sample of a population to
mathematically derive characteristics of the
total population.
Surveys: What Are They Good For?
5. Why is This Cycle Important?
• It’s a framework that provides guidelines
when you work with clients and stake-holders
• You’re likely doing parts of it already
• Those are likely the parts of your process that
work!
10. Needs: We All Have Them
• Questions to ask:
– What are we trying to figure out?
– What kinds of reports or data do we want or
expect?
– What will we do with this data when we’re done?
– Who is our intended audience or population?
– How are you going to access the target
audience?
11. Examples of Need
– How well known is my brand?
– Will customers buy this product?
– If we offer X benefit, will our employee happiness
go up?
– Why are my customers not converting?
– Will my product do well in a new market?
12. Set a Survey Goal
• A goal is not a single learning point – a goal is
what you plan to do with this data, and why.
– Good goal: grow your company into new
markets.
“A survey will determine which markets are
good for our existing products, so that we
may expand our customer base.”
– Bad goal: make more money for your
business.
“A survey will help us make more money.”
13. Learning Objectives
• Determine your learning objectives
– These should all support your overall need and goal
– A good amount of learning objectives: three
– You should have no more than five!
16. Eye on the Prize: ROI
• If the cost of the survey is greater than the
possible ROI, it’s a waste of time and money,
• Without an ROI measurement, there is no
encouragement to take action.
• Without clearly defined actions, survey
results may not have an ROI.
21. Guide: Writing Questions
• Multiple choice versus open-text questions
– Quantitative versus qualitative
• Phrasing and language use
– unclear language – grammar
– ambiguity – too technical
• Language can differ btw demographic groups
• Keep your questions:
– Brief – Simple
– Relevant – Specific and direct
22. Qualitative Versus Quantitative
• Quantitative – Numeric in nature,
extrapolated to whole population
• Qualitative – Touchy-feely, give context to
quantitative questions
24. Emotional Bias
• Asking loaded questions
• Asking neutral-seeming questions on a
loaded topic
25. Identity Bias
• Asking “Do you like SurveyGizmo?” with a
SurveyGizmo logo in the corner of the
survey
Isn’t Mel a
great trainer?
26. Option Bias
• Required, non-applicable questions
• Leading or restrictive options
• Different types of scales
• Option lists of death
? ? ?
27. Conversational Bias
• Surveys as a conversation
• Respondents giving the answer they think
you want to hear
Mr. Black
Job
Interviews
Today
Were you fired
from your last
job?
28. Lack of Focus
• Covering too many diverse topics
• Additional questions that do not meet the
survey goal
• Questions that are not inline with the
learning objectives
• Questions that do not derive actionable
results
29. Miscommunication
• Know your audience and the language that
they use and understand
– Avoid technical terms unless it is
appropriate
– Define terms if necessary
• Remember to speak in your company’s voice
• Have a peer review for clarity
30. Survey fatigue as a cultural trend
• Cultural survey fatigue
– The average
respondent is fatigued
already, just by nature
of:
• Receiving emails
from organizations
• Suggestions on
receipts and from
cashiers
31. • Try to avoid…
– Leading questions
– Loaded or suggestive questions
– Fatiguing question types – large tables,
lots of open-text or essay questions
– Sensitive questions
– Highly technical language
The Wrap-Up: Question Mistakes to Avoid
34. • Design: Involves
thinking about
psychology, emotions
and words. It is the
more abstract phase.
• Build: Involves taking
into account security
walls, logic, combatting
fatigue, bias, and poor
data collection; It is the
more active phase.
How are Design and Build different?
41. Multi-Text Questions
• Qualitative
• Explorative or
un-aided
response; used
for lists
Please list the names of phone providers that you have seen or heard advertised.
42. Essay Questions
• Qualitative and
explorative
• This is a way to
gather unaided
responses for
your survey 3. What is your favorite thing about SurveyGizmo?
46. Validate
• Number, Email, Percent, Date
• RegEx – Validate patterns like phone
numbers, zip code, etc.
• Capitalize each word
• Autosuggest answers
47. Test Reports
• Are your questions reporting the way you
expect?
• Are you able to create the reports you
need using the data you are collecting?
• Is the data in the format you need?
55. • Your options are: survey everyone, or
survey a percentage
• Why?
– Cost
– Survey fatigue
– You will miss certain sections of the
population
– Using a statistically valid sample is just
as effective (or more effective) than
trying to survey your entire population
What is Sample? Why is it Important?
56. More on Sample
A sample is statistically valid when every
single person in that population has a equal
chance or probability to be in a sample that
you select.
57. What is the Ideal Sample Size?
How many responses do you need for your
survey to be statistically accurate?
– It depends.
• How accurate do you want the data to
be? (margin of error or confidence
interval)
• How repeatable do you want the
results to be?
• How large is your total population?
58. How to Determine Sample Size
• Estimate 400 responses
• Use a sample calculator!
60. Caveats
#1: If you are segmenting data for comparison,
the segments should be the same as the
segments in the represented population.
#2: If you are cross-tabbing data, ensure that
the data collected (per question that you are
cross-tabbing) is statistically valid when
representing the larger population.
61. Where Do You Get Sample?
• Pull a population from your customer list.
– Warning: Do NOT use your entire
customer base.
– If everyone has the same chance of being
randomly selected, you are not biasing
your results in any way.
62. • Panel Companies: A panel company is an
organization that exists to sell anonymous
survey responses to marketers and market
researchers.
Engage Panel (opt.)
63. Panel Companies: The Issues
Drawbacks:
– Using incentives
– Cannot access market researchers
– Some panel companies will buy from each
other when they cannot provide the
sample needed
– Hard to determine level of bias in sample
• If the panel companies award “points”
for websites like Amazon – helps
reduce sample bias based on incentive
64. Incentives
• Biases your sample (ex. Toys R Us
gift card as incentive)
• Incentives can jeopardize your data
(because respondents just want to
get to the end)
• Safeguards:
– Survey page timer with
disqualification
– Shorter surveys
– Red herring questions
– Clean data (eliminating straight
liners, Christmas trees, etc)
68. How to Clean Data – Step 1
• Look out for:
– Unusually quick responses
– Unusually long responses
69. How to clean data - Step 1
1. Patterns/Straightlining 2. Red herring/logically inconsistent
4. Checking all or Checking one3. Gibberish, one word, fake text
70. How to clean data - Step 2
• Prepare your data for analysis
– Beware of:
• Inconsistent numeric values (How old
are you? Etc.)
• Breaks in validation
• Do not introduce new bias!
– Changing question text
71. • Run individual reports for each learning objective.
– Use this process to determine the “highlights” of
data collected as they relate to ultimate actions
so that you can truly understand the most
significant findings of your research.
Run Initial Reports
72. Run Preliminary Reports
• Your preliminary reports should be focused on your
original learning objectives.
– Did you get your questions answered?
– Is the data in the format you expected?
– Are you seeing the trends that you anticipated?
73. Running reports: Key factors
• Make sure your data makes sense
• For any overt trends you are finding in the
data, make note of them and ensure that
they are important towards the objectives
that you had set for your survey
77. Segmenting data for analysis
• Often, your survey will contain demographic
and firmographic questions to create
segments in your survey.
• These segments should remain the same
from start to finish of the survey process.
78. Good indicators of a trend:
• When you have data that isn’t statistically
sound but is still interesting, you can call it
“directional data”.
– This data gives you an idea of what your
population is saying, thinking or feeling,
but you cannot use statistics to back it up.
80. Suggestions for effective reports
Stage 1: Write a summary
– What was the ultimate goal of this survey?
– Who was surveyed?
– Who was the population?
– Who responded?
– Include basic highlights of the survey
audience and your data to introduce the
findings
81. Stage 2: Write a mini-report for each
individual learning objective (ex: 401K
changes).
•The last section for every learning objective
report will include the recommended actions
to take based on the results of the survey
(these should not be a surprise!)
Suggestions for effective reports
82. Stage 3 (optional): Interesting and
unexpected trends found
– Good to know, not need-to-know
– Ex. Perhaps you found a new, unintended
segment of your population that could
help you to make good business
decisions moving forward
•This is going the extra mile for your clients!
Suggestions for effective reports
83. Stage 4: Conclusion
– Recap what actions are going to be taken (if
any) based on your findings.
– Get all of stakeholders to agree to those
actions.
– Create a survey to be sent to stakeholders in
order to gain feedback for the project and put
actions in motion.
– Important for the next stage, Act: ask
stakeholders to provide metrics that can be
used to measure the success of the actions
that will be taken.
Suggestions for effective reports
84. Tips for communicating data
• Try to anticipate questions about the
report
• Know the details
• Be honest
87. Actions: The key to success!
• Reiterate project goals and objectives
• Motivate the stakeholders to take action
based on the data collected
• Establish a reasonable timeframe in which
actionable results (positive or negative)
can be expected.
Empower Action Taking
89. How to get feedback
• Send a short survey to all stakeholders
• Ask for any suggestions
• Allows you to work better together in the
next study and improve the process.
Get Feedback on Survey
- Designed for surveys
- Can be applied to forms/quizzes
- Key here is to use a sample, not perform a census
- Forms – collect responses from everyone
Why a circle?
Learn from past projects
Develop and create new projects better
Empowers you when creating a survey/project on a team, or on behalf of someone else
The cycle is split into parts that occur in your office, and parts that occur in the software.
Reason to collect data is to act!
Goals and objectives outline how you will act.
- Good goals have actions associated with them
- Bad goals leave you asking how? When? Where? What?
3-5 - # of things you can easily remember
Aleta – What should we do if there are more than 5 learning objectives?
Set rules
Time limit
# of ideas
No judgment
Track who had what idea/Q
Label objective (ABC, 123)
Match Qs to objective
Aleta: What do we do with the brainstorm questions that do not fit with one of the a learning objective?
Aleta – What if the brainstorm questions do not help collect data for the learning objectives?
Most important question first/last
Demographics first and last
Aleta – What are the different ways that you can organize the questions in the survey?
By objective
As a conversation
Easier to do on paper or in a word doc than in the app
Wording/language is important
Has anyone in your family suffered from an Acute myocardial infarction? vs heart attack
Open ended questions can open a can of worms
Respondent is offended
Respondent leaves survey
Follow up questions are affected by the emotion
Respondent provides a more positive than honest answer
- Respondent can’t truthfully answer question
Leave, partial response – Bad
Pick an answer, bad data - Worse
Not honest answer
Respondent wants to be liked
Respondent leaves
Bad Data
Aleta – Do you have any suggestions about what to do with the extra survey questions that do not meet the survey goal?
- Respondent selects I don’t know more often
Respondents stop engaging
Answer I don’t know more
Other bad behaviors
Review survey questions/options
Assure that they address the learning objective
Can the proposed actions be performed with these survey question?
How is this going to be laid out?
What are the psychological impacts of laying out the survey in a specific way?
When you’re branding a survey, how can that bias your results?
All questions break down into these basic question types
Low fatigue, find answer and select
mid point vs. no midpoint
Aleta – Is it ok to use different scales in my survey? Like a 1-5 scale and a 1-10 scale?
Medium fatigue – need to read all answers in the list
Need to type in answer to each textbox
- You as survey admin need to spend more time reviewing responses/reporting
How will you handle data?
Open text analysis
Read and reply
Make a list for a report
Quote respondent directly
Tables go wrong when
Too many rows
Overwhelming to respondent
Not a single topic
Respondent starts to compare rows
Best to have as only question on a page
Important to do this early
Question changes, layout changes
Aleta – What if the response data isn’t in the format that I was expecting?
Aleta – I add logic to my surveys when I am building them. Why do you suggest to wait and do this later in the Build process?
- Each type of logic serves a purpose
- Not usually a need for each time in a single survey
Already selected mode in Need
Recognize that mode can create sample bias
Aleta – Can you give us an example of how the survey mode can introduce bias?
Example: Percentage of households with internet capability in the US versus households with no internet
- Choosing to email (versus telephone) this survey will create a highly biased sample
Who are you going to survey?
Very important for surveys/feedback systems
You don’t need to, and should not, survey everyone
Example: When comparing men and women in the US, the ratio within your survey should be the same as the ratio within the larger US population.
Do not email entire customer base
Do not email entire customer base
Need to collect more responses for a survey since you may remove responses collected.
Aleta – What about partial responses. Should partial responses be included in the reports?
Identify responses that are not engaged and delete or exclude from the data set
Responses that did not follow the trend
Run one report for each learning objective
Run one report for each learning objective
When reviewing the report, data should make sense and provide a logical path to actions.
- Analyzing text responses takes time and effort
- Don’t ask questions that you will not read or act on
Collect enough responses to segment data and be statistically accurate
This step is not a surprise
You know from Need how data/reports will be shared and discussed
It’s ok to be a pest
Aleta – What if I am building a survey for someone else? How can I empower action taking?
Follow up with stake holders
Ask questions about actions
Was there a financial benefit to company
Great way to get involved and encourage next project
- Can easily be an email asking for feedback on specific parts of study/report/actions
Communicating changes to customers
Communicating changes with business execs
Communicating changes with employees