Navi Mumbai Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Research Traps: 7 ways of thinking that keep you from doing great customer research
1. Research Traps:
7 ways of thinking that keep you from doing
great customer research
Wendy Castleman
Principal User Research Scientist
To be presented at the UPA Conference in Portland, OR in June 2009
22. Avoiding the False Consensus Trap
• Focus on the customer
–spend time watching and talking
• Test with people who aren’t like you
23. What is the rule?
2, 4, 6, 8, ___
Hypotheses:
Each value must be 2 higher than the one before.
How do you test this hypothesis?
Actual Rule:
Each value must be any number bigger than the one before.
24. Congruence
Bias
We jump to conclusions by
only looking at one approach
25. Try out our new idea
for an iPhone application!
26. Congruence
Trap
insufficiently inform the design
Only trying one solution
may miss a better one
34. Avoiding the Confirmation Trap
• Look for surprises, instead of what you expect
• Test out what shouldn’t work
• Consider independent evaluation
35. In American English are there more:
Words that begin with the letter “K”?
Or
Words where “K” is the third letter?
There are 3x as many words
with “K” in the 3rd position
36. Availability
Heuristic
We have a tendency to put too much weight
on what comes easily to mind
40. Avoiding the Availability Trap
• Gather key usability metrics
–(task success, specific error counts, time)
• Don’t rely on your memory
• Look at all of the data
–Encourage others to do the same
46. How to Avoid the Recency Trap
• Gather key usability metrics
–(task success, specific error counts, time)
• Don’t rely on your memory
• Look at all of the data
51. That’s the fourth man
who has bought this version.
I need to find out how
many men buy this…
52. Avoiding the Illusory Correlation Trap
• Recognize the limitations of your research
methods
• Verify magnitude estimations and
correlations with large-scale quantitative
studies
55. Ways to avoid the traps…
Planning
1. Look at every project as unique
2. Consider what you need to learn
3. Identify the best method
4. Test with people who aren’t like you
5. Test several different solutions
6. Test out what shouldn’t work
56. Ways to avoid the traps…
Conducting
1. Look for surprises, instead of what you
expect
2. Gather key usability metrics
3. Consider independent evaluation
57. Ways to avoid the traps…
Analyzing
1. Don’t rely on your memory
2. Look at all of the data
3. Recognize the limitations of your research
methods
4. Verify magnitude estimations and
correlations with large-scale quantitative
studies
My name is Wendy CastlemanI am a wifeAnd a motherMy background is PsychologyBut now I work as an experience design researcherAt a company called “Intuit”… Where I do research to help us design solutions that empower individuals and businesses to achieve their dreams.
In my time as a researcher, I’ve made lots of mistakes that have limited the impact of the research I’ve done. I’m here to talk with you about some common research traps, how to recognize them and avoid them.
It ends up that YOU and I and everyone else around here all have some built in mental shortcuts that help us think efficiently.
This efficient thinking is really useful for our survival. Like, if you are being chased by a bear, you don’t have all the time in the world to try out different escape options to find the best one. You just need to try something right away…
But, these same shortcuts become barriers for research. In order to do great research to inform the design of delightful solutions, we need to break away from our natural biases and get beyond the obvious to the underlying truths. This talk will focus on some of the most common mental barriers, or cognitive biases, that can interfere with doing great customer research.
It doesn’t matter whether you are new to the fieldOr very experienced… we are all at risk of falling for these traps. Fortunately, a raised awareness of the traps can help us successfully avoid them.
Today we are going to go over 7 research traps, understand what they are, why they happen and how to avoid them.
These are the 7 traps, but don’t worry… I’ll go through them one at a time.
Now, I’m not going to admit that I’ve let habit come into play when doing research… nor would I say that it hasn’t… So, drawing from Child Psychology Therapy Sessions, I’m going to use a puppet to tell the story of how some usability researchers may handle these biases. So, Elsie will show us examples throughout this talk.
Let me tell you a story. This is a picture of the street that I live on.For the past year, I lived here.I just bought the house across the street and moved here.Now, driving home after a late night at work… what do you think happened?YES! I drove to the wrong house. Why? Because I was going on auto-pilot… force of habit. I wasn’t thinking, I was just doing what I always do.
Habit comes into play with research. We often don’t really think about the type of research we should do, but instead do the same type of research that we usually do. So, Elsie spends a lot of time in the lab doing usability tests… and when someone comes to her with a new project, she just jumps straight to doing another usability test.
The problem is that all research methods are not the same. By doing what Elsie always does, a usability test, she might end up not learning the key type of information that the project team needs to know. Have you ever done a test that ended up not being that helpful? Maybe it was because you fell for the Habit trap.
With her new raised awareness, now Elsie asks the team a number of questions and figures out which method is most appropriate.
Ross and colleagues asked students to walk around campus with a sign saying ‘Eat at Joe’s’. Those who agreed said that 62% of other people would agree to carry the sign. Those who disagreed said that 67% would not carry the sign. Ross, L, Amabile, T. M. and Steinmetz, J. L.(1977) Social roles, social control and biases in social perception, Journal of Personality and Social Psychology, 35, 485-494
Example for “False Consensus” is pretty much any product team I have worked on—the engineers and tax dev folks have a tendency to think they are just like the customer and know the problem. They get all wrapped up in the ui design, when they are clearly not accountants! Not sure how you can turn that into a story without incriminating people though. J
There are lots of ways False Consensus can come into play with usability… for example, conducting a cognitive walkthrough and assuming that your insights are the same as a target user. However, another way we fall for this trap is by recruiting participants who are not quite the target audience. Ever invite friends and family to participate in a study? Do your friends and family think like you do? You might have fallen for the False Consensus trap.
Example for “False Consensus” is pretty much any product team I have worked on—the engineers and tax dev folks have a tendency to think they are just like the customer and know the problem. They get all wrapped up in the ui design, when they are clearly not accountants! Not sure how you can turn that into a story without incriminating people though. J
Quicken usability testing was conducted with first-time PC users, including the Palo Alto Junior League in 1984. Intuit’s ambitious goal: to have complete PC novices up and running on Quicken within 15 minutes. (Courtesy of Virginia Boyd.)
Because of the confirmation bias, we tend to fall victim to another bias … the congruence bias. Take this example. You see a sequence of numbers and are tasked with identifying the rule. Your hypotheses is that these are a sequence of even numbers, increasing by 2. If you fill in the next value to determine the accuracy of your hypothesis, you are likely to put in the number 10. But, this is a direct test. It is possible that 10 would be okay but your hypothesis is not true. If you test the indirect hypothesis, you might choose 9. Because, if your hypothesis is true, then 9 should be FALSE.
The team is coming up with a new iPhone app and asked Elsie to find out what people think about the idea and how to make it better. She tests the application… and suggests usability improvements… what’s wrong with that? Well…
Ideally, Elsie will test multiple in a pseudo-usability test to gauge directionSome might not turn out to solve a big enough of a problem for her customers (X)Some might be best combined into one (O and arrow)
Turn T & 6, not 6 & L. If there is a 6 on the other side of L, that doesn’t tell you anything. If there is an L on the other side of the 6, that is consistent with your hypothesis, but doesn’t tell you if you are right. If there is an L on the other side of the 4, that doesn’t mean anything either (maybe all even numbers have L’s… that wouldn’t mean that your hypothesis would be wrong, or maybe some 6’s have Ls and some have Rs). Only if you turn over the T and found a 6 would you be able to determine that your hypothesis definitely isn’t right.
Elsie goes looking for the data that supports her hypotheses…
Soon after the Quicken launch, Scott and his team started doing surveys where they asked a number of demographic questions. One of the things he learned was that 48% of customers said they used Quicken in an office or both at home and in an office. This made no sense to anyone on the team, as it was clearly intended for personal home users, so they ignored it. They repeated the study 18 months later. That time they found that 49% said they used it in an office. So they ignored it again. But the anomaly kept gnawing at him, and a year later Scott started calling and visiting these customers to figure out why…
Someone is asked to estimate the proportion of words that begin with the letter \"R\" or \"K\" versus those words that have the letter \"R\" or \"K\" in the third position. Most English-speaking people could immediately think of many words that begin with the letters \"R\" (roar, rusty, ribald) or \"K\" (kangaroo, kitchen, kale), but it would take a more concentrated effort to think of any words where \"R\" or \"K\" is the third letter (street, care, borrow, acknowledge); the immediate answer would probably be that words that begin with \"R\" or \"K\" are more common. The reality is that words that have the letter \"R\" or \"K\" in the third position are more common. In fact, there are three times as many words that have the letter \"K\" in the third position. Tversky, A. & Kahneman, D. (1974). Judgments under uncertainty: Heuristics and biases. Science, 185, 1124-1131.
Have you ever conducted a study with multiple people, then been asked to summarize what you found without having the time to go back and analyze all of your data? If so, you too have probably fallen victim to the availability trap.
Tracking and counting can help make this more objective. Even if it’s just taking a couple of minutes to tally up what you recall finding…
List the following: - the last 3 movies you saw in 2008 - the prior 3 movies you sawWhich was easier?
Elsie does a usability test… and sees which button people click on a particular screen. Immediately after the last participant, the product manager asks her how the study went and what they saw on that screen. Elsie says, “Most people clicked Continue”. However, as you can see… that just isn’t true. She’s fallen victim to the recency trap. This is especially likely when you have more participants. Common on site visits too, where time elapses between sessions and you have a greater chance of forgetting.
Again, the key to avoiding this trap is to take a moment and review your counts…
A study was done with teenaged girls. They were asked to describe the relationship between weight and happiness. What do you think they found? The teenagers thought there was a negative correlation between weight and happiness. In truth, there is no correlation at all.
Elsie is noticing patterns… that’s fine, but if she actually claims the relationship to be true, she can lose credibility if it ends up that it isn’t.
Elsie’s instincts are right on here… notice a pattern and seek to discover if other data supports it.
So those are the seven traps. Now that you know about them…