SlideShare uma empresa Scribd logo
1 de 34
Baixar para ler offline
Guerrilla
Research
OR:
HOW I LEARNED TO
STOP WORRYING AND
LOVE UX
1
I
One
Slap some UX on it.
What’s the solution?
Guerrilla Usability
Guerrilla Warfare
“The strategy and tactics of guerrilla warfare
tend to focus around the use of a small,
mobile force competing against a larger, more
unwieldy one.”
- Wikipedia, “Guerrilla Warfare”
Usability
“All aspects of the end-user’s interaction with
the company, its services and its products.”
- Nielson Norman Group
Usability Criteria
1. Learnability	

2. Efficiency	

3. Memorability	

4. Errors	

5. Satisfaction
UX is everything.
Stand back!

I’m going to attempt science!
The Scientific Method of UX
1. Define a question	


1. Identify the problem

2. Gather information and
resources	


2. External research

3. Form an explanatory
hypothesis	


3. First testable version

4. Test the hypothesis	


4. User test

5. Analyze the data	


5. Analyze the data

6. Interpret the data, form
conclusions and create new
hypothesis

6. Iterate, iterate, iterate
The “lab” test
Today’s lab
Where do I get my users?
The Intro
The Intro: Explained
Background Questions
Tasks
Framing the question
http://tsaoshin.deviantart.com/gallery/45473299

Questions to avoid
http://tsaoshin.deviantart.com/gallery/45473299

Leading questions
http://tsaoshin.deviantart.com/gallery/45473299

Yes/No questions
http://tsaoshin.deviantart.com/gallery/45473299

Questions about future actions
Echo. Boomerang. Columbo.

* http://www.nngroup.com/articles/talking-to-users/
What to do after you test
No, really, I don’t have the time.
Remote studies
Usabilla.com, UsabilityHub.com and OptimalWorkshop.com
Remote Studies 2:	

Click Test Boogaloo

Where would you click to start using Square?
Other techniques of note

•
• Surveys	

• 5-second surveys	

A/B testing	

•
Analytics
•
Card sorts
Bryan Robinson
Twitter: @brob	

LinkedIn: 	

http://linkedin.com/in/bryanlrobinson	

!

Presentation Slides: 	

http://bit.ly/guerrilla-ux

Questions?

Resources
usability.gov	

silverbackapp.com	

xsplit.com	

ethn.io	

!

usabilla.com	

usabilityhub.com	

optimalworkshop.com	

uxbooth.com

Mais conteúdo relacionado

Semelhante a Guerrilla Usability, or How I Learned to Stop Worrying and Love UX

Managing Product Growth Experiments With JIRA and Confluence - Zane Salim
Managing Product Growth Experiments With JIRA and Confluence - Zane SalimManaging Product Growth Experiments With JIRA and Confluence - Zane Salim
Managing Product Growth Experiments With JIRA and Confluence - Zane SalimAtlassian
 
Conceipt of experimental design sixsigma1
Conceipt of experimental design sixsigma1Conceipt of experimental design sixsigma1
Conceipt of experimental design sixsigma1Mujtaba Masood
 
Strong Angel III Final Report
Strong Angel III Final ReportStrong Angel III Final Report
Strong Angel III Final ReportDoug Hanchard
 
Recommendations as a Conversation with the User
Recommendations as a Conversation with the UserRecommendations as a Conversation with the User
Recommendations as a Conversation with the UserDaniel Tunkelang
 
Usability Testing Bootcamp
Usability Testing BootcampUsability Testing Bootcamp
Usability Testing BootcampDavid Travis
 
Dey alexander usability_training_notes_01
Dey alexander usability_training_notes_01Dey alexander usability_training_notes_01
Dey alexander usability_training_notes_01danamato
 
Usability Testing Methods
Usability Testing MethodsUsability Testing Methods
Usability Testing Methodsdillarja
 
All posts must 100 original work. no plagiarism. During our course .docx
All posts must 100 original work. no plagiarism. During our course .docxAll posts must 100 original work. no plagiarism. During our course .docx
All posts must 100 original work. no plagiarism. During our course .docxADDY50
 
Don't let assumptions kill good ideas
Don't let assumptions kill good ideasDon't let assumptions kill good ideas
Don't let assumptions kill good ideasLauren Liss
 
How to get your innovations adopted
How to get your innovations adoptedHow to get your innovations adopted
How to get your innovations adoptedJisc
 
Facilitating Complexity: Methods & Mindsets for Exploration
Facilitating Complexity: Methods & Mindsets for Exploration Facilitating Complexity: Methods & Mindsets for Exploration
Facilitating Complexity: Methods & Mindsets for Exploration William Evans
 
Submitting to Low-Fidelity User Research: a Primer
Submitting to Low-Fidelity User Research: a PrimerSubmitting to Low-Fidelity User Research: a Primer
Submitting to Low-Fidelity User Research: a PrimerIan Fitzpatrick
 
An informatics perspective on argumentation mining - SICSA 2014-07-09
An informatics perspective on argumentation mining - SICSA 2014-07-09An informatics perspective on argumentation mining - SICSA 2014-07-09
An informatics perspective on argumentation mining - SICSA 2014-07-09jodischneider
 
How to create a taxonomy for management buy-in
How to create a taxonomy for management buy-inHow to create a taxonomy for management buy-in
How to create a taxonomy for management buy-inMary Chitty
 
Discussion 1Quantitative Research DesignsA research design i.docx
Discussion 1Quantitative Research DesignsA research design i.docxDiscussion 1Quantitative Research DesignsA research design i.docx
Discussion 1Quantitative Research DesignsA research design i.docxcharlieppalmer35273
 
Biology monthly report for_theme_units
Biology monthly report for_theme_unitsBiology monthly report for_theme_units
Biology monthly report for_theme_unitsjuanvidal71
 
Role of crowdsourcing
Role of crowdsourcing Role of crowdsourcing
Role of crowdsourcing Graham Steel
 

Semelhante a Guerrilla Usability, or How I Learned to Stop Worrying and Love UX (20)

Managing Product Growth Experiments With JIRA and Confluence - Zane Salim
Managing Product Growth Experiments With JIRA and Confluence - Zane SalimManaging Product Growth Experiments With JIRA and Confluence - Zane Salim
Managing Product Growth Experiments With JIRA and Confluence - Zane Salim
 
Conceipt of experimental design sixsigma1
Conceipt of experimental design sixsigma1Conceipt of experimental design sixsigma1
Conceipt of experimental design sixsigma1
 
Strong Angel III Final Report
Strong Angel III Final ReportStrong Angel III Final Report
Strong Angel III Final Report
 
Recommendations as a Conversation with the User
Recommendations as a Conversation with the UserRecommendations as a Conversation with the User
Recommendations as a Conversation with the User
 
Usability Testing Bootcamp
Usability Testing BootcampUsability Testing Bootcamp
Usability Testing Bootcamp
 
Dey alexander usability_training_notes_01
Dey alexander usability_training_notes_01Dey alexander usability_training_notes_01
Dey alexander usability_training_notes_01
 
Usability Testing Methods
Usability Testing MethodsUsability Testing Methods
Usability Testing Methods
 
All posts must 100 original work. no plagiarism. During our course .docx
All posts must 100 original work. no plagiarism. During our course .docxAll posts must 100 original work. no plagiarism. During our course .docx
All posts must 100 original work. no plagiarism. During our course .docx
 
Don't let assumptions kill good ideas
Don't let assumptions kill good ideasDon't let assumptions kill good ideas
Don't let assumptions kill good ideas
 
How to get your innovations adopted
How to get your innovations adoptedHow to get your innovations adopted
How to get your innovations adopted
 
Facilitating Complexity: Methods & Mindsets for Exploration
Facilitating Complexity: Methods & Mindsets for Exploration Facilitating Complexity: Methods & Mindsets for Exploration
Facilitating Complexity: Methods & Mindsets for Exploration
 
AUX Bootcamp
AUX BootcampAUX Bootcamp
AUX Bootcamp
 
Ch07
Ch07Ch07
Ch07
 
Submitting to Low-Fidelity User Research: a Primer
Submitting to Low-Fidelity User Research: a PrimerSubmitting to Low-Fidelity User Research: a Primer
Submitting to Low-Fidelity User Research: a Primer
 
1115 wed morar fairclough & vincent
1115 wed morar fairclough & vincent1115 wed morar fairclough & vincent
1115 wed morar fairclough & vincent
 
An informatics perspective on argumentation mining - SICSA 2014-07-09
An informatics perspective on argumentation mining - SICSA 2014-07-09An informatics perspective on argumentation mining - SICSA 2014-07-09
An informatics perspective on argumentation mining - SICSA 2014-07-09
 
How to create a taxonomy for management buy-in
How to create a taxonomy for management buy-inHow to create a taxonomy for management buy-in
How to create a taxonomy for management buy-in
 
Discussion 1Quantitative Research DesignsA research design i.docx
Discussion 1Quantitative Research DesignsA research design i.docxDiscussion 1Quantitative Research DesignsA research design i.docx
Discussion 1Quantitative Research DesignsA research design i.docx
 
Biology monthly report for_theme_units
Biology monthly report for_theme_unitsBiology monthly report for_theme_units
Biology monthly report for_theme_units
 
Role of crowdsourcing
Role of crowdsourcing Role of crowdsourcing
Role of crowdsourcing
 

Último

Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAshyamraj55
 
Linked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesLinked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesDavid Newbury
 
9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding TeamAdam Moalla
 
COMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a WebsiteCOMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a Websitedgelyza
 
20230202 - Introduction to tis-py
20230202 - Introduction to tis-py20230202 - Introduction to tis-py
20230202 - Introduction to tis-pyJamie (Taka) Wang
 
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Will Schroeder
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioChristian Posta
 
NIST Cybersecurity Framework (CSF) 2.0 Workshop
NIST Cybersecurity Framework (CSF) 2.0 WorkshopNIST Cybersecurity Framework (CSF) 2.0 Workshop
NIST Cybersecurity Framework (CSF) 2.0 WorkshopBachir Benyammi
 
Salesforce Miami User Group Event - 1st Quarter 2024
Salesforce Miami User Group Event - 1st Quarter 2024Salesforce Miami User Group Event - 1st Quarter 2024
Salesforce Miami User Group Event - 1st Quarter 2024SkyPlanner
 
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsIgniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsSafe Software
 
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfJamie (Taka) Wang
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureEric D. Schabell
 
Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024D Cloud Solutions
 
Computer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and HazardsComputer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and HazardsSeth Reyes
 
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDEADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDELiveplex
 
Nanopower In Semiconductor Industry.pdf
Nanopower  In Semiconductor Industry.pdfNanopower  In Semiconductor Industry.pdf
Nanopower In Semiconductor Industry.pdfPedro Manuel
 
Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1DianaGray10
 
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostKubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostMatt Ray
 
VoIP Service and Marketing using Odoo and Asterisk PBX
VoIP Service and Marketing using Odoo and Asterisk PBXVoIP Service and Marketing using Odoo and Asterisk PBX
VoIP Service and Marketing using Odoo and Asterisk PBXTarek Kalaji
 
Building AI-Driven Apps Using Semantic Kernel.pptx
Building AI-Driven Apps Using Semantic Kernel.pptxBuilding AI-Driven Apps Using Semantic Kernel.pptx
Building AI-Driven Apps Using Semantic Kernel.pptxUdaiappa Ramachandran
 

Último (20)

Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
 
Linked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesLinked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond Ontologies
 
9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team
 
COMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a WebsiteCOMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a Website
 
20230202 - Introduction to tis-py
20230202 - Introduction to tis-py20230202 - Introduction to tis-py
20230202 - Introduction to tis-py
 
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and Istio
 
NIST Cybersecurity Framework (CSF) 2.0 Workshop
NIST Cybersecurity Framework (CSF) 2.0 WorkshopNIST Cybersecurity Framework (CSF) 2.0 Workshop
NIST Cybersecurity Framework (CSF) 2.0 Workshop
 
Salesforce Miami User Group Event - 1st Quarter 2024
Salesforce Miami User Group Event - 1st Quarter 2024Salesforce Miami User Group Event - 1st Quarter 2024
Salesforce Miami User Group Event - 1st Quarter 2024
 
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsIgniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
 
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability Adventure
 
Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024
 
Computer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and HazardsComputer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and Hazards
 
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDEADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
 
Nanopower In Semiconductor Industry.pdf
Nanopower  In Semiconductor Industry.pdfNanopower  In Semiconductor Industry.pdf
Nanopower In Semiconductor Industry.pdf
 
Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1
 
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostKubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
 
VoIP Service and Marketing using Odoo and Asterisk PBX
VoIP Service and Marketing using Odoo and Asterisk PBXVoIP Service and Marketing using Odoo and Asterisk PBX
VoIP Service and Marketing using Odoo and Asterisk PBX
 
Building AI-Driven Apps Using Semantic Kernel.pptx
Building AI-Driven Apps Using Semantic Kernel.pptxBuilding AI-Driven Apps Using Semantic Kernel.pptx
Building AI-Driven Apps Using Semantic Kernel.pptx
 

Guerrilla Usability, or How I Learned to Stop Worrying and Love UX

Notas do Editor

  1. {"38":"This is a great article from UX Booth. The Five useful lies to tell your study participants.\nFirst, tell them you had nothing to do with the project. We’ve talked about human nature a few times so far. If a participant thinks you’re only here for testing and have had nothing to do with the design of the product, they’re more likely to be candid. Only psychopaths enjoy hurting someones feelings, so most participants will be overly positive if they are worried about hurting your feelings.\nSecond, play dumb. If a user is struggling and asks for guidance, it’s easy to nudge them in the right direction, but that can have a negative impact on the study. Instead of the nudge, ask them what they think or what they expect? If they ask if a specific button is the “right” answer, flip the question back on them by asking “What would you expect to happen if you pressed it?”\nNext, lie about the purpose of the study. This is slightly more ethically grey, though if you tell the user the true purpose at the end, all is well. By saying something like “We’re just making sure things are functioning as you would expect them to” instead of going into great detail, you avoid contaminating your results by clueing the participant into what to expect.\nIf you have people observing the test, lie about the amount watching. If the participant knows they have an active crowd listening in, it can give them performance anxiety and make them less responsive. Letting them know they’re being observed is important, but try to lessen the impact.\nFinally, participants need to be reassured that they’re doing well. Even if their feedback has been less than stellar, if you tell them how well they’re doing, it can often help open them up to know that their feedback is helpful.\n","27":"What are we testing exactly?\nYou can test at just about any stage of the process. If you’ve just had an amazing idea and can do some sketching to illustrate it, you can test various flows with those. If you’ll notice, I’ve got some mobile app sketches up there, you can cut those out with a frame or even using a real phone to make it a more accurate representation.\nIf you want to formalize your design a bit more, you can build out wireframes. You can use any design program to do wireframes really, but if you’ve some budget, I’d suggest picking up either AxureRP or getting a subscription to UXPin. They’re both wireframing programs that build interactive html to make testing at this stage much easier.\nAxure can be a bit of an investment and a bit of a beast to learn. UXPin is a cloud-based tool that is $15/month and can get you up and running quickly. \nThese tools allow you to build out a functional prototype so that you can run users through a “working” version instead of static images.\nIf all you’ve got is a visual design, you can use the prototyping tools i mentioned before to add interactivity or show static images to your participants to get feedback.\nA live product works really well, too, but I really want to stress that you should test early and often. If you’ve got a legacy design or product, test that before you even begin working on new designs and functionality to both get a baseline, but also make sure to keep features and design elements that DO work.\n","5":"1\nIn two years, we ran a total of three studies. This isn’t a statistic I’m proud of. \nWe made plenty of excuses. We never lacked for work. A lot of our stakeholders were executives that were horribly out of touch with the industry. We had no real budget. We had a bug list as long as my leg.\nSo, we coined a phrase to help us get through the day:\n","33":"Avoid Yes or No questions. It’s hard enough to get participants to talk without giving them the opportunity to give a one-word answer.\nAvoid asking questions about future actions. As a human, two things can get in the way of answering. The desire to give the “right” answer and the user’s high expectation of their future actions. If someone asked you if in the future you would look both ways before crossing the street, you’d probably say yes. In reality, there are a lot of factors and your expectations of yourself may not always live up to the situation.\n","22":"Step 5: Analyze the data. Crunch all the numbers!\nDefine a question (What's the problem we're solving?)\nThis is the problem our product or website is solving. Long lines at the grocery store, not enough leads being generated, explaining what guerrilla usability is... etc.\nGather information and resources (observe) \n(competitive analysis, market research, user surveys/stories)\nForm an explanatory hypothesis (Build a testable version: Product, wireframe, design, prototype)\nTest the hypothesis by performing an experiment and collecting data in a reproducible manner (The product research)\nAnalyze the data\nInterpret the data and draw conclusions that serve as a starting point for new hypothesis\nPublish results\nIterate\n","11":"With that definition of UX in hand, what does it mean to do usability testing?\n","39":"In-person studies are definitely great for qualified answers and getting a user actually using the product, but for quick and quantifiable studies, remote or unmoderated tests are amazingly efficient.\nThe basic idea, is you set a series of simple tasks for a user to attempt to complete, often on a screenshot, wireframe or design comp and via a software as a service solution check failure and success rate as well as time on task. This is sometimes referred to as a click test.\nMy services of choice are usabilla.com, UsabilityHub.com and OptimalWorkshop.com\n","28":"Once you nail down what you’re testing, you need to figure out how to ask the questions you need answered. Remember just because we’re doing this quick and cheap doesn’t mean we want to compromise our results. \nIt’s important to focus on the main hypothesis and problem the product is attempting to solve. It will help you frame your test to make sure you’re not asking questions or observing tasks that have a low return on investment.\nLet’s take a look at two related, but different websites. Techcamp’s website and GEEKmemphis’s website.\nThe questions we would ask for the Techcamp website would be focused around converting an idle reader into an attendee.\nFor the TechCamp site, we might ask something like this:\n","17":"To keep our methodology clean, let’s talk about the scientific method and what it means for your product and the UX process\nHere are the basics:\n","6":"Slap some UX on it.\nIt was one of those things that’s funny because it’s true. \nIf you can take one thing away from my presentation today, it’s this: UX. Is. Not. A. Verb.\nWe weren’t being asked to be user experience staff, we were designers and developers that were CALLED a UX staff.\n","34":"Avoid asking questions about future actions. As a human, two things can get in the way of answering. The desire to give the “right” answer and the user’s high expectation of their future actions. If someone asked you if in the future you would look both ways before crossing the street, you’d probably say yes. In reality, there are a lot of factors and your expectations of yourself may not always live up to the situation.\n","23":"Finally, steps 6 and 7 cana really be summed in by the battle cry of startups and Agile and Lean developers the world over as “Iterate, iterate, iterate.”\nInterpret your results, fix potential issues, launch a new version and test all over again.\nProduct development and scientific research are highly iterative processes.\n","12":"Basically, It’s evaluating a product or service by putting it in front of real users.\nThat’s a quote from the usability.gov website. I used to be embarrassed to send folks to that site, but they’ve very recently redesigned and it’s no longer an embarrassment to the industry.\nUX Testing often manifests in lab-based tests where participants sit with a moderator behind a two-way mirror running through a series of tasks and questions while stakeholders sit in another room observing and taking notes.\nThis is actually an awesome process, but it can take days or weeks depending one what’s being tested, it’s expensive and, it culminates in a huge document that most stakeholders don’t end up reading.\nThis is kind of traditionally where UX lives, but we’re not traditionalists, we’re engaging in guerrilla warfare\n","1":"Good morning! I hope everyone’s doing well. My name is Bryan Robinson and I’m the Director of Usability and Interactive at RocketFuel, a marketing and interactive agency here in Memphis. I’m glad i can share a little of my experience with you this morning.\nI’ve actually been dying to use a variation on the Dr. Strangelove title in a presentation for a long time. Glad it’s finally worked out. And if you don’t get the reference, do yourself a favor: find a copy and watch it this weekend.\nSo, I’ve been doing this design and development thing for 7 or 8 years. For two of those years, I spent my time as a UX specialist on a cross-functional UX team focused on news websites. \nI learned a LOT of lessons in those two years. I basically went from a novice front-end developer and designer to a seasoned pro. My interaction design skills blossomed. I learned a lot of product development techniques. And most importantly I learned\n","40":"In-person studies are definitely great for qualified answers and getting a user actually using the product, but for quick and quantifiable studies, remote or unmoderated tests are amazingly efficient.\nThe basic idea, is you set a series of simple tasks for a user to attempt to complete, often on a screenshot, wireframe or design comp and via a software as a service solution check failure and success rate as well as time on task. This is sometimes referred to as a click test.\nMy services of choice are usabilla.com, UsabilityHub.com and OptimalWorkshop.com\n","29":"“A Friend has told you about a technology conference coming up in Memphis, find out if it’s something you would want to attend”\nA couple things to note in this question: First, we’re setting the scene for the user. Giving them a scenario that they can wrap their head around. \nWe’re also making it fairly open-ended and in doing so, hopefully encouraging them to explore as they would at home.\nWe’ve avoided keywords that might give them clues about what we WANT them to do and not what makes the most sense to them. They’re not looking to identify speakers or topics being discussed; they’re looking for information that entices them specifically instead of just searching the page for keywords.\nIf we were testing the GeekMemphis website, our focus would be different. geekmemphis.com has more of a focus on community building, so a sample question might look like this:\n","18":"First define a question. Define the problem your product aims to solve.\nA few examples could be solving long lines at the grocery store, finding more leads for your sales funnel, creating better presentations or explaining what guerrilla usability is to a large group of people.\nDefine a question (What's the problem we're solving?)\nThis is the problem our product or website is solving. Long lines at the grocery store, not enough leads being generated, explaining what guerrilla usability is... etc.\nGather information and resources (observe) \n(competitive analysis, market research, user surveys/stories)\nForm an explanatory hypothesis (Build a testable version: Product, wireframe, design, prototype)\nTest the hypothesis by performing an experiment and collecting data in a reproducible manner (The product research)\nAnalyze the data\nInterpret the data and draw conclusions that serve as a starting point for new hypothesis\nPublish results\nIterate\n","7":"So, what should we have done instead of bemoaning our issues?\n","35":"As for where to test\nYou don’t need anything fancy to test your users.\nIf you have a conference room in your office, you can very easily set up in there. At RocketFuel, we’ve got a pretty good guerrilla setup. It involves a couple of computers and conference rooms. And I just use Skype and screensharing to simulate a ”Viewers room” for stakeholders and “lab” for participants.\nIf your product has a specific demographic that tends to be in certain areas, go out to those areas. If it’s an app for a church, go to the church; for grocery shopping, go to a store… etc.\nAnd if all else fails, hit up a Starbucks. They’re actually oddly accommodating. You’d be surprised at what a Grande no whip soy double pump latte will get you.\n","24":"Now that we’ve got the science lesson out of the way, let’s talk about the cornerstone of usability testing: The Lab-based test.\nThis is where you bring users in, sit them in front of your product, run them through tasks, ask questions and get feedback. Doing this in-person allows you to ask important follow ups and see the participants body language, facial expressions and hear their tone.\nWe don’t need the fancy equipment and a lab with two-way mirrors to get great data, we’re Guerrillas and we’ll improvise and get the data we need.\nThe first question we need to answer is\n","13":"So what does it mean to fight as a Guerrilla force?\n","2":"“Never make eye contact while eating a banana.” Or I suppose don’t make eye contact while on a banana phone? If you look closely enough you can see Jakob Nielson in the entryway to the UX area.\nWe were a small team, so jokes about bananas and random pictures of Nielson were common. \nBut, in a company with hundreds of thousands of monthly users and millions of monthly page views, our small 5-person team was the bastion of User Experience. We were the protector of the user. \nIn my time with the company, do you know how many moderated UX tests we ran?\n","41":"Other techniques of note\n•Card sorts\n•Surveys\n•5-second visual survey\n•A/B testing\n•Analytics\n","30":"“You’re new to Memphis and are looking for technology networking opportunities. What types of opportunities does GEEKmemphis offer?”\nAgain, we frame the question in a scenario, but instead of the end goal being one conversion to sign up to attend, the end goal is more one of exploration and discovery. The hope being that a user finds multiple things that attract their attention, instead of ending at just one conversion point.\nWe still try to keep it open ended. Instead of asking a yes or no question like if GEEKmemphis offers networking opportunities, we ask our participant what types. This allows not only to see if they can find the events, but also may offer insight into how the participant feels things should be organized.\nIt’s easy to go astray when framing your questions, though. We’re all human and we’re preconditioned to word questions in a way to get us the responses we want. Humans also tend to want to avoid uncomfortable truths.\n","19":"Step 2: Gather information and resources. \nIn our case, this would often be doing competitive analysis, doing market research, doing audience segmentation.\nDefine a question (What's the problem we're solving?)\nThis is the problem our product or website is solving. Long lines at the grocery store, not enough leads being generated, explaining what guerrilla usability is... etc.\nGather information and resources (observe) \n(competitive analysis, market research, user surveys/stories)\nForm an explanatory hypothesis (Build a testable version: Product, wireframe, design, prototype)\nTest the hypothesis by performing an experiment and collecting data in a reproducible manner (The product research)\nAnalyze the data\nInterpret the data and draw conclusions that serve as a starting point for new hypothesis\nPublish results\nIterate\n","8":"We should have looked to a growing trend in the usability world and started engaging in Guerrilla Usability.\nAnd no, I’m not talking about how I test my websites using monkeys and gorillas. I’ve heard that joke about 150 times since I started telling people about this presentation.\nSo if Guerrilla Usability isn’t testing on Gorillas. What is it? Let’s start by defining some terms:\n","36":"So, you’ve successfully completed your first guerrilla usability test! Excellent. Aside from me giving you a high five, what comes next?\nIf you’re working with observers, make sure to schedule 15 minutes after the test to debrief. The observers have been paying attention and taking notes, see if there was anything unexpected. \nMaybe a user stumbled in an area you thought was rock solid or took the test in a random direction.\nList out trouble areas, areas working as intended and any new questions that have come to light. Make sure everyone is on the same page and keep testing.\nOnce you get through your full round, set up time for all observers to get together for a full debrief and find the commonalities and make a list of actionable items. \nThen ……. go work on those items and get ready to test again. This is an iterative process. The best products and sites out there are constantly testing.\nIf you’re a team of 1 and couldn’t get buy-in to have folks observing or just didn’t have time or observers, schedule those same times for yourself, and then present findings to your team. A bulleted list works just fine, but if you’ve recorded video, it’s often times what can swing the group or your boss’s opinion.\n","25":"In general, all you need is a product, a location, questions and some users, but here are a few things that can make your life better:\nRecording software. Full-featured programs like Morae (an industry standard) are absolutely ridiculously priced for our purposes, plus they take a long time to master.\nMy software of choice is a Mac program called Silverback. It’s geared specifically for our purposes. It records the screen AND the participants face, allows you to create “chapters” in your test and does some easy exporting to be able to share with stakeholders. It’s only going to run you about $80.\nOn Windows, things get a little harder. There are programs like Snagit and Quicktime Pro, but those won’t capture picture in picture for participant viewing. There’s also a web service called XSplit Pro which does picture-in-picture and is about $40 per year.\nIf possible, have some observers ready as well. If you’re doing the testing, you can’t take notes as easily. Having 2-3 observers can make things drastically easier and help create buy-in in your organization. \n","14":"Mainly, it’s a small group of insurgents fighting against a large standing army using their speed and mobility to stage creative and sometimes unprecedented military campaigns.\nFor usability, when you’ve got a large corporation and tens of thousands of dollars for a dedicated research team, the feedback you can get is incredibly valuable, but how can those of us with no budget and a UX team of 1 or none expect to compete?\nYou fight like an insurgent and you get the job done.\nYou use tried and true UX techniques in new and modified ways to turn weeks of testing and 30 pages of documentation into a day of testing and a list of actionable items.\n","3":"1\nHow many A/B tests did we run?\n","31":"So, here are some things to avoid when working on your question\n","20":"Step 3: Form an explanatory Hypothesis.\nWhen we’re dealing with usability testing, this is often the first testable version. Maybe it’s a prototype or a wireframe. It could be a sketch on a napkin or the current version of your product to establish a baseline.\nThe important thing to remember at this point is that you don’t have a solution. You merely have a version to test and iterate upon.\nDefine a question (What's the problem we're solving?)\nThis is the problem our product or website is solving. Long lines at the grocery store, not enough leads being generated, explaining what guerrilla usability is... etc.\nGather information and resources (observe) \n(competitive analysis, market research, user surveys/stories)\nForm an explanatory hypothesis (Build a testable version: Product, wireframe, design, prototype)\nTest the hypothesis by performing an experiment and collecting data in a reproducible manner (The product research)\nAnalyze the data\nInterpret the data and draw conclusions that serve as a starting point for new hypothesis\nPublish results\nIterate\n","9":"What is User Experience beside a buzz word and NOT a verb?\n","37":"Other factors that can help you produce a quality test:\nWork from a script. Not only will this help keep each test consistent, it’ll help give you confidence. Don’t just read from the script, though, make sure to make the participant feel at ease by talking to them, not at them.\nKeep to the same skeleton of questions. If a user diverges during a task, different follow-ups are fine.\nAnd remember, like a good website or product, good usability testing takes practice and iteration. Be sure to take time after performing a test to figure out what could work better and always be improving.\n","26":"Who do we test?\nThere are a lot of great answers to this. In a pinch, the main answer is anyone you can get your hands on. If the choice is no testing or testing on a random person on the street, I’ll always take random person on the street.\nThat being said, the closer to your actual market the better. If you run a retirement community, don’t test on a college student. If you a skate shop, don’t test IN the retirement community.\nIf you have a site or product already and are testing new functionality, it’s easy to grab a cross-section of actual users. You can put a screener form on your website easily enough and ask a few questions to allow you to get the best group of users for the specific test.\nThere are services out there that can help you build screeners and recruit participants, but those can be costly on top of a cost of paying those participants decently for their time. Most of the time a standard web form will net you the people you need.\nIf all else fails, mine your family, your friends and your social media acquaintances for participants and convince them to spend 10-15 minutes helping you.\nOnce we’ve got an idea of where we can get our participants, it’s time to consider\n","15":"Often times the first step is the hardest. I’ve always been a proponent of jumping right into the deep end.\nI’d encourage you to get into the same room as a user and run them through your product and get real-time feedback.\nHow can you do that?\nBefore we sit you down with your first user, however, let’s make sure we’ve got our brains geared up for the splash. Let’s take trip down memory lane… All the way to high school science!\n","4":"1\nHow many remote usability studies?\n","32":"Try to avoid asking leading questions\nInstead of “What do you like about X?” Try asking "What did you think about X?”\nInstead of "How would you contact this company?” Try "You've decided this company may be a good fit, what is your next step?”\n","21":"Step 4: Testing the hypothesis. Ahhhh the test. This is where we put the idea in front of a user and gather data on how they perceive and use it. This is the meat and the potatoes. We’ll get into the actual testing in a few slides.\nDefine a question (What's the problem we're solving?)\nThis is the problem our product or website is solving. Long lines at the grocery store, not enough leads being generated, explaining what guerrilla usability is... etc.\nGather information and resources (observe) \n(competitive analysis, market research, user surveys/stories)\nForm an explanatory hypothesis (Build a testable version: Product, wireframe, design, prototype)\nTest the hypothesis by performing an experiment and collecting data in a reproducible manner (The product research)\nAnalyze the data\nInterpret the data and draw conclusions that serve as a starting point for new hypothesis\nPublish results\nIterate\n","10":"There are a lot definitions of what User Experience is, I think the Nielson Norman Group sums it up well.\n“All aspects of the end-user’s interaction with the company, its services and its products”\nIf you don’t know the Nielson Norman group, they’re one of the pioneering usability companies. They’ve been around forever.\nAs for the definition, I think it’s great, but if you’ll allow me a moment of grandeur:\nUX is not just design; it’s not just development or information architecture or interaction design or testing. It’s the space between the user and your brand. It IS the user and it IS your brand. UX is everything.\nOK, delusions of grandeur over.\n"}