SlideShare uma empresa Scribd logo
1 de 19
Baixar para ler offline
Qualifications
User Testing &
Design Research

January 24, 2013
Who we are

Analytic Design Group Inc (ADGi) is a visionary user experience     Founded in 2005 on the
strategy and design firm that specializes in innovating in digital   principle that evidence-based
environments by leveraging in-depth primary research to find         design will always be more
expose unexamined assumptions. Our work not only withstands the     powerful than design driven by
                                                                    best practices, we have grown
complexity of multiple agendas and intricate implementation but
                                                                    from a single practitioner to a
also the scrutiny of the public.
                                                                    vibrant, collaborative team.

Some of our clients include:
Samsung, Sony, AT&T, Adobe, Nokia, LG, Motorola
Our Services
Key service areas include: design research, user
experience strategy development, interaction design,
communication design, and usability testing. Lately some of
our work has also included service design considerations as
well. Our projects can include the full sweep of user
experience services (i.e. user research through strategy and
design) or just one element. Our aim is to always fit the work
required to the need, and we’ll work with you to ensure you
are getting the best value from our efforts.


This presentation focuses on our design research and user
testing services.




                                                                                                  / User Experience
/ Design Research /   / Usability Testing /   / Communication Design /   / Interaction Design /
                                                                                                  Strategy Development /
Design Research
Design Research
We use a diverse set of design research methodologies:


Surveys — we have used surveys to establish baseline data
(largely attitudinal), help to segment audiences, and in
some cases, help to identify core issues that can be further
explored by other research.


Context-rich group interviews (like marketing focus groups
but much richer) — the focus groups we do, are typically
very rich and usually drive out a great deal of contextual as
well as attitudinal data. We usually ask participants to
complete homework prior to the session (aids in grounding
the user and supports contextual data gathering) as well as
have some form of participatory design exercise to allow
participants to tap into their feelings and attitudes quickly.
Design Research
On-site observation (w/o) interviews — this is useful when we are
looking for issues that are process related.


Task analysis — this is usually both an expert review and then a
walkthrough with participants to identify particular pain points with
certain tasks. This often involves both offline and online elements.      TASK 1


                                                                        80%50%
                                                                             70%
Expert review/Heuristic analysis — this can be a quick and cost-
effective means of identifying user experience and usability issues.
We typically rank severity of issues identified and can include an
accessibility review in this process.


Card sorting — we have done card sorting exercises in both one
on one as well as group sessions. We’ve used both open and closed
card sorts and typically use the findings to develop information
architectures.


Diary Studies — are useful when we are looking at processes that
occur over a longer period of time or are looking at the impact of
certain things over time.
Design Research
In-Situ & Ethnographic

We have conducted numerous ethnographic or in-situ studies on
a wide range of physical and digital products. These typically are
very data rich and result in in-depth, tactical, near-term findings
as well as robust, strategic, longer-term, insights. Our clients
report that the ROI on these studies is that along with finding
solutions to nagging problems, it can help them focus their
product management for a year or more.

For example, last year ADGi conducted an ethnographic study for
a mobile carrier on a device experiencing high returns. We were
able to identify key usability issues, service design issues, and
deliver insights about how their customers currently perceived
these devices and were likely to for the foreseeable future.
Design Research
Sample Report
User Testing
User Testing
The range of user testing methods we use include:


Metrics-based usability studies — the usability studies we do
are quite rich with quantitative (metrics) data as well as
qualitative data. We typically collect task time, performance,
SUS, satisfaction, and hedonic scores


Remote-moderated usability studies — through the use of
such tools as WebEx (or other screen sharing tools) we have
successfully conducted remote moderated testing, collecting
similar (or the same) metrics as we do for in person tests –
this is particularly useful when testing with participants who
are geographically dispersed or where the user’s context
heavily influences their interaction and on-site observation is
not possible/feasible.
User Testing
‘Listening-lab’ style user testing — this is essentially user testing
without a set task list. There is some hard data we draw out of
these sessions, but mainly this is focused on qualitative data.


Un-moderated usability testing — this is user testing where the
user is in the lab and observed and recorded but completing the
tasks on their own.


ADGi Field Test — this is a web-based tool we developed in house
that automates a field test: participants are asked via email
whether they wish to participate. If they indicate yes, they are sent
a set of instructions or tasks to complete along with an NDA
reminder. After a set period of days participants are then sent a
survey to fill out. From a test administration point of view we can
track all the participants, where they are in the study and get a
graphical view on how they responded to each question, as well
as download a CSV of the results for additional manipulation.
We’ve used this tool to test devices and apps.
User Testing
Navigation testing — this is another tool we developed in
house to test navigation structures. Users are asked a series of
questions about under what categories and labels they would
expect to find certain pieces of information. They are shown
the tree structure for the site and can navigate through it to the
spot where they would expect to find the content. This testing
has been very effective for us in establishing how findable
content on very large sites will be and in determining the
effectiveness of categorization and labeling schemes.


Concept acceptance testing — this is useful for trying out a
new concept, typically while comparing it to other more familiar
ones. We’ve used this on devices when a client wants to
evaluate new way of navigating or different form factor
User Testing
Competitive benchmark testing — this is useful when
comparing a product (interface, device, site) against one or
more others – we have used this to set benchmarks for future
comparison as well as just comparisons


Iterative testing — this is where we test one or at most two
discreet elements with a very small set of users (2 or 3) make
recommendations on that testing, the development team
makes those changes and we test again until we do not see
the need for any more changes. We use this method primarily
for games research looking at a particular interaction. While
other clients have asked about this, after discussing it we
have so far determined that the value of this approach does
not warrant the effort and cost for the project at hand.
User Testing
Remote user testing — Ability and experience in executing
remote usability testing — inclusive of screen sharing, audio
and video recording.


We have experience conducting remote-moderated usability
as well as focus group sessions. We screen share and
capture (audio and video record) the sessions. We have
found that this type of research can be very cost effective
and is especially useful when we are asking participants to
log in to their own accounts, or are geographically
dispersed. On occasion we’ve also found that by having the
user located in their own environment, we are able
to glean more contextual information than we are typically
able to in the lab.
User Testing
Sample Report
User Testing
Sample Report
Mobile Test Lab
Mobile Test Lab

Mobile test lab — we conduct a great deal of testing
on mobile devices and our lab set up is both flexible
and powerful:


Our testing equipment is deliberately flexible so that
we can set up in a lab environment, a coffee shop, a
person’s home or office. We have designed a very
stable, yet flexible camera mount that allows us to
capture a variety of interactions. Assuming we can
connect to a stable WiFi, we can also live stream (to
allow remote viewing) outside of a lab environment.
                                                        For more information view this presentation:
                                                        Mobile Usability: What's Your Strategy
Karyn Zuidinga
Principal & Director of User Experience

604.669.7655
karyn@analyticdesigngroup.com

www.analyticdesigngroup.com
@analytic_design

Mais conteúdo relacionado

Mais procurados

Validating hypotheses with user research
Validating hypotheses with user researchValidating hypotheses with user research
Validating hypotheses with user researchgabrieljonsson
 
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...UserZoom
 
Preference and Desirability Testing: Measuring Emotional Response to Guide De...
Preference and Desirability Testing: Measuring Emotional Response to Guide De...Preference and Desirability Testing: Measuring Emotional Response to Guide De...
Preference and Desirability Testing: Measuring Emotional Response to Guide De...Paul Doncaster
 
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...SlideTeam
 
Automotive MR and Virtual Reality
Automotive MR and Virtual RealityAutomotive MR and Virtual Reality
Automotive MR and Virtual RealityElio Dalprato
 
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with ConfidenceUserZoom
 
Usability Testing Bootcamp
Usability Testing BootcampUsability Testing Bootcamp
Usability Testing BootcampDavid Travis
 
Ericsson Review: Crafting UX - designing the user experience beyond the inter...
Ericsson Review: Crafting UX - designing the user experience beyond the inter...Ericsson Review: Crafting UX - designing the user experience beyond the inter...
Ericsson Review: Crafting UX - designing the user experience beyond the inter...Ericsson
 
User testing on a diet
User testing on a dietUser testing on a diet
User testing on a dietPaul Veugen
 
Experience mapping - UX Case study
Experience mapping - UX Case studyExperience mapping - UX Case study
Experience mapping - UX Case studyTinton Abraham (CUA)
 
#flashtest: User Research Live
#flashtest: User Research Live#flashtest: User Research Live
#flashtest: User Research LiveUXPA Boston
 
What does a product manager actually do?
What does a product manager actually do?What does a product manager actually do?
What does a product manager actually do?Graham O'Connor
 
Building Products Your Customers Love with Empathy and Human Insights
Building Products Your Customers Love with Empathy and Human InsightsBuilding Products Your Customers Love with Empathy and Human Insights
Building Products Your Customers Love with Empathy and Human InsightsAggregage
 
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...Joshua Ledwell
 
Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...
Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...
Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...UserZoom
 
Politics of design systems
Politics of design systemsPolitics of design systems
Politics of design systemsDani Nordin
 
Tackle the Problem with Design Thinking - GDSC UAD
Tackle the Problem with Design Thinking - GDSC UADTackle the Problem with Design Thinking - GDSC UAD
Tackle the Problem with Design Thinking - GDSC UADgallangsadewa
 
Sample - Design Portfolio Walkthrough
Sample - Design Portfolio WalkthroughSample - Design Portfolio Walkthrough
Sample - Design Portfolio WalkthroughShanae Chapman
 
Introduction to UX
Introduction to UXIntroduction to UX
Introduction to UXEffective
 
How to effectively implement different online research methods - UXPA 2015 - ...
How to effectively implement different online research methods - UXPA 2015 - ...How to effectively implement different online research methods - UXPA 2015 - ...
How to effectively implement different online research methods - UXPA 2015 - ...Steve Fadden
 

Mais procurados (20)

Validating hypotheses with user research
Validating hypotheses with user researchValidating hypotheses with user research
Validating hypotheses with user research
 
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
 
Preference and Desirability Testing: Measuring Emotional Response to Guide De...
Preference and Desirability Testing: Measuring Emotional Response to Guide De...Preference and Desirability Testing: Measuring Emotional Response to Guide De...
Preference and Desirability Testing: Measuring Emotional Response to Guide De...
 
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...
 
Automotive MR and Virtual Reality
Automotive MR and Virtual RealityAutomotive MR and Virtual Reality
Automotive MR and Virtual Reality
 
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
 
Usability Testing Bootcamp
Usability Testing BootcampUsability Testing Bootcamp
Usability Testing Bootcamp
 
Ericsson Review: Crafting UX - designing the user experience beyond the inter...
Ericsson Review: Crafting UX - designing the user experience beyond the inter...Ericsson Review: Crafting UX - designing the user experience beyond the inter...
Ericsson Review: Crafting UX - designing the user experience beyond the inter...
 
User testing on a diet
User testing on a dietUser testing on a diet
User testing on a diet
 
Experience mapping - UX Case study
Experience mapping - UX Case studyExperience mapping - UX Case study
Experience mapping - UX Case study
 
#flashtest: User Research Live
#flashtest: User Research Live#flashtest: User Research Live
#flashtest: User Research Live
 
What does a product manager actually do?
What does a product manager actually do?What does a product manager actually do?
What does a product manager actually do?
 
Building Products Your Customers Love with Empathy and Human Insights
Building Products Your Customers Love with Empathy and Human InsightsBuilding Products Your Customers Love with Empathy and Human Insights
Building Products Your Customers Love with Empathy and Human Insights
 
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...
 
Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...
Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...
Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...
 
Politics of design systems
Politics of design systemsPolitics of design systems
Politics of design systems
 
Tackle the Problem with Design Thinking - GDSC UAD
Tackle the Problem with Design Thinking - GDSC UADTackle the Problem with Design Thinking - GDSC UAD
Tackle the Problem with Design Thinking - GDSC UAD
 
Sample - Design Portfolio Walkthrough
Sample - Design Portfolio WalkthroughSample - Design Portfolio Walkthrough
Sample - Design Portfolio Walkthrough
 
Introduction to UX
Introduction to UXIntroduction to UX
Introduction to UX
 
How to effectively implement different online research methods - UXPA 2015 - ...
How to effectively implement different online research methods - UXPA 2015 - ...How to effectively implement different online research methods - UXPA 2015 - ...
How to effectively implement different online research methods - UXPA 2015 - ...
 

Semelhante a Analytic Design Group Design Research Qualifications

Usability Testing - Sivaprasath Selvaraj
Usability Testing - Sivaprasath SelvarajUsability Testing - Sivaprasath Selvaraj
Usability Testing - Sivaprasath SelvarajSivaprasath Selvaraj
 
The UX Research Methodology
The UX Research MethodologyThe UX Research Methodology
The UX Research MethodologyEdward Mc Elroy
 
11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptxZahirahZairul2
 
ADGi Experience design that works!
ADGi Experience design that works!ADGi Experience design that works!
ADGi Experience design that works!Jon Mattice
 
Ucd Techniques - Shad MUN 2008
Ucd Techniques - Shad MUN 2008Ucd Techniques - Shad MUN 2008
Ucd Techniques - Shad MUN 2008Patañjali Chary
 
UCD Workshop - Shad MUN 2008
UCD Workshop - Shad MUN 2008UCD Workshop - Shad MUN 2008
UCD Workshop - Shad MUN 2008guest63c15b
 
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter eventUsability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter eventKay Aubrey
 
Prototyping and Usability Testing your designs
Prototyping and Usability Testing your designsPrototyping and Usability Testing your designs
Prototyping and Usability Testing your designsElizabeth Snowdon
 
How to Improve Your Company's UX Capabilities - Let Your Methods Drive Your Plan
How to Improve Your Company's UX Capabilities - Let Your Methods Drive Your PlanHow to Improve Your Company's UX Capabilities - Let Your Methods Drive Your Plan
How to Improve Your Company's UX Capabilities - Let Your Methods Drive Your PlanUserZoom
 
UX (User Experience) Process, May 2017
UX (User Experience) Process, May 2017UX (User Experience) Process, May 2017
UX (User Experience) Process, May 2017Gary Coker
 
Chapter 7 - Evaluation Tekhnique
Chapter 7 - Evaluation TekhniqueChapter 7 - Evaluation Tekhnique
Chapter 7 - Evaluation TekhniqueMuhammad Najib
 
Agile methodology - Humanity
Agile methodology  - HumanityAgile methodology  - Humanity
Agile methodology - HumanityHumanity
 
Evaluation in hci
Evaluation in hciEvaluation in hci
Evaluation in hcisajid rao
 
What is User Centered Design?
What is User Centered Design?What is User Centered Design?
What is User Centered Design?jamiecavanaugh
 
Usability methods to improve EMRs
Usability methods to improve EMRsUsability methods to improve EMRs
Usability methods to improve EMRsJeffery Belden
 

Semelhante a Analytic Design Group Design Research Qualifications (20)

SR-capabilities-2015
SR-capabilities-2015SR-capabilities-2015
SR-capabilities-2015
 
SR-capabilities-2015
SR-capabilities-2015SR-capabilities-2015
SR-capabilities-2015
 
Usability Testing - Sivaprasath Selvaraj
Usability Testing - Sivaprasath SelvarajUsability Testing - Sivaprasath Selvaraj
Usability Testing - Sivaprasath Selvaraj
 
The UX Research Methodology
The UX Research MethodologyThe UX Research Methodology
The UX Research Methodology
 
11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx
 
ADGi Experience design that works!
ADGi Experience design that works!ADGi Experience design that works!
ADGi Experience design that works!
 
Ucd Techniques - Shad MUN 2008
Ucd Techniques - Shad MUN 2008Ucd Techniques - Shad MUN 2008
Ucd Techniques - Shad MUN 2008
 
UCD Workshop - Shad MUN 2008
UCD Workshop - Shad MUN 2008UCD Workshop - Shad MUN 2008
UCD Workshop - Shad MUN 2008
 
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter eventUsability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
 
Prototyping and Usability Testing your designs
Prototyping and Usability Testing your designsPrototyping and Usability Testing your designs
Prototyping and Usability Testing your designs
 
How to Improve Your Company's UX Capabilities - Let Your Methods Drive Your Plan
How to Improve Your Company's UX Capabilities - Let Your Methods Drive Your PlanHow to Improve Your Company's UX Capabilities - Let Your Methods Drive Your Plan
How to Improve Your Company's UX Capabilities - Let Your Methods Drive Your Plan
 
UX (User Experience) Process, May 2017
UX (User Experience) Process, May 2017UX (User Experience) Process, May 2017
UX (User Experience) Process, May 2017
 
Chapter 7 - Evaluation Tekhnique
Chapter 7 - Evaluation TekhniqueChapter 7 - Evaluation Tekhnique
Chapter 7 - Evaluation Tekhnique
 
Improved Overview
Improved OverviewImproved Overview
Improved Overview
 
Designing the User Experience
Designing the User ExperienceDesigning the User Experience
Designing the User Experience
 
Ux presentation
Ux presentationUx presentation
Ux presentation
 
Agile methodology - Humanity
Agile methodology  - HumanityAgile methodology  - Humanity
Agile methodology - Humanity
 
Evaluation in hci
Evaluation in hciEvaluation in hci
Evaluation in hci
 
What is User Centered Design?
What is User Centered Design?What is User Centered Design?
What is User Centered Design?
 
Usability methods to improve EMRs
Usability methods to improve EMRsUsability methods to improve EMRs
Usability methods to improve EMRs
 

Analytic Design Group Design Research Qualifications

  • 1. Qualifications User Testing & Design Research January 24, 2013
  • 2. Who we are Analytic Design Group Inc (ADGi) is a visionary user experience Founded in 2005 on the strategy and design firm that specializes in innovating in digital principle that evidence-based environments by leveraging in-depth primary research to find design will always be more expose unexamined assumptions. Our work not only withstands the powerful than design driven by best practices, we have grown complexity of multiple agendas and intricate implementation but from a single practitioner to a also the scrutiny of the public. vibrant, collaborative team. Some of our clients include: Samsung, Sony, AT&T, Adobe, Nokia, LG, Motorola
  • 3. Our Services Key service areas include: design research, user experience strategy development, interaction design, communication design, and usability testing. Lately some of our work has also included service design considerations as well. Our projects can include the full sweep of user experience services (i.e. user research through strategy and design) or just one element. Our aim is to always fit the work required to the need, and we’ll work with you to ensure you are getting the best value from our efforts. This presentation focuses on our design research and user testing services. / User Experience / Design Research / / Usability Testing / / Communication Design / / Interaction Design / Strategy Development /
  • 5. Design Research We use a diverse set of design research methodologies: Surveys — we have used surveys to establish baseline data (largely attitudinal), help to segment audiences, and in some cases, help to identify core issues that can be further explored by other research. Context-rich group interviews (like marketing focus groups but much richer) — the focus groups we do, are typically very rich and usually drive out a great deal of contextual as well as attitudinal data. We usually ask participants to complete homework prior to the session (aids in grounding the user and supports contextual data gathering) as well as have some form of participatory design exercise to allow participants to tap into their feelings and attitudes quickly.
  • 6. Design Research On-site observation (w/o) interviews — this is useful when we are looking for issues that are process related. Task analysis — this is usually both an expert review and then a walkthrough with participants to identify particular pain points with certain tasks. This often involves both offline and online elements. TASK 1 80%50% 70% Expert review/Heuristic analysis — this can be a quick and cost- effective means of identifying user experience and usability issues. We typically rank severity of issues identified and can include an accessibility review in this process. Card sorting — we have done card sorting exercises in both one on one as well as group sessions. We’ve used both open and closed card sorts and typically use the findings to develop information architectures. Diary Studies — are useful when we are looking at processes that occur over a longer period of time or are looking at the impact of certain things over time.
  • 7. Design Research In-Situ & Ethnographic We have conducted numerous ethnographic or in-situ studies on a wide range of physical and digital products. These typically are very data rich and result in in-depth, tactical, near-term findings as well as robust, strategic, longer-term, insights. Our clients report that the ROI on these studies is that along with finding solutions to nagging problems, it can help them focus their product management for a year or more. For example, last year ADGi conducted an ethnographic study for a mobile carrier on a device experiencing high returns. We were able to identify key usability issues, service design issues, and deliver insights about how their customers currently perceived these devices and were likely to for the foreseeable future.
  • 10. User Testing The range of user testing methods we use include: Metrics-based usability studies — the usability studies we do are quite rich with quantitative (metrics) data as well as qualitative data. We typically collect task time, performance, SUS, satisfaction, and hedonic scores Remote-moderated usability studies — through the use of such tools as WebEx (or other screen sharing tools) we have successfully conducted remote moderated testing, collecting similar (or the same) metrics as we do for in person tests – this is particularly useful when testing with participants who are geographically dispersed or where the user’s context heavily influences their interaction and on-site observation is not possible/feasible.
  • 11. User Testing ‘Listening-lab’ style user testing — this is essentially user testing without a set task list. There is some hard data we draw out of these sessions, but mainly this is focused on qualitative data. Un-moderated usability testing — this is user testing where the user is in the lab and observed and recorded but completing the tasks on their own. ADGi Field Test — this is a web-based tool we developed in house that automates a field test: participants are asked via email whether they wish to participate. If they indicate yes, they are sent a set of instructions or tasks to complete along with an NDA reminder. After a set period of days participants are then sent a survey to fill out. From a test administration point of view we can track all the participants, where they are in the study and get a graphical view on how they responded to each question, as well as download a CSV of the results for additional manipulation. We’ve used this tool to test devices and apps.
  • 12. User Testing Navigation testing — this is another tool we developed in house to test navigation structures. Users are asked a series of questions about under what categories and labels they would expect to find certain pieces of information. They are shown the tree structure for the site and can navigate through it to the spot where they would expect to find the content. This testing has been very effective for us in establishing how findable content on very large sites will be and in determining the effectiveness of categorization and labeling schemes. Concept acceptance testing — this is useful for trying out a new concept, typically while comparing it to other more familiar ones. We’ve used this on devices when a client wants to evaluate new way of navigating or different form factor
  • 13. User Testing Competitive benchmark testing — this is useful when comparing a product (interface, device, site) against one or more others – we have used this to set benchmarks for future comparison as well as just comparisons Iterative testing — this is where we test one or at most two discreet elements with a very small set of users (2 or 3) make recommendations on that testing, the development team makes those changes and we test again until we do not see the need for any more changes. We use this method primarily for games research looking at a particular interaction. While other clients have asked about this, after discussing it we have so far determined that the value of this approach does not warrant the effort and cost for the project at hand.
  • 14. User Testing Remote user testing — Ability and experience in executing remote usability testing — inclusive of screen sharing, audio and video recording. We have experience conducting remote-moderated usability as well as focus group sessions. We screen share and capture (audio and video record) the sessions. We have found that this type of research can be very cost effective and is especially useful when we are asking participants to log in to their own accounts, or are geographically dispersed. On occasion we’ve also found that by having the user located in their own environment, we are able to glean more contextual information than we are typically able to in the lab.
  • 18. Mobile Test Lab Mobile test lab — we conduct a great deal of testing on mobile devices and our lab set up is both flexible and powerful: Our testing equipment is deliberately flexible so that we can set up in a lab environment, a coffee shop, a person’s home or office. We have designed a very stable, yet flexible camera mount that allows us to capture a variety of interactions. Assuming we can connect to a stable WiFi, we can also live stream (to allow remote viewing) outside of a lab environment. For more information view this presentation: Mobile Usability: What's Your Strategy
  • 19. Karyn Zuidinga Principal & Director of User Experience 604.669.7655 karyn@analyticdesigngroup.com www.analyticdesigngroup.com @analytic_design