From Event to Action: Accelerate Your Decision Making with Real-Time Automation
McKinley NDF2013 crowdsourcing
1. Evaluating crowdsourcing websites
(Why evaluation isn’t a party at the end)
Donelle McKinley
PhD candidate, Victoria University of Wellington
Supervisors: Dr Sydney Shep and Dr Brenda Chawner
www.digitalglam.org
National Digital Forum conference, 27 November 2013,
Wellington, New Zealand
2.
3.
4.
5.
6.
7. Heuristics for user interface design
• Heuristics can resemble high-level (conceptual) design
principles (Rogers et al., 2011)
• Heuristics serve as both criteria to guide the design
process and a basis for evaluation
(Cockton et al., 2012; Hartson & Pyla, 2012).
• Heuristic evaluation is inexpensive, informal, relatively
intuitive, people are easily motivated to participate in
the process, it requires no advance planning, and can
be used early in the development process (Nielsen and Molich,1990)
10. Requirements for a NZ-RED task interface
1. Minimize user effort
2. Support integration of the task with research
processes
3. Enable new visitors and contributors to understand
what the task involves quickly and easily
4. Support accurate and controlled data entry
5. Be easy to use for people reasonably confident
with the Web
6. Support flexible, structured data entry
7. Support bilingual data entry
11. What did I learn?
• The UK-RED task interface only partially meets four
of the seven NZ-RED requirements
• Using an existing project ‘template’ may not be the
most effective way to serve the needs of your
volunteers or your project objectives. The only way
you can determine this is by subjecting it to
evaluation.
• Heuristics can be an efficient and effective method
of website evaluation