Like many libraries, the University of Michigan Library for a long time employed no one for the purpose of website usability. To address the gap, a Usability Group was formed. The structure and methodologies of the group have evolved over the last four years, producing an efficient organization with innovative and highly effective techniques. Twenty-eight staff members have contributed to this group. Six different systems have been evaluated, resulting in over 30 reports and hundreds of recommendations.
Although resources are evaluated using a wide range of traditional techniques (formal testing, focus groups, surveys, heuristic evaluations, prototype testing, etc.), the group strongly believes that usability doesn't have to be complicated and time-consuming, favoring more straightforward, "budget" techniques as means to the most interesting and useful results. The group also often employs an iterative approach to testing by repeating and refining tests to evaluate effectiveness of changes and to fine-tune techniques.
This presentation will describe the Usability Group's techniques and findings from our most recent projects to evaluate the library's recently launched website. Specifically, we will describe methodologies, present testing materials and results from "guerilla" testing, group card sorting, and participatory design sessions with undergraduates, graduates and faculty, and staff. Participants will be able to apply these methods in their own libraries.
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
Usability for Tough Times
1. Usability for Tough Times Suzanne Chapman User Experience Department Ken Varnum Web Systems Department Image by flickr user alancleaver_2000 Budget Usability
29. Questions? All past reports: www.lib.umich.edu/usability Suzanne Chapman (suzchap@umich.edu) Ken Varnum (varnum@umich.edu)
Notas do Editor
Traditionally most usability work done through this committee but recently created UX department will now also be focusing on this type of research. Nice to mention that UTF members volunteer and rarely have prior experience so budget techniques are easy to learn and easy to take back to their departments to use for other purposes! I’m a member of the core team, and also one of the biggest ‘customers’
This is OUR definition and it’s pretty loose.
Ken – maybe you want the slide to just say “Faster” and fill in the bits in parens? Faster (less time investment for prep) Easier (less time designing evaluations, cut out time-consuming things like recruiting participants) Cheaper (don’t need fancy software or facilities) More targeted (have a question, answer it directly) More staff (with less expertise) can take it on. (and maybe just as reliable results) Doing a couple of budget tests is better than doing nothing. The ramp up to doing formal testing can be prohibitive to actually getting it done at all.
Like voting in Chicago
[this seems better at end, but I like the idea of the last slide being the online guerrilla.] Good example would be to describe our first attempt at guerrilla: We were looking to relabel the link to our various delivery services… we asked 9 people what they’d call it and got 9 different answers. It was still interesting and useful but we had to redesign and redo the test to get a solid answer.
Turning to our site, it’s goals, and our evaluation of them We launched a new web site in August 2009. Based around two main features: Search and Browse . Through user studies (both guerilla and longer-term) we generated data that helped define/conform the scope of our problem: Catalog Federated Search Research Guides x 3 300+ database providers Thousands of online journals Dozens of libraries One-click survey Broad themes emerged: “ I can’t find anything” (our Dean) A vehicle for a single identity – the MLibrary brand Provide a unified user experience Platonic Ideal – Eliminate Silos: information not location Site we have now in response to these needs. How’d we do, we wondered? Project priorities Gain a better understanding of user's perception and use of the "new" library website (it's now 1 year old!) Pinpoint & evaluate problem areas Completed 4 evaluations using 3 different methods in 8 months
Suz intro ask how many people have done “formal” vs. “informal”
Search language
talk about how we analyzed data (user groups, sep out areas) to identify trends
Like most larger libraries, we have a bewildering array of service points Hard to organize in a way meaningful to the public We needed to break out of our librarian-centric way of thinking about the org chart Card sorting
Group Paper Card Sort w. Students 18 participants: undergrads, grad students (divided into 4 groups) Organized 84 cards representing half of this content Allowed us to see interaction among students, hear thought processes, and better understand confusing labels Individual Online Card Sort w. Staff Purchased license to OptimalSort allowing us to place in front of many individuals 140 staff completed exercise Provided more data, but didn't expose the thought process
Exploring the results can be tricky Task Force also came up with "unified" categories, based on the categories the participants created, as well as the comments they made during the card sort. Several similarities between categories surfaced across the various participant groups performing the card sort, whether performing a paper sort or using the online tool. Both the similar groupings across participant groups and the "unified" categories the Task Force came up with were suggested as bases for further tests. Implementing changes will be a large-scale change that would add significant complexities for users and staff.
Exploring the results can be tricky Task Force also came up with "unified" categories, based on the categories the participants created, as well as the comments they made during the card sort. Several similarities between categories surfaced across the various participant groups performing the card sort, whether performing a paper sort or using the online tool. Both the similar groupings across participant groups and the "unified" categories the Task Force came up with were suggested as bases for further tests. Implementing changes will be a large-scale change that would add significant complexities for users and staff.
Goal: Fine-tune the contents & labels for Quick Links. Give brief history: used to have QL, killed in new site, heard from lots of faculty that it was a poor idea to eliminated, brought it back. The Test: 20 participants: undergrads, grad students Participants were shown the current Quick Links section without its title-- asked to name the section and describe where each link went Then asked what links they would most like to see in a grouping of links like this one Findings "Outages" not understood or considered to be useful. More than half of users requested addition of Webmail link. Quick Links label works well. Removed/added links Rearranged links Retitled 'Ejournals' -> 'Online Journals' (throughout site)
URL of survey is at http://umichlib.qualtrics.com/SE/?SID=SV_3rZvKvGPvIkS1ms Still need to set up a TinyURL. And add URL to this slide.
URL of survey is at http://umichlib.qualtrics.com/SE/?SID=SV_3rZvKvGPvIkS1ms Still need to set up a TinyURL. And add URL to this slide.