280 likes | 440 Views
Approaching usability testing at your level Sit anywhere Rice Majors University of Colorado Boulder. Rice Majors Faculty Director of Libraries Information Technology University of Colorado Boulder AMICAL Conference – April 2012 – American University of Sharjah. Today’s agenda.
E N D
Approaching usability testing at your levelSit anywhere Rice MajorsUniversity of Colorado Boulder Rice Majors Faculty Director of Libraries Information Technology University of Colorado Boulder AMICAL Conference – April 2012 – American University of Sharjah
Today’s agenda • Why do usability testing? • Looking at some existing models & methodologies • How to get started and scale upward • Preparing for results
Why would a library do usability testing? Talk in small groups & we will share
Your website • For your website, usability is a necessary condition for survival • If a website is difficult to use, people leave • There's no such thing as a user reading a website manual • Paradigmatic shift from publisher-focus to user-focus • Why wouldn’t you want your web presence to be easy to use?
Your intranet • For intranets, usability = employee productivity • Spending 10% of your time on usability will more than double your (Web) quality metrics • Somewhat lower return for software & physical items
Purchased / subscription products • Use the data to validate purchase & subscription decisions • Use the data to help the vendor(s) make data-driven decisions about interface and experience enhancements
Why I am doing user testing • Research goals – generate data to compare the user experiences of discovery tools (to help libraries and vendors) • Business goals – untangle some known challenges with our web “experience” • How do we pass a user back and forth between our website and our web-based platforms (Encore, Research Pro, various databases)? • How do we guide a user toward different paths of article discovery based on their needs (“I have a citation” vs. “I have a topic”)?
Some existing models • Lots of studies, more and less formal, published on the web and in articles • University of Minnesota and the usability lab • Explore issues of discovery • Assess the effectiveness of Primo (across many metrics) • University of Michigan and the guerilla testing • Compare a single set of design options • Validate a single design decision
Common Methodologies • Card sorts • Open sort – design (“Sort using your own categories”) • Closed sort – validate (“Sort using the provided categories”) • Task completion – assess or validate • Find representative users • Ask the users to perform representative tasks • Observe what the users do, where they succeed, and where they have difficulties with the user interface • Shut up and let the users do the talking
A “real” task I used • encore.colorado.edu • Find all recordings the library owns by The Beatles. Somehow remind yourself to look at these again later. (10 minutes) • If you need to log in, use these provided login credentials, using the Public Patron login prompt: • Last Name Testpatron • ID Number 1234567890 • PIN 0wh1ne (first digit is a zero)
What was good/bad about the design of that task? Talk in small groups & we will share
The good… … the bad • Beatles is a non-unique term in bibliographic data • Task probably involved using facets / refinement features • Involved understanding how formats are represented in results • (User would realize the library has recordings) • “Recordings” • FRBR • Library may have licensed content as well • More than one way to complete the task • Hard to assess whether the user would actually be reminded
Starting can be the hardest part • Identifying goals can make this seem daunting • You have a robust web presence • You have a sense that there are many areas for improvement • Anything you do is better than not doing anything • C.f. shelf-reading
Framing a single question • Possible starting point: Ask users to validate a single decision • What are the actual questions you get? • Printout(s) of possible webpage(s) – possibly just mocked up • “If you wanted to find an article, where would you start?” • “When does Norlin library close today?”
What question would be your first question? Talk in small groups & we will share
Starting simple • Administrative support for this (more on this later!) • Quiet place for doing the test • Mostly time and interest • Designing a test • Recruiting participants • Conducting the tests • Reviewing your data • Laptops have built-in cameras/mics • One license of Morae (or equivalent)
Scaling upward • Work toward making testing normal • Dedicated space? • Ours is shared with other functions (the IT dept. testing new OS bundles; troubleshooting off-campus access) • Proper usability lab is probably out of reach, and that’s ok • Does a library need to do eye-tracking? • Does a library need to have a note-taker AND a video recording?
Finding participants • I started with posters, Twitter/Facebook • But my real success came when a student forwarded the opportunity to a listserv • There is no substitute for real users • Use library employees only for intranet testing
Finding participants • Do you have time for one question to help us improve the library website? • Incentives for longer participation – food for a little time, gift cards for longer amounts of time? • If incentives are not possible, then frame things as single decisions as much as you can • It doesn’t take many participants • Reach consensus (or not) in 6-10 participants • An absence of consensus is informative
What happens once we have results from our testing? (A few thoughts of preparation)
Those darn results • Be prepared for results that may surprise you • The problems may not be where you thought they were • Be prepared for inconclusive results
Accepting the results • Be prepared for a new power structure • What is your existing power structure? (Design by Committee?) • Emotional decisions and data-driven decisions • “I spent a lot of time on this” and/or “those users are dumb” vs. “the design needs improving” • Craigslist is very easy to use
What are the first three things you will do when you return to your library? Talk in small groups & we will share
Further reading • Dumas, J.S., & Redish, J.C. (1993). A practical guide to usability testing. Norwood, New Jersey: Ablex Publishing, 1993. • Dumas, J.S., & Loring, B.A. (2008). Moderating usability tests: Principles and practices for interacting. Burlington, Mass.: Morgan Kaufmann Publishers, 2008. • Rubin, J. (1994). Handbook of usability testing: How to plan, design, and conduct effective tests. New York: Wiley, 1994.
Thank you!Rice.majors@colorado.edu Rice Majors Faculty Director of Libraries Information Technology University of Colorado Boulder AMICAL Conference – April 2012 – American University of Sharjah