840 likes | 1.1k Views
Usability testing for library catalogs. October 25, 2001 Nicole Hennig, Web Manager libraries.mit.edu libraries.mit.edu/barton. Thank you. Tracy Gabridge Librarian for Civil & Environmental Engineering • led the HTML customization team. Details available . http://macfadden.mit.edu:9500/
E N D
Usability testing for library catalogs October 25, 2001 Nicole Hennig, Web Manager libraries.mit.edu libraries.mit.edu/barton
Thank you Tracy Gabridge Librarian for Civil & Environmental Engineering • led the HTML customization team
Details available ... http://macfadden.mit.edu:9500/ webgroup/usability2001/barton/test1/ overview.html
Outline 1. background 2. the tests 3. problems & solutions 4. future directions
6 month process January - June 2001 • old system: GEAC Advance • new system: ExLibris ALEPH
Web OPAC project teams • web OPAC team - public service librarians - circulation staff - processing staff - cataloger - web manager
Web OPAC project teams • HTML customization team same as previous, plus - systems office staff - programmer
Bibliography • on handout • includes background on display and interface design of library catalogs
Background research • a lot of research on OPAC design available • but not based on observing users or usability testing • library system vendors are not following basic good design principles
Who makes design decisions? • we have more control now that we can customize HTML screens • the vendors need to practice good design in building the system
A work in progress • libraries.mit.edu/barton • more rounds of testing and improvements are coming later in the spring
Usable design goals • every page is self-explanatory • “self-teaching” interfaces
Will it apply? • some things are specific to ExLibris systems • many things are general - could apply to any OPAC
General principles • success summary http://macfadden.mit.edu:9500/ webgroup/usability2001/barton/ test2/success.html
The test • we had already done extensive usability testing while redesigning our web site
Latest thinking has changed • 1999: Large test, 30 users, timed people - quantitative • 2001: - More frequent, smaller tests, 5-6 people at a time - qualitative
The test • 1/2 hour long • 10 questions • think out loud
The test • observer takes detailed notes • train observers to not answer how it was supposed to work until end of the test • each observer tests 2 people (2 week time frame)
Designing questions • easy, basic tasks that a first-time user should be able to accomplish • real-world tasks (give them a real article citation)
Designing the questions • no need to obsess over perfect, “scientific” questions • you will learn plenty from watching people use the catalog
The questions • 1 - 5: known items • 6 - 10: general research • complete list: http://macfadden.mit.edu:9500/webgroup/ usability2001/barton/test1/questions.html
The questions • test the questions • get the bugs out • print out the questions in large type
Who we learned from • Washington State University Janet Chisman, et al. “Usability Testing: A Case Study” College & Research Libraries Nov. 1999
What we learned • multi-part questions - if user can’t complete first part, observer does it so they can try second part
What we looked for • features that were confusing or unclear • aspects of the system that worked well
The tests test 1 test 2 Who 7 students 3 students 3 library staff 4 library staff 4 disabled Catalogs our old web catalog:Barton (6) 1st draft of McGill: MUSE (2) new Barton Boston College: QUEST(2) screens DatesJan. 22 - Feb. 1, 2001May 21 - June 1 Successes 4 of 10 tasks 7 of 10 tasks
Problem 1 • people usually picked the default choices or the first choices without thinking much about it (not always the best strategy for their search)
people used first box, ignored second Example
Solution Default choice is keyword. This casts a broad net for those who forget to make a choice.
Problem 2 • Difference between browse & keyword search not clear
Example ? ?
Solution No need to know difference between keyword and browse search. Combined in one menu.
Problem 3 • it wasn’t clear how to input a search string (people used initial articles, author’s first name first, thought they had to type the entire title)
Example • carefully typed complete title, with article: The Journal of the American Chemical Society
Solution • includeexamples and instructions of how to input data near the search box and in the search menu
Examples for each type Example changes when menu changes.
Examples for each type Example changes when menu changes.
Grouping { Group different title searches, author searches, and subject searches together.
Problem 4 • very busy screens with many buttons were overwhelming for people
Solution • Present choices only where needed • Group navigation links in ways that make sense
Problem 5 • it was difficult to find clickable URLs for electronic titles