360 likes | 490 Views
Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites. Susan [Gardner] Archambault Kenneth Simon. Loyola Marymount University. Private Catholic University in Los Angeles, California 5900+ undergraduates and 1900+ graduates
E N D
Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites Susan [Gardner] Archambault Kenneth Simon
Loyola Marymount University • Private Catholic University in Los Angeles, California • 5900+ undergraduates and 1900+ graduates • William H. Hannon Library Information Desk open 24/5
Research Question • What is the most effective way to provide access to our Library FAQs? • A comparison of two products: How Do I? and LibAnswers. Which features do students prefer, and which features lead to better performance?
Methodology • Conducted usability testing on 20 undergraduate students at LMU • Population equally represented each class (freshmen through seniors) and had a ratio of 60:40 females to males
Methodology • Used a combination of the Performance Test methodology and the Think-Aloud methodology
Methodology • Students given 10 performance tasks to complete at a computer twice - once using LibAnswers as starting point, and once using How Do I? • After each performance task, students given questionnaire measuring satisfaction with site
Methodology • Audio recorded and computer screen activity captured via “ScreenFlow”screencastingsoftware
Additional Questions • How likely would you be to use each page again? • What was your favorite aspect of each site? • What was your least favorite aspect? • Overall, do you prefer LibAnswers or How Do I?
Performance Scoring: Speed • Start the clock when the person begins searching for the answer to a new question on the home page of the site they are testing • Stop the clock when they copy the URL with the answer
Performance Scoring: Efficiency • Count the number of times the person made a new attempt, or started down a new path, by returning to the home page *after* a previous attempt away from or on the homepage failed
Patterns • Overall, 9 of 20 performed worse with the site they said they preferred. • 4 of 5 freshmen performed worse with the site they said they preferred. Upperclassmen were more consistent. • Females tended to perform better with their preferred site; males did not. • 75% of the males preferred How Do I? over LibAnswers, while females were evenly divided.
LibAnswers Likes • Keyword search “like a search engine” • Autosuggest in search bar • Popular topics list • Friendly / pleasant to use • Don’t have to read through categories Dislikes • Overwhelming interface / cluttered • Long list of specific questions but hard to find the info you want • Less efficient than the “How Do I” page • Once you do a search, you lose your original question • Autosuggestions are ambiguous or too broad, and sometimes don’t function properly
How Do I? Likes • Fast / efficient to use • Everything is right there in front of you: “I don’t have to type, just click” • Simple, clearly laid out categories • Organized and clean looking Dislikes • Less efficient than the LibAnswers page: have to read a lot • Too restricted: needs a search box • Have to guess a category to decide where to look • Limited number of too-broad questions • Boring / basic appearance
Sharing results with Springshare • Retain question asked in search results screen. • Add stopwords to search, so typing “How do I” doesn’t drop down a long list of irrelevant stuff, and “Where is” and “where are” aren’t mutually exclusive. • Remove “related LibGuides” content to reduce clutter. • Control the list of “related questions” below an answer: they seem to be based only on the first topic assigned to a given question.
But wait… There is another.
Conclusions • Ended up with a balance between two extremes rather than one or the other • Think-aloud method: gave up control; no preconceived ideas could influence outcome • Sitting in silence watching the participants made them nervous. Next time maybe leave the room and have a self-guided test • Efficiency is difficult to measure: moved away from counting clicks
Acknowledgements Thank you: • Shannon Billimore • Jennifer Masunaga • LMU Office of Assessment/Christine Chavez • Springshare
Bibliography • Ericsson, K.A. and Simon, H.A. (1980). Verbal Reports as Data. Psychological Review, 87(3), 215-251. • Smith, Ashleigh, Magner, Brian, and Phelan, Paraic. (2008, Nov. 20). Think Aloud Protocol Part 2. Retrieved May 3, 2012 from http://www.youtube.com/watch?v=dyQ_rtylJ3c&feature=related • Norlin, Elaina. (2002). Usability Testing for Library Web Sites: A Hands-On Guide. Chicago: American Library Association. • Porter, J. (2003). Testing the Three-Click Rule. Retrieved from http://www.uie.com/articles/three_click_rule/. • Willis, G.B. (2005). Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage Publications.
Additional Information Presentation Slides Contact Us Ken Simon Reference & Instruction Technologies Librarian Loyola Marymount University Twitter: @ksimon Email: ksimon@me.com Susan [Gardner] Archambault Head of Reference & Instruction Loyola Marymount University Twitter: @susanLMU Email: susan.gardner@lmu.edu • bit.ly/gardnersimon