250 likes | 422 Views
Museums and the Web 2001. Workshop Conducting a Heuristic Evaluation of a Museum Web Site to Improve its Usability. David Farkas University of Washington, College of Engineering Werner Schweibenz University of Saarland, Dept. of Information Science. Overview. 1 Usability - Definitions
E N D
Museums and the Web 2001 WorkshopConducting a Heuristic Evaluation of a Museum Web Site to Improve its Usability David Farkas University of Washington, College of EngineeringWerner Schweibenz University of Saarland, Dept. of Information Science
Overview 1 Usability - Definitions 2 Usability Engineering for the Web 3 Methods for Usability Engineering 4 Heuristics for Evaluation 5 Application of the Findings 6 Conclusion Museums & the Web 2001
Usability - Definitions "Usability is the measure of the quality of the user experience when interacting with something – whether a web site, a traditional software application, or any other device the user can operate in some way or another." (Jakob Nielsen) ISO 9241 Ergonomic requirements for office work with visual display terminals (VDTs) Museums & the Web 2001
Usability and functionality • Usability and functionality are attributes of a product. • Functionality refers to what you can do with the product. Usability refers to how users actually use the product. • "The functionality exists. But building functionality into a product, however, doesn't guarantee that people will be able to use it." (Dumas & Redish) • "A product by itself has no value; it has value only insofar as it is used. Use implies users." (Dumas & Redish) Museums & the Web 2001
2 Usability Engineering for the Web • The Web is a complex graphical user interface but has to be instantly usable. • "Usability rules the Web. Simply stated, if the customer can’t find a product, then he or she will not buy it." (Nielsen 2000) • In the context of the World Wide Web, "usability refers to how easy it is to find, understand and use the information displayed on a Web site." (Keevil 1998) Museums & the Web 2001
The Situation on the Web • Nielsen estimated that there will be about 10 billion Web pages on the Internet by the Year 2001. • Traditional user testing (deluxe usability) in the laboratory is too expensive to do. • Discount usability engineering is our only hope. We need methods simple enough that people can do usability work, fast enough that people will take the time, and cheap enough that it's still worth doing. • Heuristic evaluation is the right tool for discount usability engineering on the Web. Museums & the Web 2001
3 Methods for Usability Engineering • Usability engineering is a set of methods to design user-friendly products. • There are a wide variety of methods. • Expert-focused (analytical) methods, e.g. heuristic evaluation with experts as "surrogate users." • User-focused (empirical) methods, e.g. laboratory tests with the thinking aloud method and actual users. • A combination of heuristic evaluation and laboratory testing provides the greatest value from each method. Museums & the Web 2001
Heuristic Evaluation • "Heuristic evaluation is a way of finding usability problems in a design by contrasting it with a list of established usability principles." (Nielsen 1997) • Advantage: five evaluators can find some 75 per cent of the usability problems • Disadvantage: experts cannot step back behind what they already know ("surrogate users") Museums & the Web 2001
Laboratory Testing with Thinking Aloud • Thinking aloud "is the most fundamental usability method and is in some sense irreplaceable, since it provides direct information about how people use computers and what their exact problems are with the concrete interface being tested." (Nielsen) • Advantage: tests supply a huge amount of qualitative data that show how actual users handle the product. • Disadvantages: tests take place in a laboratory situ-ation and a lot of equipment and coordination is necessary to conduct the test. Museums & the Web 2001
Usability Lab working place of the participant thinking aloud mixing console for the video cameras video camera recording the actions on the screen computer screen and keyboard video recorder and tv for recording participant micro- phone notes video camera recor- ding the actions of the participant test manager technicalassistant Museums & the Web 2001
Usability Engineering Lifecycle "Usability engineering is not a one-shot affair where the user interface is fixed up before the release of the product. Rather, usability engineering is a set of activites that ideally take place throughout the lifecycle of the product, ..." (Nielsen) Museums & the Web 2001
The Web usability engineering lifecycle detected usa- bility problems, severity rating heuristicevaluation Web site developing test material suggestions for the redesign question-naires test tasks analysing the data of user tests and heuristic evaluation data from the user tests user tests with thinking aloud Museums & the Web 2001
4 Heuristics for Evaluation • Simple heuristics, e.g. 10 basic heuristics of Molich & Nielsen (1990) • Complex checklists, e.g. Keevil‘s Usability Index (1998) • Complex heuristics, e.g. Heuristics for Web Communication (2000) Museums & the Web 2001
Heuristics of Molich & Nielsen • Simple and natural dialogue • Speak the user´s language • Minimize the user´s memory load • Consistency • Feedback • Clearly marked exits • Shortcuts • Precise and constructive error messages • Prevent errors • Help and documentation Museums & the Web 2001
The Keevil Usability Index The Usability Index consists of 203 questions arranged in five categories: • Finding the information: Can you find what you want? • Understanding the information: After you find the information, can you understand it? • Supporting user tasks: Does the information help you perform a task? • Evaluating the technical accuracy: Is the technical information complete? • Presenting the information: Does the information look like a quality product? Museums & the Web 2001
Findings of the Usability Index • The usability index is calculated from the total number of 'yes' answers divided by the total numbers of 'yes' and 'no' answers. • The 203 questions leave room for interpretation. • In the Saarland Museum study, the Keevil Usability Index for 15 evaluators ranged from 29% to 55%, the arithmetic mean being 47%. The deviation is due to the bias of the evaluators in interpreting the questions. Museums & the Web 2001
The Heuristics for Web Communication • The Heuristics for Web Communication are a set of five heuristics with the scope on information-oriented Web sites. • They were developed for an International Summer Workshop Exploring a Communication Model for Web Design, Seattle, WA, July 10-17, 1999. • They cover all important aspects of Web commu-nication: displaying information, navigation, text comprehension, role playing (i.e. author-reader relationship), and data collection for analyzing interaction. Museums & the Web 2001
The Heuristics for Web Communication • Advantages: they are both design-oriented, i.e. they can be used as guidelines for designing a prototype, and evaluation-oriented, i.e. they can be used for evaluating an existing Web site. Their function in the design process is both idea-generating and troubleshooting. • Disadvantages: they are applicable to information-oriented Web sites only; and they are complex and that it takes some time and effort to learn how to apply them. Museums & the Web 2001
The Heuristics for Web Communication • Special Issue of the Journal of Technical Communication, August 2000, Vol. 47, No. 3, available in print and online. • Quicklists for Web Communication. In: Technical Communication Online, August 2000, Vol. 47, No. 3., Internet, URL http://www.techcomm-online.org/shared/ special_col/quicklists/menu.htm Museums & the Web 2001
Heuristic Evaluation of Web Sites A heuristic evaluation can be conducted in several steps: • Forming teams of 4 or 5 expert as evaluators • Getting to know the Web site (ca. 15 min browsing) • Individual evaluation of the Web site by each expert using one of the heuristics (ca. 60 to 90 min) • Severity rating of uncovered problems in teams (ca. 60 to 90 min) • Presentation of the findings of all the involved teams • Collecting the findings and writing an evaluation report Museums & the Web 2001
Severity Rating of the Findings Rating categories according to Nielsen: • 0 I don't agree that this is a usability problem at all • 1 Cosmetic problem only: need not be fixed unless extra time is available on project • 2 Minor usability problem: fixing this should be given low priority • 3 Major usability problem: important to fix, so should be given high priority • 4 Usability catastrophe: imperative to fix this before product can be released Museums & the Web 2001
Findings of the Heuristics • The heuristics detect a great number of usability problems: inconsistent use of link colors, no text messages for graphic links, complicated sentences, deficits in page structure and organization, lack of informative titles, overuse of bold and italics, meaningless animation, flaws in the author-reader relationship etc. • These problems are real usability problems but might not observable in user testing, because average users do not realize that they cause problems because they lack the background knowledge in Web design. Museums & the Web 2001
5 Application of the findings The findings of the heuristic evaluation can be used • for a redesign of the Web site. • for generating ideas for the further development of the Web site. • for developing test tasks for laboratory tests with actual users. Museums & the Web 2001
6 Conclusion • The Heuristics for Web Communication proved to be applicable tools for heuristic evaluation. • The heuristics support a structured evaluation and help both to find and to solve usability problems. • In contrast to simple checklists, they give the evaluators some scope for interpretation while offering guidence at the same time. • A heuristic evaluation can be complete by a few evaluators in a few days. Museums & the Web 2001
Thanks for listening. Any questions or comments? For information on the heuristics please contact David Farkas, farkas@u.washington.edu Fon: 206-685-8659 For information on evaluation please contact Werner Schweibenz, w.schweibenz@rz.uni-sb.de Fon: +49-681-302-3542 Museums & the Web 2001