250 likes | 260 Views
This guide provides an overview of heuristic evaluation, its benefits, and the process involved. It also explores common usability principles and how to apply them to identify and fix interface problems.
E N D
SEG3120 User Interfaces Design and Implementation LAB 3: Heuristic Evaluation
Outline • Assignment 1 Revisit. • Assignment 1 discussion. • Heuristic evaluation overview. • Heuristics Principles. • How to perform an HE. • Assignment 2 discussion.
Assignment 1 Revisit • General Notes • Pseudo code description • The Steps were taken by the user to do the task • Short and direct descriptive statements • Not the ideal steps to solve the task • User dependent (i.e. different scenarios)
Assignment 1 Revisit • Example • User1, Change Slide design • Format Slide Design • Search for “Maple design” • Select the design • Choose “Apply to all slides” option • Start next task
Assignment 1 Revisit • Malfunctions should target the software, not the user • The user had difficulties in dealing with the mouse • The user lost time in finding the picture, • Not a software malfunction • An evidence for malfunctions.
Assignment 1 Revisit • Poor malfunction analysis, important for assignment2. • Identify Malfunction • Limited aid in inserting pictures • Answer four distinct questions • Q1: How is the malfunction manifested? What do you notice and who noticed it? • Detected by the user • It occurred when the user tried to insert a specific picture. • Q2. At what stage in the interaction is it occurring? • When the user decide on the next goal.
Assignment 1 Revisit • Q3. At what level of the user interface is it occurring? • Interaction style level • Q4. Why is it occurring? • Poor job satisfaction • List and prioritize possible cures • Inserting pictures should be through wizards.
Assignment 1 Revisit Assignment 1 discussion
Interface Evaluation • The problem: • Test the suitability/goodness of an interface • Various approaches • Design evaluation: • cognitive walk-through, • review-based evaluation, • and use of models • Implementation evaluation: • controlled experiments, • heuristic evaluation, • questionnaires, • and interviews
Heuristic Evaluation • A type of predictive evaluation • Use experts as reviewers instead of users • Benefits of predictive evaluation: • The experts know what problems to look for • Can be done before system is built • Experts give prescriptive feedback
Heuristic Evaluation • To consider • Reviewers should be independent of designers • Reviewers should have experience in both the application domain and HCI • Include several experts to avoid bias • Experts must know classes of users • Beware: Novices can do some very bizarre things that experts may not anticipate
Heuristic Evaluation Process • Evaluators go through UI several times • Inspect various dialogue elements • Compare with list of usability principles • Consider other principles/results that come to mind • Usability principles • Use violations to redesign/fix problems
H1: Simple & natural dialog H2: Speak the users’ language H3: Minimize users’ memory load H4: Consistency H5: Feedback H6: Clearly marked exits H7: Shortcuts H8: Precise & constructive error messages H9: Prevent errors H10: Help and documentation Heuristics (original)
searching database for matches Heuristics (revised set) • H2: Visibility of system status • Inform users about system behavior • For example, pay attention to response time • 0.1 sec: no special indicators needed • 1.0 sec: user tends to lose track of data • 10 sec: max duration if user to stay focused on action • for longer delays, use percent-done progress bars
Heuristics (cont.) • H4: Consistency & standards
Heuristics (cont.) • H8: Aesthetic and minimalist design • Only relevant information in dialogues
Heuristics (cont.) • H9: Help users recognize, diagnose, and recover from errors • Error messages in plain language • Precisely indicate the problem • Constructively suggest a solution
Heuristics (cont.) • H10: Help and documentation • Easy to search • Focused on the user’s task • List concrete steps to carry out • Not too large
Phases of Heuristic Evaluation 1) Pre-evaluation training • Give evaluators needed domain knowledge and information on the scenario 2) Evaluation • Individuals evaluate and then aggregate results 3) Severity rating • Determine how severe each problem is (priority) 4) Debriefing • Discuss the outcome with design team
Severity Ratings (cont.) 0 - disagree about usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix
Severity Ratings Example 1. [H4 Consistency] [Severity 3] The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function.
Debriefing • Conduct with evaluators, observers, and development team members • Discuss general characteristics of UI • Suggest potential improvements to address major usability problems • Development team rates fix difficulty
Assignment 2 • Individual work. • Date Due 10 Mars • Se référer à la section 5.4.2 dans vos notes et au chapitre 6 dans le livre • Choose a web site. • Skim through the textbook and the course notes section E, look for design guidelines regarding: • Graphic design, use of color, fonts, icons etc. • Response time • Form design and windowing • Study the web site and identify the 10 most important malfunctions.
Assignment 2 • For each malfunction, do the following: • Identify the malfunction. • What is the design principle you based on to identify the malfunction. • The results of the malfunction analysis. • What do you think could be done to fix the malfunction. • Picture highlighting the malfunction, if it’s possible.
Heuristic Evaluation Questions ?