1 / 40

Discount Evaluation

This preview of Part 4 of the project discusses the use of Heuristic Evaluation and Cognitive Walkthrough for evaluating prototypes. It includes guidelines for performing evaluations and summarizing usability conclusions based on collected data.

Download Presentation

Discount Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Discount Evaluation Evaluating with experts

  2. Agenda • Part 4 preview • Heuristic Evaluation • Perform HE on each other’s prototypes • Cognitive Walkthrough • Perform CW on each other’s prototypes

  3. Project part 4 • I’ll make comments on your evaluation plans by Tuesday • Perform the evaluations • Clearly inform your users what you are doing and why. • If you are audio or video recording, I prefer you use a consent form. • Pilot at least once – know how long its going to take. • If you need equipment, ask me

  4. Part 4 write up • State exactly what you did (task list, how many, questionnaires etc.) • Summarize data collected • Summarize usability conclusions based on your data • Discuss implications for the prototype based on those conclusions

  5. Discount Evaluation Techniques • Basis: • Observing users can be time-consuming and expensive • Try to predict usability rather than observing it directly • Conserve resources (quick & low cost)

  6. Approach - inspections • Expert reviewers used • HCI experts interact with system and try to find potential problems and give prescriptive feedback • Best if • Haven’t used earlier prototype • Familiar with domain or task • Understand user perspectives • Does not require working system

  7. Heuristic Evaluation • Developed by Jakob Nielsen • Several expert usability evaluators assess system based on simple and general heuristics (principles or rules of thumb) (Web site: www.useit.com)

  8. How many experts? • Nielsen found thatabout 5 evaluations found 75% of the problems • Above that you get more, but at decreasing efficiency

  9. Evaluate System • Reviewers need a prototype • May vary from mock-ups and storyboards to a working system • Reviewers evaluate system based on high-level heuristics. • Where to get heuristics? • http://www.useit.com/papers/heuristic/ • http://www.asktog.com/basics/firstPrinciples.html

  10. use simple and natural dialog speak user’s language minimize memory load be consistent provide feedback provide clearly marked exits provide shortcuts provide good error messages prevent errors Heuristics

  11. visibility of system status aesthetic and minimalist design user control and freedom consistency and standards error prevention recognition rather than recall flexibility and efficiency of use recognition, diagnosis and recovery from errors help and documentation match between system and real world Neilsen’s Heuristics

  12. Groupware heuristics • Provide the means for intentional and appropriate verbal communication • Provide the means for intentional and appropriate gestural communication • Provide consequential communication of an individual’s embodiment • Provide consequential communication of shared artifacts (i.e. artifact feedthrough) • Provide Protection • Manage the transitions between tightly and loosely-coupled collaboration • Support people with the coordination of their actions • Facilitate finding collaborators and establishing contact Baker, Greenberg, and Gutwin, CSCW 2002

  13. Ambient heuristics • Useful and relevant information • “Peripherality” of display • Match between design of ambient display and environments • Sufficient information design • Consistent and intuitive mapping • Easy transition to more in-depth information • Visibility of state • Aesthetic and Pleasing Design Mankoff, et al, CHI 2003

  14. Process • Perform two or more passes through system inspecting • Flow from screen to screen • Each screen • Evaluate against heuristics • Find “problems” • Subjective (if you think it is, it is) • Don’t dwell on whether it is or isn’t

  15. Debriefing • Organize all problems found by different reviewers • At this point, decide what are and aren’t problems • Group, structure • Document and record them

  16. Severity Rating • Based on • frequency • impact • persistence • market impact • Rating scale • 0: not a problem • 1: cosmetic issue, only fixed if extra time • 2: minor usability problem, low priority • 3: major usability problem, high priority • 4: usability catastrophe, must be fixed

  17. Advantages • Few ethical issues to consider • Inexpensive, quick • Getting someone practiced in method and knowledgeable of domain is valuable

  18. Challenges • Very subjective assessment of problems • Depends of expertise of reviewers • Why are these the right heuristics? • Others have been suggested • How to determine what is a true usability problem • Some recent papers suggest that many identified “problems” really aren’t

  19. Let’s practice: PAL

  20. use simple and natural dialog speak user’s language minimize memory load be consistent provide feedback provide clearly marked exits provide shortcuts provide good error messages prevent errors Heuristics

  21. Your turn • Prepare • Determine your heuristics • Demo your prototype to all of us • Evaluate • HA eval Mini-e • Mini-e eval GroupOne • GroupOne eval HA • Debrief • On your own, include in Part 4

  22. Cognitive Walkthrough More evaluation without users

  23. Cognitive Walkthrough • Assess learnability and usability through simulation of way novice users explore and become familiar with interactive system • A usability “thought experiment” • Like code walkthrough (s/w engineering) • From Polson, Lewis, et al at UC Boulder

  24. CW: Process • Construct carefully designed tasks from system spec or screen mock-up • Walk through (cognitive & operational) activities required to go from one screen to another • Review actions needed for task, attempt to predict how users would behave and what problems they’ll encounter

  25. CW: Assumptions • User has rough plan • User explores system, looking for actions to contribute to performance of action • User selects action seems best for desired goal • User interprets response and assesses whether progress has been made toward completing task

  26. CW: Requirements • Description of users and their backgrounds • Description of task user is to perform • Complete list of the actions required to complete task • Prototype or description of system

  27. CW: Methodology • Step through action sequence • Action 1 • Response A, B, .. • Action 2 • Response A • ... • For each one, ask four questions and try to construct a believability story

  28. CW: Questions • Will users be trying to produce whatever effect action has? • Will users be able to notice that the correct action is available? (is it visible) • Once found, will they know it’s the right one for desired effect? (is it correct) • Will users understand feedback after action?

  29. CW: Answering the Questions • 1. Will user be trying to produce effect? • Typical supporting evidence • It is part of their original task • They have experience using the system • The system tells them to do it • No evidence? • Construct a failure scenario • Explain, back up opinion

  30. CW: Next Question • 2.Will user notice action is available? • Typical supporting evidence • Experience • Visible device, such as a button • Perceivable representation of an action such as a menu item

  31. CW: Next Question • 3.Will user know it’s the right one for the effect? • Typical supporting evidence • Experience • Interface provides a visual item (such as prompt) to connect action to result effect • All other actions look wrong

  32. CW: Next Question • 4.Will user understand the feedback? • Typical supporting evidence • Experience • Recognize a connection between a system response and what user was trying to do

  33. CW: Questions • Will users be trying to produce whatever effect action has? • Will users be able to notice that the correct action is available? (is it visible) • Once found, will they know it’s the right one for desired effect? (is it correct) • Will users understand feedback after action?

  34. Let’s practice: PAL

  35. User characteristics • Technology savy users • Familiar with computers • Comfortable with basic cell phone operations • Familiar with many features of cell phone, but not expert

  36. Task:Party helper • Heather goes to a reception with some conference attendees. Heather doesn’t know many of them but wants to make a good impression since she’ll be looking for a job in a few months, so she wants to use PAL to remind her of names of folks she meets during the evening.

  37. Action list • Assume PAL is running… • When introduced to someone new, press “earmark” button and repeat the name. • Assume no intervening introductions… • When there is a break, press and hold the back navigation button until bar reaches earmark. • Listen to name being repeated

  38. Advantages Explores important characteristic of learnability Novice perspective Detailed, careful examination Working prototype not necessary Disadvantages Can be time consuming May find problems that aren’t really problems Narrow focus, may not evaluate entire interface CW Summary

  39. Your turn • Prepare: • Describe users and their experience • Determine task(s) and action lists • Evaluation: • HA evaluate GroupOne • GroupOne evaluate Mini-e • Mini-e evaluate HA • As a group, for each action, answer the 4 questions with evidence

  40. CW: Questions • Will users be trying to produce whatever effect action has? • Will users be able to notice that the correct action is available? (is it visible) • Once found, will they know it’s the right one for desired effect? (is it correct) • Will users understand feedback after action?

More Related