330 likes | 557 Views
Usability and Human Factors. Usability Evaluation Methods. Lecture b.
E N D
Usability and Human Factors Usability Evaluation Methods Lecture b This material (Comp 15 Unit 5) was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. This material was updated by The University of Texas Health Science Center at Houston under Award Number 90WT0006. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/4.0/.
Usability Evaluation MethodsLecture b – Learning Objectives • Describe the importance of usability in relation to health information technologies (Lecture a) • List and describe usability evaluation methods (Lecture a) • Given a situation and set of goals, determine which usability evaluation method would be most appropriate and effective (Lecture a) • Conduct a cognitive walkthrough (Lecture b) • Design appropriate tasks for a usability test (Lecture b) • Describe the usability testing environment, required equipment, logistics, and materials (Lecture b)
Cognitive Walkthrough (Polson et al, 1992) • Kind of cognitive task analysis • Assesses system usability • Criteria focuses on cognitive processes needed to perform tasks • Identifying sequences of actions and subgoals to successfully complete a task • Assigning causes to usability problems • Are the cues provided by the interface sufficient to perform task?
Cognitive Task Analysis (CTA) • Tools and techniques for describing knowledge & strategies required for task performance • Hierarchical decomposition of goals and component tasks • Objective: • Yield information about the knowledge, thought processes, and goal structures that underlie observable task performance
Why Do a CTA? • Develop theory of competent performance • Understand invariant features of a task • There are invariant performance characteristics of any class of tasks • Understand process of skill acquisition • Training and instructional resources • e.g. manuals and tutorials • Develop methods for usability testing • Design • Coding scheme for data analysis
Cognitive Walkthrough Step 1: Preparations • Choose a set of representative tasks: • Identify population of users • Describe the contexts of use • Identify sequence of actions for completing a task • Complex tasks require a task decomposition • Granularity (e.g., keystroke to complete entry) • Describe user’s initial goal (top-level goal)
Step 2: Walkthrough Process • Hand simulation of user’s cognitive processes for successfully executing an action sequence to complete a task • Step through each action and specify: • Goal structure for each step • Behavior of the interface and its effect on the user • Actions that could be difficult to execute • Source of potential problems • Overall objectives: • Can a user with a certain degree of knowledge perform the tasks that the system is intended to support? • Can a user learn to perform what is unknown?
Step 3: Explain Sources of Potential Problems • Goal problems • Can user associate specific actions with a goal? • Action problems • Cues provided by interface sufficient to perform a task? • Are there incomplete goals that look accomplished? • Interpret feedback from system? • Monitor his/her own progress ? • Transitions between subtasks handled?
ATMs: Goal Structure • Goals: obtain cash, deposit checks, check balance, pay bills • Task: Enter PIN • Retrieve number if necessary • Enter each digit • Hit enter • Goal-Obtain-Cash • Indicate intention to obtain cash • Action: If unclear on step, follow prompt on screen • Action: Indicate Checking Account • Subgoal: Obtain $40 • Action: Enter amount and hit enter • Goal-Terminate Transaction • Subgoal: Retrieve card
A Partial Walkthrough: ATMGoal: Obtain $80 Cash from Checking Account 1. Action: Enter card (screen 1) • System response: Enter PIN > (screen 2) 2. Sub-goal: Interpret prompt & provide input 3 & 4. Actions: Enter “PIN” on numeric keypad and hit enter (press lower white button next to screen) • System response: “Do you want a printed transaction record” • Binary Option: Yes or No (screen 3) 5. Sub-goal: Decide whether a printed record is necessary
A Partial Walkthrough: ATMGoal: Obtain $80 Cash from Checking Account - 2 6. Action: Press button next to no response • System response: Select transaction - 8 choices (screen 4) 7. Sub-goal: Choose between quick cash & cash withdrawal 8. Action: Press button next to cash withdrawal • System response: Select account (screen 5) 9. Action: Press button next to checking • System response: Enter dollar amounts in multiples of 20 (screen 6) 10 & 11. Enter $80 on numeric key pad and select correct
Cognitive Walkthrough: Measure Glucose • Sub-goal 1: Begin measurement • Action: Press blue power button • System response: Meter displays last blood glucose result • Sub-goal 2: Obtain a blood sample • Sub-goal 3: Use sterile/sharp lancet • Action: Select unused lancet • Sub-goal 4: Draw blood using instrument • Action: Pierce finger with lancet • Sub-goal 5: Apply blood to test strip • Sub-goal 6: Locate Pink Test Area • Action: Dab Finger/Touch Strip • Potential Problem: Missing Test Area or Applying Excessive Blood
CW: Blood Glucose (Cont’d – 1) • Sub-goal 7: Determine if test worked • Sub-goal 8: Locate confirmation dot • Action: turn over test strip • Action: Determine if confirmation dot back of strip is completely blue • Potential problem: Intermediate shades of blue—has the test worked ? • Sub-goal 9: Take measurement with device • Sub-goal 10: Determine readiness of the device • Action: Look for flashing test strip on meter • System Response: Code 4 • System Response: Flashing test strip
CW: Blood Glucose (Cont’d – 2) • Sub-goal 11: Insert pink test strip • Action: Push test strip in firmly (pink side up) • System response: Flashing clock signals waiting • System response: (short lag) new glucose value and time • Potential problem: Improper insertion • Sub-goal/Action 12: Read test result • Sub-goal 13: Dispose lancet • Action: Point forward on the lancet • Action: Eject and dispose of lancet • Sub-goal 14: Turnoff meter • Action: Press blue power button • System response: Meter turns off
CW Glucose Results • 14 Sub-goal/action pairings • 16 Actions • 5 Device/screen transitions • Potential problems • Applying blood: missing test area or applying excessive blood • Determine if test worked: intermediate shades of blue • Insert test strip: insert pink test strip
Usability Testing • Gold standard for usability evaluation • Set of techniques to collect empirical data • while observing representative end users using the system under study to perform representative tasks • Video-recorded • Provide information that can lead to systems that: • Easy to learn and use • Satisfying to use • Provide utility and functionality that are valued by the target population • Characterize task-specific competencies
Think-Aloud Protocol • Method broadly used in cognitive research and usability testing • User verbalizes his/her thoughts while performing a task • Report the contents of working memory • Session is audio and/or video recorded • Transcript of think aloud is coordinated with video analysis
Selection of Representative Users • Users may differ: • including age, education, gender, computer experience, etc. • Select subjects based on relevant criteria (e.g., age, education) • Fully representative not possible • Convenience sample is less desirable
Development of Test Plan • Outline task and procedure • Informed by objectives • Prior testing • Constrained by time and setting • Ethical and IRB issues • Exploratory • Characterize potential problems • Controlled Experiment • Comparing 2 Interfaces
Role of Researcher • Neutral Observer vs. Active Participant • Researchers may play a more interactive role in field testing • Guide the subject as necessary • Skilled user will need a minimum • Novice may need step-by-step instructions • Autonomy Rule: No more guidance than necessary
Field Usability Testing • Hybrid Method: Lab and Ethnography/ Field Study • Naturalistic setting • Numerous constraints • Proscribed set of tasks • Quasi- Experiment • Video analysis is key • Intrusive
Video-Analytic Usability Testing on Location: Old School Microsoft Clipart
Software-based Video Analysis • Provides a video of all screen activity • Captures user via a webcam • Logs a wide range of events and system interactions including mouse clicks, text entries, web-page changes and windows dialogue events (e.g., saving a document, selecting among a set of choices). • Morae state of the art usability testing software
Morae Video Analytic Usability Software Khan, S.A., Ancker, J.S., Li, J., Kaufman, D., Hutchinson, C., Cohall, A., Kukafka, R. (2009)
Data Analysis: Transcript • Working document for video analysis • Verbatim and time stamped - every 10 to 30 seconds • Iteratively modified document as coding categories become refined • Add field notes and observations to the transcript • Variety of video transcription software available such as InqScribe
Video Analysis: Granularity • Basic (semiformal) video review • Identify and categorize observable problems • Measure latency of task • Macroanalysis • Segmenting session into events or episodes • Analyzing dialogue and observed behavior • Microanalysis • Fine-grained analysis of short stretches of the interaction
Macro-Analysis 1.1 Table: (Kaufman, et al., 2003).
Observations: Sending Results • Time: 4:33:03 • Turns on glucose without being told to do so • Using visual cues realizes it is not working, sends again • Holds her hand on mouse whole time • Sits comfortably close to computer • Explains what goes on as she goes • Recognizes without difficulty her glucose and blood pressure values • When modem sound stops, immediately recognizes the cue, says, “That’s it.”
Micro-Analysis 1.2 Table: (Kaufman, et al., 2003).
Triangulate • Strategy for using more than one data gathering technique. For example: • Interviews to target certain stake holders • Questionnaires to reach a wider population • Cognitive walkthrough to evaluate task complexity • Usability testing employing novices to evaluate learnability of system • Provides different perspectives and corroboration of findings across techniques • Leading to more rigorous and defensible findings
Usability Evaluation Methods Summary • The value of usability evaluations in healthcare contexts has been well established • Wide range of methods which vary in terms of their advantages and disadvantages • Lectures illustrated how to use some of these methods including the cognitive walkthrough and usability testing
Usability Evaluation MethodsReferences – Lecture b References Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods. John Wiley & Sons, New York, NY. Polson, P., Lewis, C., Rieman, J., & Wharton, C. (1992). Cognitive walkthroughs: A method for theory-based evaluation of user interfaces. International Journal of Man-Machine Studies, 36, 741–773. Preece, J., Rogers, Y., & Sharp, H. (2007). Interaction Design: Beyond Human-Computer Interaction (2nd ed.). West Sussex, England: Wiley. Images Slide 21: Microsoft Clipart Slide 23: Khan, S.A., Ancker, J.S., Li, J., Kaufman, D., Hutchinson, C., Cohall, A., Kukafka, R. (2009). GetHealthyHarlem.org: developing a web platform for health promotion and wellness driven by and for the Harlem community. AMIA AnnuSymp Proc.317–21. Charts, Tables & Figures 1.1 Table: Kaufman, D.R., Patel, V.L., Hilliman, C., Morin, P.C., Pevzner, J, Weinstock, Goland, R. Shea, S. & Starren, J. (2003). Usability in the real world: Assessing medical information technologies in patients’ homes. Journal of Biomedical Informatics, 36, 45-60. 1.2 Table: Kaufman, D.R., Patel, V.L., Hilliman, C., Morin, P.C., Pevzner, J, Weinstock, Goland, R. Shea, S. & Starren, J. (2003). Usability in the real world: Assessing medical information technologies in patients’ homes. Journal of Biomedical Informatics, 36, 45-60.
Usability and Human FactorsUsability Evaluation MethodsLecture b This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. This material was updated by The University of Texas Health Science Center at Houston under Award Number 90WT0006.