840 likes | 1.06k Views
CCT384 – Universal Design and Access. UD Principle: Tolerance for Error. Week 3. From previous week. Designing Labs for Peoples with Disabilities http:// www.washington.edu/doit/Video/Wmv/temp/equal.asx. Principle 5: Tolerance for Error.
E N D
CCT384 – Universal Design and Access UD Principle: Tolerance for Error Week 3
From previous week Designing Labs for Peoples with Disabilities • http://www.washington.edu/doit/Video/Wmv/temp/equal.asx
Principle 5: Tolerance for Error The design minimizes hazards and negative consequences of accidental actions.
Principle 5: Tolerance for Error The design minimizes hazards and negative consequences of accidental actions. • “CAUTION: It is not recommended that children or pets regularly drink water from the toilet, even though the bowl water is not harmful to children or pets.” Label on toilet bowl cleaner bottle
About Donald A. Norman BS and EECS from MIT Ph.D. in psychology from UPenn Centers for Cognitive Studies at Harvard professor emeritus of cognitive science at UCSD VP of Advanced Technology Group at Apple; HP co-founder of Nielsen Norman Group (usability consulting company) professor of computer science at Northwestern
About the Book • first published in 1988 • original title: “The Psychology of Everyday Things” • User-Centered Design • structure of tasks • making things visible • getting the correct mapping • exploiting the powers of constraint • designing for error • explaining affordances and seven stages of action.
Slips versus Mistakes • Recall: • Human errors can be classified into slips and mistakes • Can understand using Norman’s gulf of execution • SLIP: If you understand a system well you may know exactly what to do to satisfy your goals: you’ve formulated the correct action. But you may fail to execute that action correctly (mis-type, press the wrong button) • MISTAKE: If you don’t know the system well you may not even formulate the right goal. (Example: you may pick the magnifying glass icon thinking it is the ‘find’ function, when it actually zooms the text). • Both may be corrected for, and designed around.
Errors in User-Computer Dialog • Three phases • Read-scan phase -- Perceptual errors • Think phase -- Cognitive errors • Respond phase -- Motor errors • Can generally lead to either slips or mistakes
Perceptual Errors • Result from poor perceptual cues • Display of objects that are visually similar • Invisible or poorly expressed states • Failure to capture user’s attention • Lack of perceivable feedback
Perceptual Errors Are perceptual errors likely here? Tallly Ho Uniforms
Cognitive Errors • Caused by taxing memory and thinking • Tax recall memory • Poor mnemonic aids • Inconsistency • Lack of context or status info • e.g., where came from in a menu • Mental calculations and translations
Cognitive Errors Are cognitive errors likely here?
Motor Errors • Taxing the motor skills • Awkward movements • Highly similar motor sequences • e.g., double click, click • Pressure for speed • Require a high degree of hand-eye coordination • Requiring special types of motor skills (type)
Motor Errors Lots of errors are likely here!!
Example Studies • 170 experienced UNIX users over 9 days • Individual commands error rates of 3-50% • 300 security system users over 20 months • 12,117 error messages • Most common 11 errors -> 65% • 2517 involved repeated errors (with no non-errors in between) within 10 minutes • Bad error recovery/help Kraut et al, CHI ‘83 Mosteller & Ballas, Human Factors ‘89
Slips • Automatic (subconscious) error that occurs without deliberation • Examples?
Types of Slips • 1. Capture error • Continue frequently done activity instead of intended one • Type “animation” instead of animate • Confirm deletion of file instead of cancel • 2. Description error • Intended action has much in common with others possible (usually when distracted, close proximity) • ctrl key & caps lock key / Sun & Mac
Types of Slips • 3. Data driven error • Triggered by arrival of sensory info which intrudes into normal action • Call to give someone a number, dial that number instead • 4. Associative activation • Internal thoughts and associations trigger action • Phone rings, yell “come in”
Types of Slips • 5. Loss of activation • Forgetting goal in middle of sequence of actions • Start going into room, then forget why you’re going there • 6. Mode errors • Do action in one mode thinking you’re in another • Delete file, but you’re in wrong directory
Error-handling Strategies • Avoid and prevent • Identify and understand • Handle and recover
Preventing Errors • Rules of thumb: • Preventing slips can be done by analysing users’ interaction with the application, then tweaking screen design, button spacing, etc. • Preventing many mistakes requires that users understand the system better; may require more radical redesign, or perhaps a totally different metaphor
Error Prevention Guidelines • Eliminate modes or provide visible cues for modes • Use good coding techniques (color, style) • Maximize recognition, minimize recall • Design non-similar motor sequences or commands • Minimize need for typing
Error Prevention Guidelines • Test and monitor for errors and engineer them out • Allow reconsideration of action by user (e.g., removing file from trash)
Error Prevention Guidelines • Provide appropriate type of feedback • Gag - Prevent user from continuing • Erroneous login
Error prevention Warn user an unusual situation is occurring • Bell or alert box
Error prevention • Nothing - Just don’t do anything (Careful, user must determine problem) • Mac: move file to bad place
Error Recovery Guidelines • Provide undo function! • Provide cancel function from operations in progress • Require confirmation for drastic, destructive commands • Provide reasonableness checks on input data • Did you really mean to order 5000? PSYCH / CS 6750
Error Recovery Guidelines • However, before a user can recover, must be able to detect that an error has occurred • Detection: provided by easy visibility, feedback • Other options? • Self-correct - Guess correct action & do it • Spell-check correction • Dialog - System opens dialog with user • Go into debugger on run-time crash • Query - Ask user what should’ve been done, then allow error action as legal one (“did you mean…?”) • Command language naming error
Error Recovery Guidelines • Return cursor to error field, allow fix • Tell them what to fix, how to fix it • Provide some intelligence • Guess what they wanted to do • Provide quick access to context-sensitive help PSYCH / CS 6750
Error Handling Example (Web) • Form fill in is the most common error case
User Support (aka “Help) • Line between error recovery and help can be fuzzy • Overarching design principle: must be as unobtrusive as possible
Command Assistance • E.g., on-screen manuals, help commands (“man” on Unix), etc. • Simple and efficient if the user knows what he/she is looking for and is seeking either a reminder or more detailed information • But… • What if people don’t know what they’re looking for? • What about commands that the user does not know about but needs? • What about commands the user thinks exist but do not? • Command assistance is little help here.
Context-sensitive Help • Move away from placing onus on user to remember the command • Often not very sophisticated • Common examples: • Microsoft’s “What’s This?” option • Tooltips • Clippy (arrrgh…) • What’s the “context”? • Just the control itself? (Simple) • User’s past history and application state? (More sophisticated)
Wizards and Assistants • Attempt to prevent errors by providing “common paths” through software • Safety, efficiency, and accuracy (as long as it’s a supported task) • May be unnecessarily constraining • Guidelines: allow backward movement, show progress indicator • Assistant: Clippy is actually an example of this. A context-sensitive trigger to launch a wizard style interaction • Q: What went wrong with Clippy?
Mistake-proofing: a preliminary definition • Mistake-proofing is the use of process design features* to facilitate correct actions, prevent simple errors, or mitigate the negative impact of errors. • Mistake-proofing tends to be inexpensive, very effective, and based on simplicity and ingenuity. • It will not make processes free of all errors, nor is it a stand-alone technique that will eliminate the need for any other responses to error. *these process design features will be referred to as “devices” or “counter-measures”
Design …Then the methods of reducing risks and hazards are limited to: • What can be put on paper and subsequently… • What can be embedded in the human brain. “Knowledge in the head”* *Source: Donald Norman, The Design of Everyday Things
Design • “human errors can be made irrelevant to outcome, continually found, and skillfully mitigated.” • Can human errors become irrelevant by only changing knowledge in the head? “Knowledge in the World”* *Source: Donald Norman, The Design of Everyday Things
To err is human • Have you ever gone somewhere and not remembered why you went there? • Have you ever gone home when you meant to stop at a store? Why does that happen? How would you prevent it if your life depended on it?
“Be more careful” not effective • “The old way of dealing with human error was to scold people, retrain them, and tell them to be more careful … My view is that you can’t do much to change human nature, and people are going to make mistakes. If you can’t tolerate them ... you should remove the opportunities for error.” • “Training and motivation work best when the physical part of the system is well-designed. If you train people to use poorly designed systems, they’ll be OK for awhile. Eventually, they’ll go back to what they’re used to or what’s easy, instead of what’s safe.” • “You’re not going to become world class through just training, you have to improve the system so that the easy way to do a job is also the safe, right way. The potential for human error can be dramatically reduced.” Chappell, L. 1996. The Pokayoke Solution. Automotive News Insights, (August 5): 24i. LaBar, G. 1996. Can Ergonomics Cure ‘Human Error’? Occupational Hazards 58(4): 48-51.
A new attitude toward preventing errors: “Think of an object’s user as attempting to do a task, getting there by imperfect approximations. Don’t think of the user as making errors; think of the actions as approximations of what is desired.”* These approximations are part of Norman’s concept of “knowledge in the head” *Source: Norman, The design of everyday things. Doubleday 1988.
A New Attitude toward Preventing Errors • Make wrong actions more difficult • Make it possible to reverse actions — to “undo” them—or make it harder to do what cannot be reversed. • Make it easier to discover the errors that occur. • Make incorrect actions correct. These outcomes do not occur without design changes
Precise outcomes without precise knowledge or action? Provide clues about what to do: • natural mappings • affordances • visibility • feedback • constraints
Natural Mappings: Which dial turns on the burner? Stove A Stove B
Affordances: How would you operate these doors? Push or pull? left side or right? How did you know? A C B
“SUPPORT THE BOTTOM” Affordances: How would you lift this pan?
Visibility and Feedback • Visibility means making relevant parts visible, and effectively displaying system status • Feedback means providing an immediate and obvious effect for each action taken.