340 likes | 358 Views
Towards a Work-centered Theory of Usability. February 11, 2005 University of California at Santa Barbara. Keith Butler, Chris Esposito, & Stephen Jones Boeing Math & Computing Technology Bellevue, WA. Jiajie Zhang University of Texas Health Informatics Houston, TX. Robert Eggleston
E N D
Towards a Work-centered Theory of Usability February 11, 2005 University of California at Santa Barbara Keith Butler, Chris Esposito, & Stephen Jones Boeing Math & Computing Technology Bellevue, WA Jiajie Zhang University of Texas Health Informatics Houston, TX Robert Eggleston Air Force Research Labs Dayton, OH
Outline of Today’s Talk • Conventional definitions of usability & their limitations • Previous theories for HCI • Requirements for a definition of usability • Distributed Cognition and the representation effect • A Work-Centered Framework for usability • The design of our experiment • Results summary • Step-wise regression analysis • Discussion of results • Implications for HCI design
Conventional Measures of Usability Measures for usability defined in ANSI standard 354-2001 [7; 18] • in terms of effectiveness and efficiency of user performance • have good face validity User performance-based measures of usability have proven valuable • Clear relation to business case for many applications [1] • Key role in the development of usability engineering [11; 22]
Limitations of User Performance-based Measures Expensive, often untimely to gather empirical user performance data Dependence on face validity makes deeper analysis difficult • Time-on-task can tell us “when” but little about “why” of usability problems • Task-completion rate can’t easily be decomposed User performance is holistic and confounds usability with • application functionality • work difficulty • difficult to apply to key concept decisions Limitations pose obstacles to the potential impact of usability engineering
Some Previous HCI Theory • Model Human Processor & GOMS (Card, Moran, & Newell, 1983) • Cognitive Complexity Theory (Kieras & Polson, 1985) • Artifact Theory (Carroll & Campbell, 1986) • Human Factors Engineering (Dowell & Long, 1989) • Cognitive Systems (Rasmussen, et al. 1994; Vicente, 1999) • Activity Theory (Nardi, 1996)
Key Requirements for a Definition of Usability Can be applied to predict user task performance Maps usability problems to features of the software’s design Relates performance to psychological theory • Determines how artifacts constrain cognition • Separates effects of usability from work difficulty or functionality
Psychological Theory of Distributed Cognition Based on research of Zhang & Norman (1994; 1995; 1996; 1997, Wright, et al., 2000) Cognition can be distributed among people interacting with artifacts • Written languages • Number systems • Information displays The way the artifact represents information constrains people’s cognitive strategies and work procedures
The Representation Effect in Distributed Cognition Different isomorphic representations of the same abstract work problem can produce drastically different behavior (Simon & Hayes, 1976; Marr, 1982) • E.g., both pairs of numbers represent the same quantities: XVI times CIII vs. 16 times 103 but the Arabic numerals are far easier to multiply
A Tale of 2 GUIs:the representation effect in HCI Portable Maintenance Aid application (PMA) • support for fix-or-fly decision about airliner “squawks” • hyperlinked all published tech data • brought tech data access to the parking ramp via specialized laptop Two versions of the PMA user interface • used direct manipulation methods • represented the problem differently • produced drastically different problem strategies
Isomorphic GUI#1 Technology-centered Airline mechanics tried to solve a “squawk” using a Portable Maintenance Aid that had a technology-centric user interface Mechanics got lost in the layers of data The interaction contradicts effective work strategy to solve “squawks” Task performance was worse than paper docs = 86% failures
Isomorphic GUI#2 Supports Effective Work Strategy 2nd PMA user interface explicitly represents expert cognitive strategy GUI maps the data to the steps of problem-solving, including back-tracking Improved mechanic squawk solution rate to 100% success
Bad Representation-> Inefficient Work Both PMA GUIs are isomorphic representations of the same underlying • problem • data • functions But, 8-fold difference in user performance
Why? Hypothesis: • The PMA files-in-folders GUI distorts the job with overhead tasks for • managing windows • integrating information among windows • remembering contents of files and folders • meta-task of tracking progress • The palette GUI distributes cognition better • strongly constrains procedure possibilities to a proven strategy • imposes little overhead to follow the strategy Zhang (1996)calls this Representational Determinism
The Usability Experiment Independent variable to manipulate: extrinsic overhead Control variable to hold constant: intrinsic difficulty Within-subjects design, counter-balanced for order, n = 8 Using two, isomorphic GUIs that differ in overhead
Tower of Hanoi Game- Representation #1- disks on posts Goal: • Move the disks to another post so smallest disk goes on 1st, middle goes on 2nd, and biggest goes on last • Rules: • Only one disk can be moved at a time; • Biggest out first; • A disk can only go onto a post where it becomes the biggest on that post
Tower of Hanoi Game – Representation #2 – files-in-folders Goal: • Move the files to another folder so smallest goes in 1st, middle goes in 2nd, and biggest goes in last • Rules: • Only one file can be moved at a time; • Biggest out first; • 3. A file can only go into a folder where it becomes the biggest in that folder
Scoring the User Performance Data • Time to solution p < 0.0048 • Errors (illegal moves) p < 0.0313 • Legal moves to solution p < 0.026 • Overhead actions p < 0.0002
Results Summary- solution time P < 0.01
Scoring errors: Problem Space of the Tower of Hanoi Puzzle Each box in the space shows a state that the game-entity can be in The line-paths to each box are the sequence of legal moves that can change the game-entity to that state
Results Summary- errors p < 0.01
Scoring overhead: Ontology of the Tower of Hanoi Problem §Two property dimensions. Ordinal dimension. 3 levels: O1 > O2 > O3. Nominal dimension. 3 levels: N1, N2, N3. §Object: OBJi = (Oi, Nl). i = 1, 2, 3; l = 1,2,3. §Problem state: S(l, m, n) = ((O1, Nl), (O2, Nm), (O3, Nn)). l, m, n = 1, 2, 3. §Operation: OP(Oi, Nl) = (Oi, Nm). l ≠ m. §Rules: 1: OP is a unary operator. 2: When OBJj = (Oj, Nm), OP(Oi, Nl) = (Oi, Nm) is true if Oi > Oj. 3: When OBJi = (Oi, Nl) & OBJj = (Oj, Nl), OP(Oi, Nl) is true if Oi > Oj. §Goal: S(l’, m’, n’) -> S(l”, m”, n”). §Optimal-Sequence: one or more for S(l’, m’, n’) -> S(l”, m”, n”) In the ontology (abstract structure) of the TOH, O1, O2, and O3 are the three levels of the ordinal dimension, and N1, N2, and N3 are the three levels of the nominal dimension. An object OBJi is described as OBJi = (Oi, Nl), which can be at three different levels on the nominal dimension: (Oi, N1), (Oi, N2),(Oi, N3).
Experiment Conclusions extrinsic overhead of GUI is statistically independent from intrinsic difficulty of work extends Kieras & Polson (1985) prediction of solution time from procedure count • Overhead actions were 31% of all actions, but accounted for 93% of variance in solution time • each overhead procedure increased solution time by ~4.2 secs.
General Conclusion:2 New Principles of Usability An application will be usable to the extent that it: • Represents work-problem information in a manner that conforms to strategies or work procedures that are recognizable, effective, and efficient • Provides application operating procedures that do not impose overhead tasks in addition to the work procedures
Factors Affecting HCI Work System Performance • Skill level of the user to perform • Work knowledge and procedures • Interface procedures • Intrinsic difficulty of the work problem • Number of operators and their resource requirements • Complexity of problem space and number of state changes • Functionality of the supporting application that performs operators • Automated operators • Continuity of information • Support for effective problem representation • Extrinsic overhead of user operations that are only required due to the way the application was implemented • Can induce major deviation from good procedures for intrinsic work
Application to Software Development • A major part of an IT application’s job is to manage the content, access and format of information – • The default settings will have a constraining effect on user cognition • Users either have to follow sub-optimal constraints or work harder to overcome ineffective problem-solving strategies • Many users may not know better or not make the effort • There is not much neutral ground – • Developers who try to over-supply features or data simply create more overhead, and the strategy for dealing with it • Systems that do not promote good work strategy will still constrain user work • The only question is whether user work procedure and strategy will be selected deliberately or by accident
Next Research Questions • Need to clarify the relationship between overhead and distributed cognition • How can designers determine preferred strategy for work that has never existed before? • How can we engineer the needed representation? • Can we validate and calibrate a measurement model: • Ta = f(1/OHa) + Da
Confounding? • The version of TOH with files-in-folders GUI allocates cognition differently than disks-on-posts: • 1. Users must track the states of 3 containers • 2. User must integrate states across 3 containers to determine state of the game • 3. Users must remember rule for biggest-out-first • 4. Users must remember rule for biggest-only-in • 5. Users must keep track of size • Responsibilities #3 and #4 produced errors • Responsibilities #1 and #2, in turn, interacted with the constraints of the application to produce overhead actions for • managing windows • moving files • (Overhead is not intrinsic to the task, and competes for working memory) • Therefore, responsibilities #3, #4,  confounded the difficulty for files-in-folders, and could have contributed to the longer task-time. However, I scored errors separately from overhead, and overhead alone accounted for 93% of the variance in task-time. • So, I think we are still on the right track, but the 2 versions of TOH varied more than just overhead.
References • Bias, R. & Mayhew, D. (Ed.s) (1994) Cost-Justifying Usability. Academic Press. • Card, S.; Moran, T.; & Newell, A. (1983) The Psychology of Human-Computer Interaction. Erlbaum. • Carroll, J. & Campbell, R. (1986). Softening up hard science: Reply to Newell and Card. Human-Computer Interaction, 2, 227-249. • Cooper, A. (1999) The Inmates Are Running the Asylum : Why High Tech Products Drive Us Crazy and How to Restore the Sanity. Sams. • Cooper, W. (1996) Cost Analysis. In: Encyclopedia of Operations Research and Management Science. S. Gass & C. Harris (Eds.) Kluwer Academic. • Diaper, D. & Stanton, N. (Eds.) (2004) The Handbook of Task Analysis for Human-Computer Interaction. Lawrence Erlbaum. • Dowell, J. & Long, J. (1989) Towards a conception for an engineering discipline of human factors. Ergonomics, 32, 1513-1535. • Dumas, J. S.& Redish,J.C. (1993). A Practical Guide to Usability Testing. Greenwich, CT: Ablex • International Committee for Information Technology Standards (2001) Common Industry Format for Usability Test Reports. Doc.#: ANSI/INCITS 354-2001. American National Standards Institute. • Gruber, T. (1993) A translation approach to portable ontologies. Knowledge Acquisition, 5(2):199-220. • Hyperionics Technology, LLC. (2004) HyperCam 2. • Jacobson, I., Christerson, M., Jonsson, P, & Overgaard, G. Object-Oriented Software Engineering- A Use Case Driven Approach. Reading, MA: Addison-Wesley, 1992. • Karat, J. (1997) Evolving the scope of user-centered design. CACM, 40(7), pp. 33-38. • Kieras, D. & Polson, P. (1985) An approach to the formal analysis of user complexity. International Journal of Man-Machine Studies, 22, pp. 365-394. • Kirschenbaum, S. S., Gray, W. D., Ehret, B. D., & Miller, S. L. (1996). When using the tool interferes with doing the task. In M. J. Tauber (Ed.), Conference companion of the ACM CHI'96 Conference Human Factors in Computing Systems (pp. 203-204). New York: ACM Press. • Kotovsky, K., Hayes, J. R., & Simon, H. A. (1985). Why are some problems hard? Evidence from Tower of Hanoi. Cognitive Psychology, 17, 248-294. • Long, J. (1997) Research and the design of human-computer interactions or “What Happened to Validation?” In H. Thimbleby, B. O’Conaill, & P. Thomas (Eds.), People and computers XII (pp 223-243). NY: Springer. • Nardi, B. (Ed.) (1996) Context and Consciousness: Activity Theory and Human-Computer Interaction. MIT Press. • Nielsen, J. (1993) Usability Engineering. Academic Press. • Rasmussen, J., Pejtersen, A. M., & Goodstein, L. P. (1994). Cognitive systems engineering. New York: Wiley & Sons. • Scholtz, J.; Wichansky, A.; Butler, K.; Morse, E.; Laskowksi, S. (2002) Quantifying Usability: The Industry Usability Reporting Project. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Baltimore, Sept 30-Oct 4, 2002 • Simon, H. A. & Hayes, J. R. (1976). The understanding process: Problem isomorphs. Cognitive Psychology, 8, 165-190. • Van der veer, G. & van Welie, M. (2004) DUTCH: Designing for Users and Tasks from Concepts to Handles. In: Diaper, D. & Stanton, N. (Eds.) The Handbook of Task Analysis for Human-Computer Interaction. Lawrence Erlbaum. • Vicente, K. J. (1999). Cognitive work analysis. Mahwah, NJ: Erlbaum. • Whiteside, J.; Bennett, J. & Holtzblatt, K. (1988) Usability Engineering: Our Experience and Evolution. In: M. Helander (ed.) Handbook of Human-Computer Interaction. North-Holland. pp. 791-817. • Wright, P.; Fields, R.; & Harrison, M. (2000) Analyzing Human-Computer Interaction as Distributed Cognition: The Resources Model. Human-Computer Interaction, 15(1), pp. 1-42. • Zhang, J. & Norman, D. A. (1994). Representations in distributed cognitive tasks. Cognitive Science, 18, 87-122. • Zhang, J., & Norman, D. A. (1995). A representational analysis of numeration systems, Cognition, 57, 271-295. • Zhang, J. (1996). A representational analysis of relational information displays. International Journal of Human-Computer Studies, 45, 59-74. • Zhang, J. (1997). Distributed representation as a principle for the analysis of cockpit information displays. International Journal of Aviation Psychology, 7, 105-121
A Work-centered Framework for Usability A model of work adapted from operations research [5], human factors [7], and cognitive systems [20; 24] has: • Entity is the object of work whose state is to be changed • Operations that consume resources to change the entity’s state • Constraints on the way state changes can be achieved • Resources such as time, energy, labor, etc. are used by operators • Overhead is activity that does not change the state of the entity towards the goal A Work Procedure is a set of operations to accomplish a goal by changing the state of an entity to satisfy the criteria of the goal An HCI Work System is made up of procedures that are distributed over human and machine resources, including user interfaces
Work-centered Design Paradox • The goal of HCI design is effective and efficient work performance, not merely good-looking representations (e.g., Schultz, et al, 2003) • The quality of a representation is defined by the effectiveness and efficiency of the work procedures it induces • But . . information technology’s greatest value lies in changing the nature of work, sometimes to a type of work that has never existed before • So . . we need to design work procedures for a type of work that has never existed before (similar to Carroll & Rosson’s task-artifact framework, 1992)