440 likes | 563 Views
Mobile Human Computer Interaction COMP40300 Context-Sensitive Service Delivery Lecture 7 & 8. Agents, Mobility, Ubiquity & Virtuality. Professor Gregory O’Hare School of Computer Science & Informatics, University College Dublin (UCD). Gregory O’Hare
E N D
Mobile Human Computer Interaction COMP40300 Context-Sensitive Service Delivery Lecture 7 & 8 Agents, Mobility, Ubiquity & Virtuality Professor Gregory O’Hare School of Computer Science & Informatics, University College Dublin (UCD) Gregory O’Hare Department of Computer Science, University College Dublin
Objectives • Human Computer Interaction (HCI) • Human • Machine • Paradigms • Models of interaction • Mobile HCI
Background • World War II • Weapons research • Ergonomics • Physical characteristics • Affects on performance • Human factors • Ergonomics • Cognitive issues • Various systems • Computer, mechanical, manuals
HCI Today • 1980s • Interaction between people and computers • Physical • Psychological • Theoretical • Multidisciplinary by nature “HCI involves the design, implementation and evaluation of interactive systems in the context of a user’s task and work”
Key Players • Human • Individual • Group (together/in sequence) • Computer • PC • Cluster • Process control • Embedded systems • Interaction • Direct (dialog box) • Indirect (batch processing/Intelligent sensors)
The Human Component • Input • Senses • Sight, hearing, touch, taste, smell • Output • Effectors • Limbs, eyes, fingers, head, vocal system • Primary Senses • Sight, hearing, touch • Primary Effectors • fingers
Design - Vision • Consider: “Ability to read or distinguish falls off inversely as the distance from our point of focus increases” • Yet: “peripheral vision very sensitive to movement” • What might the implications of these facts be for error messages, for example?
Design - Hearing • Consider how much information you Hear • Consider • Loudness • Pitch • Location • Filtering • Cocktail party effect • In short • Sound rarely used to its potential in interface design
Design - Touch • Touch = haptic perception • Consider e-commerce • Successful areas include • CDs, Books, Travel services • Less successful areas include: • Clothes • Why? • Consider Virtual Reality limitations.
Haptics - Terminology • Three types of sensory receptor in the skin: • Thermoreceptor • Heat & cold • Nociceptors • Intense pressure, heat or cold • Mechanoreceptors • Pressure • Kinesthesis • Awareness of the position of body & limbs • Consider touch typing
The Computer Component • Input Devices • Text entry • Keyboard e.g. qwerty • technological inertia! • Numeric keypads • Calculator • Phone pad • Notice how one-hand and thumb is used! • Handwriting recognition • Stroke before shape • Consider signature authentication • 25 words a minute at most • Gesture recognition - e.g. drawing a line though a word so as to delete it
Computer Input II • Speech recognition • Accent, emotion, illness • Confidentiality? • Positioning & Pointing • Mouse • Touchpad • Trackball/Thumbwheel • Joystick • Touch-sensitive screens • Stylus/light pen • Digitizing Tablet • Eyegaze
Computer Output • Bitmap displays • CRT • LCD • Digital paper • 3D Space • Cockpit/virtual Controls • 3D mouse • Dataglove • VR helmet • Whole-body tracking
Computer Output II • Specialised displays • Dials • Gauges • Flashing diodes • Sound • Touch, feel & smell • Physical controls e.g. microwave • Environment/bio-sensing
The Interaction Component • What does the user want? • What does the system do? • How do we translate? • Interaction = communication between user and system • What is the role of the interface? • How can we understand what is happening?
Why a Model of Interaction? • What is going on in the interaction? • Where is the likely source of problems that occur during the interaction? • How do we compare and contrast different interaction styles? • Where do we begin when considering interaction problems?
Norman’s Interaction Model Norman (1988) – 2 stages to interaction - • Execution • Establishing the goal • Forming the intention • Specify the action sequence • Execute the action • Evaluation • Perceive system state • Interpret system state • Evaluate with respect to original goals and intentions • Repeat process
Gulf of Execution • Why do some interfaces cause problems? • User and system do not use the same language!!!!!! • Difference between users’ formulation of the actions to achieve a goal and the actions allowed by the system. • Interface should aim to minimise this gulf! • The smaller the gulf, the more effective the Interface!
Gulf of Evaluation • “distance” between the physical presentation of the system's state and the expectation of the user • The more effort required to interpret the presentation, the less effective the interaction
Human Error Two classifications of error • Slips • Understand goal (formulate correct action) • Mistype (execute incorrectly) • Mistake • May not formulate correct action • Does magnifying glass icon indicate find or magnify? Corrective actions • Better screen deign, more space between buttons (slips) • Radical redesign, improved training (mistakes)
Interaction Framework • Abowd & Beale • Norman’s model neglects system communication so …. • Four Components proposed • System • User • Input • Output • Each Component has its own language.
Interaction Framework Cycle • USER • Formulate Goal and Task • Articulate this to the INPUT • INPUT • Translate into operations to be performed by the SYSTEM • SYSTEM • Perform the necessary operations • OUTPUT • Present current stare of SYSTEM • USER • Observe results
INTERACTION • Taxonomy of Interaction • Explicit Interaction • Implicit Interaction • Multimodal Interaction
Explicit Interaction • Human Model • Initiate an action • Anticipate a reaction • Computer Model • Discrete event generation • Discrete response generated • Event-based • Recall Light switch example • Explicit input • Explicit output
Implicit Interaction • Human Model • No consciously explicit behavior • No expectation of response • Task performed in “normal” way • Computer Model • User Behavior treated as an implicit input (but not interaction per se) • Observation • Interpretation • Reaction may be Explicit or Implicit • Warning • Map update in response to movement
Multimodal Interaction • Several Input Channels • Several Output Channels • Examples • Voice • Gesture • Handwriting • Haptic • May be used in an “as-needed” fashion • May be used simultaneously
MM Interaction - Architectural Considerations • Input • Parallel recognition for each input channel • Methodology to interpret the individual input modalities • Analysis must be time sensitive • Output • Module for dialogue management • Criteria for adapting I/O to needs and environment
Interaction Paradigms • Time sharing (1950s) • Single computer, multiple users • Video Display Units • SketchPad • Ivan Sutherland • Personal Computing • Alan Kay • Visual programming (Smalltalk)
Paradigms II • Windows • WIMP Interfaces • Hypertext • 1960s! • Multi-Modality • Simultaneous use of multiple communication channels for input & output
Paradigms III • Computer Supported Cooperative Work (CSCW) • World Wide Web
Paradigms IV • Agent-based Interfaces • Simple email filter program • Ubiquitous Computing • Ratio of devices to people • Context/sensor based computing • Automatic doors etc
Usability Dependent on technological advances AND Creative application to augment and enhance the power of the human “the extent to which a product can be used by specified users to achieve specific goals with effectiveness, efficiency and satisfaction in a specified context of use.” – ISO 9241-11
User Centered Design • Requirements of user are factored into all stages of the product lifecycle. • Recall Waterfall approach…. • ISO 13407 • Specify the Context of use • Who, what, why …. • Specify requirements • Specific business or user objectives • Create Design Solution • Using standard approaches … • Evaluate Design • Usability testing, for example, SUMI
Norman’s Seven Principles To transform difficult tasks into simple ones…… • Use both knowledge of the world and knowledge in the head • Simplify the structure of tasks • Make things visible • Get the mappings right • Exploit the power of constraints, both natural & artificial • Design for error • When all else fails - standardise
Shneiderman’s Eight Golden Rules User Interface Design • Strive for consistency • Enable frequent users to use shortcuts • Offer informative feedback • Design dialogs to yield closure • Offer error prevention and simple error handling • Permit easy reversal of actions • Support internal locus of control • Reduce short-term memory load
Mobile HCI • Do “classic” HCI principles carry over to mobile computing? • Is HCI even important for mobile users? • How can mobile users and their contexts be better understood?
Mobile Computing Paradigms Consider • Classic mobile computing usage • Ubiquitous computing • Wearable computing • Context-aware computing As an aside, consider • Autonomic computing • Proactive computing What is the key objective of ambient intelligence?
The Mobile User • Limited attention span – interactions with the real world are more important than with the device • User’s hands may be occupied • Tasks may require a high degree of attention so as to avoid danger • User may adopt a variety of postures and positions • Interactions with the environment are context dependent • Interaction with mobile device is high speed, driven by external circumstances
Some Design Challenges • Designing for mobile users, their tasks and contexts • Accommodating the diversity of devices, services and applications • Current inadequacy of HCI models to address the varied demand of mobile systems • Demands of evaluating mobile systems
Sources • Shneiderman, B. (1998). Designing the User Interface: Strategies for Effective Human-Computer Interaction. (3rd ed.). Menlo Park, CA: Addison Wesley. • Norman, D. The Design of Everyday Things
Required Reading • Dix, A., Finlay, J., Abowd, G., Beale, R. Human Computer Interaction 3rd Edition Chapter 3