1 / 44

Agents, Mobility, Ubiquity & Virtuality

Mobile Human Computer Interaction COMP40300 Context-Sensitive Service Delivery Lecture 7 & 8. Agents, Mobility, Ubiquity & Virtuality. Professor Gregory O’Hare School of Computer Science & Informatics, University College Dublin (UCD). Gregory O’Hare

mostyn
Download Presentation

Agents, Mobility, Ubiquity & Virtuality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mobile Human Computer Interaction COMP40300 Context-Sensitive Service Delivery Lecture 7 & 8 Agents, Mobility, Ubiquity & Virtuality Professor Gregory O’Hare School of Computer Science & Informatics, University College Dublin (UCD) Gregory O’Hare Department of Computer Science, University College Dublin

  2. Objectives • Human Computer Interaction (HCI) • Human • Machine • Paradigms • Models of interaction • Mobile HCI

  3. Background • World War II • Weapons research • Ergonomics • Physical characteristics • Affects on performance • Human factors • Ergonomics • Cognitive issues • Various systems • Computer, mechanical, manuals

  4. HCI Today • 1980s • Interaction between people and computers • Physical • Psychological • Theoretical • Multidisciplinary by nature “HCI involves the design, implementation and evaluation of interactive systems in the context of a user’s task and work”

  5. Key Players • Human • Individual • Group (together/in sequence) • Computer • PC • Cluster • Process control • Embedded systems • Interaction • Direct (dialog box) • Indirect (batch processing/Intelligent sensors)

  6. The Human Component • Input • Senses • Sight, hearing, touch, taste, smell • Output • Effectors • Limbs, eyes, fingers, head, vocal system • Primary Senses • Sight, hearing, touch • Primary Effectors • fingers

  7. Design - Vision • Consider: “Ability to read or distinguish falls off inversely as the distance from our point of focus increases” • Yet: “peripheral vision very sensitive to movement” • What might the implications of these facts be for error messages, for example?

  8. Design - Hearing • Consider how much information you Hear • Consider • Loudness • Pitch • Location • Filtering • Cocktail party effect • In short • Sound rarely used to its potential in interface design

  9. Design - Touch • Touch = haptic perception • Consider e-commerce • Successful areas include • CDs, Books, Travel services • Less successful areas include: • Clothes • Why? • Consider Virtual Reality limitations.

  10. Haptics - Terminology • Three types of sensory receptor in the skin: • Thermoreceptor • Heat & cold • Nociceptors • Intense pressure, heat or cold • Mechanoreceptors • Pressure • Kinesthesis • Awareness of the position of body & limbs • Consider touch typing

  11. The Computer Component • Input Devices • Text entry • Keyboard e.g. qwerty • technological inertia! • Numeric keypads • Calculator • Phone pad • Notice how one-hand and thumb is used! • Handwriting recognition • Stroke before shape • Consider signature authentication • 25 words a minute at most • Gesture recognition - e.g. drawing a line though a word so as to delete it

  12. Computer Input II • Speech recognition • Accent, emotion, illness • Confidentiality? • Positioning & Pointing • Mouse • Touchpad • Trackball/Thumbwheel • Joystick • Touch-sensitive screens • Stylus/light pen • Digitizing Tablet • Eyegaze

  13. Computer Output • Bitmap displays • CRT • LCD • Digital paper • 3D Space • Cockpit/virtual Controls • 3D mouse • Dataglove • VR helmet • Whole-body tracking

  14. Computer Output II • Specialised displays • Dials • Gauges • Flashing diodes • Sound • Touch, feel & smell • Physical controls e.g. microwave • Environment/bio-sensing

  15. The Interaction Component • What does the user want? • What does the system do? • How do we translate? • Interaction = communication between user and system • What is the role of the interface? • How can we understand what is happening?

  16. Why a Model of Interaction? • What is going on in the interaction? • Where is the likely source of problems that occur during the interaction? • How do we compare and contrast different interaction styles? • Where do we begin when considering interaction problems?

  17. Terminology Check

  18. Norman’s Interaction Model Norman (1988) – 2 stages to interaction - • Execution • Establishing the goal • Forming the intention • Specify the action sequence • Execute the action • Evaluation • Perceive system state • Interpret system state • Evaluate with respect to original goals and intentions • Repeat process

  19. Gulf of Execution • Why do some interfaces cause problems? • User and system do not use the same language!!!!!! • Difference between users’ formulation of the actions to achieve a goal and the actions allowed by the system. • Interface should aim to minimise this gulf! • The smaller the gulf, the more effective the Interface!

  20. Gulf of Evaluation • “distance” between the physical presentation of the system's state and the expectation of the user • The more effort required to interpret the presentation, the less effective the interaction

  21. Human Error Two classifications of error • Slips • Understand goal (formulate correct action) • Mistype (execute incorrectly) • Mistake • May not formulate correct action • Does magnifying glass icon indicate find or magnify? Corrective actions • Better screen deign, more space between buttons (slips) • Radical redesign, improved training (mistakes)

  22. Interaction Framework • Abowd & Beale • Norman’s model neglects system communication so …. • Four Components proposed • System • User • Input • Output • Each Component has its own language.

  23. Interaction Framework Components

  24. Interaction Framework - Translations

  25. Interaction Framework Cycle • USER • Formulate Goal and Task • Articulate this to the INPUT • INPUT • Translate into operations to be performed by the SYSTEM • SYSTEM • Perform the necessary operations • OUTPUT • Present current stare of SYSTEM • USER • Observe results

  26. INTERACTION • Taxonomy of Interaction • Explicit Interaction • Implicit Interaction • Multimodal Interaction

  27. Explicit Interaction • Human Model • Initiate an action • Anticipate a reaction • Computer Model • Discrete event generation • Discrete response generated • Event-based • Recall Light switch example • Explicit input • Explicit output

  28. Implicit Interaction • Human Model • No consciously explicit behavior • No expectation of response • Task performed in “normal” way • Computer Model • User Behavior treated as an implicit input (but not interaction per se) • Observation • Interpretation • Reaction may be Explicit or Implicit • Warning • Map update in response to movement

  29. Multimodal Interaction • Several Input Channels • Several Output Channels • Examples • Voice • Gesture • Handwriting • Haptic • May be used in an “as-needed” fashion • May be used simultaneously

  30. MM Interaction - Architectural Considerations • Input • Parallel recognition for each input channel • Methodology to interpret the individual input modalities • Analysis must be time sensitive • Output • Module for dialogue management • Criteria for adapting I/O to needs and environment

  31. Interaction Paradigms • Time sharing (1950s) • Single computer, multiple users • Video Display Units • SketchPad • Ivan Sutherland • Personal Computing • Alan Kay • Visual programming (Smalltalk)

  32. Paradigms II • Windows • WIMP Interfaces • Hypertext • 1960s! • Multi-Modality • Simultaneous use of multiple communication channels for input & output

  33. Paradigms III • Computer Supported Cooperative Work (CSCW) • World Wide Web

  34. Paradigms IV • Agent-based Interfaces • Simple email filter program • Ubiquitous Computing • Ratio of devices to people • Context/sensor based computing • Automatic doors etc

  35. Usability Dependent on technological advances AND Creative application to augment and enhance the power of the human “the extent to which a product can be used by specified users to achieve specific goals with effectiveness, efficiency and satisfaction in a specified context of use.” – ISO 9241-11

  36. User Centered Design • Requirements of user are factored into all stages of the product lifecycle. • Recall Waterfall approach…. • ISO 13407 • Specify the Context of use • Who, what, why …. • Specify requirements • Specific business or user objectives • Create Design Solution • Using standard approaches … • Evaluate Design • Usability testing, for example, SUMI

  37. Norman’s Seven Principles To transform difficult tasks into simple ones…… • Use both knowledge of the world and knowledge in the head • Simplify the structure of tasks • Make things visible • Get the mappings right • Exploit the power of constraints, both natural & artificial • Design for error • When all else fails - standardise

  38. Shneiderman’s Eight Golden Rules User Interface Design • Strive for consistency • Enable frequent users to use shortcuts • Offer informative feedback • Design dialogs to yield closure • Offer error prevention and simple error handling • Permit easy reversal of actions • Support internal locus of control • Reduce short-term memory load

  39. Mobile HCI • Do “classic” HCI principles carry over to mobile computing? • Is HCI even important for mobile users? • How can mobile users and their contexts be better understood?

  40. Mobile Computing Paradigms Consider • Classic mobile computing usage • Ubiquitous computing • Wearable computing • Context-aware computing As an aside, consider • Autonomic computing • Proactive computing What is the key objective of ambient intelligence?

  41. The Mobile User • Limited attention span – interactions with the real world are more important than with the device • User’s hands may be occupied • Tasks may require a high degree of attention so as to avoid danger • User may adopt a variety of postures and positions • Interactions with the environment are context dependent • Interaction with mobile device is high speed, driven by external circumstances

  42. Some Design Challenges • Designing for mobile users, their tasks and contexts • Accommodating the diversity of devices, services and applications • Current inadequacy of HCI models to address the varied demand of mobile systems • Demands of evaluating mobile systems

  43. Sources • Shneiderman, B. (1998). Designing the User Interface: Strategies for Effective Human-Computer Interaction. (3rd ed.). Menlo Park, CA: Addison Wesley. • Norman, D. The Design of Everyday Things

  44. Required Reading • Dix, A., Finlay, J., Abowd, G., Beale, R. Human Computer Interaction 3rd Edition Chapter 3

More Related