290 likes | 302 Views
Explore the psychology behind everyday actions and how they relate to human-computer interaction. Learn about conscious vs unconscious thought, emotional design, and designing for human activity. Discover how to bridge the gaps between users and interfaces to reduce cognitive effort.
E N D
Chapter 2 The Psychology of Everyday Actions
Chapter 2 • Gulfs of execution/evaluation • Conscious vs. subconscious thought • Declarative vs. procedural knowledge • Emotional design • Visceral/behavioral/reflective activity/responses • People as storytellers/causation/blame • Failure and blame vs. breakdowns and recovery • Designing for human activity and variation
Abstraction of User Interaction • What is going on in computer-human interaction? • How do users decide what to do to achieve their goals?
The gulfs • The ‘gulfs’ explicate the gaps that exist between the user and the interface • The gulf of execution • the distance from the user to the physical system • The gulf of evaluation • the distance from the physical system to the user • Need to bridge the gulfs in order to reduce the cognitive effort required to perform a task
Norman’s (1986) Theory of action • Proposes 7 stages of an activity • Establish a goal • Form an intention • Specify an action sequence • Execute an action • Perceive the system state • Interpret the state • Evaluate the system state with respect to the goals and intentions
An example: reading breaking news on the web • Set goal to find out about breaking news decide on news website • Form an intention check out BBC website • Specify what to do move cursor to link on browser • Execute action sequence click on mouse button • Check what happens at the interface see a new page pop up on the screen (vi) Interpret it read that it is the BBC website (vii) Evaluate it with respect to the goal read breaking news
How realistic? • Human activity does not proceed in such an orderly and sequential manner • More usual for stages to be missed, repeated or out of order • Do not always have a clear goal in mind but react to the world • Theory is only approximation of what happens and is greatly simplified • Help designers think about how to help users monitor their actions
Conscious vs. Unconscious Thought • Upon initial use of a new item almost all activity requires conscious thought • Over time, people no longer have to think about many actions (e.g. typing) • Frees conscious mind to work on higher-level items • But removes the ability for users to describe their processes
Alternative Representations of Knowledge • Stored as data • Memory of information • Can think of databases or indexed document stores • Examples of data you know as a CS expert? • Stored as process • Memory of processes for determining information • Can think of production rules, • Examples of process you know as a CS expert?
How Much Knowledge? • A professional’s knowledge is adequate when she knows about as much as other professionals in her domain. • Time available to acquiring and maintaining knowledge will affect limit for large domains • Describing expertise • ~50,000 chunks across disciplines, or 10 years of learning • When a domain exceeds this • It will increase use of externalized information stores • It will divide into subfields (specialize) • Science proceeds through producing new knowledge and compressing old through more general theories
Affective (Emotional) Design • HCI has traditionally been about designing efficient and effective systems • Now more about how to design interactive systems that make people respond in certain ways • e.g. to be happy, to be trusting, to learn, to be motivated • Color, icons, sounds, graphical elements and animations are used to make the ‘look and feel’ of an interface appealing • Conveys an emotional state • In turn this can affect the usability of an interface • People are prepared to put up with certain aspects of an interface (e.g. slow download rate) if the end result is appealing and aesthetic
Example: Friendly interfaces • Microsoft pioneered friendly interfaces for technophobes - ‘At home with Bob’ software • 3D metaphors basedon familiar places (e.g. living rooms) • Agents in the guise of pets (e.g. bunny, dog) were included to talk to the user • Make users feel more at ease and comfortable
Responses and Actions • Visceral • Unconscious, almost hard-wired response • Behavioral • Result of unconscious, learned knowledge • Reflection • Conscious reaction
Slight Detour: Anthropomorphism • Attributing human-like qualities to inanimate objects (e.g. cars, computers) • Well known phenomenon in advertising • Dancing butter, drinks, breakfast cereals • Much exploited in human-computer interaction • Make user experience more enjoyable, more motivating, make people feel at ease, reduce anxiety
Which do you prefer? 1. As a welcome message • “Hello Chris! Nice to see you again. Welcome back. Now what were we doing last time? Oh yes, exercise 5. Let’s start again.” • “User 24, commence exercise 5.”
Which do you prefer? 2. Feedback when get something wrong • “Now Chris, that’s not right. You can do better than that.Try again.” • “Incorrect. Try again.” Is there a difference as to what you prefer depending on type of message? Why?
Evidence to support anthropomorphism • Reeves and Naas (1996) found that computers that flatter and praise users in education software programs -> positive impact on them “Your question makes an important and useful distinction. Great job!” • Students were more willing to continue with exercises with this kind of feedback
Criticism of anthropomorphism • Deceptive, make people feel anxious, inferior or stupid • People tend not to like screen characters that wave their fingers at the user & say: • Now Chris, that’s not right. You can do better than that.Try again.” • Many prefer the more impersonal: • “Incorrect. Try again.” • Studies have shown that personalized feedback is considered to be less honest and makes users feel less responsible for their actions (e.g. Quintanar, 1982)
Virtual characters • Increasingly appearing on our screens • Web agents, characters in videogames, learning companions, wizards, pets, newsreaders, popstars • Provides a persona that is welcoming, has personality and makes user feel involved with them
Clippy • Why was Clippy dislikedby so many? • Was it annoying, distracting,patronising or other? • What sort of user liked Clippy?
Disadvantages of Interface Agents • Lead people into false sense of belief, enticing them to confide personal secrets with chatterbots (e.g. Alice) • Annoying and frustrating • e.g. Clippy • Not trustworthy • virtual shop assistants?
Error, Failure, Blame • Causation – people are used to looking for causes in the world around them • They assign causes when none exist • They wrongly accept blame for similar reasons • Interfaces need to be designed for unexpected behaviors • Human-human interaction proceeds through a process of: • action -> breakdown -> recovery -> action …
Error messages “The application Word Wonder has unexpectedly quit due to a type 2 error.” Why not instead: “the application has expectedly quit due to poor coding in the operating system” • Shneiderman’s guidelines for error messages include: • avoid using terms like FATAL, INVALID, BAD • Audio warnings • Avoid UPPERCASE and long code numbers • Messages should be precise rather than vague • Provide context-sensitive help
Should computers say they’re sorry? • Reeves and Naas (1996) argue that computers should be made to apologize • Should emulate human etiquette • Would users be as forgiving of computers saying sorry as people are of each other when saying sorry? • How sincere would they think the computer was being? For example, after a system crash: • “I’m really sorry I crashed. I’ll try not to do it again” • How else should computers communicate with users?
Breakdowns and Recovery • Donald Schoen says design (and other human activity) proceeds through a series of • Unselfconscious action • Breakdown • Reflection in action • Is this how we let our unconscious perform activities freeing our conscious self for other efforts? • Expectation is that there will be breakdowns • Breakdowns caused by unexpected circumstances or unexpected outcomes • Learning through breakdowns: “To Engineer is Human”
Universal Design & Accessibility • Not all users are the same. Some examples: • Visually impaired or blind (~2% in US) • Color blindness (7% of men, 0.4% of women in US) • Moderate to severe hearing loss (10% of population, maybe >40% in senior population) • Essential tremors (1-5%) • All of these have implications for computer use
Chapter 2 • Gulfs of execution/evaluation • Conscious vs. subconscious thought • Declarative vs. procedural knowledge • Emotional design • Visceral/behavioral/reflective activity/responses • People as storytellers/causation/blame • Failure and blame vs. breakdowns and recovery • Designing for human activity and variation
Compare these Visions • Apple’s “Knowledge Navigator” • https://www.youtube.com/watch?v=9bjve67p33E • Sun’s “Starfire: A Vision of Future Computing” • https://www.youtube.com/watch?v=NKJNxgZyVo0