520 likes | 773 Views
Perspectives of Social Computing life-like characters as social actors. Helmut Prendinger and Mitsuru Ishizuka Dept. of Information and Communication Eng. Graduate School of Information Science and Technology University of Tokyo. Social Computing objective. Social Computing aims to support
E N D
Perspectives of Social Computinglife-like characters as social actors Helmut Prendinger and Mitsuru Ishizuka Dept. of Information and Communication Eng. Graduate School of Information Science and Technology University of Tokyo
Social Computingobjective Social Computing aims to support the tendency of humans to interact with computers as social actors. Technology that reinforces the bias towards social interaction by appropriate response may improve communication between humans and computational devices.
Social Computing (cont.)realization Most naturally, social computing can be realized by using life-like characters.
Life-like Charactersrequirements for their believability • Synthetic bodies • 2D or 3D animations [realism not required] • Affective voice • Emotional display • Gestures • Posture Terms Life-likeness: providing “illusion of life” Believability: allowing “suspension of disbelief” Features of life-like Characters Embodiment Artificial Mind • Emotional response • Personality • Context and situation dependent response [social role awareness] • Adaptive behavior [social intelligence]
Outlinesocial computing • Background • The Media Equation, Affective Computing, the Persona Effect • Artificial mind • An architecture for emotion-based agents • Embodied behavior • Gestures, affective speech • Implementation • Coffee shop demo, Casino demo • Emotion recognition (sketch only) • Stereotypes, biosensors • Environments with narrative intelligence (sketch only) • Character and story
Backgroundcomputers as social actors • Psychological studies show that people are strongly biased to treat computers as social actors • For a series of classical tests of human-human social interaction, results still hold if “human” is replaced by “computer” • Computers with language output (human-sounding voice) and a role (companion, opponent,…) • Tendency to be nicer in “face-to-face” interactions, ... • We hypothesize that life-like characters support human tendency of natural social interactions with computers Ref.: B. Reeves and C. Nass, 1998. The Media Equation. Cambridge University Press, Cambridge.
Background (cont.)computers that express and recognize emotions • Affective Computing (R. Picard) • “[…] computing that relates to, arises from, or deliberately influences emotions.” • “[…] if we want computers to be genuinely intelligent, to adapt to us, and to interact naturally with us, then they will need to recognize and express emotions […]” • We hypothesize that life-like characters constitute an effective technology to realize affect-based interactions with humans Ref.: R. Picard, 1997. Affective Computing. The MIT Press.
Background (cont.)the persona effect • Experiment by J. Lester et al. on the `persona effect’ • [...] which is that the presence of a lifelike character in an interactive learning environment - even one that is not expressive - can have a strong positive effect on student’s perception of their learning experience. • Dimensions: motivation, entertainment, helpfulness, … Herman the Bug watches as a student chooses roots for a plant in an Alpine Meadow Ref.: J. Lester et al., 1997. The Persona effect: Affective impact of animated pedagogical agents. Proc. of CHI’97, 359-366. J. Lester et al., 1999. Animated agents and problem-solving effectiveness: A large-scale empirical evaluation . Artificial Intelligence in Education, 23-30.
Life-like Charactersdesigning their mind • Architecture for emotion-based behavior • Affect processing • Personality • Awareness of social and contextual factors • Adaptive to interlocutor’s emotional responses • SCREAM: SCRipting Emotion-based Agent Minds • Scripting tool to specify character behavior • Encodes affect-related processes • Allows author to define character profile for agent
SCREAM System ArchitectureSCRipting Emotion-based Agent Minds Ref.: H. Prendinger, S. Descamps, M. Ishizuka, 2002. Scripting affective communication with life-like characters. Artificial Intelligence Journal. To appear. H. Prendinger, M. Ishizuka, 2002. SCREAM: SCRipting Emotion-based Agent Minds. Proceedings 1st International Joint Conference on Autonomous Agents and Multi-Agent Systems (AAMAS’01).To appear.
Emotion Generation Componentelicitation and management of emotions • Appraisal Module • Process that qualitatively evaluates events according to their emotional significance for the character • Outputs emotion types: joy, distress, angry at, happy for, resent, Schadenfreude, … • Resolution Module • Given a multitude of emotions are active at a time, the most dominant emotion must be extracted • Maintenance Module • Emotions are short-lived, they decay
Appraisal Modulethe cognitive structure of emotions Ref.: A. Ortony, G. Glore, A. Collins, 1988. The Cognitive Structure of Emotions. Cambridge University Press, Cambridge.
Appraisal Rulesexamples joy(L,F,I,S) if % emotion type wants(L,F,Des,S) and % goal holds(F,S) and % belief I = Des. % intensity happy-for(L1,L2,F,I,S) if % emotion type likes(L1,L2,App,S) and % attitude joy(L2,L1,F,Des,S) and % belief (hypothesized emotion of L2) log-combination(App,Des,I). % intensity
Appraisal Rules (cont.)examples angry-at(L1,L2,A,I,S) if % emotion type holds(did(A,L2),S) and % belief causes(A,F,S0) and % belief precedes(S0,S) and % formal condition blameworthy(A,Praise,L1) and % standard wants(L1,Non-F,Des,S) and % goal log-combination(Praise,Des,I). % intensity
Emotion Resolution/Maintenanceemotion dynamics active emotions (valencepositiveor negative) winning state happy for (5) distress (2) 0 happy for (5) distress (3) happy for (3) distress (1) bad mood (4) 1 hope (4) distress (2) happy for (1) distress (0) hope (4) 2 angry at (3) hope (0) distress (1) happy for (-1) angry at (3) 3 Example of disagreeable character [agreeableness dimension of personality decides decay rate of pos./neg. emotions]
Emotion Regulation Componentinterface between emotional state and expression • “Display rules” • Ekman and Friesen (’69): expression and intensity of emotions is governed by social and cultural norms • Linguistic style variations • Brown and Levinson (’87): linguistic style is determined by assessment of seriousness of Face Threatening Acts (FTAs) • Social variables (universal): distance, power, imposition of speech acts • Emotion regulation studies • J. Gross in psychology • De Carolis, de Rosis in HCI
Social Filter Moduleemotion expression modulating factors Linear combination of parameters Ref.: H. Prendinger, M. Ishizuka, 2001. Social role awareness in animated agents. Proceedings 5th International Conference on Autonomous Agents (Agents’01),270-277.
Social Filter Module (cont.)alternative combination using decision network
Agent Model Componentaffective state management • Character Profile • Static and dynamic features • Values of dynamic features are initialized • Static features • personality traits, standards • Dynamic features • goals, beliefs: updated by surface consistency check • Attitude, social distance: simple update mechanisms
Affect Dynamicsattitude and familiarity change • Attitudes (liking, disliking) • Attitudes are an important source of emotions • Decisive for `happy for’–resent, `sorry for’–gloat • On the other hand … an agent’s attitude changes as result of `affective interaction history’ (elicited emotions) with interlocutor • Implementation of Signed Summary Record (Ortony ‘91) • Familiarity (social distance) • Source for some emotions • attraction, aversion • Positive emotions elicited with interlocutor improves social relationship, possibly increases familiarity • Simplified implementation of Social Perlocutions (Pautler and Quilici ‘98) • [More sophisticated model implemented by Cassell and Bickmore ’01, variety of topics and depth of topics considered]
Signed Summary Recordcomputing attitude winning emotional states positive emotions negative emotions joy (2) distress (1) joy (2) distress (1) distress (3) hope (2) distress (3) angry at (2) good mood (1) angry at (2) hope (2) happy for (2) gloat (1) Attitude summary value good mood (1) + = gloat (1) Liking if positive Disliking if negative happy for (2) time Ref.: A. Ortony, 1991. Value and emotion. In: W. Kessen, A. Ortony, and F. Craik (eds.), Memories, Thoughts, and emotions: Essays in the honor of George Mandler. Hillsdale, NJ: Erlbaum, 337-353.
Updating Attitudeweighted update rule • What if a high-intensity emotion of opposite sign occurs? (a liked agent makes the character very angry) • Character ignores `inconsistent’ new information • Character updates summary value by giving greater weight to `inconsistent’ information (primacy of recency, Anderson ‘65) • Consequence for future interaction with interlocutor • Momentary disliking: new value is active for current situation • Essential disliking: new value replaces summary record • 30.25 50.75=3 • likingh-weight angry r-weightdisliking
Input and Output Componentsreceiving utterances and expressing emotions • Input are formulas encoding • speaker, hearer • conveyed information • modalities (facial display, linguistic style) • hypothesized interlocutor goals, attitudes,… • Output • 2D animation sequences displaying character • Synthesized speech
Embodimentcharacters that act and speak • Realization of embodiment • 2D animation sequences visually display the character • Synthetic speech • Technology • Microsoft Agent package (installed client-side) • JavaScript based interface in Internet Explorer • Microsoft Agent package • Controls to trigger character actions and speech • Text-to-Speech (TTS) Engine • Voice recognition • Multi-modal Presentation Markup Language (MPML) • Easy-to-use XML-style authoring tool • Supports multiple character synchronization, simple synchronization of action and speech • Interface with SCREAM system
Propositional gestures I Gesturesnon-verbal behaviors supporting speech “there is a small difference” “there is a big difference” Ref.: J. Cassell, 2000. Nudge nudge wink wink: Elements of face-to-face conversation for embodied conversational agents. In: J. Cassell, S. Prevost, J. Sullivan, and E. Churchill. Embodied Conversational Agents. The MIT Press, 1-27.
Propositional gestures II Gestures (cont.)non-verbal behaviors supporting speech “do you mean [this]” “or do you mean [that]”
Gestures and posture for emotion expression Gestures (cont.)non-verbal behaviors supporting speech “happy” “sad”
Communicative Behavior I Gestures (cont.)non-verbal behaviors supporting speech Communicative function “greet” “want turn”
Communicative Behavior II Gestures (cont.)non-verbal behaviors supporting speech Communicative function “take turn” “give feedback”
Affective Speechvocal effects associated with five emotions Ref.: I. R. Murray, J. L. Arnott, 1995. Implementation and testing of a system for producing emotion-by-rule in synthetic speech. Speech Communication (16),369-390.
Implementation (cont.)simple MPML script <!--Example MPML script --> <mpml> … <scene id=“introduction” agents=“james,al,spaceboy”> <seq> <speak agent=“james”>Do you guys want to play Black Jack?</speak> <speak agent=“al”>Sure.</speak> <speak agent=“spaceboy”>I will join too.</speak> <par> <speak agent=“al”>Ready? You got enough coupons? </speak> <act agent=“spaceboy” act=“applause”/> </par> </seq> </scene> … </mpml>
Implementation (cont.)interface between MPML and SCREAM <!--MPML script illustrating interface with SCREAM --> <mpml> … <consult target=”[…].jamesApplet.askResponseComAct(‘james,’al’,’5’)”> <test value=“response25”> <act agent=“james” act=“pleased”/> <speak agent=“james”>I am so happy to hear that.</speak> </test> <test value=“response26”> <act agent=“james” act=“decline”/> <speak agent=“james”>We can talk about that another time.</speak> </test> … </consult> … </mpml>
Life-like Characters in Inter-Actionthree demonstrations Coffee Shop Scenario Casino Scenario Japanese Comics Scenario Animated agents with personality and social role awareness Life-like characters that change their attitude during interaction Animated comics actors engaging in developing social relationships
Coffee Shop Scenariolife-like characters with social role awareness • User in the role of customer • Animated waiter features • Emotion, personality • Social role awareness: respecting conventional practices depending on interlocutor • Aim of implementation • Entertaining environment for language conversation training • Aim of study • Does social role awareness have an effect on the character’s believability? Ref.: H. Prendinger, M. Ishizuka, 2001. Let’s talk! Socially intelligent agents for language conversation training. IEEE Transactions of SMC – Part A: Systems and Humans,31(5), 465-471.
Experimental Studyuser-agent and agent-agent interaction Cast James (waiter) Genie (manager) Al (waiter’s friend)
Example Conversationunfriendly waiter version (excerpt only)
Support for effect of social role awareness Behavior more natural to userin C2 [respects role] Behavior more agreeable in C2 [friendly behavior even though low threat from user] Unexpected results James’ behavior slightly more natural to others in C2 Personality and mood rated differently (despite of short interaction time) Resultssocial role awareness and believability Mean scores for participants’ attitudes (8 subjects for each version) Ratings range from 1 (disagreement) to 7 (agreement)
Casino Scenariolife-like characters with changing attitude • User in the role of player of Black Jack game • Animated advisor features • Emotion, personality • Changes attitude dependent on interaction history with user • Advisor’s agent profile • Agreeable, extrovert, initially slightly likes the user • Wants user to follow his advice (high intensity) • Wants user to win (low intensity) Implemented with MPML and SCREAM
Casino Demo Produced in cooperation with Sylvain Descamps
Emotional Arcadvisor’s winning emotions depending on attitude • Fig. shows the agent’s internal intensity values for dominant emotions • Highly abstract description (personality, context,… influences are left out) • Values of expressed emotions differ depending on agent’s personality and contextual features • Since character’s personality is agreeable, e.g., negative emotions are de-intensified Game 1 user rejects advice looses game Game 2 rejects advice looses game Game 3 rejects advice looses game Game 4 follows advice looses game Game 5 rejects advice wins game Pos. attitude distress (4) sorry for (4) sorry for (5) good mood (5) gloat (5) Neg. attitude
Japanese Comics ScenarioJapanese Manga for children “Little Akko’s Got a Secret” • User controls an avatar (“Kankichi”) • Goal is to elicit Little Akko’s attraction emotion by guessing her wishes • Correct guesses increase her liking and familiarity values • Animated character features • Emotion (joy, distress, attraction, aversion) • Aim of game • Develop social relationship • Entertainment User makes a wrong guess …
Emotion Recognitionlimitations of our characters as social actors • Human social actors can recognize interlocutors’ emotions • Humans recognize frustration (confusion,…) when interacting others and typically react appropriately • Our characters’ emotion recognition ability is very limited • Characters make assumptions about other agents (incl. the user) and use emotion generation rules to detect their emotional state • Stereotypes are used to reason about emotions of others • A typical visitor in a coffee shop wants to be served a certain beverage and is assumed to be distressed upon failure to receive it (the goal “get a certain beverage” is not satisfied) • A typical visitor in a casino wants to win, … • The very same appraisal rules are used to reason about the emotional state of the interlocutor • Emotion recognition via physiological data from user • We started to use bio-signals to detect users’ emotional state
Physiological Data AssessmentProComp+ • EMG: Electromyography • EEG: Electroencephalography • EKG: Electrocardiography • BVP: Blood Volume Pulse • SC: Skin Conductance • Respiration • Temperature
Emotion ModelLang’s (95) 2-dimensional model • Valence: positive or negative dimension of feeling • Arousal: degree of intensity of emotional response excited enraged joyful Arousal sad depressed relaxed Valence
Educational Gamesrecognizing students’ emotions (C. Conati) • Computer games have high potential as educational tools • May generate high level of engagement and motivation • Detect students’ emotions to improve learning experience Prime Climb Game to teach number factorization (UBC)
self-esteem extraversion reproach relief shame reproach shame neg valence pos valence relief arousal eyebrows position skin conductivity heart rate vision based recognizer EMG GSR HR monitor Example Session user’s traits ti ti+1 provide help agent’s action user’s emotional state at ti do nothing user’s emotional state at ti+1 bodily expressions sensors down(frowning) high high
Narrative Intelligence (sketch only)limitations of our characters as social actors • Our characters are embedded in quite simplistic scenarios • Knowledge gain might be limited even if characters are life-like • “Knowledge is Stories” (R. Schank ‘95) • Schank argues that knowledge is essentially encoded as stories • This suggests to design `story-like’ interaction scenarios • Narrative Intelligence (P. Sengers ’00) • Humans have a tendency to interpret events in terms of narratives • This suggests that characters should be designed to produce narratively comprehensible behavior, so that humans can easily create narrative explanations of them • Applications • Learning environments (users as co-constructors of narratives) • Virtual sales agents (story serves rapport building and credibility) • Corporate memory (story-telling to enhance knowledge exchange in organizations, learning from mistakes,…)