430 likes | 442 Views
Characters that display emotion are critical to a rich and believable simulated environment. Emotion is the essential ingredient that creates the difference between Robotic behavior Lifelike engaging behavior
E N D
Characters that display emotion are critical to a rich and believable simulated environment. Emotion is the essential ingredient that creates the difference between Robotic behavior Lifelike engaging behavior Traditionally, animators painstakingly created these behaviors for pre-rendered animations But, truly interactive characters must generate their behavior autonomously through techniques based upon artificial emotions Must possess their own personalities and moods Artificial Emotion
Ian Wilson www.artificial-emotion.com Engine simulates personality, age, gender and the low level emotional behavior that drives most of our actions. From this simulation gestures and low level actions are generated. Behaviors can be mapped to application specific elements, i.e. characters, robots, cell phone agents, etc Behavior of real users can be simulated to predict what type of action that user might take preferences how a unique individuals mood changes while using your product how to adjust that behavior to ideally reward each individual Emotion AI Engine
Engine Features » Simulates millions of unique personalitites » Simulates ages from 5 to 105 » Simulates gender from very femenine to very masculine » Simulates core brain systems responsible for emotion level processing » Generates facial gestures using muscle simulation, MPEG4 FAPs and FACS action units » Generates eye saccade, eye movement speed/range/frequency control, blink » Generates head position movement, pitch and yaw » Generates upper body gestures (spine, shoulders, neck) with walk cycle due to follow soon » Generates low level actions such as movement speed / range / frequency, search patterns, approach, avoid » Input can be as simple as a single integer to drive the whole system » Output is a continuous stream of integer values » Output is not tied to a specific application for maximum flexibility
Ekman, P., Friesen, W.V., & Hager, J.C. (2002). An anatomically oriented coding system, based on the definition of "action units" (AUs) of a face that cause facial movements. Each AU may correspond to several muscles that together generate a certain facial action. As some muscles give rise to more than one action unit, correspondence between action units and muscle units is only approximate. 46 AUs were considered responsible for expression control and 12 for gaze direction and orientation. The FACS model has been used to synthesize images of facial expressions; exploration of its use in analysis problems has been a topic of continuous research. Facial Action Coding System (FACS)
MPEG-4 is the latest compression standard developed by the Moving Picture Experts Group (MPEG) of the ISO, the same group that brought us MPEG-1 and MPEG-2. MPEG-4 builds on the proven success of three fields: Digital Television Interactive graphics applications (synthetic content); Interactive multimedia (World Wide Web, distribution of and access to content) MPEG-4
Inspired by FACS Facial Definition Parameter set (FDP) and the Facial Animation Parameter set (FAP) were designed to allow the definition of a facial shape and texture, as well as the animation of faces reproducing expressions, emotions and speech pronunciation. MPEG-4 Facial Animation and Definition Paramters
The FAPs are based on the study of minimal facial actions and are closely related to muscle actions. They represent a complete set of basic facial actions, such as squeeze or raise eyebrows, open or close eyelids, and therefore allow the representation of most natural facial expressions. All FAPs involving translational movement are expressed in terms of the Facial Animation Parameter Units (FAPU). FAPUs aim at allowing interpretation of FAPs on any facial model in a consistent way, producing reasonable results in terms of expression and speech pronunciation. Facial Animation Parameters
JOY a la FAP open_jaw (F3), lower_t_midlip (F4), raise_b_midlip (F5), stretch_l_cornerlip (F6), stretch_r_cornerlip (F7), raise_l_cornerlip (F12), raise_r_cornerlip (F13), close_t_l_eyelid (F19), close_t_r_eyelid (F20) , close_b_l_eyelid (F21), close_b_r_eyelid (F22), raise_l_m_eyebrow (F33), raise_r_m_eyebrow (F34), lift_l_cheek (F41), lift_r_cheek (F42), stretch_l_cornerlip_o (F53), stretch_r_cornerlip_o (F54)
FDPs on the other hand are used to customise a given face model to a particular face. The FDP set contains a 3D mesh (with texture coordinates if texture is used), 3D feature points, and optionally texture and other characteristics such as hair, glasses, age, gender. Facial Definition Parameters
IBM Java Toolkit for MPEG-4 Facial Animation Engine (FAE) (University of Genova, Digital and Signal Processing Lab.) Miraface: MPEG-4 FAP Player (MIRALab, University of Geneva) XFace (Cognitive and Communicative Technologies, ITC-irst) Visage Technologies FAP Engines
Emotions comprise three layers of behavior Top level: momentary emotions Behaviors that we display briefly in reaction to events Next level: moods Prolonged emotional states caused by the cumulative effect of momentary emotions Underlying level: personality Behavior that we generally display when no momentary emotion or mood overrides Levels have an order of priority Momentary emotions over mood Mood over personality AE Approach
Momentary emotions are brief reactions to events that assume the highest priority when we select out behavior Behaviors are short lived and decay quickly Moods are produced by momentary emotions Usually by the cumulative affects of a series of momentary emotions Can gradually increase in prominence Even after the momentary emotions have subsided Development depends on whether momentary emotions are positive or negative If a character were to receive a stream of negative momentary emotions, then the mood would obviously be bad and would decay slowly Personality level is always present and has a consistent level of prominence Layer Prominence
The behavior that a character displays depends upon each emotional layer’s prominence The more prominent the layer, the higher the probability of that behavior being selected Behavior Selection
Autonomous AE of any depth is rarely seen in commercial interactive entertainment Some exceptions are P.F. Magic’s Catz and Dogz series, Fujitsu’s fin fin, and Cyberlife’s Creatures series Uses of AE
IE is currently dominated by genres that require the user to either conquer and/or kill everything in his or her path Little emotion is required Emotion primarily serves a social function in IE. Emotional responses are used to make characters believable and engaging If we walk into a virtual bar and the characters had distinct personalities, the scene would be an immersive and believable simulation If characters show no emotion Our suspension of disbelief would be immediately broken We would be immediately reminded that we were in a computer generated simulation Uses of AE
Using human characters necessarily implies that their behavior is deep and complex Unfortunately, we are most attuned to recognizing human emotion And therefore recognizing flawed human emotion. Which could easily break the illusion of an otherwise well constructed simulated environment One way to attack this problem is to use nonhuman characters Cats, dogs, and Norns all show engaging levels of interactive emotional behavior that maintains the illusion of life Without having or needing the complexity of human emotional response. Uses of AE
AE produces two fundamental components as output Actions General category dependent on the context of the situation in which the character exists A simulation’s movement system uses AE to select and/or modify an action AE indicates what actions are appropriate to the character’s personality and current mood A timid character is unlikely to do anything aggressive An outgoing, extroverted character might perform an action enthusiastically which would probably not be the case for an extreme introvert AE Output
Gestures Hand, body and facial Way to communicate our emotions to the outside world AE-driven gestures are tied directly to our characters’ personalities and moods and follow definite patterns. E.g., a sad looking fellow, shoulders hunched over, arms hanging limply and walking slowly as he makes his way through our environment Might compel a player to ask “Why does he look so sad. What is his story? Should I go and ask him? The kinds of questions that occur to the viewer of a truly interactive experience would be irrelevant without AE. AE Output
The genetic and environmental differences in the brain structures of individuals. As a species, our brains are almost identical This gives rise to our common sets of behaviors. But, we are all genetically and environmentally unique. It is the differences that give us our unique behavioral variations of common behavior patterns i.e., our personality What is Personality?
An area in a 3-D space. The axes of the space are Extroversion, Fear and Agression (EFA Space) Personality traits are represented by points within this space Positioned according to the amount with which they are correlated with the each axis. E.g., the trait of anxiety is positioned at (E -30, F +70, A -10) Associated -30% with Extroversion, 70% with Fear, and -10% with Aggression Personality Model
The position of the center of the personality area (P) represents by how much each of those axes (central traits) define the overall personality P is the center of a sphere Contains the set of personality traits that make up the aggregate personality The set of traits available to the character EFA Space
Based on the idea that the brain might have 3 central systems that mediate behavior Approach System Associated with Extroversion Behavioral Inhibition System Associated with Fear Flight/Fight System Associated with Aggression Three Dimensions of Personality
The behavior that we display at any time is controlled by these 3 systems And the genetic prominence of each system E.g., for an anxious personality Behavioral Inhibition System is very prominent Personality would have a very high Fear component Person generally fearful and cautious Three Dimensions of Personality
Determined by perceived signal of punishments and rewards Inputs for most elements of the system Modulated by the character’s personality The position of the personality in EFA space affects the Range of positive moods Maximum level of positive moods increases with E Range of negative moods Maximum level of negative moods increases with F Rate of change of moods Speed at which moods build up and decay increases with A Mood
Personality with high E, high F and high A Moods would have large negative and large positive values Moods would build up and decay rapidly Character would be very moody with large mood swings Mood Example
Nine modules Six of these represent conceptual neural systems Emotional reactions Personality Punishment and reward Mood Memory Motivations Two of these are the engine interfaces (API) Input Output The Self State module is the central data repository for the engine Represents the characters general emotional state at any time Engine Architecture
Basic types of input Takes incoming signals of raw, sensed punishment and reward (p/r) and translates into perceived signals of p/r Perceived p/r depends on Characters previous history of received p/r Personality Punishment and Reward
Use of habituation The more the character receives, in succession, a signal of one type The lower effect the signal has Use of novelty The longer the character goes without receiving one type of signal The greater the effect of that signal when it is received Punishment and Reward
Character’s personality determines how susceptible it is to punishment or reward E.g., a psychopath is Highly susceptible to reward Highly unsusceptible to punishment Makes them go after thrills without regard for the consequences Punishment and Reward
Arranged into a 4-layer hierarchy of needs Physiological layer is always bottom Relative positions of the remaining 3 layers in the hierarchy are determined by personality Safety Affiliation Esteem E.g., for the psychopath, Esteem is prioritized higher than Affiliation (friendship and kinship) Motiviations
Used to facilitate non-cognitive social processing Used in conjunction with the affiliation and esteem needs to provide enhanced social behavior Keeps track of how much a particular character is liked based on several factors Including social preferences of our family and friends Used by the reactive emotion module Memory
Innate emotional reactions that have not involved any deep cognitive processing Represent stimulus response reactions of the kind hard wired into our neural circuits Joy Sadness Fear Anger Surprise Disgust Reactions are modulated by personality Our psychopath would show very little fear in his reactions But may show a great deal of anger Triggered by p/r signals when they exceed thresholds derived from levels of motivational needs Reactive Emotions
All input is in the form of integer streams p/r signals for each of the physiological and safety needs (8 in total) Input representing signals received by the five senses for emotional reactions Some needs remain constant if not changed Assume we are sheltered unless we are informed otherwise Some needs change unless information is received to the contrary Hunger will increase unless we receive a reward signal for the hunger need The engine is updated with input at each developer-determined time step API Input
Positional information required to produce emotional body and facial gestures For a character whose body joints are being driven by an inverse kenmatics system Output is the joint deviations from a plain ”vanilla” movement to make the movement emotional For a timid character, the spine might be curved backwards, the shoulders hunched forwards, and the head down Determines characters movement speed, style, and smoothness A neurotic character would move quickly in short bursts and in a very ”staccato”, jerky fashion 3 API Output Formats
Semantic action plan Determined by the characters motivational needs Determines both what to do and how to do it How is taken from current mood and selected personality trait Character has been out in the cold rain a while Warmth need receiving a constant stream of punishment Results in making the mood level highly negative and the warmth need a high priority For a character with a timid personality the resulting semantic output would be Increase warmth very anxiously ”increase warmth” comes from motivational need for warmth ”very” comes from importance of need and mood level ”anxiously” is a trait that this particular character possesses and is appropriate to a negative mood 3 API Output Formats
Raw emotional state Gives the developer the flexibility to use the emotional state in ways not handled by the first two modes 3 API Output Formats
Jane is a 38-year old mother of two playing an office simulation She is the boss and she has to manage the well being of her office co-workers In this game, the tasks that the team has to perform are secondary to their interactions. Different team members respond differently to the same situation She has to make them all work well together and keep them happy (goal of the game) Example Scenario
Jane begins by taking a personality test and passes the result to the Engine Engine displays behaviors similar to hers She then decides the personalities of her 5 coworkers Extrovert Introvert Neurotic Psychopath A character with a personality similar to hers Example Scenario
Day One: Jane’s alter ego arrives at work Her character is greeted warmly by the extrovert The introvert sits at his desk, continues typing hardly acknowledging her presence The neurotic looks mildly panicked, she does not like change, her shoulders slump forward, she curls up looking submissive The psychopath sticks out his chest, shoulders back and fixes Jane’s character with a steely gaze as he marches, quickly and firmly, over to her to make his presence known The character similar to Janes’s looks unsure. He is unable to decide if the new boss is good or bad for him. Jane, knowing his personality, can empathize with him so she makes the first move to smile and greet him. Example Scenario