300 likes | 382 Views
Eyes Alive. Sooha Park - Lee Jeremy B. Badler - Norman I. Badler University of Pennsylvania - The Smith-Kettlewell Eye Research Institute Presentation Prepared By: Chris Widmer CSE 4280. Outline. Introduction Motivation Background Overview of System Descriptions Results Conclusions.
E N D
Eyes Alive Sooha Park - Lee Jeremy B. Badler - Norman I. Badler University of Pennsylvania - The Smith-Kettlewell Eye Research Institute Presentation Prepared By: Chris Widmer CSE 4280
Outline • Introduction • Motivation • Background • Overview of System • Descriptions • Results • Conclusions
Introduction • Eye movement an important expression technique • Statistical eye movement model based on empirical data
Motivation • Natural look eye movement for animations of close-up face views • Traditionally difficult to attain accurate eye movement in animations • No proposals for saccadic eye movement for easy use in speaking/expressive faces • Recent interest in construction of human facial models
Background • To build a realistic face model: • Geometry modeling • Muscle behavior • Lip synchronization • Text synthesis • Research has traditionally not focused on eye movement.
Background • Eyes are essential for non-verbal communication • Regulate flow of conversation • Search for feedback • Express emotion • Influence of behavior New approach based on statistical data and empirical studies
Saccades • Rapid movements of both eyes from one gaze position to another. • Only Eye Movement Executed Consciously • Balance conflicting demands of speed and accuracy • Magnitude – angle the eyeball rotates to change position • Direction – 2D axis of rotation, 0 degrees to the right • Duration – Time of movement • Inter-saccadic Interval – time between saccades
Saccades • Example: Magnitude 10, 45 degrees • Rotating 10 degrees, right-upward • Initial/Final Acceleration: 30,000 deg/sec • Peak Velocity – 400 – 600 deg/sec • Reaction Time: 180 – 220 msec • Duration and Velocity Functions of Magnitude • Magnitude Approximation • D = D0 + d * A • D = Duration, A = Amplitude, d = increment in duration per degree (2-2.7 msec/deg), D0 = Intercept (20-30 ms) • Often accompanied by head rotation
Background • Three Functions of Gaze • Sending Social Signals • Open Channel to Receive Information • Regulate Flow of Conversation
Overview of System • Eye-tracking images analyzed, statistically based model generated using Matlab • Lip movements/Eye Blinks/Head Rotation analyzed by alterEGO face motion analysis system
Overview of System • Face Animation Parameter (FAP) File • Eye Movement Synthesis System (EMSS) • Adds eye movement data to FAP file • Modified from face2face’s animator plug-in for 3D Studio Max
Analysis of Data • Eye movements recorded with eye-tracking visor (ISCAN) – monocle and two miniature cameras • One views environment from left eye perspective, other is close-up of left eye • Eye image recorded • Device tracks by comparing corneal reflection of the light source relative to the location of the pupil center • Reflection acts as reference point while pupil changes during movement
Analysis of Data • Pupil position found using pattern mapping • Default threshold grey level using Canny Edge Detection operator • Positional histograms along X and Y axis calculated • Two center points with maximum correlation chosen
Analysis of Data • Saccade Magnitude • Frequency of a specific magnitude (least mean squares distribution) • d = Distance traversed by pupil center • r = radius of eyeball (1/2 of xmax • P = % chance to occur • A = Saccade Magnitude (Degrees)
Analysis of Data • Saccade Duration measured with 40 deg/sec threshold • Used to derive instantaneous velocity curve for every saccade • Duration of each movement normalized to 6 frames • Two classes of Gaze: • Mutual • Away
Synthesis of Eye Movement • Attention Monitor (AttMon) • Parameter Generator (ParGen) • Saccade Synthesizer (SacSyn)
Synthesis of Natural Eye Movement • AttMon determines mode, changes in head rotation, gaze state • ParGen determines saccade magnitude, direction, duration, and instantaneous velocity • SacSyn synthesizes and codes movements into FAP values
Synthesis of Natural Eye Movement • Magnitude determined by inverse of fitting function shown earlier (Slide 16) • Mapping guarantees same probability distribution as empirical data • Direction determined by head rotation (threshold), and distribution table • Uniform Distribution, 0 to 100 • 8 non-uniform intervals assigned to respective directions
Synthesis of Natural Eye Movement • Duration determined by first equation, respective values for d and D0 • Velocity determined by using fitted instantaneous velocity curve • SacSyn system calculates sequence of coordinates for sys centers • Translated into FAP values, rendered in 3D Studio MAX • Face2face animation plug-in to render animations with correct parameters
Results • 3 Different Methods Tested • Type 1 -> No Saccadic Movements • Type 2 -> Random Eye Movement • Type 3 -> Sampled from Estimated Distributions (synchronized with head movements) • Tests were subjective
Results • Q1: Did the character on the screen appear interested in (5) or indifferent (1) to you? • Q2: Did the character appear engaged (5) or (1) distracted during the conversation? • Q3: Did the personality of the character look friendly (5) or not (1)? • Q4: Did the face of the character look lively (5) or deadpan (1)? • Q5: In general, how would you describe the character?
Conclusions • Saccade Model for Talking and Listening Modes • 3 Different Eye Movements: Stationary, Random, Model-based • Model-based scored significantly higher • Eye Tracking Data recorded from a subject • New recorded data for every character This model allows any number of unique eye movement sequences.
Drawbacks and Improvements • Aliasing with small movements • Sensing of eye movement vs. head movement during data gathering • Future Enhancements • Eye/Eyelid data • More model gaze patterns • More subjects for data • Scan-path model for close-up images
Developments • J. Badler, Director, Center for Human Modeling and Simulation • Digital Human Modeling/Behavior • “Jack” Software • Simulation of workflow using virtual people
References • Badler, Jeremy B., Badler, Norman I, Lee, Sooha Park, “Eyes Alive • http://www.cis.upenn.edu/~sooha/home.html • http://www.cis.upenn.edu/~badler/ • http://cg.cis.upenn.edu/hms/research.html