120 likes | 293 Views
Expressive Gestures for NAO. Le Quoc Anh - Catherine Pelachaud CNRS, LTCI, Telecom- ParisTech , France. NAO TechDay, 13/06/2012, Paris. Objectives. Generate communicative gestures for Nao robot Integrated within an existing platform for virtual agent
E N D
Expressive Gestures for NAO Le QuocAnh - Catherine Pelachaud CNRS, LTCI, Telecom-ParisTech, France NAO TechDay, 13/06/2012, Paris
Objectives • Generate communicative gestures for Nao robot • Integrated within an existing platform for virtual agent • Nonverbal behaviors described symbolically • Synchronization (gestures and speech) • Expressivity of gestures • GVLEX project (Gesture & Voice for Expressive Reading) • Robot tells a story expressively. • Partners : LIMSI (linguistic aspects), Aldebaran (robotics), Acapela (speech synthesis), Telecom ParisTech (expressive gestures) NAO TechDay 2012
State of the art NAO TechDay 2012 • Several initiatives recently: • Salem et Kopp (2012): robot ASIMO, the virtual framework MAX, gesture description with MURML. • Aaron Holroyd et Charles Rich (2011): robot Melvin, motion scripts with BML, simple gestures, feedback to synchronize gestures and speech • Ng-Thow-Hing et al. (2010): robot ASIMO, gestures selection, synchronization between gestures and speech. • Nozawa et al. (2006): motion scripts with MPML-HP, robot HOAP-1 • Our system: Focus on expressivity and synchronization of gestures with speech using a common platform for Greta and for Nao
Steps GRETA System Behavior Realizer BML Intent Planning Behavior Planning Text FML Behavior Realizer BML • Build a library of gestures from a corpus of storytelling video: the gesture shapes should not be identical (between the human, virtual agent, robot) but they have to convey the same meaning. • Use the GRETA system to generate gestures for Nao • Following the SAIBA framework • Tworepresentationlanguages: FML (FunctionMarkupLanguage) and BML (BehaviorMarkupLanguage) • Threeseparated modules: plan communicative intents, select and plan gestures, and realizegestures NAO TechDay 2012
Global diagram LEXICON Gesture Selection Synchronisation with AI speech Planification of gesture duration Modification of gesture expressivity KEYFRAMES FML BML NAO TechDay 2012
Gesture Animation Planning • Synchronizationwith speech • The stroke phase coincides or precedes emphasized words of the speech (McNeill, 1992) • Gesture stroke phase timing specified by synch points • Expressivity of gestures • The same prototype but different animations • Parameters: • Spatial Extent (SPC): Amplitude of movement • Temporal Extent (TMP): Speed of movement • Power (PWR): Acceleration of movement • Repetition (REP): Number of Stroke times • Fluidity (FLD): Smoothness and Continuity • Stiffness (STF): Tension/Flexibility NAO TechDay 2012
Example <bml> <speech id="s1" start="0.0“ \vce=speaker=Antoine\ \spd=180\ Et le troisièmedittristement: \vce=speaker=AntoineSad\ \spd=90\ \pau=200\ <tm id="tm1"/>J'aitrèsfaim! </speech> <gesture id="beat_hungry" start="s1:tm1" end=“start+1.5" stroke="0.5"> <FLD.value>0</FLD.value> <OAC.value>0</OAC.value> <PWR.value>-1.0</PWR.value> <REP.value>0</REP.value> <SPC.value>-0.3</SPC.value> <TMP.value>-0.2</TMP.value> </gesture> </bml> <gesture id=“beat_hungry” min_time="1.0"> <phase type="STROKE-START“> <hand side=“BOTH"> <verticalLocation>YCC</verticalLocation> <horizontalLocation>XCenter</horizontalLocation> <distanceLocation>Zmiddle</distanceLocation> <handShape>OPENHAND</handShape> <palmOrientation>INWARD</palmOrientation> </hand> </phase> <phase type="STROKE-END“ > <hand side=“BOTH"> <verticalLocation>YLowerEP</verticalLocation> <horizontalLocation>XCenter</horizontalLocation> <distanceLocation>ZNear</distanceLocation> <handShape>OPEN</handShape> <palmOrientation>INWARD</palmOrientation> </hand> </phase> </gesture> keyframe1]<phase="preparation", start-time=“Start", end-time="Ready", description of stroke-start's position> keyframe[2] <phase="stroke", start-time="Stroke-start", end-time="Stroke-end", description of stroke-end's position> keyframe[3]<phase="retraction", start-time="Relax", end-time="End", description of rest position> NAO TechDay 2012
Compilation • Send timed key-positions to the robot using available APIs • Animation is obtained by interpolating between joint values with robot built-in proprietary procedures. API.AngleInterpolation (joints, values,times) BML Realizer BML Realizer NAO TechDay 2012
Demo « Trois petits morceaux de nuit » NAO TechDay 2012
Conclusion • Conclusion • A gesture model is designed, implemented for Nao while taking into account physical constraints of the robot. • Common platform for both virtual agent and robot • Expressivity model • Future work • Create gestures with different emotional colour and personal style • Validate the model through perceptive evaluations NAO TechDay 2012
Acknowledgment • This work has been funded by the ANR GVLEX project • It is supported from members of the laboratory TSI, Telecom-ParisTech NAO TechDay 2012