210 likes | 806 Views
Expression Cloning. Jung-yong Noh Ulrich Neumann Siggraph01. Introduction (1/2). What is Expression Cloning? Allow animations to be easily retargeted to new models Why Expression Cloning? Easily create facial animations for character models Provide an alternative from scratch.
E N D
Expression Cloning Jung-yong Noh Ulrich Neumann Siggraph01
Introduction (1/2) • What is Expression Cloning? • Allow animations to be easily retargeted to new models • Why Expression Cloning? • Easily create facial animations for character models • Provide an alternative from scratch
Introduction (2/2) • How to do Expression Cloning? • Transfer vertex motion vectors from a source face model to a target model 1. Determine surface points correspondence 2. Transfer motion vectors
Related works (1/4) • Two kinds of facial animation approaches 1. Physical behaviors of the bone and muscle structures A Muscle Model for Animating Three-Dimensional Facial Expression, K. Waters et al [31]
Related works (2/4) • Two kinds of approaches 2. Smooth surface deformation Animation do not simply transfer between models Making Faces, H. Malvar et al [31]
Related works (3/4) • Reusing data for new models • Vector based muscle models • Placing heuristic muscles under the surface of the face • Repeat for each new model • A parametric approach • Associating the motion of a group of vertices to a specific parameter • Manual association must be repeated for models
Related works (4/4) • The goal of this paper • Reusing motion data to produce facial animations • Same qualities • Easily transform • Control varied target models from one generic model • Similar work • Performance driven facial animation, MPEG-4 • Tracking a live actor; 84 feature points
Expression Cloning (1/11) Two steps: 1. Dense surface correspondences 2. Animation with motion vectors
Expression Cloning (2/11) • Dense surface correspondences • Determine which surface points in the target correspond to vertices in the source model • Different number of vertices or connectivity • Small set of initial correspondences to establish an approximate relationship
Expression Cloning(3/11) • Dense surface correspondences • Radial Basis Functions (RBF) • Roughly project vertices in the source model onto the target model • Cylindrical Projections
Expression Cloning (4/11) • Dense surface correspondences
Expression Cloning (5/11) • Animation with Motion Vectors • Displace each target vertex to match the motion of a corresponding source surface point. • Need Dense source motion vectors, linear interpolation • Direction and magnitude of a motion vector must be altered and scaled
Expression Cloning (6/11) • Animation with Motion Vectors 2.1 Motion Vector Direction Adjustment 2.2 Motion Vector Magnitude Adjustment
Expression Cloning (7/11) • Direction Adjustment
Expression Cloning (8/11) • Magnitude Adjustment
Expression Cloning (9/11) • Direction Adjustment & Magnitude Adjustment ▪ Local bounding box (BB), scale and rotate▪ limit by a global threshold ▪ m : motion vector
Expression Cloning (10/11) • Lip contact line • Models have lips that touch at a contact line • Lower lip vertices may be controlled by upper lip triangle • Solve: • Include all the source-model lip contact line vertices for the RBF morphing step • Completely align the lip contact lines of the two models
Expression Cloning (11/11) • Automated Correspondence Selection • A small set of correspondences is needed for the RBF morphing • 15 heuristic rules when applied to human faces
Results • Animation can be created by motion capture data • Wide variety of target models
Conclusion • Expression cloning • Use high-quality dense 3D data in source model animations • Produce animations of different models with similar expressions • The method is fast and produces real time animations.
Future Works • Stick figures and cartoons • Use sparse source data without loss of expressive animation quality • Control knobs • To amplify or reduce a certain expression • Tongue and teeth model