170 likes | 385 Views
Animation From Observation: Motion Editing. Dan Kong CMPS 260 Final Project. Why we need motion editing. Reuse motion capture data to different character and different actions Create infeasible motion which is hard to get by using motion capture
E N D
Animation From Observation: Motion Editing • Dan Kong • CMPS 260 Final Project
Why we need motion editing • Reuse motion capture data to different character and different actions • Create infeasible motion which is hard to get by using motion capture • Change of intent which can not be predicted before the motion caputre
In this project, I will implement a prototype motion editing system that has the following features • Multiresolution motion filtering • Multitarget motion interpolation • Motion path editing
What is Motion Multiresolution filtering • Basically, it is a technique that originally used in image processing now applied to the motion parameters of an articulated figure. • Intuition of this method: low frequencies contain gross and smooth motion while high frequencies contain detail, abrupt motion and most of the noise.
Motion filtering algorithm • The number of frames determine how many frequency bands (fb) for each signal: let 2n < m < 2n+1, then fb = n. • Calculate the low-pass sequence of all fb signal by convolving the signal with a low-pass filter, obtaining Gk, Gk-1...G0 where Gfb is the DC component of the signal (0=<k<=fb) • Obtain the band- pass filter bands: L k = Gk – Gk+1
Motion filtering algorithm (continued) • Adjust gains for each band and multiply Lk by their current gain values. • Reconstruct the motion signal: G0 = Gfb +
Multitarget motion interpolation • A process widely used in computer animation to blend between different models. • Facial animation: By blending the corresponding parameters of different face, we can control the expression. • A happy walk + a sad walk = ?
Multitarget motion interpolation (continued) • Mix multitarget interpolation and multiresolution filtering to blend the frequency bands of two or more motion separately. • Need to establish the time correspondence between two signals.
Motion path editing • What is path editing: • Altering previously motion capture data to follow a different path • Why Path Editing: - Using original motion in a new environment - Dynamic application: walking to a goal location
Alternative Approach • Capture all desired motions • Disadvantages of this approach: - Can not possible predict every possible required motion - Such a motion library are expensive to create - Unwieldy to control as it expands
Algorithm Overview • Basic Idea: Represent the motion relative to the path. As the path is altered, the orientation of character updates accordingly. • Algorithm - Automatic path generation - Calculate initial path coordinate system - Path Editing - Computing the changed path coordinate system
Path Description • Why B-Spline • Local control without affecting global shape • continuous • Path Coordinate Representation - p(t) is the coordinate center. The corresponding transformation matrix is P(t) - R(t) is the orientation of the coordinate system. - P(t)R(t) is the path’s coordinate system - R(t)-1P(t)-1 is the transformation from global to path’s local coordinate system
Initial Path Generation • Least-Square Fitting • Using the root translation as the fitting point. • Compute average Y translation to constrain the path to lie in X-Z plane. • Store the R0(t)-1P0(t)-1 for later editing
Path Editing • After Altering the curve • P(t)R(t) R0(t)-1P0(t)-1 is computed as the coordinate system for the new path • Once the P(t) is updated, the corresponding R(t) must be computed accordingly
Future Work • Generate more frames when the path is streched • Add constraint to avoid geometric violation.
References “Motion Path Editing” “Motion Signal Processing” “Motion Warping” “Multiresolution Curves” “Hierarchical B-Spline refinement”