290 likes | 384 Views
Michael Isard and Andrew Blake, IJCV 1998. CONDENSATION – Conditional Density Propagation for Visual Tracking. Presented by Wen Li Department of Computer Science & Engineering Texas A&M University. Outline. Problem Description Previous Methods CONDENSATION Experiment Conclusion.
E N D
Michael Isard and Andrew Blake, IJCV 1998 CONDENSATION – Conditional Density Propagation for Visual Tracking Presented by Wen Li Department of Computer Science & Engineering Texas A&M University
Outline • Problem Description • Previous Methods • CONDENSATION • Experiment • Conclusion
Problem Description • What’s the task • Track outlines and features of foreground objects • Video frame-rate • Visual clutter
Problem Description • Challenges • Elements in background clutter may mimic parts of foreground features • Efficiency
Previous Methods • Directed matching • Geometric model of object • + motion model • Kalman Filter
Kalman Filter • Main Idea • Model the object • Prediction – predict where the object would be • Measurement – observe features that imply where the object is • Update – Combine measurement and prediction to update the object model
Kalman Filter • Assumption • Gaussian prior • Markov assumption
Kalman Filter • Essential Technique • Bayes filter • Limitation • Gaussian distribution • Does not work well in “clutter” background
CONDENSATION • Stochastic framework + Random sampling • Difference with Kalman Filter • Kalman Filter – Gaussian densities • Condensation – General situation
CONDENSATION • Symbols + goal • Assumptions • Modelling • Dynamic model • Observation model • Factored sampling • CONDENSATION algorithm
CONDENSATION • Symbols • xt – the state of object at time t • Xt – the history of xt, {x1,…, xt} • zt – the set of image features at time t • Zt – the history of zt, {z1,…, zt} • Goal • Calculate the model of x at time t, given the history of the measurements. -- P(xt |Zt)
CONDENSATION • Assumptions • Markov assumption • The new state is conditioned directly only on the immediately preceding state • P(xt|Xt-1)=p(xt|xt-1) • zt -- Independence (mutually and with respect to the dynamical process) • P(Zt|Xt)=∏ p(zi|xi) • P(zi|xi) = p(z|x)
CONDENSATION • Dynamic model • P(xt|xt-1) • Observation model
CONDENSATION • Propagation – applying Bayes rules Cannot be evaluated in closed form
CONDENSATION • Factored Sampling • Approximate the probability density p(x|z) • In single image • Step 1: generate a sample set {s(1),…, s(N)} • Step 2: calculate the weight πi corresponding to each s(i), using p(z | s(i)) and normalization • Step 3: calculate the mean position of x, that
CONDENSATION • Factored Sampling -- illustration
CONDENSATION • The CONDENSATION algorithm – finally! • Initialize p(x0) • For any time t • Predict: select a sample set {s’t(1),…, s’t(N)} from old sample set {st-1(1),…, st-1(N)} according to π t-1(n) predict a new sample-set {st(1),…, st(N)} from {s’t(1),…, s’t(N)}, using the dynamic model we mentioned previously • Measure: calculate weights πi according to observed features, then calculate mean position of xt as in the single image
Experiment • On Multi-Model Distribution The shape-space for tracking is built from a hand-drawn template of head and shoulder
Experiment • On Rapid Motions Through Clutter
Experiment • On Articulated Object
Experiment • On Camouflaged Object
Conclusion • Good news: • Works on general distributions • Deals with Multi-model • Robust to background clutter • Computational efficient • Controllable of performance by sample size N • Not too difficult
Conclusion • Problems might be • Initialization • “hand-drawn” shape-space