250 likes | 372 Views
Half Quadratic Analysis for Mean Shift: with Extension to A Sequential Data Mode-Seeking Method. Xiaotong Yuan, Stan Z. Li October 17, 2007. Outline. Motivation Theoretical Exploration Algorithm Extension Summary. Motivation. Put Mean Shift on proper grounds
E N D
Half Quadratic Analysis for Mean Shift: with Extension to A Sequential Data Mode-Seeking Method Xiaotong Yuan, Stan Z. Li October 17, 2007
Outline • Motivation • Theoretical Exploration • Algorithm Extension • Summary
Motivation • Put Mean Shift on proper grounds • Better understand the essence • Facilitate the numerical study • Fast multiple data mode-seeking • Improve exhaustive initialization based method
Background of Mean Shift Prior weights Mahalanobis Distance kernel function Mean Shift: A fixed point iteration algorithm to find the local maximum of
Prior Understanding • Gradient ascent algorithm with an adaptive step size[Cheng 1995] • Quadratic bounding optimization [Fashing et al. 2005] • Gaussian Mean shift is an EM algorithm [Carreira-Perpinan 2007]
Half Quadratic Optimization • Theory of convex conjugated functions • Non-quadratic convex objective functions is optimized in a quadratic-like way • Convergence property is deeply studied [Mila 2005, Allain 2006]
Preliminary Facts • All the conditions we impose on kernel are summarized as below:
HQ Optimization For KDE dual variable quadratic term conjugated term The supermum is reached at
Alternate Maximization A new objective function on extended domain: Equivalent to Mean-Shift
Relation to Bound Optimization • Quadratic bounding optimization [Fashing et al. 2005] • HQ formulation: for a fixed point , denote analytically defines a quadratic lower bound for at each time stamp
Relation to EM • Gaussian Mean shift is an EM algorithm [Carreira-Perpinan 2007] • HQ optimization: the alternate maximization scheme is equivalent to E-Step and M-Step
Convergence Rate • When is isotropic, the root-convergence of convex kernel mean shift is at least linear with rate
Multiple Mode-Seeking Exhaustive Initialization Run the Mean Shift in parallel
Adaptive Mean Shift • Basic idea • Properly estimate the starting points near the significant modes • Sequentially re-weight the samples to guide the search
Algorithm Description Global mode estimation by Annealed MS [Shen 2007] • Initialization: • Repeat - Starting point estimation: - Local Mode estimation: - Sample prior re-weight: Until ever-found mode reappears By traditional mean shift
Illustration Four iterations only! Prior weight curve
Advantages • Initialization invariant • Highly efficient with linear complexity is the number of significant modes
Application: Color Constancy Linear render model [Manduchi 2006] color surfaces under “canonical” illuminant Color transformation Gaussian distribution Test Image containing the color surfaces with illumination change Problem:How to estimate possible existing?
Existing Method • MAP formulation [Manduchi 2006]: Possible existing are estimated with EM algorithm Limitation: • number of render vectors should be known a prior • All the render vectors should be properly initialized
Convex Kernel based Solution • Compensation accuracy Prior weight: pixel i on training surface j • Ada-MS for sequential modes seeking
Results Compensated image Ground Truth Mapping Prior weight images
Summary • Put the mean-shift on a proper ground: HQ optimization • Connection with previous viewpoints: bounding optimization and EM • Fast multiple data modes seeking (linear complexity and initialization invariant) • Very simple to implement • We hope to see more applications