1 / 25

Half Quadratic Analysis for Mean Shift: with Extension to A Sequential Data Mode-Seeking Method

Half Quadratic Analysis for Mean Shift: with Extension to A Sequential Data Mode-Seeking Method. Xiaotong Yuan, Stan Z. Li October 17, 2007. Outline. Motivation Theoretical Exploration Algorithm Extension Summary. Motivation. Put Mean Shift on proper grounds

jeneil
Download Presentation

Half Quadratic Analysis for Mean Shift: with Extension to A Sequential Data Mode-Seeking Method

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Half Quadratic Analysis for Mean Shift: with Extension to A Sequential Data Mode-Seeking Method Xiaotong Yuan, Stan Z. Li October 17, 2007

  2. Outline • Motivation • Theoretical Exploration • Algorithm Extension • Summary

  3. Motivation • Put Mean Shift on proper grounds • Better understand the essence • Facilitate the numerical study • Fast multiple data mode-seeking • Improve exhaustive initialization based method

  4. Mean Shift as Half Quadratic Optimization

  5. Background of Mean Shift Prior weights Mahalanobis Distance kernel function Mean Shift: A fixed point iteration algorithm to find the local maximum of

  6. Prior Understanding • Gradient ascent algorithm with an adaptive step size[Cheng 1995] • Quadratic bounding optimization [Fashing et al. 2005] • Gaussian Mean shift is an EM algorithm [Carreira-Perpinan 2007]

  7. Half Quadratic Optimization • Theory of convex conjugated functions • Non-quadratic convex objective functions is optimized in a quadratic-like way • Convergence property is deeply studied [Mila 2005, Allain 2006]

  8. Preliminary Facts • All the conditions we impose on kernel are summarized as below:

  9. HQ Optimization For KDE dual variable quadratic term conjugated term The supermum is reached at

  10. Alternate Maximization A new objective function on extended domain: Equivalent to Mean-Shift

  11. Relation to Bound Optimization • Quadratic bounding optimization [Fashing et al. 2005] • HQ formulation: for a fixed point , denote analytically defines a quadratic lower bound for at each time stamp

  12. Relation to EM • Gaussian Mean shift is an EM algorithm [Carreira-Perpinan 2007] • HQ optimization: the alternate maximization scheme is equivalent to E-Step and M-Step

  13. Convergence Rate • When is isotropic, the root-convergence of convex kernel mean shift is at least linear with rate

  14. Adaptive Mean Shift forSequential Data Mode-Seeking

  15. Multiple Mode-Seeking Exhaustive Initialization Run the Mean Shift in parallel

  16. Adaptive Mean Shift • Basic idea • Properly estimate the starting points near the significant modes • Sequentially re-weight the samples to guide the search

  17. Algorithm Description Global mode estimation by Annealed MS [Shen 2007] • Initialization: • Repeat - Starting point estimation: - Local Mode estimation: - Sample prior re-weight: Until ever-found mode reappears By traditional mean shift

  18. Illustration Four iterations only! Prior weight curve

  19. Advantages • Initialization invariant • Highly efficient with linear complexity is the number of significant modes

  20. Numerical Test: Image Segmentation

  21. Application: Color Constancy Linear render model [Manduchi 2006] color surfaces under “canonical” illuminant Color transformation Gaussian distribution Test Image containing the color surfaces with illumination change Problem:How to estimate possible existing?

  22. Existing Method • MAP formulation [Manduchi 2006]: Possible existing are estimated with EM algorithm Limitation: • number of render vectors should be known a prior • All the render vectors should be properly initialized

  23. Convex Kernel based Solution • Compensation accuracy Prior weight: pixel i on training surface j • Ada-MS for sequential modes seeking

  24. Results Compensated image Ground Truth Mapping Prior weight images

  25. Summary • Put the mean-shift on a proper ground: HQ optimization • Connection with previous viewpoints: bounding optimization and EM • Fast multiple data modes seeking (linear complexity and initialization invariant) • Very simple to implement • We hope to see more applications

More Related