450 likes | 559 Views
Similarity and Difference. Pete Barnum January 25, 2006 Advanced Perception. Color. Texture. Visual Similarity. Uses for Visual Similarity Measures. Classification Is it a horse? Image Retrieval Show me pictures of horses. Unsupervised segmentation Which parts of the image are grass?.
E N D
Similarity and Difference Pete Barnum January 25, 2006 Advanced Perception
Color Texture Visual Similarity
Uses for Visual Similarity Measures • Classification • Is it a horse? • Image Retrieval • Show me pictures of horses. • Unsupervised segmentation • Which parts of the image are grass?
Histogram Example Slides from Dave Kauchak
Cumulative Histogram Normal Histogram Cumulative Histogram Slides from Dave Kauchak
Joint vs Marginal Histograms Images from Dave Kauchak
Joint vs Marginal Histograms Images from Dave Kauchak
Higher Dimensional Histograms • Histograms generalize to any number of features • Colors • Textures • Gradient • Depth
Distance Metrics y y - x x = Euclidian distance of 5 units - = Grayvalue distance of 50 values - = ?
Bin-by-bin Bad! Good!
Cross-bin Bad! Good!
Distance Measures • Heuristic • Minkowski-form • Weighted-Mean-Variance (WMV) • Nonparametric test statistics • 2 (Chi Square) • Kolmogorov-Smirnov (KS) • Cramer/von Mises (CvM) • Information-theory divergences • Kullback-Liebler (KL) • Jeffrey-divergence (JD) • Ground distance measures • Histogram intersection • Quadratic form (QF) • Earth Movers Distance (EMD)
Heuristic Histogram Distances • Minkowski-form distance Lp • Special cases: • L1: absolute, cityblock, or Manhattan distance • L2: Euclidian distance • L: Maximum value distance Slides from Dave Kauchak
More Heuristic Distances • Weighted-Mean-Variance • Only includes minimal information about the distribution Slides from Dave Kauchak
Nonparametric Test Statistics • 2 • Measures the underlying similarity of two samples Images from Kein Folientitel
Nonparametric Test Statistics • Kolmogorov-Smirnov distance • Measures the underlying similarity of two samples • Only for 1D data
Nonparametric Test Statistics • Kramer/von Mises • Euclidian distance • Only for 1D data
Information Theory • Kullback-Liebler • Cost of encoding one distribution as another
Information Theory • Jeffrey divergence • Just like KL, but more numerically stable
Ground Distance • Histogram intersection • Good for partial matches
Ground Distance • Quadratic form • Heuristic Images from Kein Folientitel
Ground Distance • Earth Movers Distance Images from Kein Folientitel
Summary Images from Kein Folientitel
The Difference? (amount moved) =
The Difference? (amount moved) * (distance moved) =
Linear programming P m clusters (distance moved) * (amount moved) Q All movements n clusters
Linear programming P m clusters (distance moved) * (amount moved) Q n clusters
Linear programming P m clusters * (amount moved) Q n clusters
Linear programming P m clusters Q n clusters
Constraints 1. Move “earth” only from P to Q P m clusters P’ Q Q’ n clusters
Constraints 2. Cannot send more “earth” than there is P m clusters P’ Q Q’ n clusters
Constraints 3. Q cannot receive more “earth” than it can hold P m clusters P’ Q Q’ n clusters
Constraints 4. As much “earth” as possible must be moved P m clusters P’ Q Q’ n clusters
Advantages • Uses signatures • Nearness measure without quantization • Partial matching • A true metric
Disadvantage • High computational cost • Not effective for unsupervised segmentation, etc.
Examples • Using • Color (CIE Lab) • Color + XY • Texture (Gabor filter bank)
Image Lookup L1 distance Jeffrey divergence χ2 statistics Quadratic form distance Earth Mover Distance
Concluding thought - - = it depends on the application -