1 / 21

Learning Image Similarities via Probabilistic Feature Matching

Learning Image Similarities via Probabilistic Feature Matching. Ziming Zhang * , Ze-Nian Li, Mark Drew School of Computing Science, Simon Fraser University, Vancouver , B.C., Canada {zza27, li , mark}@ cs.sfu.ca. Outline. Introduction Probabilistic-matching based similarity learning

artie
Download Presentation

Learning Image Similarities via Probabilistic Feature Matching

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Image Similarities via Probabilistic Feature Matching Ziming Zhang*, Ze-Nian Li, Mark Drew School of Computing Science, Simon Fraser University, Vancouver, B.C., Canada {zza27, li, mark}@cs.sfu.ca *This work was done when the author was in SFU.

  2. Outline • Introduction • Probabilistic-matching based similarity learning • Probabilistic feature matching function • Probabilistic feature matching learning • Experiments • Conclusion

  3. Introduction • Object-based image similarity • Ideally two images are supposed to have a higher similarity if they contain similar objects. • Feature matching • A natural way to measure the image similarities • Many different criteria • In this talk, we simply match features only based on their appearance information

  4. >

  5. Introduction • Several relevant feature matching approaches • Summation kernel • Max-selection kernel • Optimal assignment kernel

  6. Introduction

  7. Introduction • Our approach is a generalization of a family of SIMILARITY learning approaches, including the three above. • Similarity matrix ≠ Kernel • A kernel matrix can be considered as a special similarity matrix (i.e. symmetric positive semi-definite) • Classification with Support Vector Machines (SVM)

  8. Probabilistic-matching based similarity learning • How to learn these feature matching probabilities?

  9. Feature Matching Function • Given two images X={x1,…,x|X|} and Y={y1,…,y|Y|}, a feature matching function α can be defined as • Explanations of matching processes in the Summation Kernel, Max-selection Kernel, and Optimal Assignment Kernel using feature matching functions • Ksum: • Kmax: • KOA :

  10. Probabilistic Feature Matching Function • Our probabilisticfeature matching function α is defined in the vector space covered by the following convex set: Total matching probability of one feature Total matching probability of all features Each matching probability

  11. Probabilistic Feature Matching Learning • Data-dependent optimization Distributionsparseness (or Regulizer) Image similarity

  12. Probabilistic Feature Matching Learning • Theorems • Consider maxf(x) over xϵX, where f(x) is convex, and X is a closed convex set. If the optimum exists, a boundary point of X is the optimum. • If a convex function f(x) attains its maximum on a convex polyhedron X with some extreme points, then this maximum is attained at an extreme point of X. • Relation to Ksum, Kmax, and KOA • Ksum: C=+∞ and H={i,j} • Kmax: C=0 and H={i}, and C=0 and H={j} • KOA: C=0 and H={i,j}

  13. Probabilistic Feature Matching Learning • Proposition • For two images X and Y , both the sparseness of αand their similarity will decrease monotonically with increasing the parameter C .

  14. Probabilistic Feature Matching Learning

  15. Experiments • Datasets: Graz • Descriptor • SIFT + dense sampling • Image Representation • 3*3 spatial Bag-of-Word histograms with 200 codewords • Feature similarity: RBF-kernel with χ2 distance • 50 runs Graz-01 Graz-02

  16. Experiments • Graz-01 (b) PFM2 with H={i} or H={j} (c) PFM3 with H=ф (a) PFM1 with H={i,j}

  17. Experiments Table 1. Comparison results between different approaches on Graz-01 (%) [1] Lazebnik et. al., “Beyond bags of features: Spatial pyramid matching for recognizing natural scene categories,” in CVPR’06. [2] Ling and Soatto, “Proximity distribution kernels for geometric context in category recognition,” in ICCV’07.

  18. Experiments • Graz-02 (a) PFM1 with H={i,j} (b) PFM2 with H={i} or H={j} (c) PFM3 with H=ф

  19. Experiments Table 2. Comparison results between different approaches on Graz-02 (%) Opelt et. al., “Generic object recognition with boosting,” PAMI, 2006.

  20. Conclusion • Probabilistic feature matching scheme • A generalization of a rich family of probabilistic feature matching approaches • Easy to control the sparseness of the matching probability distributions and their corresponding image similarities

  21. Thank you !!!

More Related