1 / 32

Segmentation

Segmentation. Segmentation breaks an image into groups over space and/or time. Tokens are the things that are grouped (pixels, points, surface elements, etc., etc.) Bottom up segmentation: tokens belong together because they are locally coherent. Top down segmentation:

rsherman
Download Presentation

Segmentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Segmentation

  2. Segmentation breaks an image into groups over space and/or time. Tokens are the things that are grouped (pixels, points, surface elements, etc., etc.) Bottom up segmentation: tokens belong together because they are locally coherent. Top down segmentation: tokens grouped because they lie on the same object. Bottom up/Top Down need not be mutually exclusive General Ideas

  3. Segmentation as Clustering

  4. Cluster together (pixels, tokens, etc.) that belong together Agglomerative clustering attach closest to cluster it is closest to repeat Divisive clustering split cluster along best boundary repeat Point-Cluster distance single-link clustering complete-link clustering group-average clustering Dendrograms yield a picture of output as clustering process continues Segmentation by clustering * From Marc Pollefeys COMP 256 2003

  5. Simple clustering algorithms * From Marc Pollefeys COMP 256 2003

  6. * From Marc Pollefeys COMP 256 2003

  7. Drawbacks • Too many … • One of them: simple clustering methods are greedy algorithm – no objective function. • Alternative approach: • Design an objective function that expresses how good the representation is. • Build an algorithm that optimizes the objective function.

  8. K-Means • Assume we know there are K clusters • Each cluster has a centerci • The Objective function is : • Choose cluster centers and point-cluster allocations to minimize Φ. • can’t do this by search, because there are too many possible allocations.

  9. Algorithm fix cluster centers; allocate points to closest cluster fix allocation; compute best cluster centers x could be any set of features for which we can compute a distance (careful about scaling) K-Means

  10. K-Means * From Marc Pollefeys COMP 256 2003

  11. Results of K-Means Clustering: Image Clusters on intensity Clusters on color K-means clustering using intensity alone and color alone * From Marc Pollefeys COMP 256 2003

  12. Graph Theoretic Clustering

  13. Represent tokens (which are associated with each pixel) using a weighted graph. affinity matrix Cut up this graph to get subgraphs with strong interior links and weaker exterior links Graph theoretic clustering Application to vision originated with Prof. Malik at Berkeley

  14. Minimum Cut and Clustering * From Khurram Hassan-Shafique CAP5415 Computer Vision 2003

  15. Measuring Affinity Intensity Distance Texture * From Marc Pollefeys COMP 256 2003

  16. Extracting a Single Good Cluster • This is an eigenvalue problem - choose the eigenvector of A with largest eigenvalue • Simplest idea: we want a vector a giving the association between each element and a cluster • We want elements within this cluster to, on the whole, have strong affinity with one another • We could maximize • But need the constraint

  17. Example eigenvector points eigenvector matrix

  18. Spectral Clustering Overview Data Similarities Block-Detection * Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University

  19. Eigenvectors and Blocks • Block matrices have block eigenvectors: • Near-block matrices have near-block eigenvectors: 3= 0 1= 2 2= 2 4= 0 eigensolver 3= -0.02 1= 2.02 2= 2.02 4= -0.02 eigensolver * Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University

  20. Spectral Space • Can put items into blocks by eigenvectors: • Clusters clear regardless of row ordering: e1 e2 e1 e2 e1 e2 e1 e2 * Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University

  21. Extracting weights for a set of clusters First eigenvector Three next eigenvectors

  22. Scale affects affinity * From Marc Pollefeys COMP 256 2003

  23. * From Marc Pollefeys COMP 256 2003

  24. Drawbacks of Minimum Cut • Weight of cut is directly proportional to the number of edges in the cut. Cuts with lesser weight than the ideal cut Ideal Cut * Slide from Khurram Hassan-Shafique CAP5415 Computer Vision 2003

  25. Normalized cuts • Maximize where cut(A,B) is sum of weights that straddle A,B; assoc(A,V) is sum of all edges with one end in A. • i.e. construct A, B such that their within cluster similarity is high compared to their association with the rest of the graph. • Current criterion evaluates within cluster similarity, but not across cluster difference • Instead, we’d like to maximize the within cluster similarity compared to the across cluster difference • Write graph as V, one cluster as A and the other as B

  26. Finding Minimum Normalized-Cut • Finding the Minimum Normalized-Cut is NP-Hard. • Polynomial Approximations are generally used for segmentation * Slide from Khurram Hassan-Shafique CAP5415 Computer Vision 2003

  27. Normalized cuts • Write a vector y whose elements are 1 if item is in A, -b if it’s in B. • Write the matrix of the graph as W, and the matrix which has the row sums of W on its diagonal as D, 1 is the vector with all ones. • Criterion becomes • and we have a constraint • This is hard to do, because y’s values are quantized

  28. Normalized cuts • Instead, solve the generalized eigenvalue problem • which gives • Now look for a quantization threshold that maximises the criterion --- i.e all components of y above that threshold go to one, all below go to -b

  29. Algorithm • Compute matrices W & D • Solve for eigen vectors with the smallest eigen values • Use the eigen vector with second smallest eigen value to bipartition the graph • Recursively partition the segmented parts if necessary. * Slide from Khurram Hassan-Shafique CAP5415 Computer Vision 2003

  30. Figure from “Image and video segmentation: the normalised cut framework”, by Shi and Malik, 1998 * Slide from Khurram Hassan-Shafique CAP5415 Computer Vision 2003

  31. F igure from “Normalized cuts and image segmentation,” Shi and Malik, 2000 * Slide from Khurram Hassan-Shafique CAP5415 Computer Vision 2003

  32. PROJECT 7 • Implement image segmentation using Normalized Cut*. Use intensity, color, and texture for similarity. • Implement K-means using the same tokens. • Test both algorithms on the standard collection of images from Berkley http://www.cs.berkeley.edu/projects/vision/grouping/segbench/ *J. Shi and J. Malik, “Normalized Cuts & Image Segmentation,” IEEE Trans. of PAMI, Aug 2000.

More Related