1 / 25

Design and Perceptual Validation of Performance Measures for Salient Object Segmentation

Design and Perceptual Validation of Performance Measures for Salient Object Segmentation. Vida Movahedi, James H. Elder Centre for Vision Research York University, Canada. Evaluation of Salient Object Segmentation. Source: Berkeley Segmentation Dataset.

mya
Download Presentation

Design and Perceptual Validation of Performance Measures for Salient Object Segmentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design and Perceptual Validation of Performance Measures for Salient Object Segmentation Vida Movahedi, James H. Elder Centre for Vision Research York University, Canada

  2. Evaluation of Salient Object Segmentation Source: Berkeley Segmentation Dataset Centre for Vision Research, York University

  3. Evaluation of Salient Object Segmentation How do we measure success? Centre for Vision Research, York University

  4. Existing literature • Salient object segmentation • [Liu07, Zhang07, Park07, Zhuang09, Achanta09, Pirnog09, …] • Evaluation of salient object segmentation algorithms • [Ge06,?] • Evaluation of segmentation algorithms • [Huang95, Zhang96, Martin01, Monteiro06, Goldmann08, Estrada09] Centre for Vision Research, York University

  5. Contributions • Analysis of previously suggested measures • Contour Mapping Measure (CM) • Order-preserving matching • A new dataset of salient objects (SOD) • Psychophysics experiments • Evaluation of above measures • Matching paradigm in Precision and Recall measures Centre for Vision Research, York University

  6. Evaluation measures in literature • Region-based error measures • Based on false positive/ false negative pixels • [Young05], [Ge06], [Goldmann08], ... • Boundary-based error measures • Based on distance between boundaries • [Huttenlocher93], [Monteiro06], ... • Mixed measures • Based on distance of misclassified pixels to the boundaries • [Young05], [Monteiro06], ... Centre for Vision Research, York University

  7. Region-based error measures[Young05], [Ge06], [Goldmann08], ... • A and B two boundaries • RA the region corresponding to a boundary A and |RA| the area of this region, False Positives False Negatives Not sensitive to shape differences Centre for Vision Research, York University

  8. Boundary-based error measures[Huang95],[Huttenlocher93], [Monteiro06], ... • A and B two boundaries • Distance of one point a on A from B is • Hausdorff distance: • Mean distance: a Not sensitive to shape differences Centre for Vision Research, York University

  9. Mixture error measures [Young05], [Monteiro06], ... • Penalizing the over-detected and under-detected regions by their distances to intersection False Negatives False Positives Not sensitive to shape difference Centre for Vision Research, York University

  10. Another example Different shapes with low errors Centre for Vision Research, York University

  11. Comparing two boundaries • The two boundaries need to follow each other • Thus it is not sufficient to map points to the closest point on the other boundary • The ordering of mapped points must be preserved Small False Negative Region Small False Positive Region Centre for Vision Research, York University

  12. The order of mapped points on the two boundaries must be monotonically non-decreasing. Allowing for different levels of detail: One-to-one Many-to-one One-to-many Order-preserving Mapping Centre for Vision Research, York University

  13. Contour Mapping Measure • Given two contours A=a1a2..an and B=b1b2..bm, • Find the correct order-preserving mapping Contour mapping error measure: Average distance between matched pairs of points • Bimorphism [Tagare02] • Elastic Matching [Geiger95, Basri98, Sebastian03, ..] Centre for Vision Research, York University

  14. Contour Mapping Measure • A dynamic programming implementation to find the optimum mapping • Closed contours  point indices are assigned cyclically • Based on string correction techniques [Maes90] • Complexity: if m<n and m, n points on two boundaries Centre for Vision Research, York University

  15. Contour Mapping Example Ground Truth Boundary Algorithm Boundary Matched pairs shown as line segments CM= average length of line segments connecting matched pairs Centre for Vision Research, York University

  16. Contour Mapping Measure • Order- preserving mapping avoids problems experienced by other measures Centre for Vision Research, York University

  17. SOD: Salient Object Dataset • A dataset of salient objects • Based on Berkeley Segmentation Dataset (BSD) [Martin01] • 300 images • 7 subjects Source: Berkeley Segmentation Dataset Available in SOD Centre for Vision Research, York University

  18. Psychophysical experiments • Which error measure is closer to human judgements of shape similarity? • 9 subjects • 5 error measures • Regional Intersection (RI) • Mean distance (MD) • Hausdorff distance (HD) • Mixed distance (MM) • Contour Mapping (CM) Centre for Vision Research, York University

  19. Psychophysical Experiments Experiment 1 - SOD Reference & test shapes all from SOD Experiment 2 - ALG Reference from SOD, test shapes algorithm-generated Reference: Human segmentation Reference: Human segmentation Test cases: Human segmentations Test cases: Algorithm-generated Centre for Vision Research, York University

  20. Agreement with Human Subjects Reference • Human subject chooses Left or Right • An error measure M also chooses Left or Right, based on their error w.r.t. the reference shape • If M chooses the same as the human, it is a case of agreement • Human-Human consistency: defined based on agreement between human subjects Left Right Centre for Vision Research, York University

  21. Psychophysical Experiments Experiment 1- SOD Reference & tests shapes all from SOD Experiment 2 - ALG Reference from SOD, test shapes algorithm-generated RI: region intersection, MD: mean distance, HD: Hausdorff distance, MM: mixed measure, CM: contour mapping Centre for Vision Research, York University

  22. Precision and Recall measures • For algorithm boundary A and ground truth boundary B • Precision: proportion of true positives on A • Recall: proportion of detected points on B • Martin’s PR (M-PR)[Martin04] • Minimum cost bipartite matching, cost proportional to distance • Estrada’s PR (E-PR)[Estrada09] • ‘No intervening contours’ and ‘Same side’ constraints • Contour Mapping PR (CM-PR) • Order-preserving mapping Centre for Vision Research, York University

  23. Matching paradigm in Precision/Recall Experiment 1- SOD Reference & test shapes all from SOD Experiment 2 - ALG Reference from SOD, test shapes algorithm-generated Centre for Vision Research, York University

  24. Summary • Analysis of available measures for evaluation of salient object segmentation algorithms • A new measure- contour mapping measure (CM) • Code available online: http://elderlab.yorku.ca/ContourMapping • A new dataset of salient objects • Dataset available online: http://elderlab.yorku.ca/SOD • Psychophysical Experiment • CM has a higher agreement with human subjects • Order-preserving matching paradigm in Precision/Recall analysis • Code available online: http://elderlab.yorku.ca/ContourMapping Centre for Vision Research, York University

  25. Thank You! Centre for Vision Research, York University

More Related