1 / 30

Markov Random Fields with Efficient Approximations

Markov Random Fields with Efficient Approximations. Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY. Introduction. MAP-MRF approach (Maximum Aposteriori Probability estimation of MRF)

Download Presentation

Markov Random Fields with Efficient Approximations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY

  2. Introduction MAP-MRF approach (Maximum Aposteriori Probability estimation of MRF) • Bayesian framework suitable for problems in Computer Vision (Geman and Geman, 1984) • Problem: High computational cost. Standard methods (simulated annealing) are very slow.

  3. Outline of the talk • Models where MAP-MRF estimation is equivalent to min-cut problem on a graph • generalized Potts model • linear clique potential model • Efficient methods for solving the corresponding graph problems • Experimental results • stereo, image restoration

  4. - disparity at pixel p - configuration MRF framework in the context of stereo • image pixels (vertices) • neighborhood relationships (n-links) MRF defining property: Hammersley-Clifford Theorem:

  5. Observed data Bayes rule Prior (MRF model) Likelihood function (sensor noise) MAP estimation of MRF configuration

  6. Find that minimizes the Posterior Energy Function : Smoothness term (MRF prior) Data term (sensor noise) Energy minimization

  7. Clique potential Penalty for discontinuity at (p,q) Energy function Generalized Potts model

  8. Disparity configurations minimizing energy E( f ): Static clues - selecting Stereo Image: White Rectangle in front of the black background

  9. Terminals (possible disparity labels) Cost of n-link Cost of t-link Minimization of E(f) via graph cuts p-vertices (pixels)

  10. Graph G = <V,E> Graph G(C) = <V, E-C > • A multiway cut Cyields some disparity configuration Multiway cut verticesV = pixels + terminals Remove a subset of edgesC edges E = n-links + t-links • Cis a multiway cut if terminals are separated inG(C)

  11. Main Result(generalized Potts model) • Under some technical conditions on the multiway min-cut C on G gives___ that minimizes E( f ) - the posterior energy function for the generalized Potts model. • Multiway cut Problem: find minimum cost multiway cut C graph G

  12. Solving multiway cut problem • Case of two terminals: • max-flow algorithm (Ford, Fulkerson 1964) • polinomial time (almost linear in practice). • NP-complete if the number of labels >2 • (Dahlhauset al., 1992) • Efficient approximation algorithms that are optimal within a factor of 2

  13. Our algorithm Initialize at arbitrary multiway cut C 1. Choose a pair of terminals 2. Consider connected pixels

  14. Our algorithm Initialize at arbitrary multiway cut C 1. Choose a pair of terminals 2. Consider connected pixels 3. Reallocate pixels between two terminals by running max-flow algorithm

  15. Our algorithm Initialize at arbitrary multiway cut C 1. Choose a pair of terminals 2. Consider connected pixels 3. Reallocate pixels between two terminals by running max-flow algorithm 4. New multiway cut C’is obtained Iterate until no pair of terminals improves the cost of the cut

  16. Experimental results (generalized Potts model) • Extensive benchmarking on synthetic images and on real imagery with dense ground truth • From University of Tsukuba • Comparisons with other algorithms

  17. Correlation Multiway cut Synthetic example Image

  18. Real imagery with ground truth Ground truth Our results

  19. Comparison with ground truth

  20. Gross errors (> 1 disparity)

  21. Comparative results: normalized correlation Gross errors Data

  22. Statistics

  23. Related work (generalized Potts model) • Greig et al., 1986 is a special case of our method (two labels) • Two solutions with sensor noise (function g) highly restricted • Ferrari et al., 1995, 1997

  24. Clique potential Penalty for discontinuity at (p,q) Energy function Linear clique potential model

  25. cut C Cost of n-link Cost of t-link {p,q} part of graph a cut C yields some configuration Minimization of via graph cuts

  26. Main Result(linear clique potential model) • Under some technical conditions on the min-cut C on gives that minimizes - the posterior energy function for the linear clique potential model.

  27. Related work (linear clique potential model) • Ishikawa and Geiger, 1998 • earlier independently obtained a very similar result on a directed graph • Roy and Cox, 1998 • undirected graph with the same structure • no optimality properties since edge weights are not theoretically justified

  28. Experimental results (linear clique potential model) • Benchmarking on real imagery with dense ground truth • From University of Tsukuba • Image restoration of synthetic data

  29. Generalized Potts model Linear clique potential model ground truth Ground truth stereo image

  30. Generalized Potts model Linear clique potential model Noisy diamond image Image restoration

More Related