1 / 53

A Non-local Cost Aggregation Method for Stereo Matching

A Non-local Cost Aggregation Method for Stereo Matching. Qingxiong Yang City University of Hong Kong 2012 IEEE Conference on Computer Vision and Pattern Recognition. Outilne. Introduction Related Works Method Experimental Results Conclusion. Introduction _________________________.

lobo
Download Presentation

A Non-local Cost Aggregation Method for Stereo Matching

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Non-local Cost Aggregation Method for Stereo Matching Qingxiong Yang City University of Hong Kong 2012 IEEE Conference on Computer Vision and Pattern Recognition

  2. Outilne • Introduction • Related Works • Method • Experimental Results • Conclusion

  3. Introduction_________________________

  4. Introduction • Goal : Get fast and accurate disparity map. • Solution : Non-local cost aggregation + MST • Advantage : Better in low textures region Low complexity

  5. Related Works_________________________

  6. Related Works [21] D. Scharstein and R. Szeliski. A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. International Journal of Computer Vision (IJCV), 47:7–42, 2002.

  7. Related Works Local methods Global methods 1(=>2)=>3 Energy minimization process (GC,BP,DP,Cooperative) Per-processing Explicit smoothness Accuratebutslow • 1=>2=>3 • A local support region with winner take all • Implicit smoothness • Fastbutinaccurate.

  8. Comparison (Rank in Middleburry)

  9. Reference(1/2) • C. Shi, G. Wang, X. Pei, H. Bei, and X. Lin. High-accuracy stereo matching based on adaptive ground control points. Submitted to IEEE TIP 2012 • X. Mei, X. Sun, M. Zhou, S. Jiao, H. Wang, and X. Zhang. On building an accurate stereo matching system on graphics hardware. GPUCV2011. • A. Klaus, M. Sormann and K. Karner. Segment-based stereo matching using belief propagation and a self-adapting dissimilarity measure.ICPR 2006. • Z. Wang and Z. Zheng. A region based stereo matching algorithm using cooperative optimization.CVPR 2008. • Anonymous. A dense stereo matching with reliability aggregation and propagation. CVPR 2012 submission 1170. • Q. Yang, L. Wang, R. Yang, H. Stewénius, and D. Nistér. Stereo matching with color-weighted correlation, hierarchical belief propagation and occlusion handling. IEEE TPAMI 2009

  10. Reference(2/2) • X. Sun, X. Mei, S. Jiao, M. Zhou, and H. Wang. Stereo matching with reliable disparity propagation. 3DIMPVT 2011. • L. Xu and J. Jia. Stereo matching: an outlier confidence approach.ECCV 2008. • M. Bleyer, C. Rother, and P. Kohli. Surface stereo with soft segmentation. CVPR 2010. • Q. Yang, R. Yang, J. Davis, and D. Nistér. Spatial-depth super resolution for range images. CVPR 2007. • Y. Mizukami, K. Okada, A. Nomura, S. Nakanishi, and K. Tadamura. Sub-pixel disparity search for binocular stereo vision. ICPR 2012 submission 1439. • S. Zhu, L. Zhang, and H. Jin. A locally linear regression model for boundary preserving regularization in stereo matching. ECCV 2012. • M. Bleyer, M. Gelautz, C. Rother, and C. Rhemann. A stereo approach that handles the matting problem via image warping. CVPR 2009.

  11. Method_________________________

  12. Method

  13. Bilateral Filter • Every sample is replaced by a weighted average of its neighbors. • These weights reflect two forces • How close are the neighbor and the center sample • How similar are the neighbor and the center sample • Edge-preservingand noise reducing smoothing filter

  14. Bilateral Filter

  15. Bilateral Filter Center Sample : p Neighborhood : q

  16. Bilateral Filter Total Distance

  17. Bilateral Filter Bilateral wieght Original image Gaussian wieght

  18. Minimum Spanning Tree • Kruskal's Algorithm • Scan all edges increasing weight order, if an edge is safe, add it to F. 4 B C 4 2 1 A 4 E F 1 2 D 3 10 5 G Orginal Graph 5 6 3 4 I H 3 2 J PPT By Jonathan Davis

  19. 4 1 A B A D 4 4 B C B D 4 10 2 B C B J C E 4 2 1 1 5 C F D H A 4 E F 1 6 2 2 D J E G D 3 10 5 G 3 5 F G F I 5 6 3 4 3 4 I G I G J H 3 2 J 2 3 H J I J

  20. Sort Edges (in reality they are placed in a priority queue - not sorted - but sorting them makes the algorithm easier to visualize) 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  21. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  22. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  23. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  24. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  25. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  26. Cycle Don’t Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  27. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  28. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  29. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  30. Cycle Don’t Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  31. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  32. Minimum Spanning Tree Orginal Graph 4 B C 4 B C 4 4 2 1 2 1 A E A 4 F E 1 F 1 2 D 2 D 3 10 G 5 G 3 5 6 3 4 I I H H 3 2 3 J 2 J

  33. Cost Computation • Cd(p) : matching cost for pixel p at disparity level d • : aggregated cost -- σSand σR: constants used to adjust the similarity.

  34. Cost Aggregation on a Tree Structure • Weight between pand q • w(p, q) = | I(p)-I(q)| = image gradient • Distance between p and q • D(p, q) = sum of weights of the connected edges • Similarity between p and q • Aggregated cost =>

  35. Cost Aggregation on a MST • Claim 1. Let Tr denote a subtree of a node s and r denote the root node of Tr, then the supports node s received from this subtree is the summation of the supports node s received from r and S(s, r) times the supports node r received from its subtrees. • Supports r = • Supports s = s r Tr

  36. Cost Aggregation on a MST • Aggregated cost => • , if node v is a leaf node • P(vc) denote parent of nodevc

  37. Cost Aggregation on a MST

  38. Cost Aggregation on a MST • Aggregated cost =>

  39. Cost Aggregation on a MST • Cost aggregation process • Aggregate the original matching cost Cd from leaf nodes towards root node using Eqn. (6) • Aggregate from root node towards leaf nodes using Eqn. (7) • Complexity • Each level:2 addition/subtraction + 3 multiplication

  40. Disparity Refinement • D:the left disparity map • Unstable:occlusion, lack of texture, specularity • Median filter overlap

  41. Experimental Results_________________________

  42. Experimental Results • Device:a MacBook Air laptop computerwith a 1.8 GHz Intel Core i7 CPU and 4 GB memory • Parameter: • σ = 0.1 (non-local cost aggregation) • Source : Middlebury http://vision.middlebury.edu/stereo/ HHI database(book arrival) Microsofy i2i database(Ilkay)

  43. Experimental Results • Time: • Proposed average runtime : 90 milliseconds (1.25× slower) • Unnormalizedbox filter average runtime : 72 milliseconds. • Local guided image filter average runtime : 960 milliseconds [7] C.Rhemann, A. Hosni, M. Bleyer, C. Rother, and M. Gelautz. Fast cost-volume filtering for visual correspondence and beyond. In CVPR,2011. [24] P. Viola and M. Jones. Robust real-time face detection. International Journal of Computer Vision, volume 57, pages 137–154, 2003.

  44. [7] C.Rhemann, A. Hosni, M. Bleyer, C. Rother, and M. Gelautz. Fast cost-volume filtering for visual correspondence and beyond. In CVPR,2011.

  45. Experimental Results

  46. Experimental Results

  47. Experimental Results

  48. Experimental Results • Different disparity level (depth of spanning tree) Max=7

  49. Max=10 Max=14

  50. Max=16 Max=20

More Related