230 likes | 748 Views
CPSC 643, Presentation 1. Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography. Martin A. Fischler, Robert C. Bolles Artificial Intelligence Center SRI International, CA. Graphics and Image Processing,
E N D
CPSC 643, Presentation 1 Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography Martin A. Fischler, Robert C. Bolles Artificial Intelligence Center SRI International, CA Graphics and Image Processing, Volume 24, Number 6, June 1981.
Martin A. Fischler • Research Focus • Artificial Intelligence, Machine Vision, Switching Theory, Computer Organization, Information Theory • B.E.E Degree – City College of New York, NY • M.S and PhD – Stanford University, CA • Computer Scientist – SRI International in 1977 • Published the RANSIC paper firstly in work report of SRI International in 1980 • Published the RANSIC paper in Graphics and Image Processing in 1981 • Currently working on Visual Odometry and Visual SLAM
Back to 1981 • Computer Vision in 1981 • Focused on classification and recognition • Science-based (hadn’t gotten to applications yet) • Initially focused largely on artificial worlds. • Images were hard to come by. • 3-D range sensing was almost viewed as cheating. • Research was driven by sponsor’s interests. IBM first PC, 1981 4.77MHz Apple II-Plus, 1981 Max of 64K RAM Adapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/
Motivation • Least Square Algorithm • Optimize the fit of a functional description to ALL of the presented data. Adapted from http://en.wikipedia.org/wiki/Linear_least_squares
Motivation • Least Square Algorithm • Least square is an averaging technique that considers all the presented data, and therefore is sensitive to outliers. Adapted from http://www.cs.unc.edu/~lazebnik/spring09/
Motivation • Robust Estimator • The robust function ρ behaves like squared distance to small ri but saturates to large ri , where ri is the residual of point i w.r.t. model parametersθ, σ is scale parameter. • Nonlinear optimization that must be solved iteratively. • Least squares solution can be used for initialization. Adapted from http://www.cs.unc.edu/~lazebnik/spring09/
Motivation • Two types of error • Measurement error – inliers • Classification error – outliers • Existing Problem • Least square and robust estimator (initialization) treat inliers and outliers equally, as a whole. • Robust estimator tries to extract the outliers in the later iteration, while fitting inliers and extracting outliers should be in the same process. • Why not randomly choose data subset to fit – RANSAC.
RANSAC • Notations • U= {xi} set of data points, |U|=N • p model parameters • function fcomputes model parametersp given a sample S from U • the cost function for a single data point x • k times of iteration • Algorithm • Select random set , • Compute parameters • Compute cost • Stop of Ck < C* or k > k*
RANSAC Example Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC Example • Select data subset Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC Example • Select data subset • Calculate model parameters p Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC Example • Select data subset • Calculate model parameters p • Calculate cost for each data point Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC Example • Select data subset • Calculate model parameters p • Calculate cost for each data point • Select the data that fit the current model Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC Example • Select data subset • Calculate model parameters p • Calculate cost for each data point • Select the data that fit the current model • Repeat sampling Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC Example • Select data subset • Calculate model parameters p • Calculate cost for each data point • Select the data that fit the current model • Repeat sampling Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC Example • Select data subset • Calculate model parameters p • Calculate cost for each data point • Select the data that fit the current model • Repeat sampling • Ck < C* or k > k* Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC • How many iterations • The average step number k is a function of the sample size m and the fraction of outliers • Choose K so that, with probability p, at least one random sample is free from outliers
RANSAC Application: Location Determination Problem • Existence proofs of multiple solutions for the P3P, P4P, and P5P problems. • An algorithm for solving the general P3P. • An algorithm for solving the planar P4P problem. • An automatic gross-error filtering technique (RANSAC). Adapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/
RANSAC Results: Location Determination Problem Final result (Deviations) X: 0.1 ft Heading: 0.01O Y: 0.1 ft Pith: 0.10O Z: 0.1 ft Roll: 0.12O Adapted from http://www.ai.sri.com/people/fischler/
RANSAC Other Applications Adapted from http://graphics.cs.cmu.edu/courses/15-463/2006_fall/www/463.html
RANSAC Other Applications Adapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/
RANSAC • Pros • Simple and general. • Applicable to many different problems. • Often works well in practice. • Cons • Sometimes too many iterations are required. • Can fail for extremely low inlierratios. • Lots of parameters to tune. • Can’t always get a good initialization of the model. • We can often do better than brute-force sampling.
END For more information please visit the website of 25 Years RANSAC Workshop: http://cmp.felk.cvut.cz/ransac-cvpr2006/ 1981 Ford F150 Adapted from http://http://www.lmctruck.com/Fordcust_photos.htm