620 likes | 752 Views
Long-term image-based motion estimation. Dennis Strelow and Sanjiv Singh. On the web. Related materials: these slides related papers movies VRML models at: http://www.cs.cmu.edu/~dstrelow/maryland. Introduction (1). micro air vehicle (MAV) navigation. AeroVironment Black Widow.
E N D
Long-term image-based motion estimation Dennis Strelow and Sanjiv Singh
On the web Related materials: • these slides • related papers • movies • VRML models at: http://www.cs.cmu.edu/~dstrelow/maryland
Introduction (1) micro air vehicle (MAV) navigation AeroVironment Black Widow AeroVironment Microbat
Introduction (2) mars rover navigation Mars Exploration Rovers (MER) Hyperion
Introduction (3) robotic search and rescue Center for Robot-Assisted Search and Rescue, U. of South Florida Rhex
Introduction (4) NASA ISS personal satellite assistant
Introduction (5) Each of these problems requires: • 6 DOF motion • in unknown environments • without GPS or other absolute positioning • over the long term …and some of the problems require: • small, light, and cheap sensors
Introduction (6) Monocular, image-based motion estimation is a good candidate In particular, simultaneous estimation of: • multiframe motion • sparse scene structure is the most promising approach
Outline Image-based motion estimation Improving image-based motion estimation Improving feature tracking Reacquisition
Outline Image-based motion estimation refresher difficulties Improving image-based motion estimation Improving feature tracking Reacquisition
Image-based motion estimation: refresher (1) A two-step process is typical… First, sparse feature tracking: • Inputs:raw images • Outputs: projections
Image-based motion estimation: refresher (3) Second, estimation: • Input: • Outputs: • projections from tracker • 6 DOF camera position at the time of each image • 3D position of each tracked point
Image-based motion estimation: refresher (5) Algorithms exist For tracking: • Lucas-Kanade (Lucas and Kanade, 1981)
Image-based motion estimation: refresher (6) For estimation: • SVD-based factorization (Tomasi and Kanade, 1992) • bundle adjustment (various, 1950’s) • Kalman filtering (Broida and Chellappa, 1990) • variable state dimension filter (McLauchlan, 1996)
Image-based motion estimation: difficulties (1) So, the problem is solved?
Image-based motion estimation: difficulties (2) • If so, where are the automatic systems for estimating the motion of: • in unknown environments? • from images in unknown environments?
Image-based motion estimation: difficulties (3) …and for automatically modeling • rooms • buildings • cities from a handheld camera?
Image-based motion estimation: difficulties (4) Estimation step can be very sensitive to: • incorrect or insufficient image feature tracking • camera modeling and calibration errors • outlier detection thresholds • sequences with degenerate camera motions
Image-based motion estimation: difficulties (5) …and for recursive methods in particular: • poor prior assumptions on the motion • poor approximations in state error modeling
Image-based motion estimation: difficulties (6) • 151 images, 23 points
Image-based motion estimation: difficulties (8) For long-term motion estimation, these errors accumulate
Outline Image-based motion estimation Improving image-based motion estimation overview image and inertial measurements Improving feature tracking Reacquisition
Improving image-based motion estimation: image and inertial (1) Image and inertial measurements are highly complimentary Inertial measurements can: • resolve the ambiguities in image-only estimates • establish the global scale
Improving image-based motion estimation: image and inertial (2) Images measurements can: • reduce the drift in integrating inertial measurements • distinguish between rotation, gravity, acceleration, bias, noise in accelerometer readings
Improving image-based motion estimation: image and inertial (3)
Improving image-based motion estimation: image and inertial (4)
Improving image-based motion estimation: image and inertial (5) • Other examples: • global scale typically within 5% • better convergence than image-only estimation
Improving image-based motion estimation: image and inertial (6) Many more details in: Dennis Strelow and Sanjiv Singh. Motion estimation from image and inertial measurements. IJRR, September 2004.
Outline Image-based motion estimation Improving image-based motion estimation Improving feature tracking Lucas-Kanade and real sequences The “smalls” tracker Reacquisition
Improving feature tracking: Lucas-Kanade and real sequences (1) • Lucas-Kanade is the “go to” sparse feature tracker: • iterative minimization of the intensity matching error function • applied at several image resolutions to handle large motions • features extracted based on image texture • feature death based on iteration convergence and correlation error
Improving feature tracking: Lucas-Kanade and real sequences (2) • Advantages: • fast • subpixel resolution • can handle some large motions well • uses general minimization, so easily extendible
Improving feature tracking: Lucas-Kanade and real sequences (3) 0.1 average pixel reprojection error!
Improving feature tracking: Lucas-Kanade and real sequences (4) • But, Lucas-Kanade has some flaws: • does not exploit the rigid scene • poor heuristics for: • large motions • extracting features • detecting feature mistracking
Improving feature tracking: Lucas-Kanade and real sequences (5)
Improving feature tracking: Lucas-Kanade and real sequences (6)
Improving feature tracking: Lucas-Kanade and real sequences (7)
Improving feature tracking: Lucas-Kanade and real sequences (7)
Improving feature tracking: Lucas-Kanade and real sequences (8)
Improving feature tracking: the “smalls” tracker (1) • smalls is a new sparse image feature tracker • designed to address these issues • i.e., designed for long-term motion estimation
Improving feature tracking: the “smalls” tracker (2) Leonard Smalls: tracker, lone biker of the apocalypse
Improving feature tracking: the “smalls” tracker (3) SIFT epipolar geometry features 1-D correlation matching along epipolar lines geometric mistracking detection feature death and birth to 6 DOF output estimation features
Improving feature tracking: the “smalls” tracker (4) • SIFT keypoints (Lowe, IJCV 2004): • image interest points • can be extracted despite of large changes in viewpoint • to subpixel accuracy • A keypoint’s feature vectors in two images usually match
Improving feature tracking: the “smalls” tracker (5) Epipolar geometry between adjacent images is determined using… SIFT epipolar geometry features • SIFT extraction and matching • two-frame bundle adjustment • RANSAC
Improving feature tracking: the “smalls” tracker (6) 1-D correlation matching along epipolar lines • initial search position from nearby SIFT matches • discrete SSD search (e.g., 60 pixels) • 1-D Lucas-Kanade refines the match
Improving feature tracking: the “smalls” tracker (7) To check for mistracking, use only three-frame geometric consistency… geometric mistracking detection • …determined using: • three-frame bundle adjustment • RANSAC