270 likes | 499 Views
615 Project: Real-time monocular vision-based SLAM. Adam Rachmielowski. Overview. SFM and SLAM Extended Kalman filter Visual SLAM details Results Next. Estimating structure and motion. Factorization [Tomasi & Kanade ’92] Batch method Efficient Originally for affine camera
E N D
615 Project:Real-time monocular vision-based SLAM Adam Rachmielowski
Overview • SFM and SLAM • Extended Kalman filter • Visual SLAM details • Results • Next
Estimating structure and motion • Factorization [Tomasi & Kanade ’92] • Batch method • Efficient • Originally for affine camera • Missing data? • Finite camera [Sturm & Triggs] W = MX
Estimating structure and motion • Reconstruction from N views [Hartley & Zisserman ’00] • Multiview geomteric entities and algorithms described by Faugeras, H, Z, and others • Minimize global error with bundle adjustment • Can be used sequentially • Upgrade to Euclidean with auto calibration xFPX
SLAM • Simultaneous Localisation And Mapping • Estimate robot’s pose and map feature positions • Probabilistic framework maintains • current estimate • estimate uncertainty (covariance) • Update based on measurements and model • Many systems use • odometry and active sensors as measurement devices • limited motion models
Vision-based SLAM • Camera for measurements • Trinocular • 3D measurements by triangulation • Offline [Ayache, Faugeras ’89] • Real-time with SIFTs [Se, Lowe, Little ’01] • Real-time monocular [Chiuso et al. ’00]
Kalman filter [Swerling ’58][Welch, Bishop ’01] • Estimates state of dynamic system • Integrates noisy measurements to give optimal estimate • Noise is Gaussian • First order Markov process
KF: key variables • estimate of state at time k • error covariance (estimate uncertainty) • state transition function • measurement • state to measure • noise covariances
Predict Predicted state Predicted covariance KF: Two phase estimation
Update Innovation Innov. covar. Kalman gain State Covariance KF: Two phase estimation
EKF: Extended Kalman filter • Allow non-linear functions (F, H) • Apply functions to state • Apply jacobian to covariances • Linearizing functions around current estimate
Visual SLAM details [Davison ’03] • State representation x, P • Process model F (motion) • Measurement model H (projection) • State update • System initialization • Adding and removing feature
State representation • Scene structure (feature points) • Depth from reference image [Azarbayejani, Pentland ’95] • x,y,z coordinates • Camera • Pose • Motion
State estimate vector • Points yi • Camera xv • 6DOF pose • Constant velocity motion model • Acceleration modeled as noise
Covariance matrix • Covariance blocks • Pxx camera params • Pyiyi point I • Off diagonals represent correlation between estimates
Process model • Points don’t move: yk = yk-1 • Add velocity and acceleration to current camera parameters • Covariance updated using jacobian
Measurement model • H models projection of the predicted points by the predicted camera • Covariance Si guides feature match search
Making measurements / Update • Project innovation covariance to search ellipse • Warp template based on camera and point prediction • If viewing angle is good, match to get measurement • Compute Kalman gain and update state and covariance
System initialization • Need initial estimate and covariance • Calibration object • SFM • Process covariance • Small: small searches, but can only handle small accelerations • Large: can handle big accelerations, but need many measurements • Measurement covariance • Function of matching method (camera resolution)
Adding and removing features • Add • Select salient feature in desired region • Search along epipolar line • Remove • If matching repeatedly fails Davison ’03
Preliminary results • Simulation [implemented with Birkbeck] • Behaves according to model • Initial estimate of camera and 4 key points is true value + small amount of noise • Initial estimate of other points is true value + significant noise • Initial covariance is scaled identity
Next • Real images (video sequence) • Feature matching • Tracking • SIFTs ? • Real-time issues • Postponement [Davison ’01] • Loop closing • Davison’s system automatically corrects if feature becomes visible and is correctly measured, but… • Prevent drift by incorporating explicit loop closing [Newman, Ho ’05]