851 likes | 1.12k Views
3D SLAM for Omni-directional Camera. Yuttana Suttasupa Advisor: Asst.Prof. Attawith Sudsang. Introduction. Localization Robot can estimate its location with respect to landmarks in an environment Mapping Robot can reconstruct the position of landmarks that its encounter in an environment
E N D
3D SLAM for Omni-directional Camera Yuttana Suttasupa Advisor: Asst.Prof. Attawith Sudsang
Introduction • Localization • Robot can estimate its location with respect to landmarks in an environment • Mapping • Robot can reconstruct the position of landmarks that its encounter in an environment • SLAM • Robot build up a map and localize itself simultaneously while traversing in an unknown environment
The Problem • Propose SLAM method for a hand-held omni-directional Camera • Omni-directional camera move freely in an unknown indoor environment without knowing of camera motion model • Using only bearing data from omni-images and no need any initialize information • Reconstruct 3D camera path and 3D environment map (landmark-based)
The Problem • Input • a captured image sequence from an omni-directional camera
The Problem • Output • a camera state - 3D position and direction • an environment map - 3D landmark positions
Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation
Omni-directional Camera • Our omni-directional camera • Two parabolic mirrors • CCD camera with 640×480 pixel @ 29.97 Hz • 360° horizontal field of view • -5° to 65° vertical field of view
Omni-directional Camera • Normal camera • Omni-directional camera
Omni camera Calibration • Find a mapping function from 2D image to 3D object • Using Omnidirectional Camera Calibration Toolbox (Scaramuzza et al., 2006)
Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation
EKF SLAM • Using extended kalman filter to solve SLAM problem • Assume a robot position and a map probability distributions as gaussian distributions • Predict a robot position and landmarks distributions using a robot motion model • Correct the distributions using an observation model
EKF SLAM • The distribution representation • Initial state • Assume a robot position distribution with some value at the initial state state covariance robot probability distribution
EKF SLAM • Predict state • Using a robot motion model to predict a robot position robot Predicted state Predicted estimate covariance
EKF SLAM • Correction state • Using an observation model to update a robot position and landmark positions Observation model Innovation residual measurement robot landmark Updated state estimate adjustment Updated estimate covariance Innovation covariance Optimal Kalman gain
Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation
Introduction to Problem • Feature detection Problem • How a computer recognizes objects from an image • Feature association Problem • How can we find feature relations between two images
Introduction to Problem • Observability Problem • A camera given only a bearing-only data • How can we estimate a high dimensional state with low dimensional measurements Landmark How far is it? Camera
Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation
Solution to Problem • The proposed algorithm includes 3 steps • Image Processing • Detect features • Find feature associations • Calculate feature measurements • SLAM • Apply measurement data to SLAM • Features and reference frames management • Add and remove features from SLAM state • Add and remove reference frames from SLAM state
Solution to Problem • System Coordinate • World Frame • Camera Frame • Reference Frames landmark Camera Frame World Frame Reference Frame
Solution to Problem • SLAM State • Camera state – represent camera frame • Reference frame states – represent reference frames • Landmark states – represent landmark positions
Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation
Image Processing • Input • an image from an omni-directional camera • Old SLAM state • Output • Feature measurement • Feature association
Image Processing • Feature detection (for new features) • Using point features • Finds corners in the image using harris corner detector
Image Processing • Feature associations • Describe which landmark that the feature in current image is associated to • Find the relation between a current image and old features in an old image • Using optical flow to track features • Using template matching to refine a feature position
Image Processing • Feature associations – features tracking • Tracking features from a previous image to get a current features position • Using pyramid Lucas-Kanade optical flow
Image Processing • Feature associations – feature positions refinement • Track features using optical flow may cause a feature drift • Using pyramid template matching to correct a feature position search region feature patch result after refinement using template matching current image with a drifted feature
Image Processing • Feature associations – feature position refinements • Select patch from a reference image • Patch rotation and scale may not match • Transform function may need to apply to patch Not match Reference image Match Current image
Image Processing • Feature associations – feature position refinements • Find transform function by project 3D patch creating from a current image to a reference image 3D patch Image sphere Current image Reference image
Image Processing • Find transform function • Project every patch pixel may lead to a computational cost problem • Use perspective transform as a transform function instead • Need 4 project points to calculate a perspective transform function Real distortion Perspective distortion
Image Processing • Feature associations – example
Image Processing • Feature measurements • Using feature points in omni-image to be a measurement data • Feature points must be converted into bearing-only measurement in the form of yaw and pitch angles z landmark Ray (r) y x
Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation
Simultaneous localization and mapping (SLAM) • Using EKF SLAM to estimate Camera state, Reference frame states and Landmark states • Prediction • Determine how a camera move • Find state transition model (camera motion model) • Correction • How to measurement a landmark • Find observation model
Simultaneous localization and mapping (SLAM) • Input • Measurement data from omni-image • Output • Estimated SLAM state • Camera state • Reference frame states • Landmark states
Simultaneous localization and mapping (SLAM) • Prediction • Determine how a camera move • But the camera motion is unpredictable • Assume that a camera can move freely in any direction with some limit velocity Before predict After predict
Simultaneous localization and mapping (SLAM) • Correction • Using a bunch of measurement (include current measurements data and old measurements data at reference frame) to update SLAM state landmark Reference frame Current camera Reference frame
Simultaneous localization and mapping (SLAM) Measurement data for landmark i • Correction Observation model for each measurement y' landmark is a landmark position in X coordinate
Simultaneous localization and mapping (SLAM) • Correction step can separate in 2 parts • Camera and reference frames Correction • Landmarks Correction
Simultaneous localization and mapping (SLAM) • Camera and reference frames Correction • Assume that the measurement data can measurement landmark positions accurately • The correction affects only a camera state and reference frame states Before correction After correction
Simultaneous localization and mapping (SLAM) • Landmarks Correction • Assume that the camera state is accurate • The correction affects only landmark states Before correction After correction
Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation
Features and reference frames management • Remove features • Feature points is out of image bound • The landmark position is not accurate enough the feature of this landmark is out of bound
Features and reference frames management • Add features • Add new features • using harris corner detector to detect a new feature • Add new features when we have a new reference frame • Add old features • Consider that the old landmark may be appear in the omni image again
Features and reference frames management • Add new features • Add new landmarks to SLAM state • Estimate a landmark position by assume a large variance for a range data
Features and reference frames management landmark • Add old features • project an old landmark to the current image • check if a feature available in the image using template matching feature Image sphere
Features and reference frames management • Add reference frame • When no suitable reference frames for feature tracking • When landmark number is below some threshold • Select a current camera state as a new reference frame
Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation