1 / 84

3D SLAM for Omni-directional Camera

3D SLAM for Omni-directional Camera. Yuttana Suttasupa Advisor: Asst.Prof. Attawith Sudsang. Introduction. Localization Robot can estimate its location with respect to landmarks in an environment Mapping Robot can reconstruct the position of landmarks that its encounter in an environment

Download Presentation

3D SLAM for Omni-directional Camera

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 3D SLAM for Omni-directional Camera Yuttana Suttasupa Advisor: Asst.Prof. Attawith Sudsang

  2. Introduction • Localization • Robot can estimate its location with respect to landmarks in an environment • Mapping • Robot can reconstruct the position of landmarks that its encounter in an environment • SLAM • Robot build up a map and localize itself simultaneously while traversing in an unknown environment

  3. The Problem • Propose SLAM method for a hand-held omni-directional Camera • Omni-directional camera move freely in an unknown indoor environment without knowing of camera motion model • Using only bearing data from omni-images and no need any initialize information • Reconstruct 3D camera path and 3D environment map (landmark-based)

  4. The Problem • Input • a captured image sequence from an omni-directional camera

  5. The Problem • Output • a camera state - 3D position and direction • an environment map - 3D landmark positions

  6. Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation

  7. Omni-directional Camera • Our omni-directional camera • Two parabolic mirrors • CCD camera with 640×480 pixel @ 29.97 Hz • 360° horizontal field of view • -5° to 65° vertical field of view

  8. Omni-directional Camera • Normal camera • Omni-directional camera

  9. Omni camera Calibration • Find a mapping function from 2D image to 3D object • Using Omnidirectional Camera Calibration Toolbox (Scaramuzza et al., 2006)

  10. Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation

  11. EKF SLAM • Using extended kalman filter to solve SLAM problem • Assume a robot position and a map probability distributions as gaussian distributions • Predict a robot position and landmarks distributions using a robot motion model • Correct the distributions using an observation model

  12. EKF SLAM • The distribution representation • Initial state • Assume a robot position distribution with some value at the initial state state covariance robot probability distribution

  13. EKF SLAM • Predict state • Using a robot motion model to predict a robot position robot Predicted state Predicted estimate covariance

  14. EKF SLAM • Correction state • Using an observation model to update a robot position and landmark positions Observation model Innovation residual measurement robot landmark Updated state estimate adjustment Updated estimate covariance Innovation covariance Optimal Kalman gain

  15. Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation

  16. Introduction to Problem • Feature detection Problem • How a computer recognizes objects from an image • Feature association Problem • How can we find feature relations between two images

  17. Introduction to Problem • Observability Problem • A camera given only a bearing-only data • How can we estimate a high dimensional state with low dimensional measurements Landmark How far is it? Camera

  18. Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation

  19. Solution to Problem • The proposed algorithm includes 3 steps • Image Processing • Detect features • Find feature associations • Calculate feature measurements • SLAM • Apply measurement data to SLAM • Features and reference frames management • Add and remove features from SLAM state • Add and remove reference frames from SLAM state

  20. Solution to Problem • System Coordinate • World Frame • Camera Frame • Reference Frames landmark Camera Frame World Frame Reference Frame

  21. Solution to Problem • SLAM State • Camera state – represent camera frame • Reference frame states – represent reference frames • Landmark states – represent landmark positions

  22. Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation

  23. Image Processing • Input • an image from an omni-directional camera • Old SLAM state • Output • Feature measurement • Feature association

  24. Image Processing • Feature detection (for new features) • Using point features • Finds corners in the image using harris corner detector

  25. Image Processing • Feature associations • Describe which landmark that the feature in current image is associated to • Find the relation between a current image and old features in an old image • Using optical flow to track features • Using template matching to refine a feature position

  26. Image Processing • Feature associations – features tracking • Tracking features from a previous image to get a current features position • Using pyramid Lucas-Kanade optical flow

  27. Image Processing • Feature associations – feature positions refinement • Track features using optical flow may cause a feature drift • Using pyramid template matching to correct a feature position search region feature patch result after refinement using template matching current image with a drifted feature

  28. Image Processing • Feature associations – feature position refinements • Select patch from a reference image • Patch rotation and scale may not match • Transform function may need to apply to patch Not match Reference image Match Current image

  29. Image Processing • Feature associations – feature position refinements • Find transform function by project 3D patch creating from a current image to a reference image 3D patch Image sphere Current image Reference image

  30. Image Processing • Find transform function • Project every patch pixel may lead to a computational cost problem • Use perspective transform as a transform function instead • Need 4 project points to calculate a perspective transform function Real distortion Perspective distortion

  31. Image Processing • Feature associations – example

  32. Image Processing • Feature measurements • Using feature points in omni-image to be a measurement data • Feature points must be converted into bearing-only measurement in the form of yaw and pitch angles z landmark Ray (r) y x

  33. Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation

  34. Simultaneous localization and mapping (SLAM) • Using EKF SLAM to estimate Camera state, Reference frame states and Landmark states • Prediction • Determine how a camera move • Find state transition model (camera motion model) • Correction • How to measurement a landmark • Find observation model

  35. Simultaneous localization and mapping (SLAM) • Input • Measurement data from omni-image • Output • Estimated SLAM state • Camera state • Reference frame states • Landmark states

  36. Simultaneous localization and mapping (SLAM) • Prediction • Determine how a camera move • But the camera motion is unpredictable • Assume that a camera can move freely in any direction with some limit velocity Before predict After predict

  37. Simultaneous localization and mapping (SLAM) • Correction • Using a bunch of measurement (include current measurements data and old measurements data at reference frame) to update SLAM state landmark Reference frame Current camera Reference frame

  38. Simultaneous localization and mapping (SLAM) Measurement data for landmark i • Correction Observation model for each measurement y' landmark is a landmark position in X coordinate

  39. Simultaneous localization and mapping (SLAM) • Correction step can separate in 2 parts • Camera and reference frames Correction • Landmarks Correction

  40. Simultaneous localization and mapping (SLAM) • Camera and reference frames Correction • Assume that the measurement data can measurement landmark positions accurately • The correction affects only a camera state and reference frame states Before correction After correction

  41. Simultaneous localization and mapping (SLAM) • Landmarks Correction • Assume that the camera state is accurate • The correction affects only landmark states Before correction After correction

  42. Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation

  43. Features and reference frames management • Remove features • Feature points is out of image bound • The landmark position is not accurate enough the feature of this landmark is out of bound

  44. Features and reference frames management • Add features • Add new features • using harris corner detector to detect a new feature • Add new features when we have a new reference frame • Add old features • Consider that the old landmark may be appear in the omni image again

  45. Features and reference frames management • Add new features • Add new landmarks to SLAM state • Estimate a landmark position by assume a large variance for a range data

  46. Features and reference frames management landmark • Add old features • project an old landmark to the current image • check if a feature available in the image using template matching feature Image sphere

  47. Features and reference frames management • Add reference frame • When no suitable reference frames for feature tracking • When landmark number is below some threshold • Select a current camera state as a new reference frame

  48. Outline • Introduction • Omni-directional Camera • EKF-SLAM • Introduction to Problem • Solution to Problem • Image Processing • SLAM • Features and reference frames management • Experimental Results • Result Evaluation

  49. Experimental Results

  50. Experimental Results

More Related