1 / 42

CoSLAM : Collaborative Visual SLAM in Dynamic Environments

CoSLAM : Collaborative Visual SLAM in Dynamic Environments. Presenter : Jeongkyun Lee. CoSLAM : Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013. Contents. Introduction Proposed method Experimental results References.

urban
Download Presentation

CoSLAM : Collaborative Visual SLAM in Dynamic Environments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CoSLAM:Collaborative Visual SLAMinDynamic Environments Presenter : Jeongkyun Lee

  2. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Contents • Introduction • Proposed method • Experimental results • References

  3. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Introduction • Viewpoint • Visual SLAM with a single camera • Structure-from-motion (SFM) based SLAM • Visual odometry [1] • Real-time SLAM using a local bundle adjustment method [2] • PTAM [3] • Etc. • Bayesian-inference (or Filter) based SLAM • EKF-Monocular SLAM [4]

  4. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Introduction • Viewpoint 2. Visual SLAM with multiple camera • Visual odometry with a stereo rig [1] • 6DoF SLAM with stereo-in-hand [5] • Visual SLAM with a multi camera rig [6] • Etc. 3. SLAM in dynamic environment • SLAMMOT (SLAM and Moving Object Tracking) [7] • SLAMIDE (SLAM in dynamic environment) [8]

  5. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Introduction • CoSLAM • Deal with moving objects, that is, dynamic environments Using multiple cameras. • Interaction of each camera. • A method combiningSFM-based SLAM with filter-based SLAM. To deal with moving objects

  6. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Overall procedure

  7. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Properties • Main system : conventional sequential SFM method • Map points : 3D position vector with uncertainty • Measurements : KLT feature tracking (Or active matching) • Update • Map points : Use the Kalman gain • Refinement : bundle adjustment in key frames

  8. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera pose estimation • Intra-camera pose estimation • Each camera works independently. • Track feature points from a camera. • Compute its pose. • Inter-camera pose estimation • In dynamic environments where the number of static map points could be small, or the static points are distributed within a small image region. • Use both static and dynamic points to obtain poses for all cameras. • Condition • When the number of dynamic points are greater than that of static points. • When the area covered by the convex hull of static feature points is less than 20% of the image area.

  9. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera pose estimation: Intra-camera pose estimation • Minimize the reprojection errors by the iteratively re-weighted least squares(IRLS) method. • is initialized according to the camera pose at the previous frame. : the image projection of the 3D point : the image feature point registered to : measures the distance btw 2 image points : an index of feature points : M-estimator, the Tukey bi-weight function,

  10. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera pose estimation: Inter-camera pose estimation Therefore, we use both static and dynamic points

  11. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera pose estimation: Inter-camera pose estimation • Only applied to cameras within the same group. • Initialized from the previous frame. : an index of cameras : the set of ‘static’ and ‘dynamic’ map points : the visibility of the i-th map point at camera c, (1 or 0)

  12. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Map maintenance • Maintain the position uncertainty of each map point To help point registration, To distinguish static and dynamic points. • A map point : the triangulated position. : the covariance matrix that measures the position uncertainty.

  13. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Map maintenance: Position uncertainty of map points • Only consider the uncertainty in feature detection and triangulation. • Assuming the feature detection error • Update its 3D position given a new observation : the Jacobian of the camera projection function that maps a 3D map point to its 2D image coordinates in all views. : the number of views used for triangulation. : the image projection of in the frame. : Kalman gain.

  14. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Map maintenance: Position uncertainty of map points • The computation is independent at each point. (for efficiency) • Enable parallel computation. • Even for static points, their reconstructed positions are always changing over time due to triangulation uncertainties.

  15. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Map maintenance: Map points generationIntra-camera mapping • Reconstruct static map points from feature tracks in each individual camera. • Unmapped feature tracks -> long enough (> frames)Triangulate a 3D point using the beginning and the end frames. • Mahalanobis distance < for all frames,A new map points is generated and marked as ‘static’

  16. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Map maintenance: Map points generationInter-camera mapping • Applied to unmapped feature points only within the same camera group. • Match image features btw different cameras by ZNCC • To avoid ambiguous matches, search the pointswithin distance to the epipolar line. • Accept the match if • Guided matching (Disparity vector, Winner-take-all strategy) After matching features, Triangulate the corresponding points within the same group.

  17. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Map maintenance: Point Registration • Consider unmapped feature points as active map points:are static and have corresponding feature points within the most recent frames. (newly detected feature point) • Feature detection -> Select closest -> ZNCC comparison • Project active map points to the images, then compare by ZNCC.Consider the uncertainty ,Then, check the ZNCCscorebetween and • Also check the previous position, considering the Mahalanobis distances in all frames. is registered to • Since it has a large error, re-triangulate the 3D position, selecting 2 observation, which have the largest viewpoint changes.

  18. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Point classification • Distinguish static and dynamic points by the reprojection error. • Point : ‘static’, ‘dynamic’, ‘false’, and ‘uncertain’. • Initially, consider all points as static. At every frame, we check the reprojection errors of all ‘static’ points. • In intra-camera mapping, Mahalanobis distance < -> Uncertain • In inter-camera mapping, Mahalanobis dist. of re-triangulated position from cameras < -> Dynamic • Project dynamic points to the previous frames, thenMahalanobis dist. < -> Static

  19. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Point classification

  20. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera grouping: Grouping • Common map points btw 2 cameras i, j. • > 0Connect the camera i and j by an edge weighted by • A spanning tree for each camera group with maximum weight. • Camera grouping: Splitting • Edges are removed as they have no common feature points any more.

  21. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera grouping: Merging • Project the map points from one camera onto the image planes in the other group.The number of visible points is large > 30%The area spanned by these points are large > 70%⇒ Merging • Suppose 2 camera groups are separated at the 1st frame and are merged at Fth frame. • Adjust all camera poses from frame 2 to F,Adjust the map points generated within these frames. • Fix camera poses in the 1st frame.

  22. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera grouping: Merging, Step1 • Estimate the correct relative poses btw cameras at frame F. • Match SURF features. • Search for correspondences within distance to the epipolar line. • Merge 3D map points by averaging their positions.

  23. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera grouping: Merging, Step2 • Hard constraints : relative camera poses at the Fth frame.Select only (P+Q-1) relative poses by the spanning tree.It can be represented as two linear systems in the Fth frame. : the pose of the camera at the frame. : the relative pose btw the camera and at the frame.

  24. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera grouping: Merging, Step2 • Soft constraints : All the other relative poses from 2 to F-1.For any cameras m and n connected by the edges,Similar linear system, : the relative pose btw m and n before merging

  25. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Camera grouping: Merging, Step2 • Solve two constrained linear least square problems. • Since it does not consider orthonormality condition to the rotation matrices, obtain the closest rotation matrices by SVD. • After updating the camera poses, the 3D positions of map points are also updated by re-triangulating their corresponding feature points : the augmented matrices and vectors by adding zero elements. (by the scale factor )

  26. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method • Refinement • Refine both the camera poses and the 3D map points from time to time by bundle adjustment in selected key frames. • Operate with the most recent K key frames. • To refine the camera poses of the other frames,Apply the similar method with Merging, step2. • After the pose refinement, re-triangulate the points.

  27. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Camera pose estimation • Intra-camera pose estimation • Input : Tracked feature points • Output : Camera pose • Inter-camera pose estimation • S : static points, D : dynamic points Intra-CPE No # static pts < th Area( static pts ) < n% Yes Inter-CPE

  28. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Camera pose estimation Use both static and dynamic points

  29. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Map maintenance • Map points with position uncertainty - Update :

  30. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Map maintenance • Map points generation • Intra-camera mapping : • Inter-camera mapping : Tracked points (> N frames) Triangulation Unmappped feature Guide matching (ZNCC, near epipolar line)

  31. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Map maintenance • Point registration • Through feature tracking • Active map points : Newly detected, but unmapped feature -> association Smallest Mahalanobis distance point mj Feature detection ZNCC

  32. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Map maintenance • Point classification • Use a reprojection error Retriangulation of tracked features

  33. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Camera grouping • Grouping and Splitting • Count the number of common map points • Construct an undirected graph, Extract a spanning tree • Only match features between edge connected cameras

  34. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Camera grouping • Merging • The number of visible points, The area spanned by these points > n % Match SURF features R, T Hard constraints Soft constraints

  35. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Camera grouping • Merging • The number of visible points, The area spanned by these points > n % Optimize!!

  36. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Proposed method (Summary) • Refinement • Bundle adjustments for key frames • Camera poses and map points

  37. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Experimental results • Drift • 96 m trajectories. • 1-camera, 2-camera, 3-camera, and 4-camera • The average distance drift errors : 2.53m, 1.57m, 1.19m, and 0.67m • The average scale drift error were 0.76m, 1.10m, 0.96m, and 1.00m

  38. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Experimental results • Dynamic scenes

  39. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Experimental results

  40. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 Experimental results • Run Time

  41. CoSLAM: Collaborative Visual SLAM in Dynamic Environments, TPAMI 2013 References • D. Nister, O. Naroditsky, and J. Bergen. Visual odometry. In IEEE Proc. of CVPR, volume 1, 2004. • E. Mouragnon, M. Lhuillier, M. Dhome, F. Dekeyser, and P. Sayd. Real time localization and 3d reconstruction. In IEEE Proc. of CVPR, volume 1, pages 363–370, 2006. • G. Klein and D. Murray. Parallel tracking and mapping for small AR workspaces. In IEEE & ACM Proc. of Int’l Sym. on Mixed and Augmented Reality, pages 225–234, 2007. • A. Davison, I. Reid, N. Molton, and O. Stasse. MonoSLAM: Real-time single camera SLAM. IEEE Trans. on Pattern Analysis and Machine Intelligence, pages 1052–1067, 2007. • L. Paz, P. Pini´es, J. Tard´ os, and J. Neira. Large-scale 6-dof slam with stereo-in-hand. IEEE Trans. on Robotics, 24(5):946–957, 2008. • M. Kaess and F. Dellaert. Visual slam with a multi-camera rig. Georgia Institute of Technology, Tech. Rep. GIT-GVU-06-06, 2006. • C. Wang, C. Thorpe, S. Thrun, M. Hebert, and H. Durrant Whyte. Simultaneous localization, mapping and moving object tracking. Int’l of Robotics Research, 26(9):889, 2007. • C. Bibby and I. Reid. Simultaneous localisation and mapping in dynamic environments (slamide) with reversible data association. In Proc. of Robotics: Science and Systems, 2007.

  42. Thank you!

More Related