460 likes | 484 Views
This lecture delves into omnidirectional stereo vision, covering imaging principles, epipolar geometry, depth error characterization, and representation techniques. Explore issues such as sensor designs, calibration methods, and correspondence algorithms. Learn about omnistereo imaging with binocular and N-Ocular viewpoints, circular projection methods, and panoramic camera technologies like the Panoramic Annular Lens. Discover the benefits and challenges of various omnistereo configurations and techniques for achieving accurate 3D reconstructions.
E N D
CSc80000 Section 2 Spring 2005 Omnidirectional Stereo Vision Lecture 8 Zhigang Zhu Computer Science Department The City College, CUNY zhu@cs.ccny.cuny.edu http://www-cs.engr.ccny.cuny.edu/~zhu/
Acknowledgements • Collaborators at UMass • Edward Riseman • Allen Hanson • Deepak Karuppiah • Howard Schultz • … … • Supported by • NSF Environmental Monitoring • DARPA/ITO Mobile Autonomous Robot S/W • China NSF Scene Modeling • Paper (with references) http://www-cs.engr.ccny.cuny.edu/~zhu/zOmniStereo01.pdf @Z. Zhu CCNY
The Class of Omnistereo(omnidirectional stereo vision) • Omnidirectional Vision : How to look • Viewer-centered: outward looking • Object-centered: inward looking • Omnistereo Vision: How many viewpoints • Binocular/N-Ocular: a few (2 or more) fixed • Circular Projection: many inside a small area • Dynamic Omnistereo: a few, but configurable • Object-centered: many, in a large space @Z. Zhu CCNY
Important Issues of Omnistereo • What this lecture is about • Omnistereo Imaging principle for sensor designs • Epipolar geometry for correspondence • Depth error characterization in both direction and distance • Other important issues not in this talk • Sensor designs • Calibration methods • Correspondence algorithms @Z. Zhu CCNY
Omni Imaging & Representation • Omnidirectional (panoramic) Imaging • Catadioptric Camera (single effective viewpoint) • ParaVision by RemoteReality, PAL, and many… • Image Mosaicing • Rotating camera, translating camera, arbitrary motion • Omnidirectional Representation • Cylindrical Representation • Spherical Representation @Z. Zhu CCNY
Panoramic Camera Panoramic Annular Lens (PAL) By Pal Greguss @Z. Zhu CCNY
360o 0o Panoramic Mosaics from a Rotating Camera (ICMCS99) @Z. Zhu CCNY
1st frame Cylindrical Panorama • connecting frame • conic mosaic • head-tail stitching • panorama @Z. Zhu CCNY
P (X, Y, Z) Z D Y v f O X Cylindrical Projection Image projection (f, v) of a 3D point P (X,Y,Z) Distance Cylindrical image Vertical axis @Z. Zhu CCNY
Binocular / N-Ocular Omnistereo A few fixed viewpoints • Three configurations • Horizontally-aligned binocular (H-Bi) omnistereo • Vertically-aligned binocular (V-Bi) omnistereo • N-ocular omnistereo – trinocular case • Issues • Distance error in directions of 360 degrees • Distance error versus distance • Epipolar geometry @Z. Zhu CCNY
P (X, Y, Z) f D Z Y v2 v1 f1 f2 O1 X O2 B H-Bi Omnistereo: depth error From Image pair { (f1, v1), (f2, v2) } to a 3D point P (X,Y,Z) Triangulation - Fixed baseline B - Horizontal disparity (vergent angle) Depth Error • Depth accuracy is non-isotropic; max vergent only when f2 =90 • Not make full use of the 360 viewing • Depth error proportional to Depth2 / Baseline @Z. Zhu CCNY
P (X,Y,Z) v1 v2 O1 O2 epipoles B D H-Bi Omnistereo: singularity case Zero Vergent angle when f1=f2=0 or 180 degree Distance Ratio Method - Visible Epipoles: the images of the camera centers in the others could be visible! - Vertical disparity and vertical epipolar lines @Z. Zhu CCNY
triangulation singularity depth-blind spots v1 f1 180 360 0 H-Bi Omnistereo: Epipolar geometry Given point (f2, v2), search for (f1, v1) -The epipolar curves are sine curves in the non-singularity cases and - The epipolar lines are along the v direction in the singularity cases @Z. Zhu CCNY
Z Y X O1 v1 Bv P v2 O2 V-Bi Omnistereo From Image pair { (f1, v1), (f2, v2) } to a 3D point P (X,Y,Z) - Vertical baseline Bv - Vertical disparity v - Same as perspective stereo • Depth accuracy isotropic in all directions • - Depth error proportional to square of distance • Epipolar lines are simply vertical lines • - But NO stereo viewing without 3D reconstruction @Z. Zhu CCNY
R13 R23 O3 O1 O2 R12 R12 R23 R13 N-Ocular Omnistereo Why more viewpoints ? Every point of the 360 FOV from the center of the sensor-triangle can be covered by at least two pairs of rays from different cameras with good triangulations • depth accuracy is still not isotropic, but is more uniform in directions • - one pair of stereo match can be verified using the second pair • - However no gain in epipolar geometry @Z. Zhu CCNY
Circular Projection Omnistereo Many viewpoints on a viewing circle • Omnivergent Stereo (Shum et al ICCV99) • every point in the scene is imaged from two cameras that are vergent on that point with maximum vergence angle; and • stereo recovery yields isotropic depth resolution in all directions. • Solution: Circular Projection/ Concentric Mosiacs • A single off-center rotating camera (Peleg CVPR 99, Shum ICCV99) • Full optical design (Peleg PAMI 2000) • My catadioptric omnistereo rig @Z. Zhu CCNY
Case 1: an omni sensor Z O viewing circle Case 2: two 1D sensors Circular Projection: principle Many viewpoints on a viewing circle A virtual camera moving in a viewing circle captures two set of rays on a plane tangent to the viewing circle: the left-eye in clockwise direction, and the right-eye in counterclockwise direction @Z. Zhu CCNY
f P left-eye ray right-eye ray f1 O1 D B r f2 O viewing circle O2 Circular Projection: geometry Max vergent angles for left and right rays “baseline” “disparity” P: 3D space point r: radius of the viewing circle f1,f2: viewing directions of left and right rays f: vergent angle (angular disparity) B: baseline length (< 2r); D: distance (OP) @Z. Zhu CCNY
Circular Projection: properties • Depth estimation is isotropic • Same depth error in all directions • Make full use of the 360 viewing • Depth error proportional to depth2/baseline • Same as H-Bi Omnistereo • limited baseline (B < 2r) • Horizontal Epipolar lines • Superior than H-Bi Omnistereo when a single viewing circle for left and right omni-images • Extension to Concentric Mosaics with viewing circles of different radii? @Z. Zhu CCNY
Case 2: an omni sensor Z Z O O viewing circle viewing circle Case 1: two 1D sensors Cameras: Single? Multiple? Standard? Special? Circular Projection: Implementation • Requirements: Two sets of rays 180o apart • Methods • 1: Two Rectilinear Cameras • 2: An Omnidirectional camera • Question: Can we do it with a single rectilinear camera? @Z. Zhu CCNY
2b left-eye ray right-eye ray O path of optical center V V image plane d rotation axis O viewing circle R Single camera approach Circular Projection: Implementation (I) • Rotate a rectilinear camera off its optical center • Take two columns with angular distance 2b << 180o • Viewing circle smaller than circular path of the optical center • Stretching your arm out, camera viewer may be too far from your eyes @Z. Zhu CCNY
OL OR >2b right-eye ray left-eye ray 2g O path of two “virtual” cameras R mirror pair b d rotation axis (optical center) O Rv image plane viewing circle Catadioptric approach Circular Projection: Implementation (2) • Rotate a pair of mirror with a camera around its optical center • Look outward at the scene through two slit windows • Larger viewing circle since mirrors enlarge the viewing angle • Camera viewer right in front of your eyes @Z. Zhu CCNY
Target Image 1 Image 2 Baseline Camera 1 Camera 2 Dynamic Omnistereoa few viewpoints moving freely (OmniVision2000) • Requirements: • Optimal configuration for any given point in the world • Change the vergent angle and the baseline freely • Issues: • Dynamic Calibration • View Planning @Z. Zhu CCNY
rotation shift Dynamic Ominstereo: depth error • Question 1: Vergent angle • Max vergent angle (f2 = 90o) • Question 2: Baseline • The larger the better? • The error in estimating the baseline @Z. Zhu CCNY
PAL 2 PAL 1 O2 cylinder body a B O1 Rc Dynamic Ominstereo: mutual calibration • Sensors as calibration targets • Make use of the visible epipoles • Known target geometry • Cylindrical body of the moving platform @Z. Zhu CCNY
Mutual calibration and human tracking: an example Pano 1: Image of the 2nd robot Images of a person Pano 2: Image of the 1st robot Results: B = 180 cm, D1 = 359 cm, D2 = 208 cm @Z. Zhu CCNY
Dynamic Ominstereo: Optimal view • Baseline error proportional to B2 • Larger baseline, even larger error • Overall distance error is min if • “Best” baseline and max vergent angle • Distance error with optimal configuration proportional to D1.5 @Z. Zhu CCNY
T(1) O2(2) O2(1) T(2) rotation O1 shift Dynamic Ominstereo: Optimal view application • Track a single target by two robots • One stationary, one moving • Omnistereo head with reconfigurable vergent and baseline @Z. Zhu CCNY
rotation shift Dynamic Ominstereo: error simulation • Student project in the spring of 2003 • Java Applet • http://www-cs.engr.ccny.cuny.edu/~zhu/omnistereo/simulation/ @Z. Zhu CCNY
Comparisons • Four Cases • Fixed viewpoint omnistereo • One fixed, one circular projection • Both circular projection • Dynamic omnistereo • Java Interactive Simulations • http://www-cs.engr.ccny.cuny.edu/~zhu/omnistereo/errormaps/ @Z. Zhu CCNY
Java Interactive Simulations • http://www-cs.engr.ccny.cuny.edu/~zhu/omnistereo/errormaps/ @Z. Zhu CCNY
plane in-ward rotation object Earth translation outward rotation Modeling a building Object-Centered OmniStereo • Looking inward rather than Looking outward • Modeling objects rather than scenes • Many viewpoints over a large space Modeling the Earth @Z. Zhu CCNY
“path” of optical center right-eye ray O 2b R d left-eye ray image plane rotation axis virtual viewing circle object Omni modeling of an object • Inward-Looking Rotation • Many viewpoints over a large circle • Circular projection: viewing circle within the object • Can rotate the (small) object (e.g. human) instead moving the camera @Z. Zhu CCNY
plane Earth Omni modeling of the earth • Modeling the earth • Airplane flying along great circles • Taking the leading and trailing edge of each frame • Data amount: • 1017 pixels if 10 cm2/pixel • 1015 pixels if 1 m2/pixel • 1012 = 1 Tera = 1000 Giga • Modeling a small area • Rotation can be approximated as translation • Parallel-perspective stereo mosaics • Virtual flying through @Z. Zhu CCNY
Sensor Image Plane “Right” Mosaic “Left” Mosaic Parallel-perspective stereo mosaics • Ideal model: Sensor motion is 1D translation, Nadir view • Two “virtual” Pushbroom cameras • Real Applications • Airborne camera (Umass, Sarnoff..) • Ground Vehicles (Tsinghua, Osaka) @Z. Zhu CCNY
Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines @Z. Zhu CCNY
Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines @Z. Zhu CCNY
Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines @Z. Zhu CCNY
Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines @Z. Zhu CCNY
Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines @Z. Zhu CCNY
Re-Organizing the images…. Stereo pair with large FOVs and adaptive baselines @Z. Zhu CCNY
GPS/IMU Height H from Laser Profiler P(X,Y,Z) Two views from different perspective stereo Recovering Depth from Mosaics • Parallel-perspective stereo mosaics • Depth accuracy independent of depth (in theory) Adaptive baseline displacement disparity Fixed ! @Z. Zhu CCNY
Stereo mosaics of Amazon rain forest Left Mosaic • 166-frame telephoto video sequence -> 7056*944 mosaics Right Mosaic Depth Map @Z. Zhu CCNY
Stereo viewing • Red: Right view; Blue/Green: Left view @Z. Zhu CCNY
Accuracy of 3D from stereo mosaics(ICCV01, VideoReg01) • Adaptive baselines and fixed disparity -uniform depth resolution in theory and accuracy proportional to depth in practice • 3D recovery accuracy of parallel-perspective stereo mosaics is comparable to that of a perspective stereo with an optimal baseline @Z. Zhu CCNY
Conclusions @Z. Zhu CCNY