660 likes | 899 Views
U. Direction (unit) vectors from cameras (blue) to points (black) are given : Find the positions of the cameras and points. Branch and Bound in Rotation Space (ICCV 2007). Essential Matrix Estimation Encodes the relative displacement between two cameras. Rotation Translation
E N D
Direction (unit) vectors from cameras (blue) to points (black) are given : Find the positions of the cameras and points.
Essential Matrix Estimation • Encodes the relative displacement between two cameras. • Rotation • Translation • Needs at least 5 points X x1 x2 (R, t)
Best current error We can eliminate all rotations within the ball of radius 0.3 about trial. Rotation Space
theta v
Isometry of Rotations and Quaternions Angle between two quaternions is half the angle between the corresponding rotations, defined by All rotations within a delta-neighbourhood of a reference rotation form a circle on the quaternion sphere.
Rotations are represented by a ball of radius pi in 3-Dimensional space. Angle-axis representation of Rotations Flatten out the meridians (longitude lines) Azimuthal Equidistant Projection
Numbers of cubes left at each iteration (Log-10 scale) Remaining Volume at each iteration (Log-10 scale in cubic radians). Performance
Linear Programming, not SOCP X V’ V t C’ C Point correspondence in two views Coplanarity constraint with uncertainty
Multi-Camera Systems (Non-overlapping) – L inf Method Translation direction lies in a polyherdron (Green) from point correspondences
Each point correspondence gives two LP constraints on the direction t (epipolar direction).
Essential Matrix Calculated from 3 points (above) or 4 points (below) Possible rotations.
360 degree camera Timing (in milliseconds) for E-matrix computation – 360 degree camera. Timing Examples 29 correspondences : 2.9 seconds 794 correspondences : 75 seconds. 6572 correspondeces : 3m 30 seconds
Further Application – 1D camera (e.g. robot moving in a plane) Joint work with Kalle Astrom, Fredrik Kahl, Carl Olsson and Olof Enquist Complete structure and motion problem for “planar motion” Optimal solution in L-infinity norm. Same idea of searching in rotation space.
Hockey Rink Data Reconstructed points and path Original and dual problems
Method works also for rigidly placed multi-camera systems. • Can be considered as a single “generalized” camera • One rotation, one translation to be estimated.
Robust 6DOF motion estimation from Non-overlapping images, Multi-camera systems 4 images from the right 4 images from the left (Images: Courtesy of UNC-Chapel Hill)
Generalized Cameras (Non-overlapping) Ladybug2 camera (The locally-central case) 5 cameras (horizontal) 1 camera (top)
Generalized Cameras (Non-overlapping) Experiment setup
Generalized Cameras (Non-overlapping) An Infinity-like path which the Ladybug2 camera follows (total 108 frames)
Robust 6DOF motion estimation from Non-overlapping images, Multi-camera systems Critical configuration
Generalized Cameras (Non-overlapping) – Linear Method Estimated path (Linear Method) vs. Ground truth
Generalized Cameras (Non-overlapping) – Linear Method Demo video : 16 sec (Click to play)
Multi-Camera Systems (Non-overlapping) – L inf Method E+SOCP: Motion of multi-camera rigs using SOCP method BB+LP : Motion of multi-camera rigs using L inf method
Multi-Camera Systems (Non-overlapping) – L inf Method E+SOCP: Motion of multi-camera rigs using SOCP method BB+LP : Motion of multi-camera rigs using L inf method
Multi-Camera Systems (Non-overlapping) – L inf Method Estimated path (L inf Method) vs. Ground truth
Multi-Camera Systems (Non-overlapping) – L inf Method Demo video : 16 sec (Click to play)