1 / 31

Uncalibrated reconstruction

Uncalibrated reconstruction. Calibration with a rig Uncalibrated epipolar geometry Ambiguities in image formation Stratified reconstruction Autocalibration with partial scene knowledge. Uncalibrated Camera. calibrated coordinates. Linear transformation. pixel coordinates.

taite
Download Presentation

Uncalibrated reconstruction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Uncalibrated reconstruction Calibration with a rig Uncalibrated epipolar geometry Ambiguities in image formation Stratified reconstruction Autocalibration with partial scene knowledge UCLA Vision Lab

  2. Uncalibrated Camera calibrated coordinates Linear transformation pixel coordinates UCLA Vision Lab

  3. Uncalibrated Camera Calibrated camera • Image plane coordinates • Perspective projection Uncalibrated camera • Pixel coordinates • Camera extrinsic parameters • Projection matrix UCLA Vision Lab

  4. Taxonomy on Uncalibrated Reconstruction • is known, back to calibrated case • is unknown • Calibration (with a rig) – estimate • Uncalibrated reconstruction despite the lack of knowledge of • Autocalibration (recover from images) • Use complete scene knowledge (a rig) • Use partial knowledge • Parallel lines, vanishing points, planar motion, constant intrinsic • Ambiguities, stratification (multiple views) UCLA Vision Lab

  5. Calibration with a Rig Use the fact that both 3-D and 2-D coordinates of feature points on a pre-fabricated object (e.g., a cube) are known. UCLA Vision Lab

  6. Calibration with a Rig • Given 3-D coordinates on known object • Eliminate unknown scales • Recover projection matrix • Factor the into and using QR decomposition • Solve for translation UCLA Vision Lab

  7. Uncalibrated Camera vs. Distorted Space • Inner product in Euclidean space: compute distances and angles • Calibration transforming spatial coordinates • Transformation induced on the inner product • (the metric of the space) and are equivalent UCLA Vision Lab

  8. Calibrated vs. Uncalibrated Space UCLA Vision Lab

  9. Calibrated vs. Uncalibrated Space Distances and angles are modified by UCLA Vision Lab

  10. Motion in the Distorted Space Calibrated space Uncalibrated space • Uncalibrated coordinates are related by • Conjugate of the Euclidean group UCLA Vision Lab

  11. Uncalibrated Camera or Distorted Space Uncalibrated camera with a calibration matrix K viewing points in Euclidean space and moving with (R,T) is equivalent to a calibrated camera viewing points in distorted space governed by S and moving with a motion conjugate to (R,T) UCLA Vision Lab

  12. Uncalibrated Epipolar Goemetry • Epipolar constraint • Fundamental matrix • Equivalent forms of UCLA Vision Lab

  13. Properties of the Fundamental Matrix • Epipolar lines • Epipoles Image correspondences UCLA Vision Lab

  14. Properties of the Fundamental Matrix A nonzero matrix is a fundamental matrix if has A singular value decomposition (SVD) with For some . There is little structure in the matrix except that UCLA Vision Lab

  15. What Does F Tell Us? • can be inferred from point matches (eight-point algorithm) • Cannot extract motion, structure and calibration from one fundamental matrix (two views) • allows reconstruction up to a projective transformation (as we will see soon) • encodes all the geometric information among two views when no additional information is available UCLA Vision Lab

  16. Decomposing the Fundamental Matrix • Decomposition of the fundamental matrix into skew • symmetric matrix and nonsingular matrix • Decomposition of is not unique • Unknown parameters - ambiguity • Corresponding projection matrix UCLA Vision Lab

  17. Ambiguities in Image Formation • Potential ambiguities • Ambiguity in (can be recovered uniquely – Cholesky or QR) • Structure of the motion parameters • Just an arbitrary choice of reference coordinate frame UCLA Vision Lab

  18. Ambiguities in Image Formation Structure of motion parameters UCLA Vision Lab

  19. Ambiguities in Image Formation Structure of the projection matrix • For any invertible 4 x 4 matrix • In the uncalibrated case we cannot distinguish between camera imaging word from camera imaging distorted world • In general, is of the following form • In order to preserve the choice of the first reference frame we can restrict some DOF of UCLA Vision Lab

  20. Structure of the Projective Ambiguity • For i-th frame • 1st frame as reference • Choose the projective reference frame then ambiguity is • can be further decomposed as UCLA Vision Lab

  21. Geometric Stratification (cont) UCLA Vision Lab

  22. Projective Reconstruction • From points, extract , from which extract and • Canonical decomposition • Projection matrices UCLA Vision Lab

  23. Projective Reconstruction • Given projection matrices recover projective structure • This is a linear problem and can be solve using least-squares techniques. • Given 2 images and no prior information, the scene can be recovered up a 4-parameter family of solutions. This is the best one can do without knowing calibration! UCLA Vision Lab

  24. Affine Upgrade • Upgrade projective structure to an affine structure • Exploit partial scene knowledge • Vanishing points, no skew, known principal point • Special motions • Pure rotation, pure translation, planar motion, rectilinear motion • Constant camera parameters (multi-view) UCLA Vision Lab

  25. Affine Upgrade Using Vanishing Points maps points on the plane to points with affine coordinates UCLA Vision Lab

  26. Vanishing Point Estimation from Parallelism UCLA Vision Lab

  27. Euclidean Upgrade • Exploit special motions (e.g. pure rotation) • If Euclidean is the goal, perform Euclidean reconstruction directly (no stratification) • Multi-view reconstruction (later lectures) • Autocalibration (Kruppa) UCLA Vision Lab

  28. Direct Autocalibration Methods The fundamental matrix satisfies the Kruppa equations If the fundamental matrix is known up to scale Under special motions, Kruppa’s equations become linear. Solution to Kruppa’s equations is sensitive to noises. UCLA Vision Lab

  29. Direct Stratification from Multiple Views From the recovered projective projection matrix we obtain the absolute quadric contraints Partial knowledge in (e.g. zero skew, square pixel) renders the above constraints linear and easier to solve. The projection matrices can be recovered from the multiple-view rank method to be introduced later. UCLA Vision Lab

  30. Direct Methods - Summary UCLA Vision Lab

  31. Summary of Calibration Methods UCLA Vision Lab

More Related