1 / 41

Camera Calibration

CS 636 Computer Vision. Camera Calibration. Nathan Jacobs. overview. assignment 2 out review from last time review camera parameters intrinsic extrinsic several methods for camera calibration constraints vs. solution methods vs. error functions.

bond
Download Presentation

Camera Calibration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 636 Computer Vision Camera Calibration Nathan Jacobs

  2. overview • assignment 2 out • review from last time • review camera parameters • intrinsic • extrinsic • several methods for camera calibration • constraints vs. solution methods vs. error functions

  3. constraints vs. solution methods vs. error functions • constraints: • corresponding points • corresponding lines • solution methods: • linear • non-linear • error functions: • quadratic • robust

  4. review from last time… • Corresponding points: how do we find them? • Estimating motion from corresponding points • translational • affine • homography

  5. X? X? X? Our goal: Recovery of 3D structure • Recovery of structure from one image is inherently ambiguous x

  6. Our goal: Recovery of 3D structure • Recovery of structure from one image is inherently ambiguous

  7. Our goal: Recovery of 3D structure • Recovery of structure from one image is inherently ambiguous

  8. Our goal: Recovery of 3D structure • We will need multi-view geometry

  9. Recall: Pinhole camera model • Principal axis: line from the camera center perpendicular to the image plane • Normalized (camera) coordinate system: camera center is at the origin and the principal axis is the z-axis

  10. Recall: Pinhole camera model

  11. Principal point • Principal point (p): point where principal axis intersects the image plane (origin of normalized coordinate system) • Normalized coordinate system: origin is at the principal point • Image coordinate system: origin is in the corner • How to go from normalized coordinate system to image coordinate system?

  12. Principal point offset principal point:

  13. Principal point offset principal point: calibration matrix

  14. Pixel coordinates • mx pixels per meter in horizontal direction, my pixels per meter in vertical direction Pixel size: m pixels pixels/m

  15. Camera rotation and translation • In general, the camera coordinate frame will be related to the world coordinate frame by a rotation and a translation coords. of point in camera frame coords. of camera center in world frame coords. of a pointin world frame (nonhomogeneous)

  16. Camera rotation and translation In non-homogeneouscoordinates: Note: C is the null space of the camera projection matrix (PC=0)

  17. Camera parameters • Intrinsic parameters • Principal point coordinates • Focal length • Pixel magnification factors • Skew (non-rectangular pixels) • Radial distortion

  18. Camera parameters • Intrinsic parameters • Principal point coordinates • Focal length • Pixel magnification factors • Skew (non-rectangular pixels) • Radial distortion • Extrinsic parameters • Rotation and translation relative to world coordinate system

  19. Calibrating the Camera Method 1: Use an object (calibration grid) with known geometry • Correspond image points to 3d points • Get least squares solution (or non-linear solution)

  20. Linear method • Solve using linear least squares Ax=0 form • P has 11 degrees of freedom (12 parameters, but scale is arbitrary) • One 2D/3D correspondence gives us two linearly independent equations • Homogeneous least squares • 6 correspondences needed for a minimal solution

  21. Calibration with linear method • Advantages: easy to formulate and solve • Disadvantages • Doesn’t tell you camera parameters • Doesn’t model radial distortion • Can’t impose constraints, such as known focal length • Doesn’t minimize right error function (see HZ p. 181) • Requires an expensive 3D rig to obtain known 3D locations • Non-linear methods are preferred • Define error as difference between projected points and measured points • Minimize error using Newton’s method or other non-linear optimization

  22. Recovering camera calibration • Once we’ve recovered the numerical form of the camera matrix, we still have to figure out the intrinsic and extrinsic parameters • This is a matrix decomposition problem…

  23. Given many examples of: • World points (X,Y,Z), and • Their image points (x,y) • Solve for P. • Then Rotation + intrinsic, all mixed up. translation

  24. Finding K from P Then take the QR decomposition of this part of the matrix to get the rotation and the intrinsic parameters. (from mathworld).

  25. Method 2: Grids on a Plane • A common method is to use many planes, • this requires solving for R,T for each each plane • Counting unknowns: • For each plane (image), we have 3 unknowns for translation and 3 for rotation. • + 5 extra unknowns defining the intrinsic calibration. • Basic idea: for a collection of planes, calculate the homography between the image and the real world plane. boar • http://www.vision.caltech.edu/bouguetj/calib_doc/

  26. Corners..

  27. Camera-centric View

  28. World-centric View

  29. Vertical vanishing point (at infinity) Vanishing line Vanishing point Vanishing point Calibrating the Camera Method 3: Use vanishing points • Find vanishing points corresponding to orthogonal directions

  30. Calibration by orthogonal vanishing points • The idea • Set directions of vanishing points • e.g., X1 = [1, 0, 0] • Use orthogonality as a constraint • Each VP provides one column of R • Model K with only f, u0, v0 • Why is there no translation? • Initial estimate of focal length by assuming u0, v0 are zero • Szeliski (6.51) • Special case: two vanishing points projecting to y = 0 in the image For vanishing points

  31. Special Case: Pure Rotation • R,T can be how the camera is moved from the world coordinate system. • Suppose we take two pictures of the world, from a camera at the same location. But the camera has rotated between the two pictures.

  32. First image, points in 3D, measured in camera coordinate system. • Second image, camera has rotated.

  33. …mostly on chalkboard • On image 1 p = KP And image 2, p’ = KRP p = K((KR)-1)p’ = KR-1K-1p’

  34. p =KR-1K-1p’, so what? • Given a few examples of corresponding points p,p’, we can solve for a mapping KR-1K-1 of all points. • What kind of transformation is KR-1K-1? • Can extract K from Szelsiki Sec 6.3.4…

  35. summary • Four types of constraints for calibration: • 2D-3D correspondences • correspondences between many planes • vanishing points • pure rotational motion • Often two methods: • linear • non-linear

  36. for next time • motion • read Szeliski 8.1, 8.2, 8.4

More Related