510 likes | 596 Views
2nd Joined Advanced Student School. Calibration Benjamin Fingerle Christian Wachinger. A Definition of Calibration.
E N D
2nd Joined Advanced Student School Calibration Benjamin Fingerle Christian Wachinger Benjamin Fingerle, Christian Wachinger
A Definition of Calibration • “Calibration is the process of instantiating parameter values for mathematical models which map the physical environment to internal representations, so that the computer’s internal model matches the physical world.” • Mihran Tuceryan Benjamin Fingerle, Christian Wachinger
Augmented Reality Requires Highly Precise Pose Estimation • In an AR environment, reality is modelled in a virtual world by arranging digital counter parts of various objects positioned and orientated, based on data gathered by tracking technology • This virtual world is then enriched with context based information and somehow projected back to the user in the physical world • Hence any inaccuracy in estimating the pose of a real world object as well as imprecise projection from virtual to real world causes a loss of realism and thus usability Benjamin Fingerle, Christian Wachinger
Additional Requirements for Calibration in AR Environments Calibration procedures for different objects have to be • As autonomous as possible • To make it a convenient process • To keep the possible number of user-related errors down • Efficient • Some applications even require real-time capabilities • Versatile • To make calibration procedures reusable in different AR setups Benjamin Fingerle, Christian Wachinger
Agenda • Scenario • Pointer calibration • Object calibration • Camera calibration • Virtual Camera calibration • Image calibration • Auto-Calibration Benjamin Fingerle, Christian Wachinger
A Motivating Scenario • A mobile user - Joe - is wearing an Optical See Through Head Mounted Display (OST-HMD) • Joe stands in front of an apparently empty table • But Joe seeing through his display gets the vision of several 3D-Objects placed on the table • By using his hands Joe can move the objects on the table Benjamin Fingerle, Christian Wachinger
1 2 3 A Motivating Scenario II Wrongly positioned and orientated Correctly positioned but wrongly orientated Correctly posed Benjamin Fingerle, Christian Wachinger
Different objects are to calibrate In the example following parameters have to be estimated • Pose of the table relatively to the room • Pose of Joe’s head relatively to the room • Pose of Joe’s hands relatively to the room • Parameters of Joe’s OST-HMD This is done using • 3DOF - magnetic pointer based object calibration for the table • 6DOF - magnetic tracking - the marker rigidly fixed at Joe’s HMD • SPAAM method for calibrating the OST-HMD • Stereovision based tracking of Joe’s hands To use above additional objects have to be calibrated • A magnetic tracker transmitter Benjamin Fingerle, Christian Wachinger
Agenda • Scenario • Pointer calibration • Object calibration • Camera calibration • Virtual Camera calibration • Image calibration • Auto-Calibration Benjamin Fingerle, Christian Wachinger
3DOF - Pointer Calibration • pw = pm + Rm pt • Determination of the unknown vectors pw andpt • 6 unknown parameters as pw andpt are 3D-Vectors • Several Measurements have to be taken • Least squares method Benjamin Fingerle, Christian Wachinger
Agenda • Scenario • Pointer calibration • Object calibration • Camera calibration • Virtual Camera calibration • Image calibration • Auto-Calibration Benjamin Fingerle, Christian Wachinger
3DOF - Pointer Based Object Calibration • Calculation of the transformation from the world coordinate system to the object coordinate system • Coordinates are known in the object coordinate system pl and in the world coordinate system pw. • pw = R *pl + T, R rotation, T translation =>12 unknown parameters => Several measurements => Solving the optimization problem: Benjamin Fingerle, Christian Wachinger
Agenda • Scenario • Pointer calibration • Object calibration • Camera calibration • Virtual Camera calibration • Image calibration • Auto-Calibration Benjamin Fingerle, Christian Wachinger
Stereo Vision Camera Calibration Motivation: • Joe’s hands’ poses to be tracked by a static stereo-vision camera • This is done by Triangulation • Analysing the two 2D-images for known landmarks applied to Joe’s hands • Inferring a 3D ray for each landmark and each image on which the landmark is aligned • Intersecting the two rays for each landmark to get its 3D position • Inferring the orientation by analysis of the landmark positions Benjamin Fingerle, Christian Wachinger
Intrinsic and Extrinsic Parameters Have to Be Calibrated • To be able to apply triangulation to camera images several camera specific parameter have to be known (Intrinsic Parameter) • So far the hand’s poses are known relatively to the camera’s coordinate system (CCS) but they are needed to be in world coordinate system (WCS) • Thus the static camera’s pose relatively to the WCS has to be determined as well (Extrinsic Parameter) Benjamin Fingerle, Christian Wachinger
The Basic Camera Model (Pinhole Camera) • Intrinsic Parametersthat have to be determined • Focal length f [1 DOF] Benjamin Fingerle, Christian Wachinger
Spatial Relation of CCS to WCS has to be known • Joe should be able to move the virtual objects displayed on the table by hand movements • The virtual objects coordinates are known in the WCS • Joe’s hands’ poses so far are known relatively to the CCS • To obtain the spatial relation between his hands and the virtual objects the spatial relation between the CCS and the WCS has to be known Benjamin Fingerle, Christian Wachinger
Camera’s Pose relative to WCS forms Extrinsic Parameters • Extrinsic Parameters that have to be estimated: • Rotation R[3DOF] • Translation T[3DOF] Benjamin Fingerle, Christian Wachinger
The Relation of 2D - Image Points to their 3D - Counterparts • Pc = R Pw + T • xu = f (xc/zc) • yu = f (yc/zc) Benjamin Fingerle, Christian Wachinger
Using CCDs introduces additional Intrinsic Parameters The use of CCD - Chips introduces additional intrinsic Parameters that have to be calibrated • The image origin is shifted relatively to the optical centre • Due to CCD-typical line-sampling imprecision a horizontal scale factor has to be introduced Benjamin Fingerle, Christian Wachinger
CCD Related Intrinsic Parameters • xm = sx(xu/∆x)(#xMem/#xCCD) + tx • ym = yu/∆y + ty • Additional Intrinsic Parameters • shift S = (tx, ty) of the image relatively to the optical centre [2 DOF] • horizontal scale factor sx[1 DOF] Benjamin Fingerle, Christian Wachinger
Lens Distortion has to be Considered • Efficient algorithms for determining the intrinsic parameters f, tx, ty and sx together with the extrinsic parameters R and T exist • But optical tracking based on such calibrated cameras proved to be imprecise • This is due to Lens Distortion from which common of the shelf-cameras suffer • Lens distortion can be split into tangential - and radial lens distortion whereby the latter proved to be of special importance to optical tracking and thus camera calibration Benjamin Fingerle, Christian Wachinger
Radial Lens Distortion Requires Two More Parameters • Modelled with infinite series • xu = xd (1 + k1 r2 + k2 r4) • yu = yd (1 + k1 r2 + k2 r4) • r = (xd2 + yd2)1/2 • Additional Intrinsic Parameters: • Distortion Coefficient k1[1DOF] • Distortion Coefficient k2[1DOF] Benjamin Fingerle, Christian Wachinger
xd = ∆x#xCCD(xm- tx) , yd = ∆y (ym- ty) sx #xMem xu = f xc , yu = f yczc zc From WCS to Memory Pw = (xw, yw, zw) | point in WCS xc = r1xw + r2yw + r3zw + Tx , yc = r4xw + r5yw + r6zw + Ty , zc = r7xw + r8yw + r9zw + Tz Pc = (xc, yc, zc)| point in CCS | R [3DOF] | T [3DOF] Pu = (xu, yu) | undistorted image | f [1DOF] xu = xd (1 + k1r2 + k2r4) , | r = (xd2 + yd2)1/2 yu = yd (1 + k1r2 + k2r4) Pd = (xd, yd)| distorted image | k1[1DOF] | k2[1DOF] Pm = (xm, ym) | distorted memory image | S [2DOF] | sx[1DOF] Benjamin Fingerle, Christian Wachinger
The “Tsai Calibration Method” Satisfies all Requirements Tsai’s method • Takes a set of known non-coplanar calibration points in WCS • Estimates both extrinsic and intrinsic parameters of a statically mounted of the shelf CCD camera • And works • autonomously • Efficiently • And of provable accuracy Benjamin Fingerle, Christian Wachinger
Tsai’s Method Works in Two Stages • Prerequisites: • #mem, #CCD, ∆x, ∆y from device specification • S= (tx, ty) = (∆x/2, ∆y/2) • Measure non-coplanar calibration points Pwi = (xwi,ywi ,zwi) in WCS • Take an image and find calibration points Pmi = (xmi, ymi) • Stage 1: Compute • Transformation matrix R • x-and y-component Tx, Tyof Translation T • Horizontal scale factor sx • Stage 2: Compute • Effective focal length f • Radial lens distortion coefficients k1and k2 • z-component Tz of Translation T Benjamin Fingerle, Christian Wachinger
Stage 1 … Based on parallelism observation: • Radial distortion does not influence direction from origin to image point • (0 0 f)T(xd yd f)T || (0 0 zc)T(xc yc zc)T Thus following holds • (xd yd)T = c (xc yc)T • xd = cxc, yd = cyc => xdyc= cxcyc=ydxc Now substitute xc and yc by their counterparts xw and yw transformed with R and translated by T • xd = ydxwr1sx + ydywr2sx + ydzwr3sx + ydTxsx - dxwr4 - xdywr5 - xdzwr6 Ty Benjamin Fingerle, Christian Wachinger
Parallelism Constraint Benjamin Fingerle, Christian Wachinger
… Stage 1 • for each calibration memory point Pmicompute the interim distorted image point Pdi’ while setting sx to 1 • for each pair Pdi’ and Pwi formulate the former linear equation xdi = … • There are 7 free terms: (r1sx/Ty), (r2sx/Ty ), (r3sx/Ty ), (sxTx/Ty ), (r4/Ty ), (r5/Ty ), (r6/Ty) • With more than 7 calibration points this system of linear equations is over determined and thus can be solved (with least square error method) • From these 7 terms R, Tx, Ty and sx can be efficiently extracted by application of geometric observations Benjamin Fingerle, Christian Wachinger
Stage 2 • Step 1: Compute an approximation of f and Tz by ignoring lens distortion • Step 2: Use the approximation of f and Tz to compute the exact solution of f, Tz, k1 and k2 Benjamin Fingerle, Christian Wachinger
… Stage 2, Step 1… • Ignoring lens distortion leads from f (yc/zc) = yu = yd (1 + k1r2 + k2r4) to f (yc/zc) = yu = yd • for each calibration point i formulate linear equation f (yci/zci) = ydi • Substituting yc, zc and yd leads to f (r4xwi + r5ywi + r6zwi + Ty) = ∆y(ymi - ty) (r7xwi + r8ywi + r9zwi + Tz) Benjamin Fingerle, Christian Wachinger
… Stage 2, Step 2 • We get an over determined and thus solvable system of linear equations with two free variables f and Tz • These approximation values are taken as initial guess for an algorithm solving the system of nonlinear equations computing exact f and Tzas well as k1and k2 • This initial guess is good enough for efficiently solving the equation system even though it is not linear Benjamin Fingerle, Christian Wachinger
Conclusion:Tsai-Method solves Camera Calibration Problem INPUT: • Mono view image of non-coplanar calibration points of known coordinates in WCS • Device specific data (resolution of CCD, image centre in pixels, number of pixels scanned in a line) OUTPUT: • Extrinsic Parameters • Camera pose relatively to WCS [6DOF] • Intrinsic Parameters • Effective focal length [1DOF] • Horizontal scale factor [1DOF] • Radial lens distortion coefficients [2DOF] Benjamin Fingerle, Christian Wachinger
Different Variations of Tsai’s Method Exist Different circumstances let different variations of Tsai’s method seem feasible: • Single view with coplanar calibration points • Single view with non-coplanar calibration points (presented) • Multiple view Benjamin Fingerle, Christian Wachinger
Tsai’s Method Also Works for Stereovision Cameras Remark • Camera tracking requires stereo vision images • For stereovision two cameras are rigidlyaligned in parallel Benjamin Fingerle, Christian Wachinger
Agenda • Scenario • Pointer calibration • Object calibration • Camera calibration • Virtual Camera calibration • Image calibration • Auto-Calibration Benjamin Fingerle, Christian Wachinger
Virtual Camera Calibration (Optical-See-Through) Setup: Benjamin Fingerle, Christian Wachinger
Virtual Camera Calibration (Optical-See-Through) Calculation of a projective matrix describing the mapping from 3D Points to 2D Points in the image plane • No explicit calculation of intrinsic camera parameters • No consideration of distortion Using a 6 DOF Tracker to get the pose of the camera Head motion can be modelled Simplified algorithm for virtual camera calibration Benjamin Fingerle, Christian Wachinger
Virtual Camera Calibration (Optical-See-Through) Benjamin Fingerle, Christian Wachinger
Virtual Camera Calibration (Optical-See-Through) Calculation of matrix A: Using the relationship A = GF F: 4 x 4 transformation matrix G: 3 x 4 projection matrix F is determined by the tracker G has to be calculated Benjamin Fingerle, Christian Wachinger
Virtual Camera Calibration (Optical-See-Through) Calculation of matrix G: • Choosing a single point with known coordinates pw • Calculating the coordinates in the marker coordinate system pm ;pm = Fpw • Getting the point coordiante in the image plane pi by aligning the cross-hair with the real point • pi = Gpm • 12 unknown parameters • At least 6 “calibration” points • A single “real” point is enough Benjamin Fingerle, Christian Wachinger
Virtual Camera Calibration (Optical-See-Through) • Similar algorithm for stereoscopic displays • Instead of using a cross-hair a 3D object is used Benjamin Fingerle, Christian Wachinger
Agenda • Scenario • Pointer calibration • Object calibration • Camera calibration • Virtual Camera calibration • Image calibration • Auto-Calibration Benjamin Fingerle, Christian Wachinger
Image Calibration Calculation of distortion parameters for scan converter and frame grabber Mpv = Lpd Benjamin Fingerle, Christian Wachinger
Image Calibration • Modeling of errors through linear transformations without rotation • Calculation of transformation parameters by the comparison of the coordinates of certain points Benjamin Fingerle, Christian Wachinger
Agenda • Scenario • Pointer calibration • Object calibration • Camera calibration • Virtual Camera calibration • Image calibration • Auto-Calibration Benjamin Fingerle, Christian Wachinger
AR Applications Create the Desire for Auto-Calibration • Tracking assumes correct calibration of ceiling- or wall-mounted components • Specialised methods for getting their parameters are necessary Goal: • Calibration of AR devices without user interaction • Calibration during regular use Benjamin Fingerle, Christian Wachinger
AR Applications Create the Desire for Auto-Calibration Regular method: • Estimating location of mobile units based on sighting data of known fixed units locations • Sightings may contain more information than necessary for location determination => surplus data • Constraining the locations of mobile units => additional surplus data Using surplus data for self-surveying! Benjamin Fingerle, Christian Wachinger
AR Applications Create the Desire for Auto-Calibration Three different data gathering methods for surplus data: • People • Floor • Frame Processing self-survey data: • Simulated Annealing • Finding best guess • Scoring solution against gathered data • Inverting the location algorithm Benjamin Fingerle, Christian Wachinger
Auto-Calibration of Cameras Drawbacks of Camera Calibration • Calibration grid is not available • Change of camera parameters due to • Mechanical or thermal variations • Focusing and zooming Auto-Calibration • highly flexible • requires point matches from image sequences Benjamin Fingerle, Christian Wachinger