1 / 43

Multi video camera calibration and synchronization

Multi video camera calibration and synchronization. Motivation. Multi camera applications become common. Example: Stereo, Surveillance … . Using multi camera we can over come problems like hidden objects. In general more cameras equal more information on the scene. How does it look.

alpha
Download Presentation

Multi video camera calibration and synchronization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi video camera calibration and synchronization

  2. Motivation • Multi camera applications become common. Example: Stereo, Surveillance…. • Using multi camera we can over come problems like hidden objects. • In general more cameras equal more information on the scene.

  3. How does it look • Multi camera setup

  4. The scene • The filmed scene 1/3

  5. The scene • The filmed scene 2/3

  6. The scene • The filmed scene 3/3

  7. Perspective projection Perspective projection:

  8. The projection matrix • Object point • Image point • Using the model: • And the projection matrix (so far) is: *Homogenous coordinates

  9. Internal matrix • The internal matrix represent the inner camera settings • Focal length (d) • Principle point location usually (0,0) • Scaling factor

  10. External matrix • Includes all the orientation properties of the camera • Rotation • Translation

  11. Projection matrix sum up • Internal parameters • External parameters • The result p=MP

  12. Calibration • Camera calibration is used to coordinate between cameras. • Given a 3D point in the real word finding the projected point in the camera. • The goal is to fined the projection matrix M. • Using known 3D points and there corresponding image points p=MP can be solved.

  13. When a full calibration is not necessary Homography • Mapping between a point on a ground plane as seen from one camera, to the same point on the ground plane as seen from a second camera.

  14. When an Homography can be used • When the images are of the same plane Camera 1 Camera 2 Result

  15. When an Homography can be used • When images taking using same camera by only rotating it

  16. Homography computation • Using the Homography matrix H we can map point from one image to second image • So we have : p’=Hp • P and p’ are given in homogeneous coordinates

  17. Homography computation • H is 3x3 • That is 8 D.O.F • To find H we need 4 corassponding points

  18. Finding corresponding points • Manually, detecting by hand similar features. • Not accurate • Good for 2 cameras, what about 9 or more.

  19. Known solution • Automatic Detection of known features. • Large working volume need large objects. • very hard to detect from far distance.

  20. Features detection in wide base line • Noise • Hidden parts • Assuming detection is possible finding the corresponding is hard.

  21. Example of feature detection problems

  22. Goals of the calibration object • 360 degrees view. • Robust to noise. • Accurate regardless the distance (or zoom). • Easy to find corresponding points. • Automated as possible.

  23. Solution • Use easy to detect features (Active features). • Use the benefits of time dimension video. • This will create a easy to detect corresponding point list. • Find Homography using the list of points.

  24. Calibration object • Ultra bright LEDs. • Very bright, easy to detect.

  25. Use flickering as identifier • features flicker in constant rate • Each feature has a different rate • The cameras filming in constant rate • The LED flicker can be found • The result a list of points in an increasing frequency rate for each camera

  26. Detection method first stage • Filter unnecessary noise • Use the red channel only as filter. • What about acceptable red channel filters in RGB such as:R = ((R-B)+(R-G)). • Remove white pixels (All channels have high intensities ). • Not good for a case a LED caused high saturation (appears as white).

  27. Filter Example • Red channel only((R-B)+(R-G))

  28. Detection method second stage • Take advantage of video camera time line • The LED is going from on to off state • Subtracting following frames (similar to background subtraction). • Detect features pixels candidates using a threshold. • Save detection frame number to detect flickering rate.

  29. Detection method third stage • So far we have points candidate and there frequencies. • Yet some of the candidates are noise. • Use a frequency as a second filter • Most of the noises have a very short and not consist frequency.

  30. Noise and feature frequencies • Noise • Feature

  31. Frequency filter • Before

  32. Frequency filter • After

  33. Detection method fourth stage • Once we have the LED pixels detected we need to detect a pixel to represent it • Local maximum, the pixel with the highest intensity level. • Solution to different distances of camera from the features and different zoom rates.

  34. Local maximum example • Before

  35. Local maximum example • After

  36. Full tool example

  37. Synchronization • Given the frame number k in the first camera find the corresponding frame in the second camera. • Not all the cameras starts to film in the same time. • Known solution using temporal features

  38. Temporal features • Hard to find, not suitable for 9 cameras or more

  39. Automatic synchronization • Each feature has a different rate • The signature is based on the gap between the pools vibrate. • Given an index we search for the first time after this index the pool with the lowest frequency vibrate and so on. • Given that the polls turned on in t0,t1,t2,t3,t4,t5 the resulting signature is • (t1-t0,t2-t1,t3-t2,t4-t3,t5-t4)

  40. Synchronization graph 1/2

  41. Synchronization graph 2/2

  42. Tool synchronization example

  43. The end • Thank you!!!

More Related