1 / 26

Multi-Focus Range Sensor using Coded Aperture

Multi-Focus Range Sensor using Coded Aperture. Shinsaku HIURA Osaka Univ. Takashi MATSUYAMA Kyoto Univ. Depth from Focusing Search focused position by moving lens – not suitable for real-time measurement. Single image analysis Planar photograph is indistinguishable from real object.

ilario
Download Presentation

Multi-Focus Range Sensor using Coded Aperture

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi-Focus Range Sensorusing Coded Aperture Shinsaku HIURA Osaka Univ. Takashi MATSUYAMA Kyoto Univ.

  2. Depth from Focusing Search focused position by moving lens – not suitable for real-time measurement Single image analysis Planar photograph is indistinguishable from real object Depth from Defocus distance is estimated from the amount of blurring – no physical motion, suit for real-time measurement Multiple image analysis stable analysis against varied texture Depth from Defocus • Depth estimation using the quantity of blurring • Passive, and no physical motion - suit for real-time measurement - small and light-weight equipment • Stable depth estimation using relative defocus analysis • small and optimal sensor is necessary

  3. Multi-Focus Camera • Convert color CCD camera • Each CCD is moved 1mm toward Optical Axis • Neutral density by re-coating the prism surface • Small and light as same as usual color CCD camera

  4. Telecentric Optics • Apreture is set at front focal plane • Image size/intensity are equal among each image plane. Only blurring varies • First Applying to DFD: Nayar Usual optics Telecentric optics

  5. Problems of past Depth from Defocus research • High frequency information is lost by Blur(=LPF) •  unstable range estimation •  too sensitive to the texture or environment •  high-quality noiseless image is necessary Ex. Nayar averages over 256 images to eliminate noise • If the “blur” effect is changed to High-pass or Band-pass filter, it is possible to stabilize range estimation •  Structured aperture (coded aperture)

  6. Multi-focus camera witha Coded Aperture • Blurring kernel is the scaled shape of the aperture • Magnification ratio is varied with object distance

  7. Multi-focus camera with a coded aperture

  8. Mathematical model of blurring convolution Input image i1(x,y) convolution Input image i2(x,y) convolution Input image i3(x,y) Focus v K1 magn. K2 magn K3 magn. Wm:image plane Process of blurring Image s(x,y) Dist. u Blur kernel a(x,y)

  9. p, q: spatial freq., v: focus position Original image info. is eliminated using division Range estimaton using FFT • Fourier transform of blurring process • Elimination of original image information • Eval. Func. of range estimation Minimum residual is searched by varying v (focus position). First term is calculated from two input images, and second is from blurring Model.

  10. Minimize residual  range value Eliminate scene texture info. Windowing&FFT Process flow

  11. Restoration of blur-less image • Inverse filter • High-quality image can be restored, because • using multipule images • Rich information is remained using coded aperture v:focused positionWm:weight calculated from v

  12. Aperture design(1) • Spatial frequency response must be structured for easy and stable analysis • High freqency information must be preserved Gain Spacial frequency

  13. Aperture design(2) • Usual circular aperture is not optimal. This type is suit for beautiful defocused photograph. Blurring is not observed. Monotonic, and low gain when blurred. Feasible, but more peaks are desired.

  14. Blurring kernel(diameter of hole is ignored) 1-D Fourier transform of blurring kernel Simple example:2 holes aperture • Fourier transform of blurring kernel is cos() • Period of cos() is varied with object distance.

  15. Robustness of range estimation • Residual of evaluation function with varied range This “valley shape” shows the ability of robust depth estimation

  16. Last CCD Center CCD First CCD Experiment: input images

  17. Restored blur-less image

  18. Reconstructed object shape 3-D object shape Blur-free image(partial)

  19. Depth is estimatedby searching the position that givessame images Same convolution is appliedto the opposite image, andwe acquire the same image. (becase of commutative lawof convolution) Usual circular aperture cannot be used, because twiceblurring gives almost flatimages. Coded apertureenabled such simple principle. Blurring kernel is convolved optically Range analysis using convolution

  20. Range image Input image Measured scene Experiment

  21. Asymmetric aperture design Convolution kernel is changed at the focus plane (phase part of spacial frequency is changed) • Error range estimation is suppressed using asymmetric aperture because phase part of spacial frequency is changed. Asymmetric aperture

  22. Aperture symmetry and robustness of range estimation Input image

  23. Motion sequence measurement using input image recording • 3 images are recorded as RGB image on optical video disc recorder • Image is deteriorated by Y-C translation and cross-talk between RGB channel.

  24. Result: finger motion Input images Range images

  25. conclusions • Small/light multi-focus camera is developed • Coded aperture is applied to depth from defocus • Stable range estimation is achieved • Range estimation/image restoration by FFT • Range estimation by convolution • Recorded image can be used for motion analysis because range estimation is robust enough • Real-time range measurement is possible using image processing hardware. Simple method is easily ported to parallel hardware.

  26. Real-time calculation usingimage processing hardware • Simple convolution method can easily be ported on image processing hardware • Massive-parallel hardware, IMAP-Vision is used for experiment • Spec: 10G instruction/sec by 256PE • Calculation of 25frames/sec can be achieved. (However, this board does not have RGB separate capture interface; experiment is calculation only)

More Related