1 / 30

Recovery of Chromaticity Image Free from Shadows via Illumination Invariance

Recovery of Chromaticity Image Free from Shadows via Illumination Invariance. Mark S. Drew 1 , Graham D. Finlayson 2 , & Steven D. Hordley 2. 1 School of Computing Science, Simon Fraser University, Canada. 2 School of Information Systems, University of East Anglia, UK. Overview.

kamran
Download Presentation

Recovery of Chromaticity Image Free from Shadows via Illumination Invariance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recovery of Chromaticity Image Free from Shadows via Illumination Invariance Mark S. Drew1, Graham D. Finlayson2, & Steven D. Hordley2 1School of Computing Science, Simon Fraser University, Canada 2School of Information Systems, University of East Anglia, UK

  2. Overview Introduction Shadow Free Greyscale images - Illuminant Invariance at a pixel -- 1D image Shadow Free Chromaticity Images - Better-behaved 2D-colour image invariant to lighting Application - For shadow-edge-map aimed at re-integrating to obtain full colour, shadow-free image

  3. The Aim: Shadow Removal We would like to go from a colour image with shadows to the same colour image, but without the shadows.

  4. Why Shadow Removal? For Computer Vision, Image Enhancement, Scene Re-lighting, etc. - e.g., improved object tracking, segmentation etc. Two successive video frames snake Motion map, original colour space  Motion map, invariant colour space

  5. What is a shadow? Region Lit by Sunlight and Sky-light Region Lit by Sky-light only A shadow is a local change in illumination intensity and (often) illumination colour.

  6. Removing Shadows So, if we can factor out the illumination locally (at a pixel) it should follow that we remove the shadows. Can we factor out illumination locally? That is, can we derive an illumination-invariant colour representation at a single image pixel? Yes, provided that our camera and illumination satisfy certain restrictions ….

  7. Conditions for Illumination InvarianceAssumptions (but works anyway…!): (1) If sensors can be represented as delta functions (they respond only at a single wavelength) (2) and illumination is restricted to the Planckian locus (3) then we can find a 1D coordinate, a function of image chromaticities, which is invariant to illuminant colour and intensity (4) this gives us a greyscale representation of our original image, but without the shadows (so takes us a third of the way to the goal of this talk!) (5) But the greyscale value in fact lives in a 2D log- chromaticity colour space, (so takes us a 2/3 of the way) [and exponentiating goes back to a rank-3 colour].

  8. Chromaticity: colour grey chromaticity 2D chromaticity is much more information than 1D greyscale: Can we obtain a shadowless chromaticity image?

  9. Image Formation Camera responses depend on 3 factors: light (E), surface (S), and sensor (Q)  is Lambertian shading

  10. Using Delta-Function Sensitivities Q1(l) Q2(l) Q3(l) = Sensitivity l Delta functions “select” single wavelengths:

  11. Characterizing Typical Illuminants Most typical illuminants lie on, or close to, the Planckian locus (the red line in the figure) So, let’s represent illuminants by their equivalent Planckian black-body illuminants ...

  12. Planckian Black-body Radiators Here I controls the overall intensity of light, T is the temperature, and c1, c2 are constants For typical illuminants, c2>>lT. So, Wien’s approximation:

  13. How good is this approximation? 2500 Kelvin 5500 Kelvin 10000 Kelvin

  14. Back to the image formation equation For delta-function sensors and Planckian illumination we have: Surface Light

  15. B G Plane G=1 R Band-ratio chromaticity Let us define a set of 2D band-ratio chromaticities: p is one of the channels, (Green, say) Perspective projection onto G=1

  16. with Band-ratios remove shading and intensity Let’s take log’s: Shading and intensity are gone. Gives a straight line:

  17. Calibration: find invariant direction Macbeth ColorChecker: 24 patches Log-ratio chromaticities for 6 surfaces under 14 different Planckian illuminants, HP912 camera

  18. Deriving the Illuminant Invariant This axis is invariant to shading + illuminant intensity/colour

  19. Algorithm: Plot, and subtract mean for each colour patch: SVD (2nd eigenvector) gives invariant direction.

  20. Algorithm, cont’d: Form greyscale I’ in log-space: exponentiate:

  21. Obtaining invariant Chromaticity image (1): We observe: line in 2D chromaticity space is still 2D, if we use projector, rather than rotation: 2-vector

  22. Obtaining invariant Chromaticity image (2): However, we have removed all lighting!  put back offset in e-direction equal to regression on top 1% brightness pixels:

  23. Obtaining invariant Chromaticity image (3): offset in e-direction: We are most familiar with L1-chromaticity

  24. recovered orig. Obtaining invariant Chromaticity image (4): In terms of L1-chromaticity:

  25. Obtaining invariant Chromaticity image (5): Projection line becomes a rank ~3 curve in L1 chromaticity space

  26. Obtaining invariant Chromaticity image (6): We can do better on fitting recovered chromaticity to original — regress on brightest quartile:

  27. regressed Improves chromaticity: recovered orig.

  28. Some Examples colour chromaticity recovered

  29. Main Advantage: chromaticity invariant (in [0,1]) is better- behaved than greyscale invariant –– better for shadow-free re-integration (ECCV02)

  30. Acknowledgements The authors would like to thank the Natural Sciences and Engineering Research Council of Canada, and Hewlett-Packard Incorporated for their support of this work.

More Related