1 / 36

Edges and Contours– Chapter 7

Edges and Contours– Chapter 7. Visual perception. We don’t need to see all the color detail to recognize the scene content of an image That is, some data provides critical information for recognition, other data provides information that just makes things look “good”. Visual perception.

trappd
Download Presentation

Edges and Contours– Chapter 7

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Edges and Contours– Chapter 7

  2. Visual perception • We don’t need to see all the color detail to recognize the scene content of an image • That is, some data provides critical information for recognition, other data provides information that just makes things look “good”

  3. Visual perception • Sometimes we see things that are not really there!!! Kanizsa Triangle (and variants)

  4. Edges • Edges (single points) and contours (chains of edges) play a dominant role in (various) biological vision systems • Edges are spatial positions in the image where the intensity changes along some orientation (direction) • The larger the change in intensity, the stronger the edge • Basis of edge detection is the first derivative of the image intensity “function”

  5. First derivative – continuous f(x) • Slope of the line at a point tangent to the function

  6. First derivative – discrete f(u) • Slope of the line joining two adjacent (to the selected point) point u-1 u u+1

  7. Discrete edge detection • Formulated as two partial derivatives • Horizontal gradients yield vertical edges • Vertical gradients yield horizontal edges • Upon detection we can learn the magnitude (strength) and orientation of the edge • More in a minute…

  8. NOTE • In the following images, only the positive magnitude edges are shown • This is an artifact of ImageJ Process->Filters->Convolve… command • Implemented as an edge operator, the code would have to compensate for this

  9. Detecting edges – sharp image Image Vertical Edges Horizontal Edges

  10. Detecting edges – blurry image Image Vertical Edges Horizontal Edges

  11. The problem… • Localized (small neighborhood) detectors are susceptible to noise

  12. The solution • Extend the neighborhood covered by the filter • Make the filter 2 dimensional • Perform a smoothing step prior to the derivative • Since the operators are linear filters, we can combine the smoothing and derivative operations into a single convolution

  13. Edge operator • The following edge operators produce two results • A “magnitude” edge map (image) • An “orientation” edge map (image)

  14. Prewitt operator • 3x3 neighborhood • Equivalent to averaging followed by derivative • Note that these are convolutions, not matrix multiplications

  15. Prewitt – sharp image

  16. Prewitt – blurry image

  17. Prewitt – noisy image • Clearly this is not a good solution…what went wrong? • The smoothing just smeared out the noise • How could you fix it? • Perform non-linear noise removal first

  18. Prewitt magnitude and direction

  19. Prewitt magnitude and direction

  20. Sobel operator • 3x3 neighborhood • Equivalent to averaging followed by derivative • Note that these are convolutions, not matrix multiplications • Same as Prewitt but the center row/column is weighted heavier

  21. Sobel – sharp image

  22. Sobel – blurry image

  23. Sobel – noisy image • Clearly this is not a good solution…what went wrong? • The smoothing just smeared out the noise • How could you fix it? • Perform non-linear noise removal first

  24. Sobel magnitude and direction

  25. Sobel magnitude and direction

  26. Sobel magnitude and direction • Still not good…how could we fix this now? • Using the information of the direction (lots of randomly oriented, non-homogeneous directions) can help to eliminate edged due to noise • This is a “higher level” (intelligent) function

  27. Roberts operator • Looks for diagonal gradients rather than horizontal/vertical • Everything else is similar to Prewitt and Sobel operators

  28. Roberts magnitude and direction

  29. Roberts magnitude and direction

  30. Roberts magnitude and direction

  31. Compass operators • An alternative to computing edge orientation as an estimate derived from two oriented filters (horizontal and vertical) • Compass operators employ multiple oriented filters • To most famous are • Kirsch • Nevatia-Babu

  32. Kirsch Filter • Eight 3x3 kernel • Theoretically must perform eight convolutions • Realistically, only compute four convolutions, the other four are merely sign changes • The kernel that produces the maximum response is deemed the winner • Choose its magnitude • Choose its direction

  33. Kirsch filter kernels Vertical edges L-R diagonal edges Horizontal edges R-L diagonal edges

  34. Kirsch filter

  35. Nevatia-Babu Filter • Twelve 5x5 kernel • Theoretically must perform twelve convolutions • Increments of approximately 30° • Realistically, only compute six convolutions, the other six are merely sign changes • The kernel that produces the maximum response is deemed the winner • Choose its magnitude • Choose its direction

  36. Nevatia-Babu filter

More Related