730 likes | 860 Views
Foundations of Computer Vision Lecture 14. Roger S. Gaborski. 1. RECALL: Orange F lower Example. Orange F lower Example. 10 pixel sample from orange flower >> mr = mean(data(:,1)) mr = 1 > > mg = mean(data(:,2)) mg = 0.5325 >> mb = mean(data(:,3) ) mb = 0.
E N D
Foundations of Computer VisionLecture 14 Roger S. Gaborski 1
RECALL:Orange Flower Example Roger S. Gaborski
Orange Flower Example • 10 pixel sample from orange flower • >> mr = mean(data(:,1)) • mr = 1 • >> mg = mean(data(:,2)) • mg = 0.5325 • >> mb = mean(data(:,3)) • mb= 0 Roger S. Gaborski
distMeasure = sqrt((I(:,:,1)-mr).^2 + (I(:,:,2)-mg).^2 + (I(:,:,3)-mb).^2); >> figure, imshow(distMeasure, []) Roger S. Gaborski
Flower should be pixel locations with small distance measure Threshold distance measure What threshold value??? figure, hist(distMeasure(:),100) Roger S. Gaborski
>> Iflower = distMeasure<.45; >> figure, imshow(Iflower) Roger S. Gaborski
>> Fred = I(:,:,1).*Iflower; >> Fgrn = I(:,:,2) .*Iflower; >> Fblu = I(:,:,3) .*Iflower; >> F(:,:,1) = Fred; >> F(:,:,2) = Fgrn; >> F(:,:,3) = Fblu; >> figure, imshow(F) Roger S. Gaborski
Extract and Analyze Brandy im=imread('IMGP1715.JPG'); >> imSm = imresize(im, .25); >> figure,imshow(imSm) Roger S. Gaborski 8
Approaches Gray scale thresholding Roger S. Gaborski 9
1.Gray scale thresholding • Approach – First convert to gray scale (losing color • information), then threshold • >> imSmGray = rgb2gray(imSm); • >> imSmGray = im2double(imSmGray); • >>figure, imshow(imSmGray) • >>figure, imshow(im2bw(imSmGray,graythresh(imSmGray))) Roger S. Gaborski 10
Clearly, unsuccessful WHY DID IT FAIL?? The intensity value of pixels of Brandy is very close to intensity values of pixels of the background, which makes it hard to segment based on intensity distribution. Roger S. Gaborski 11
Grayscale Histogram >> max(imSmGray(:)) ans = 0.9804 >> min(imSmGray(:)) ans = 0.0510 >> figure, imhist(imSmGray) No clear separation line of dog and background Roger S. Gaborski 12
Approaches Gray scale thresholding Detect edges and then segment Roger S. Gaborski 13
2. Edge Detection: Sobel Roger S. Gaborski 14
2. Edge Detection: Laplacian of Gaussian Roger S. Gaborski 15
2. Edge Detection: Canny Roger S. Gaborski 16
Reason of the failures • Gray scale thresholding and edge detection: • Both these 2 algorithms work in gray scale space, only taking into account of intensity values of the pixel. However, the intensity value of the dog and the grass is very similar to each other, which makes the noise very hard to eliminate. The edge detection algorithms also fail in this case. • They ignore the most informative component: distinct colors of the brown dog and the green grass
Approaches Gray scale thresholding Detect edges and then segment Color segmentation Color spaces : RGB Euclidean distance Mahananobis distance Roger S. Gaborski 18
3. Color Segmentation: Euclidean Distance Manually select pixels Roger S. Gaborski 19
3. Color Segmentation: Mahalanobis Distance Brandy Noise from brown earth Manually select pixels Roger S. Gaborski 20
Original Brandy picture Have very similar color with brandy Roger S. Gaborski 21
Discussion None of the 3 planes will work for the segmentation of Brandy and the grass. However, by combining B, G, and B planes together, we can roughly segment Brandy from the grass by Euclidean distance and achieve desirable segmentation by Mahalanobis distance (taking into account of correlations between different color planes). Roger S. Gaborski 22
Individual Color Planes >> figure, subplot(2,3,1),imshow(imSm(:,:,1)),title('Red') >> subplot(2,3,2),imshow(imSm(:,:,2)),title('Green') >> subplot(2,3,3),imshow(imSm(:,:,3)),title('Blue') >> subplot(2,3,4),imshow(im2bw(imSm(:,:,1),graythresh(imSm(:,:,1)))) title('Red Threshold') >> subplot(2,3,5),imshow(im2bw(imSm(:,:,2),graythresh(imSm(:,:,2)))) title('Green Threshold') >> subplot(2,3,6),imshow(im2bw(imSm(:,:,3),graythresh(imSm(:,:,3)))) title('Blue Threshold') Roger S. Gaborski 23
HSV Color Space Seems doesn’t work when combining HSV together >> imH = rgb2hsv(imSm); >> figure, imshow(imH) Roger S. Gaborski 24
Distinct hues of brown color (brandy) and green color (grass) Perfect for separating the dog from the background Hard to distinguish >> figure, subplot(1,3,1),imshow(imH(:,:,1)),title('Hue') >> subplot(1,3,2),imshow(imH(:,:,2)),title('Saturation') >> subplot(1,3,3),imshow(imH(:,:,2)),title('Value') Roger S. Gaborski 25
Imhist(imH(:,:,1)) Grass distribution Dog distribution Separating line Histogram distribution in Hue space Roger S. Gaborski 26
Dog pixels gray level value = 0 Still, very similar hue with Brandy >> level = graythresh(imH(:,:,1)) level = 0.1725 (automatic threshold) >> figure, imshow(imH(:,:,1)>level) Roger S. Gaborski 27
Summary Color is the ideal descriptor in segmenting Brandy from the grass (distinct colors) Edge detection algorithms fail when the intensity values of adjacent pixels are very similar with each other We will continue with color segmentation and morphological processing in next lecture Follow up assignments on region growing and color segmentation will be posted on course website shortly. You will be informed when they are posted.
Approaches • We look at two approaches for color segmentation: • Region segmentation using a distance measurements • Region growing using seeds and a distance measurement
L*a*b Color Space • 'L*' luminosity or brightness layer, • 'a*' chromaticity layer indicating where color falls along the red-green axis • 'b*' chromaticity layer indicating where the color falls along the blue-yellow axis.
Distance Measure in L*a*b Space • Only use a and b planes • Manually sample image • Estimate mean of a and b values • Calculate distance as before
K-means Clustering • MATLAB: IDX = kmeans(X,k) partitions the points in the n-by-p data matrix X into k clusters. • How many clusters? k • Distance measure: Euclidean
Very Simple Example • Consider 1 dimensional data (algorithm works with n dimensional data Assume k = 2 Assume cluster centers, randomly pick 2 values
Very Simple Example • Measure distance between centers and remaining points Assign points to closer center Recalculate centers based on membership 1 = 1 (2+3+5+6+7+8+9)/7 = 5.7143
Very Simple Example 1.000 5.7143 Assign points to closer new center Recalculate centers based on membership (1+2+3)/3 = 2.0 (5+6+7+8+9)/5 = 7.0
Very Simple Example No points reassigned, done Z1 = 2.0 Z2 = 7.0
K-means Clustering • K-means algorithm has self organizing properties • n-dimensional vectors may be considered points in a n-dimensional Euclidean space By a Euclidean space we mean a Rn space with a definition of distance between vectors x and y as: d(x,y) = sqrt{ (x1-y1)2+(x2-y2)2+…+(xn-yn)2} Euclidean norm or length of x ||x|| = sqrt{ x12 + x22 +…+xn2 } • K-means is one of many techniques that uses the notion of clustering by minimum distance Why does using minimum distance make sense?
K-means Clustering • Two vectors that represent points in n space that are geometrically close may in some sense belong together • Notation: • Norm or length of vector x: ||x||= xi2 • Distance between two vectors: ||x-z||= ( xi-zi)2 1/2 i 1/2 i
K-Means Algorithm • We measure how close vectors are • We establish cluster points and partition vectors into these clusters such that the distance between a vector and the cluster it is assigned to is minimum with regards to the other points • With k-means you need to know the number of cluster centers.