1 / 16

Yo Horikawa Kagawa University , Japan

Use of Autocorrelation Kernels in Kernel Canonical Correlation Analysis for Texture Classification. Yo Horikawa Kagawa University , Japan. ・ Support vector machine (SVM) ・ Kernel canonical correlation analysis (kCCA) with autocorrelation kernels → Invariant texture classification

sissy
Download Presentation

Yo Horikawa Kagawa University , Japan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Use of Autocorrelation Kernels in Kernel Canonical Correlation Analysis for Texture Classification Yo Horikawa Kagawa University , Japan

  2. ・Support vector machine (SVM) ・Kernel canonical correlation analysis (kCCA) with autocorrelation kernels → Invariant texture classification only using raw pixel data without explicit feature extraction Compare the performance of the kernel methods. Discuss the effects of the order of autocorrelation kernels.

  3. Support vector machine (SVM) Sample data: xi (1 ≤ i ≤ n), belonging to Class ci ∊ {-1, 1} SVM learns a discriminant function for test data x: d(x) = sgn(∑i=1n’ αiciK(x, xsi) + b) αi and b are obtained through the quadratic programming problem. Kernel function: Inner product of nonlinear maps φ(x): K(xi, xj) = φ(xi)・φ(xj) Support vectors: xsi (1 ≤ i ≤ n’ (≤ n)): a part of sample data Feature extraction process is implicitly done in SVM through the kernel function and support vectors.

  4. Autocorrelation kernel The kth-order autocorrelation of data xi(t): rxi(t1, t2, ∙∙∙ , tk-1) = ∫xi(t)xi(t+t1)・・・xi(t+tk-1)dt The inner product between rxi and rxj is calculated with the k-th power of the cross-correlation function (2nd-order): rxi・rxj = ∫{∫xi(t)xj(t+t1)dt}k dt1 The calculation of explicit values of the autocorrelations is avoided. → High-order autocorrelations are tractable with practical computational cost.  ・Linear autocorrelation kernel: K(xi, xj) = rxi・rxj  ・Gaussian autocorrelation kernel: K(xi, xj) = exp(-μ|rxi - rxj|2) = exp(-μ(rxi・rxj + rxi・rxj - 2rxi・rxj))

  5. Calculation of autocorrelation kernels rxi・rxj for 2-dimensional image data: x(l, m) (1≤ l ≤ L, 1≤ m ≤ M) ・Calculate the cross-correlations between xi(l, m) and xj(l, m): rxi, xj (l1, m1) = ∑l=1L-l1∑m=1M-m1xi(l, m)xj(l+l1, m+m1)/(LM) (1 ≤ l1 ≤ L1, 1 ≤ m1 ≤ M1) ・Sum up the kth-power of the cross-correlations: rxi・rxj = ∑l1=0L1-1∑m1=0M1-1 {rxi, xj (l1, m1)}k M M1 ∑l,m xi(l+m)xj(l+l1,m+m1) xi(l, m) L L1 xj(l+l1,m+m1) rxi・rxj = ∑l1, m1 {・}k

  6. Kernel canonical correlation analysis (kCCA) Pairs of feature vectors of sample objects: (xi, yi) (1 ≤ i ≤ n) KCCA finds projections (canonical variates) (u, v) that yield maximum correlation between φ(x) and θ(y). (u, v) = (wφ・φ(x), wθ・θ(y)) wφ = ∑i=1nfiφ(xi), wθ = ∑i=1ngiθ(yi)where fT = (f1, ∙∙∙, fn) and gT = (g1, ∙∙∙, gn) are the eigenvectors of the generalized eigenvalue problem: Φij= φ(xi)・φ(xj) Θij = θ(yi)・θ(yj)I: Identity matrix of n×n

  7. Application of KCCA for classification problemsUse an indicator vector as the second feature vector y. y = (y1, ∙∙∙, yC) corresponding to x: yc = 1 if x belongs to class cyc = 0 otherwise (C: the number of classes) Mapping θ of y is not used. C-1 eigenvectors fk = (fk1, …, fkn) (1 ≤ k ≤ C-1) corresponding to non-zero eigenvalues are obtained. Canonical variatesuk (1 ≤ k ≤ C-1) for a test object (x, ?) are calculated by uk = ∑i=1nfiφ(xi)・φ(x) = ∑i=1nfi K(xi, xj) Classification methods, e.g., the nearest-neighbor method, can be applied in the canonical variate space (u1, …, uC-1).

  8. Classification experiment Table 1. Sample and test sets. Fig. 1. Texture images. 4-class classification problems with SVM and kCCA Original images: 512×512 pixels (256 gray scale) in the VisTex database and the Brodatz album Sample and test images: 50×50 pixels, chosen in the original images with random shift and scaling, rotation, Gaussian noise (100 images each)

  9. Kernel functions K(xi, xj) (ⅰ) Linear kernel: xi・xj(ⅱ) Gaussian kernel: exp(-μ||xi – xj||2) (ⅲ) Linear autocorrelation kernel: rxi・rxj(ⅳ) Gaussian autocorrelation kernel: = exp(-μ|rxi - rxj|2) = exp(-μ(rxi・rxj + rxi・rxj - 2rxi・rxj)) Range of correlation lags: L1 = M1 = 10 (in 50×50 pixel images)The simple nearest-neighbor classifier is used for classification with canonical variates (u1, …, uC-1) in kCCA. Parameter values are empirically chosen. (Soft margin: C = 100, Gaussian:μ, Regularization:γx, γy)

  10. (a) Brodatz Album (b) VisTex database Fig. 2. Correct classification rates (CCR (%)) in SVM.

  11. (a) Brodatz Album (b) VisTex database Fig. 3. Correct classification rates (CCR (%)) in kCCA.

  12. Comparison of the performances Classification experiments for the Brodatz album texture images (D4 and D84), (D5 and D92) with various filtering methods and vector quantization learning (Randen and Husøy, 1999) → CCRs are about 90%. SVM and kCCA with autocorrelation kernels show comparable performance. Table 2. Highest correct classification rates in SVM and kCCA.

  13. Effects of the order of autocorrelation kernels Experiments of face detection (Popovici and Thiran, 2001) → CCR increases as the autocorrelation order increases. The result of this texture classification experiment → the lower-order (k = 2, 3, 4) kernels show better performance. The best order may depend on the objects.

  14. The order k of autocorrelation kernels increases. → The generalization ability and robustness are lost. rxi・rxj = ∑t1 (rxi, xj (t1))k → δi, j (k → ∞) For test data x (≠xi), rxi・rx = 0 In kCCA, Φ= I, Θ: block matrix, eigenvectors: f = (p1, …, p1, p2, …, p2, … , pC, …, pC) (fi = pc, if xi∊ class c) For sample data, canonical variates lie on a line through the origin corresponding to its class: uxi = (rxi・rxi)pc (pc = (pc,1, ∙∙∙, pc,C-1)) , if xi∊ class c For test data: ux≈0 Modification: Use of lp-norm like functions (rxi・rxj)1/k = |∑t1 (rxi, xj (t1))k|1/k

  15. (a) linear kernel (ⅰ) (b) Gaussian kernel (ⅱ) (d) 3rd-order correlation kernel (ⅲ) (c) 2nd-order correlation kernel (ⅲ) Most of test data u≈0 (e) 4th-order correlation kernel (ⅲ) (f) 10th-order correlation kernel (ⅲ) Fig. 4. Scatter diagram of canonical variates (u1, u2) and (u3, u1) of Test 1 data of texture images in the Brodatz album in kCCA. Plotted are square (■) for D4, cross (×) for D84, circle (●) for D5 and triangle (Δ) for D92.

  16. Summary SVM and kCCA with autocorrelation kernels are applied to texture classification. The performance compete with conventional feature extraction methods and learning. The Gaussian autocorrelation kernel of the order 2 or 4 gives highest correct classification. The generalization ability of the autocorrelation kernels decreases as the order of the correlation increases.

More Related