290 likes | 310 Views
This study presents a method for image denoising using support vector regression in the wavelet domain. It introduces the concept of mother and father wavelets and explores support vector regression theory and algorithms. The paper discusses experimental results, including the 2-D discrete wavelet transform for image denoising. Support vector regression is explained with a focus on the ε-insensitive loss function and graphical representations. The use of kernel methods and LS-SVM regression is detailed, along with block matrix decompositions. Experimental outcomes and references to related works are also provided.
E N D
Wavelet domain image denoising via support vector regression Source: Electronics Letters,Volume:40,Issue: 2004 PP :1479-1481 Authors: H. Cheng, J.W. Tian, J. Liu and Q.Z. Yu Presented by: C.Y.Gun Date: 4/7 2005
Outline • Abstract • Introduction( Mother Wavelet, Father Wavelet) • Support vector regression • Proposed theory and algorithm • Experimental results and discussion
Introduction Families of Wavelets: • Father wavelet ϕ(t) – generates scaling functions (low-pass filter) • Mother wavelet ψ(t) –generates wavelet functions (high-pass filter) • All other members of the wavelet family are scaling and translations of either the mother or the father wavelet
Introduction • Father wavelets (low-pass filters)
Introduction • Mother wavelets (high-pass filters)
Introduction • 2-D DWT for Image
Introduction • 2-D DWT for Image
Introduction • 2-D DWT for Image
Support vector regression • Standard linear regression equation • The linear case is a special case of the nonlinear regression equation
Support vector regression • Idea : we define a « tube » of radius εaround the regression(ε≥ 0) • No error if y lays inside the « tube » or« band »
Support vector regression • We therefore define an ε-insensitive loss function L1 • L2
Support vector regression • Graphical representation
Support vector regression Slack variables eiare defined for each observation: e e e e
Support vector regression Kernel methods:
Support vector regression Basic kernels for vectorial data: – Linear kernel: (feature space is Q-dimensional if Q is the dim of ; Map is identity!) – RBF-kernel: (feature space is infinite dimensional) – Polynomial kernel of degree two: (feature space is d(d+1)/2 -dimensional if d is the dim of )
LS-SVM Regression We define the following optimization problem: Or:
LS-SVM Regression From Least squares support vector machine classifiers ….(1)
LS-SVM Regression The Result LS-SVM model for function estimation is ….(2)
LS-SVM Regression (1) ….(3)
Proposed theory and algorithm Block matrix decompositions The main formula we need concerns the inverse of a block matrix =??
Proposed theory and algorithm = = where
Proposed theory and algorithm H 3 3 SVR DWT V Original Image
Proposed theory and algorithm where fmis the modified wavelet coefficient, p=0.3×max( f ) . Max( f ) is the maximal value of the wavelet coefficient in that detail subband.
Reference • 1 Mallat, S.G.: ‘A theory for multiresolution signal decomposition: the wavelet representation’, IEEE Trans. Pattern Anal. Mach. Intell., 1989,11, (7), pp. 674–693 • 2 Donoho, D.L., and Johnstone, I.M.: ‘Ideal spatial adaptation via waveletshrinkage’, Biometrica, 1994, 81, pp. 425–455 • 3 Chang, S.G., Yu, B., and Vetterli, M.: ‘Adaptive wavelet thresholding forimage denoising and compression’, IEEE Trans. Image Process., 2000, 9,pp. 1532–1546 • 4 Vapnik, V.: ‘The nature of statistical learning theory’ (Springer-Verlag,New York, 1995) • 5 Suykens, J.A.K., and Vandewalle, J.: ‘Least squares support vectormachine classifiers’, Neural Process. Lett., 1999, 9, (3), pp. 293–300