310 likes | 620 Views
Image Denoising Using Scale Mixtures of Gaussians in the Wavelet Domain. Portilla , J . ( Universidad de Granada ); Strela , V. (Drexel University); Wainwright , M.J . ( University of California, Berkeley ); Simoncelli , E.P . ( New York University );
E N D
Image Denoising Using Scale Mixtures of Gaussians in the Wavelet Domain Portilla, J.(Universidad de Granada);Strela, V.(Drexel University); Wainwright, M.J.(University of California, Berkeley);Simoncelli, E.P.(New York University); Transactions on: Image Processing, IEEE Journals
Outline • Introduction • Image probability model • Gaussian scale mixtures • GSM model for wavelet coefficients • Prior density for multiplier • Image denoising • Bayes least squares estimator • Local Wiener estimate • Posterior distribution of the multiplier • Results • Implementation • Denoising digital camera images
Introduction • Survey of image denoising techniques:
Hard thresholding: • if (coef[i] <= thresh) • coef[i] = 0.0; • Soft thresholding: • if (coef[i] <= thresh) coef[i] = 0.0; • else coef[i] = coef[i] - thresh;
Image probability model • An image decomposed into oriented subbands at multiple scalesby wavelet transform. • We assume the coefficients within each local neighborhood are characterized by a Gaussian scale mixture (GSM) [3]model. • The neighborhood may include coefficients from nearby scales and orientations. [3] D Andrews and C Mallows, “Scale mixtures of normal distributions,” J. Royal Stat. Soc., vol.36, pp. 99–, 1974.
-Gaussian scale mixtures • We denote as xcthe center coefficient. • We denote as xthe vector of coefficient xc and its neighborhood coefficients (scales and orientations). • The vector x characterized by a Gaussian scale mixture (GSM) model: • u: zero-mean Gaussian vector. • z: independent positive scalar random variable.
-Gaussian scale mixtures • The vector x is thus an infinite mixture of Gaussian vectors, whose density is: • Cu: the covariance matrix of u • N: the dimensionality of x and u (in our case, the size of the neighborhood). • Without loss of generality, one can assume E{z}= 1, which implies Cu=Cx. Neighbors(Cx& N)? p(z)?
-GSM model for wavelet coefficients • GSM model can account for both the marginalsof wavelet coefficient and the correlation between amplitudes of neighbor coefficients by the neighborhood boundaries[4, 2]. • The marginal model for the multipliers: treating them as independent variables [33], • The jointdensity over the full set of multipliers [2]. [2] M J Wainwright, E P Simoncelli, and A S Willsky, “Random cascades on wavelet trees and their use in modeling and analyzing natural imagery,” Applied and Computational Harmonic Analysis, vol. 11, no. 1, pp. 89–123, July 2001. [4] M J Wainwright and E P Simoncelli, “Scale mixtures of Gaussians and the statistics of natural images,” in Adv. Neural Information Processing Systems, S. A. Solla, T. K. Leen, and K.-R. M¨uller, Eds., Cambridge, MA, May 2000, vol. 12, pp. 855–861, MIT Press. [33] V Strela, “Denoising via block Wiener filtering in wavelet domain,” in 3rd European Congress of Mathematics, Barcelona, July 2000, Birkh¨auserVerlag.
-GSM model for wavelet coefficients • An alternative approach is to use a GSM as a local description. • The model implicitly defines a local Markov model, described by the conditional density of a coefficient of its surrounding neighborhood. • The choice of neighborhood is described in section “Implementation”. Get Neighbors(Cx& N).
-Prior density for multiplier • The maximum likelihood (ML) approach for estimating a nonparametricpz(z): • m: the index of neighborhoods • Computing this solution numerically. • Advantage: it is easily extended for use with the noisy observations. (xm→ noisy observation) • Drawback: heavily computational cost.
-Prior density for multiplier • Results:
-Prior density for multiplier • Noninformativeprior (Jeffrey’s prior)[36] : • Jeffrey’s prior: • Advantage: it does not require the fitting of any parameters to the noisy observation. • Better denoising performance in image domain. Get p(z). Model completed. [36] G E P Box and C Tiao, Bayesian Inference in Statistical Analysis, Addison-Wesley, Reading, MA, 1992.
Image denoising • Decompose the image into pyramid subbandsat different scales and orientations; • Denoiseeach subband; • Invert transform, obtaining the denoised image. • We assume the image is corrupted by independent additive Gaussian noise of known covariance. • The noise and image contents are independent.
Image denoising • A vector y corresponding to a neighborhood of N observed coefficients can be expressed as: • Both u and w are zero-mean Gaussian vectors, with covariance matrices Cu and Cw. • The density of y conditioned on z is a zero-meanGaussian, with covarianceCy|z= zCu+ Cw:
Image denoising • Since w is derivedfrom the image through the (linear) pyramid transformation, it is easily to compute the noise covariance matrix Cw. • Taking the expectation of Cy|z over z yields: • Choose E{z} = 1, resulting in: • Ensure that Cuis positive semidefinite.(By performing an eigenvector decomposition and setting any negative eigenvalues to zero.)
-Bayes least squares estimator • For each neighborhood, we wish to estimatexc(the center of reference coefficient) from y. • The Bayes least squares (BLS) estimate is just the conditional mean:
-Local Wiener estimate() • By GSM, simply a local linear (Wiener)estimate: • Which • The diagonalizing (for zCu+ Cw) does not depend on z, thus it’s only computed once for each subband. (Square root)
-Local Wiener estimate() • We can now simplify (9): • Finally: • mij: element(i-throw, j-thcolumn) of matrix M • λn: the diagonal elements of Λ • vn: the elements of v -̶̶ c: the index of reference
-Posterior distribution of the multiplier • The other component of (8) (by Bayes rule): • Which p(y)
Results • Implementation: • Decompose the image: The steerable pyramid[14]5 scales, 8 oriented highpassresidual subbands, and one lowpass(non-oriented) residual band. • Hand-optimized the neighborhood structure:A 3 × 3 region surrounding xc , together with the coefficient at the same location and orientation at the next coarser scale [19]. • Test on a set of 8-bit grayscale test images withadditive Gaussian white noise at 10 different variances. [14] E P Simoncelli,WT Freeman, E H Adelson, and D J Heeger, “Shiftable multi-scale transforms,” IEEE Trans Information Theory, vol. 38, no. 2, pp. 587–607, March 1992, Special Issue on Wavelets. [19] R W Buccigrossi and E P Simoncelli, “Image compression via joint statistical characterization in the wavelet domain,” IEEE Trans Image Proc, vol. 8, no. 12, pp. 1688–1701, December 1999.
Results [30] Xin Li and Michael T. Orchard, “Spatially adaptive image denoising under overcomplete expansion,” in IEEE Int’l Confon Image Proc, Vancouver, September 2000.
-Denoisingdigital camera images • We obtain images from a Canon G1 digital camera(2160×1440 CCD quantized to 10 bits). • The noise is stronglydependent on the signal:
-Denoising digital camera images • In the subband domain, we assumed the following noise model: • αx:the secondary multiplier of local noise variance. where E{αx} = 1 and it depends on pixel variance over a spatial neighborhood (see Appendix C). • Once the values αx have been computed, then replace (z λn+1) in (14) and (12) with (z λn+αx).