220 likes | 427 Views
Entropy-constrained overcomplete -based coding of natural images. André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University. Outline. Motivation Overcomplete -based coding: overview Entropy-constrained overcomplete -based coding Experimental results Conclusion Future work.
E N D
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University
Outline • Motivation • Overcomplete-based coding: overview • Entropy-constrained overcomplete-based coding • Experimental results • Conclusion • Future work
Motivation (1) • Study of new (and unusual) schemes for image compression • Recently, new methods have been developed using the overcomplete approach • Restricted scenarios for compression • Did not fully exploit this approach’s characteristics for compression
Motivation (2) Why?Sparsity on coefficients better overall RD
Overcompletecoding: overview (1) • K > Nimplies: • Bases are not linearlyindependent • Example: • 8x8 blocks: N = 64 basis functions are needed to span the space of all possible signals • Overcomplete basis could have K = 128 • Two main tasks: • Sparsecoding • Dictionarylearning
Overcomplete coding: overview (2) • Sparse coding (“atom decomposition”) • Compute the representation coefficients x based on the signal y (given) and dictionary D (given) • overcompleteDInfinite solutions approxim. • Commonly used algorithms: Matching Pursuits (MP), Orthogonal Matching Pursuits (OMP)
Overcomplete coding: overview (3) Sparse coding (OMP) • Input: Dictionary , signal , number of non-zero coefficients (NNZ) (orerror target ε) • Output: Coefficient vector x • Set r = (r: residual) • Project r on every basis of • Select from with maximum projection • Stop if (or ||r||2 < ε). Otherwise, go to 2
Overcomplete coding: overview (4) • Dictionarylearning • Twobasic stages (analogywith K-means) • Sparsecoding stage: use a pursuitalgorithm to computex(OMP isusuallyemployed) • Dictionary update stage:adopt a particularstrategy for updating the dictionary • Convergence issues: as first stage does not guarantee best match, costcanincrease and convergence cannotbeassured
Overcomplete coding: overview (5) • Dictionarylearning • Most relevant algorithms in the literature: K-SVD and MOD • Sparsecoding stage isdone in the sameway • Codebook update stage isdifferent: • MOD • Update entiredictionaryusing optimal adjustment for a given coefficients matrix • K-SVD • Update each basis one at a time using SVD formulation • Introduces change in dictionary and coefficients
Entropy-const. OC-based coding (1) • We introduce a compression schemewhichemploysentropy-constrained stages • RD-OMP • Introducedby Gharavi-Alkhansar(ICIP 1998), uses the Lagrangiancostwith variable NNZ coefficients to select basis vectors • EC Dictionary Learning • Introducedin thiswork, uses a frameworkinspired in EC VQ to select basis vectors
Entropy-const. OC-based coding (2) • RD-OMP – keyideas • Introduction of Lagrangiancost • Estimation of rate cost: (isfixed) • Stoppingcriterion/variable NNZ coefficients • Once no more improvementisreached on the Lagrangiancost, algorithm stops
Entropy-const. OC-based coding (3) RD-OMP • Input: Dictionary , Input signal • Output: coefficient vector • For every basis k (from 1 to K) • calculate • Pick coefficient with smallest • Stop if , otherwise go to 1.
Entropy-const. OC-based coding (4) • EC Dictionary Learning – keyideas • Dictionary update strategy • K-SVD modifies dictionary and coefficients- reduction in Lagrangiancostis not assured. • We use MOD, whichprovides the optimal adjustmentassumingfixed coefficients • Introduction of “Rate cost update” stage • Analogous to ECVQ algorithm for training data • Twopmfs must beupdated: indexes and coefficients
Entropy-const. OC-based coding (5) EC-Dictionary Learning • Input: input signal y • Output: Dictionary • Initialize from • Sparse coding stage: • RD-OMP findcoefficient • Rate cost update stage: • pmfs update (indexes and coefficients) • Codeword length update: • Dictionary update stage: • MOD dictionary update • Stop when , Otherwise go to 2
Experiments (Setup) • Rate calculation: optimal codebook (entropy) for each subband • Test images: Lena, Boats, Harbour, Peppers • Training dictionary experiments • Training data: 18 Kodak downsampled (to 128x128) images (does not include images being coded) • Use of downsampledimages to 128x128, due to very high computational complexity (for other experiments, higher resolutions were employed: 512x512, 256x256)
Experiments (Sparse Coding) • Comparison of Sparse coding methods
Experiments (Dict. learning) • Comparison of dictionary learning methods
Experiments (Compression schemes) (1) • 1: Training and coding for the same image (dictionary is sent) • 2: Training with a set of natural images and applying to other images
Conclusion • Improvement of sparse coding: • RD-OMP • Improvement of dictionary learning • Entropy-constrained overcomplete dictionary learning • Better overall performance compared to standard techniques
Future work • Extension of implementation to higher resolution images • Further investigation of trade-off between K and N • Evaluation against directional transforms • Low complexity implementation of the algorithms