370 likes | 804 Views
Recovery of Clustered Sparse Signals from Compressive Measurements. Chinmay Hegde. Richard Baraniuk. Volkan Cevher volkan@rice.edu. Piotr Indyk. The Digital Universe. Size: ~300 billion gigabytes generated in 2007 digital bits > stars in the universe
E N D
Recovery of Clustered Sparse Signals from Compressive Measurements ChinmayHegde Richard Baraniuk Volkan Cevher volkan@rice.edu PiotrIndyk
The Digital Universe • Size: ~300 billion gigabytes generated in 2007 digital bits > stars in the universe growing by a factor of 10 every 5 years > Avogadro’s number (6.02x1023) in 15 years • Growth fueled by multimedia / multisensor data audio, images, video, surveillance cameras, sensor nets, … • In 2007 digital data generated > total storage by 2011, ½ of digital universe will have no home [Source: IDC Whitepaper “The Diverse and Exploding Digital Universe” March 2008]
Challenges Acquisition Compression Processing
Approaches • Do nothing / Ignore be content with where we are… • generalizes well • robust
Approaches • Finite Rate of Innovation Sketching / Streaming Compressive Sensing [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]
Approaches • Finite Rate of Innovation Sketching / Streaming Compressive Sensing PARSITY [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]
Agenda • A short review of compressive sensing • Beyond sparse models • Potential gains via structured sparsity • Block-sparse model • (K,C)-sparse model • Conclusions
Compressive Sensing 101 • Goal: Recover a sparse or compressible signal from measurements • Problem: Randomprojection not full rank • Solution: Exploit the sparsity/compressibilitygeometry of acquired signal
Compressive Sensing 101 • Goal: Recover a sparse or compressible signal from measurements • Problem: Randomprojection not full rankbut satisfies Restricted Isometry Property (RIP) • Solution: Exploit the sparsity/compressibility geometry of acquired signal • iid Gaussian • iid Bernoulli • …
Compressive Sensing 101 • Goal: Recover a sparse or compressible signal from measurements • Problem: Randomprojection not full rank • Solution: Exploit the modelgeometry of acquired signal
Basic Signal Models • Sparse signal: only K out of N coordinates nonzero • model: union of K-dimensional subspaces aligned w/ coordinate axes sorted index
Basic Signal Models • Sparse signal: only K out of N coordinates nonzero • model: union of K-dimensional subspaces • Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K-sparse signal (simply by thresholding) power-lawdecay sorted index
Recovery Algorithms • Goal: given recover • and convex optimization formulations • basis pursuit, Dantzig selector, Lasso, … • Greedy algorithms • iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP) subspace pursuit (SP) • at their core: iterative sparse approximation
Performance of Recovery • Using methods, IT, CoSaMP, SP • Sparse signals • noise-free measurements: exact recovery • noisy measurements: stable recovery • Compressible signals • recovery as good as K-sparse approximation CS recoveryerror signal K-termapprox error noise
Sparse Signals pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones
Sparsity as a Model • Sparse/compressible signal model captures simplistic primary structure sparse image
Beyond Sparse Models • Sparse/compressible signal model captures simplistic primary structure • Modern compression/processing algorithms capture richer secondary coefficient structure pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones
Sparse Signals • Defn: K-sparse signals comprise a particular set of K-dim canonical subspaces
Model-Sparse Signals • Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces
Model-Sparse Signals • Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces • Structured subspaces • <> fewer subspaces • <> relaxed RIP • <> fewer measurements [Blumensath and Davies]
Model-Sparse Signals • Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces • Structured subspaces • <> increased signal discrimination • <> improved recovery perf. • <> faster recovery
Block-Sparsity • Motivation • sensor networks • Signal model intra-sensor: sparsityinter-sensor: common sparse supports • union of subspaces when signals are concatenated sensors frequency … *Individual block sizes may vary… [Tropp, Eldar, Mishali; Stojnic, Parvaresh, Hassibi; Baraniuk, VC, Duarte, Hegde]
Block-Sparsity Recovery • Sampling bound (B = # of blocks) • Problem specific solutions • Mixed -norm solutions • Greedy solutions: simultaneous orthogonal matching pursuit [Tropp] • Model-based recovery framework: -norm [Baraniuk, VC, Duarte, Hegde] within block across blocks [Eldar, Mishali (provable); Stojnic, Parvaresh, Hassibi]
Standard CS Recovery • Iterative Thresholding • [Nowak, Figueiredo; Kingsbury, Reeves; Daubechies, Defrise, De Mol; Blumensath, Davies; …] update signal estimate prune signal estimate(best K-term approx) update residual
Model-based CS Recovery • Iterative ModelThresholding • [VC, Duarte, Hegde, Baraniuk; Baraniuk, VC, Duarte, Hegde] update signal estimate prune signal estimate(best K-term model approx) update residual
Model-based CS Recovery • Provable guarantees • [Baraniuk, VC, Duarte, Hegde] CS recoveryerror signal K-termmodel approx error noise update signal estimate prune signal estimate(best K-term model approx) update residual
Block-Sparse CS Recovery • Iterative Block Thresholding within block • sort • + • threshold across blocks
Block-Sparse Signal target CoSaMP (MSE = 0.723) block-sparse model recovery (MSE=0.015) Blocks are pre-specified.
Clustered Sparsity • (K,C) sparse signals (1-D) • K-sparse within at most C clusters • Stable recovery • Recovery: • w/ model-based framework • model approximation via dynamic programming (recursive / bottom up)
Simulation via Block-Sparsity • Clustered sparse <> approximable by block-sparse *If we are willing to pay 3 × sampling penalty • Proof by (adversarial) construction …
Clustered Sparsity in 2D • Model clustering of significant pixels in space domain using graphical model (MRF) • Ising model approximation via graph cuts [VC, Duarte, Hedge, Baraniuk] target Ising-modelrecovery CoSaMPrecovery LP (FPC)recovery
Conclusions Why CS works: stable embedding for signals with special geometric structure Sparse signals >> model-sparse signals Greedy model-based signalrecovery algorithms upshot: provably fewer measurements more stable recovery new model: clustered sparsity Compressible signals >> model-compressible signals