200 likes | 208 Views
Replace samples by more general measurements based on linear projections. Sparse signal measurements with non-zeros. Measurement process is typically analog with Gaussian noise. Single-letter characterization of optimal Compressed Sensing using BP algorithm.
E N D
A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai
Setting • Replace samples by more general measurements based on a few linear projections (inner products) sparsesignal measurements # non-zeros
Signal Model • Signal entry Xn= BnUn • iid Bn» Bernoulli() sparse • iid Un» PU PX Bernoulli() Multiplier PU
Measurement Noise • Measurement process is typically analog • Analog systems add noise, non-linearities, etc. • Assume Gaussian noise for ease of analysis • Can be generalized to non-Gaussian noise
Noise Model • Noiseless measurements denoted y0 • Noise • Noisy measurements • Unit-norm columns SNR= noiseless SNR
Allerton 2006 [Sarvotham, Baron, & Baraniuk] source encoder channel decoder channel encoder source decoder channel CS decoding CS measurement • Model process as measurement channel • Measurements provide information!
Single-Letter Bounds • Theorem:[Sarvotham, Baron, & Baraniuk 2006] For sparse signal with rate-distortion function R(D), lower bound on measurement rate s.t. SNR and distortion D • Numerous single-letter bounds • [Aeron, Zhao, & Saligrama] • [Akcakaya and Tarokh] • [Rangan, Fletcher, & Goyal] • [Gastpar & Reeves] • [Wang, Wainwright, & Ramchandran] • [Tune, Bhaskaran, & Hanly] • …
What Single-letter Characterization? , channel posterior • Ultimately what can one say about Xn given Y? • (sufficient statistic) • Very complicated • Want a simple characterization of its quality • Large-system limit:
Main Result: Single-letter Characterization • Result1: Conditioned on Xn=xn, the observations (Y,) are statistically equivalent to easy to compute… • Estimation quality from (Y,) just as good as noisier scalar observation , channel posterior degradation
Details • 2(0,1) is fixed point of • Take-home point: degraded scalar channel • Non-rigorous owing to replica method w/ symmetry assumption • used in CDMA detection [Tanaka 2002, Guo & Verdu 2005] • Related analysis [Rangan, Fletcher, & Goyal 2009] • MMSE estimate (not posterior) using [Guo & Verdu 2005] • extended toseveral CS algorithms particularly LASSO
Decoupling Result • Result2: Large system limit; any arbitrary (constant) L input elements decouple: • Take-home point: “interference” from each individual signal entry vanishes
Sparse Measurement Matrices [Baron, Sarvotham, & Baraniuk] • LDPC measurement matrix (sparse) • Mostly zeros in ; nonzeros » P • Each row contains ¼Nq randomly placed nonzeros • Fast matrix-vector multiplication • fast encoding / decoding sparse matrix
CS Decoding Using BP[Baron, Sarvotham, & Baraniuk] • Measurement matrix represented by graph • Estimate input iteratively • Implemented via nonparametric BP [Bickson,Sommer,…] signal x measurements y
Identical Single-letter Characterization w/BP • Result3: Conditioned on Xn=xn, the observations (Y,) are statistically equivalent to • Sparse matrices just as good • BP is asymptotically optimal! identical degradation
Decoupling Between Two Input Entries (N=500, M=250, =0.1, =10) density
CS-BP vs Other CS Methods (N=1000, =0.1, q=0.02) MMSE CS-BP M
Conclusion • Single-letter characterization of CS • Decoupling • Sparse matrices just as good • Asymptotically optimal CS-BP algorithm