460 likes | 1.4k Views
Reconstruction Algorithms for Compressive Sensing II. Presenter: 黃乃珊 Advisor: 吳安宇 教授 Date: 2014/04/08. Schedule. 19:30 @ EEII-225. Outline. Reconstruction Algorithms for Compressive Sensing Bayesian Compressive Sensing Iterative Thresholding Approximate Message Passing
E N D
Reconstruction Algorithms for Compressive Sensing II Presenter: 黃乃珊 Advisor: 吳安宇 教授 Date:2014/04/08
Schedule • 19:30 @ EEII-225
Outline • Reconstruction Algorithms for Compressive Sensing • Bayesian Compressive Sensing • Iterative Thresholding • Approximate Message Passing • Implementation of Reconstruction Algorithms • Lab1: OMP Simulation • Reference
Recovery Algorithms for Compressive Sensing • Linear Programming • Basis Pursuit (BP) • Greedy Algorithm • Matching Pursuit • Orthogonal Matching Pursuit (OMP) • Stagewise Orthogonal Matching Pursuit (StOMP) • Compressive Sampling Matching Pursuit (CoSaMP) • Subspace Pursuit (SP) • Iterative Thresholding • Iterative Hard Thresholding (IHT) • Iterative Soft Thresholding (IST) • Bayesian Compressive Sensing (BCS) • Approximate Matching Pursuit (AMP)
Compressive Sensing in Mathematics • Sampling matrices should satisfy restricted isometry property (RIP) • Random Gaussian matrices • Reconstruction solves an underdetermined question • Linear Programming • Orthogonal Matching Pursuit(OMP) Channel Sampling Reconstruction
Compressive Sensing in Linear Algebra • Reconstruction is composed of two parts: • Localize nonzero terms • Approximate nonzero value • Do correlation to find the location of non-zero terms • Solve least square problem to find the value • Projection (pseudo-inverse) coefficient = Measurement Input basis
Orthogonal Matching Pursuit (OMP) [3] • Use greedy algorithm to iteratively recover sparse signal • Procedure: • Initialize • Find the column that is most correlated • Set Union (add one col. every iter.) • Solve the least squares • Update data and residual • Back to step 2 or output [14]
Iterative Threshold [4] • Iterative hard thresholding (IHT) • Iterative soft thresholding (IST) [2]
Compressive SensingFrom Mathematics to Engineering • Fourier transform was invented in 1812, and published in 1822. Not until FFT was developed in 1965, Fourier transform started to change the world. • Hardware design is limited by algorithm • Engineering perspective can help compressive sensing more powerful in practical application
Message Passing • Messages pass from sender to receiver • Reliable transfer, and deliver in order • Belief propagation (BP) • Sum-product message passing • Calculate distribution for unobserved nodes on graph • Ex. low-density parity-check codes (LDPC), turbo codes • Approximate message passing (AMP) [8][9][10]
Approximate Message Passing (AMP) • Iterative soft thresholding (IST) • Approximate message passing (AMP) [8][9][10] • Onsager reaction term cancels the self-feedback effects • Approximate sum-product messages for basis pursuit • Fast and good performance, but not suit for all random input
Relevance Vector Machine (RVM) • Use Bayesian inference for regression and probabilistic classification • Support Vector Machine (SVM) • Classification and regression analysis • RVM is faster but at risk of local minima
Bayesian Compressive Sensing [5][6][7] • Consider CS from Bayesian perspective • Provide a full posterior density function • Adopt the relevance vector machine (RVM) • Solve the problem of maximum a posterior (MAP) efficiently • Adaptive Compressive Sensing • Adaptively select projection with the goal to reduce uncertainty • Bayesian Compressive Sensing via Belief Propagation
Compressive Sensing in Engineering A. Message Passing • Message passing • Sum-product message passing • Ex. Low-density parity-check codes (LDPC) • Bayesian model • Bayesian learning, a kind of machine learning • Adaptive filtering framework • Self-adjust to optimize desired signal B. Bayesian Model C. Adaptive Filter
Outline • Reconstruction Algorithms for Compressive Sensing • Bayesian Compressive Sensing • Iterative Thresholding • Approximate Message Passing • Implementation of Reconstruction Algorithms • Lab1: OMP Simulation • Reference
Implementation of Reconstruction Algorithms • Choose Greedy rather than Linear programing • Optimization is better in terms of accuracy, but its implementation is very complex and time consuming. • Design issues • Matrix multiplication • Matrix inverse • Related works • OMP – ASIC & FPGA • CoSaMP– FPGA • IHT – GPU • AMP – ASIC & FPGA Processing Flow in Greedy Pursuits Matrix Multiplication Matrix Inverse
OMP with CholeskyDecomposition 1 • [11] is the earliest hardware implementation • Cholesky decomposition does not require square root calculations • Bottleneck • Kernel 1: 655/1645 cycles • Kernel 2 (Matrix inversion): 769/1645 cycles 2 3 [9]
OMP with QR Decomposition • Cholesky increases the latency with increasing dimension • QRD-RLS and fast inverse square algorithm are used in [14] • Remove columns with low coherence by an empirical threshold to reduce computational time • Tradeoff between MSE and reconstruction cycles Reconstruction Time Normalized MSE
Outline • Reconstruction Algorithms for Compressive Sensing • Bayesian Compressive Sensing • Iterative Thresholding • Approximate Message Passing • Implementation of Reconstruction Algorithms • Lab1: OMP Simulation • Reference
OMP Simulation • Please design SolveOMP.m • Test the recovery performance of OMP with different size of measurement or different sparsity
Reference [1] E. J. Candes, and M. B. Wakin, "An Introduction To Compressive Sampling," Signal Processing Magazine, IEEE , vol.25, no.2, pp.21-30, March 2008 [2] G. Pope, “Compressive Sensing – A Summary of Reconstruction Algorithm”, Swiss Federal Instituute of Technology Zurich [3] J. A. Tropp, A. C. Gilbert, “Signal Recovery from Random Measurements via Orthogonal Matching Pursuit,” IEEE Transactions on Information Theory, vol.53, no.12, pp. 4655-4666, Dec. 2007 [4] T. Blumensath, and M. E. Davies, "Iterative hard thresholding for compressed sensing." Applied and Computational Harmonic Analysis 27.3 (2009): 265-274. [5] S. Ji, Y. Xue, and L. Carin, “Bayesian compressive sensing,” IEEE Trans. Signal Process., vol. 56, no. 6, pp. 2346–2356, Jun. 2008. [6]M. E. Tipping, "Sparse Bayesian learning and the relevance vector machine." The Journal of Machine Learning Research 1 (2001): 211-244. [7] D. Baron, S. Sarvotham, and R. G. Baraniuk, "Bayesian compressive sensing via belief propagation." Signal Processing, IEEE Transactions on 58.1 (2010): 269-280. [8] D. L. Donoho, A. Maleki, and A. Montanari, "Message-passing algorithms for compressed sensing." Proceedings of the National Academy of Sciences 106.45 (2009) [9] D. L. Donoho, A. Maleki, and A. Montanari, "Message passing algorithms for compressed sensing: I. motivation and construction." Information Theory Workshop (ITW), 2010 IEEE, Jan. 2010 [10]D. L. Donoho, A. Maleki, and A. Montanari, "Message passing algorithms for compressed sensing: II. analysis and validation," Information Theory Workshop (ITW), 2010 IEEE , Jan. 2010
Reference [11] A. Septimus, and R. Steinberg, "Compressive sampling hardware reconstruction," Circuits and Systems (ISCAS), Proceedings of 2010 IEEE International Symposium on , vol., no., pp.3316,3319, May 30 2010-June 2 2010 [12] Lin Bai, P. Maechler, M. Muehlberghuber,and H. Kaeslin, "High-speed compressed sensing reconstruction on FPGA using OMP and AMP," Electronics, Circuits and Systems (ICECS), 2012 19th IEEE International Conference on , vol., no., pp.53,56, 9-12 Dec. 2012 [13] P. Blache, H. Rabah, and A. Amira, "High level prototyping and FPGA implementation of the orthogonal matching pursuit algorithm," Information Science, Signal Processing and their Applications (ISSPA), 2012 11th International Conference on , vol., no., pp.1336,1340, 2-5 July 2012 [14] J.L.V.M. Stanislaus, and T. Mohsenin, "Low-complexity FPGA implementation of compressive sensing reconstruction," Computing, Networking and Communications (ICNC), 2013 International Conference on , vol., no., pp.671,675, 28-31 Jan. 2013s