290 likes | 435 Views
Cs: compressed sensing. Jialin peng. Outline. Introduction Exact/Stable Recovery Conditions -norm based recovery OMP based recovery Some related recovery algorithms Sparse Representation Applications. Introduction. Data Compression. Data Storage. decompress.
E N D
Cs: compressed sensing Jialin peng
Outline • Introduction • Exact/Stable Recovery Conditions • -norm based recovery • OMP based recovery • Some related recovery algorithms • Sparse Representation • Applications
Introduction Data Compression Data Storage decompress Receiving & Storage • high-density sensor • high speed sampling • …… • A large amount of sampled data will be discarded • A certain minimum number of samples is required in order to perfectly capture an arbitrary bandlimited signal
Sparse Property • Important classes of signals have naturally sparse representations with respect to fixed bases (i.e., Fourier, Wavelet), or concatenations of such bases. • Audio, images … • Although the images (or their features) are naturally very high dimensional, in many applications images belonging to the same class exhibit degenerate structure. • Low dimensional subspaces, submanifolds • representative samples—sparse representation
The Goal Develop anend-to-end system • Sampling • processing • reconstruction • All operations are performed at a low rate: below the Nyquist-rate of the input (too costly, or even physically impossible) • Relying on structure in the input
Sparse: the simplest choice is the best one • Signals can often be wellapproximated as a linear combination of just a fewelements from a known basis or dictionary. • When this representation is exact ,we say that the signal is sparse. Remark: In many cases these high-dimensional signals contain relatively little information compared to their ambient dimension.
Introduction Data Compression Data Storage decompress Receiving & Storage • high-density sensor • high speed sampling • …… • A large amount of sampled data will be discarded • A certain minimum number of samples is required in order to perfectly capture an arbitrary bandlimited signal
Introduction Data Storage modified sensor Receiving & Storage optimization • Sparse priors of signal • Nonuniform sampling • Imaging algorithm: optimization • Alleviated sensor • Reduced data • ……
Introduction = Sensing Matrix
compression Find the most concise representation: Compressed sensing: sparse or compressible representation • A finite-dimensional signal having a sparse or compressible representation can be recovered from a small set of linear, nonadaptive measurements • how should we design the sensing matrix A to ensure that it preserves the information in the signal x?. • how can we recover the original signal x from measurements y? • Nonlinear: • Unknown nonzero locations results in a nonlinear model: the choice of which dictionary elements are used can change from signal to signal . 2. Nonlinear recovering algorithms the signal is well-approximated by a signal with only k nonzerocoefficients
Introduction How can we recovery the unknown signal: Exact/Stable Recovery Condition • Let be a matrix of size with . • For a –sparse signal , let be the measurement vector. • Our goal is to exact/stable recovery the unknown signal from measurement. • The problem is under-determined. • Thanks for the sparsity, we can reconstruct the signal via .
Exact/stable recovery conditions • The spark of a given matrix A • Null space property (NSP) of order k • The restricted isometry property Remark: verifying that a general matrix A satisfies any of these properties has a combinatorial computational complexity
Exact/stable recovery conditions Restricted Isometry Property • The restricted isometry constant (RIC) is defined as the smallest constant which satisfy: • The restricted orthogonality condition (ROC)is the smallest number such that:
Exact/stable recovery conditions • Solving minimization is NP-hard, we usually relax it to the or minimization.
Exact/stable recovery conditions • For the inaccurate measurement , the stable reconstruction model is
Exact/stable recovery conditions • Some other Exact/Stable Recovery Conditions:
Exact/stable recovery conditions • Braniuk et al. have proved that for some random matrices, such as • Gaussian, • Bernoulli, • …… we can exactly/stably reconstruct unknown signal with overwhelming high probability.
Exact/stable recovery conditions cf: minimization
Exact/stable recovery conditions • Some evidences have indicated that with , can exactly/stably recovery signal with fewer measurements.
Quicklook Interpretation • Dimensionality-reducing projection. • Approximately isometric embeddings, i.e., pairwise Euclidean distances are nearly preserved in the reduced space RIP
Quicklook Interpretation • the ℓ2 norm penalizes large coefficients heavily, therefore solutions tend to have many smaller coefficients. • In the ℓ1 norm, many small coefficients tend to carry a • larger penalty than a few large coefficients.
Algorithms • L1 minimization algorithms iterative soft thresholding iteratively reweighted least squares … • Greedy algorithms Orthogonal Matching Pursuit iterative thresholding • Combinatorial algorithms
CS builds upon the fundamental fact that • we can represent many signals using only a few non-zero coefficients in a suitable basis or dictionary. • Nonlinear optimization can then enable recovery of such signals from very few measurements.
Sparse property • The basis for representing the data • incoherent->task-specific (often overcomplete) dictionary or redundant one
MRI Reconstruction MR images are usually sparse in certain transform domains, such as finite difference and wavelet.
Sparse Representation • Consider a family of images, representing natural and typical image content: • Such images are very diverse vectors in • They occupy the entire space? • Spatially smooth images occur much more often than highly non-smooth and disorganized images • L1-norm measure leads to an enforcement of sparsity of the signal/image derivatives. • Sparse representation
Matrix completion algorithms NP-hard Convex relaxation Unconstraint • Recovering a unknown (approximate) low-rank matrix from a sampling set of its entries.