230 likes | 431 Views
Enhancing Sparsity by Reweighted L-1 Minimization. Authors: Emmanuel Candes , Michael Wakin , and Stephen Boyd A review by Jeremy Watt. The Basis pursuit problem. The Basis pursuit problem. The related Lasso problem. The Basis pursuit problem. Measures: cardinality and magnitude
E N D
Enhancing Sparsity by Reweighted L-1 Minimization Authors: Emmanuel Candes, Michael Wakin, and Stephen Boyd A review by Jeremy Watt
The Basis pursuit problem The related Lasso problem
The Basis pursuit problem Measures: cardinality and magnitude (sensitive to outliers) Measures: cardinality
Ideal weightings: compensate for magnitudes • Say we know and • Want to recover from • So ideal weightings are
Ideal weightings • Only the support of can enter the model • Nullifies true magnitudes • Must stay feasible w.r.t. true support
Algorithm for general problem • Early iterations may find inaccurate signal estimates, but largest signal coefficients are likely to be identified as nonzero. • Once these locations are identified, their influence is downweighted in order to allow more sensitivity for identifying the remaining small but nonzero signal coefficients.
Simulated example • Signal length = 512 • # spikes = 130 • indep normal entries • = 0.1 • 2 iterations of the algorithm performed for perfect recovery
TV Minimization Image Reconstruction Data: = sampled Fourier coefficients of image = sampled Fourier matrix Goal: Reconstruct original image Leverage: Image gradient sparsity
TV Minimization Image Reconstruction Data: = sampled Fourier coefficients of image = sampled Fourier matrix Goal: Reconstruct original image Leverage: Image gradient sparsity
Concluding thoughts on reweighted L-1 minimization • An attempt to nullify the ‘magnitude problem’ with L-1 norm (outliers in some sense) • Same sort of motivation leads to Iteratively Reweighted Least Squares • Many superior results over standard L-1 • Generalizations to other sparsity problems • Deeper justification for efficacy as a Majorization-Minimization algorithm for solving alternative sparse recovery problem
For more details and experiments see the paper or talk to me!
Epilogue: Majorization-Minimization (MM) justification Standard epigraph trick • Epigraph trick • smooths objective • Adds linear inequality constraints to model
Epilogue: Majorization-Minimization (MM) justification In MM approach: • Majorize objective function (use first order approximation) • Form sub-problem with this objective and original constraints • Solve series of such sub-problems to solve original problem In our case tth sub-problem takes reweighted L-1 form