240 likes | 458 Views
Sparse Regression-based Hyperspectral Unmixing. Marian-Daniel Iordache 1,2. José M. Bioucas-Dias 2. Antonio Plaza 1. 2. 1. Department of Technology of Computers and Communications , University of Extremadura , Caceres Spain.
E N D
Sparse Regression-based Hyperspectral Unmixing Marian-Daniel Iordache1,2 José M. Bioucas-Dias2 Antonio Plaza1 2 1 Department of Technology of Computers and Communications,University of Extremadura, Caceres Spain Instituto de Telecomunicações,Instituto Superior Técnico,Technical University of Lisbon, Lisbon IGARSS 2011
Hyperspectral imaging concept IGARSS 2011
Outline • Linear mixing model • Spectral unmixing • Sparse regression-based unmixing • Sparsity-inducing regularizers ( ) • Algorithms • Results IGARSS 2011
Incident radiation interacts only with one component (checkerboard type scenes) Hyperspectral linear unmixing Estimate Linear mixing model (LMM) IGARSS 2011
Endmember determination (Identify the columns of ) • Inversion (For each pixel, identify the vector of proportions ) Algorithms for SLU Three step approach • Dimensionality reduction (Identify the subspace spanned by the columns of ) Sparse regression IGARSS 2011
Sparse regression-based SLU • Spectral vectors can be expressed as linear combinations • of a few pure spectral signatures obtained from a • (potentially very large) spectral library 0 0 0 0 0 0 • Unmixing: given y andA, findthesparsestsolutionof • Advantage: sidesteps endmember estimation IGARSS 2011 6
(library, , undetermined system) Problem – P0 Sparse regression-based SLU Very difficult (NP-hard) Approximations to P0: OMP – orthogonal matching pursuit [Pati et al., 2003] BP – basis pursuit [Chen et al., 2003] BPDN – basis pursuit denoising IGARSS 2011 7
CBPDN – Constrained basis pursuit denoising Convex approximations to P0 Equivalent problem Striking result: In given circumstances, related with the coherence of among the columns of matrix A, BP(DN) yields the sparsest solution ([Donoho 06], [Candès et al. 06]). Efficient solvers for CBPDN: SUNSAL, CSUNSAL [Bioucas-Dias, Figueiredo, 2010] IGARSS 2011 8
Application of CBPDN to SLU Extensively studied in [Iordache et al.,10,11] • Sixlibraries (A1, …, A6 ) • Simulated data • Endmembers random selected from the libraries • Fractional abundances uniformely distributed • over the simplex • Real data • AVIRIS Cuprite • Library: calibratedversionof USGS (A1) IGARSS 2011
Hyperspectral libraries Bad news: hiperspectral libraries exhibits high mutual coherence Good news: hiperspectral mixtures are sparse (k· 5 very often) IGARSS 2011
Reconstruction errors (SNR = 30 dB) ISMA [Rogge et al, 2006] IGARSS 2011
Real data – AVIRIS Cuprite IGARSS 2011
Real data – AVIRIS Cuprite IGARSS 2011
Beyond l1 regularization Rationale: introduce new sparsity-inducing regularizers to counter the sparse regression limits imposed by the high coherence of the hyperspectral libraries. New regularizers: Total variation (TV ) and group lasso (GL) Matrix with all vectors of fractions TV regularizer l1regularizer GL regularizer IGARSS 2011
Total variation and group lasso regularizers i-th image band i-th pixel promotes similarity between neighboring fractions promotesgroupsofatomsofA (groupsparsity) IGARSS 2011
GLTV_SUnSAL for hyperspectral unmixing Criterion: GLTV_SUnSAL algorithm: based on CSALSA [Afonso et al., 11]. Applies the augmented Lagrangian method and alternating optimization to decompose the initial problem into a sequence of simper optimizations IGARSS 2011
GLTV_SUnSAL results: l1 and GL regularizers LibraryA2 2 groups active GLTV_SUnSAL (l1+GL) GLTV_SUnSAL (l1) SRE = 5.2 dB SRE = 15.4 dB MC runs = 20 SNR = 1 IGARSS 2011
GLTV_SUnSAL results: l1 and GL regularizers Library SNR = 20 dB, l1+TV SNR = 20 dB, l1 Endmember #5 SNR = 30 dB, l1 SNR = 30 dB, l1+TV IGARSS 2011
Real data – AVIRIS Cuprite IGARSS 2011
Concluding remarks • Shown that the sparse regression framework • has a strong potential for linear hyperspectral unmixing • Tailored new regression criteria to cope with • the high coherence of hyperspectral libraries • Developed optimization algorithms for the above • criteria • To be done: reseach ditionary learning techniques IGARSS 2011