1 / 28

Easy Does It: User Parameter Free Dense and Sparse Methods for Spectral Estimation

Easy Does It: User Parameter Free Dense and Sparse Methods for Spectral Estimation. Jian Li Department of Electrical and Computer Engineering University of Florida Gainesville, Florida USA. Spectral Estimation.

melita
Download Presentation

Easy Does It: User Parameter Free Dense and Sparse Methods for Spectral Estimation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Easy Does It: User Parameter FreeDense and Sparse Methods for Spectral Estimation Jian Li Department of Electrical and Computer Engineering University of Florida Gainesville, Florida USA

  2. Spectral Estimation • The goal of spectral estimation is to determine how power distributes over frequency from a finite number of data samples. • Diverse Applications • For example: synthetic aperture radar (SAR) imaging. • Data-Independent Approaches • FFT, Matched Filter, Delay-and-Sum (DAS) • Poor resolution • High sidelobe levels, especially with missing data. A SAR imaging example using FFT.

  3. Data-Adaptive Spectral Estimation • Data-Adaptive Approaches • Examples: APES, Capon • Multiple snapshots needed to form reliable sample covariance matrices – fails for single or few snapshots, irregularly sampled data • High computational complexities • High resolution • Low sidelobe levels • Recent Development • Iterative Adaptive Approach (IAA) • Applicable to single snapshot scenario • High computational complexities • High resolution • Low sidelobe levels • Dense and accurate WFFT IAA

  4. IterativeAdaptiveApproach (IAA) • Each iteration of IAA includes two steps (user parameter free): • Estimate coefficients: • Update covariance matrix estimate

  5. IAA-R (IAA with Regularization) • Noise effect taken into account explicitly: • Still user parameter free!

  6. Active Sensing Example • Active sensing (radar, sonar, etc.) • Received signal decomposition: 6

  7. Range-Doppler Imaging Matched Filter Initialization

  8. Movies Are Nice Local Quadratic Convergence of IAA Proven.

  9. Radar GMTI Example Terrain map yellow or green dots: moving vehicles The goal of ground moving target indication (GMTI) is to detect slow moving targets in the stationary background.

  10. STAP • STAP: space-time adaptive processing • Datacube: MN samples for fixed range bin Antenna Elements N Range bins fasttime 1 1 M Pulses slowtime (J. Ward ’94)

  11. Adaptive Processing • Space-Time Adaptive Processor (Guerci et al. ’06)

  12. Angle-Doppler Imaging in STAP Clutter power distribution over angle-Doppler for a fixed range dB IAA DAS

  13. Target Detection for Fixed Angle Simulated Ground Truth • Target angle: 195 • A total of 200 targets with constant power • Average SCNR over range:-18.94 dB o Ground truth denoted byx

  14. GLC (partial knowledge) Range-Doppler Images dB Ideal (total knowledge) IAA Prior (wrong knowledge)

  15. ROC Curves • Median CFAR algorithm • applied to target detection • GLC detector • Automatic diagonal loading • Sample Number N = 20 • Prior detector • Wrong prior knowledge of the clutter-and-noise covariance matrix

  16. Main-beam width: 5 target angles: 190 - 200 • (3-D target detection) • A total of 246 targets with varying power • Slow-movingtargets and/or weak targets present o o o azimuth = KASSPER DataSet

  17. ROC Curves (KASSPER Data) • Median CFAR algorithm applied for target detection

  18. Sparse Approaches • Related work: • is replaced by to yield a convex optimization problem. • LASSO: The least absolute shrinkage and selection operator. • BP: Basis pursuit, very similar to LASSO • FOCUSS: Focal underdetermined system solution • SBL: Sparse Bayesian learning • L1-SVD: L1 – singular value decomposition, similar to BP • CoSaMP: Compressive Sampling Matching Pursuit • Most existing algorithms require • Large computation times • User parameters • Hard to decide • Performance sensitive to choice of user parameter Minimize such that is satisfied.

  19. Kragh et al. Approach • Kragh et al. uses optimization transfer technique to obtain an iterative procedure: • A recent paper on SAR imaging states: This is FOCUSS. “ ’’

  20. SLIM • Sparse Learning via Iterative Minimization (SLIM) Solves the User Parameter Problem!(Tan, Roberts, Li, and Stoica, 2010) • SLIM Assumes the Following Hierarchical Bayesian Model: • SLIM is a MAP Approach:

  21. SLIM Iterations • SLIM Iterates the Following Steps (Starting with DAS): Given q, SLIM is User Parameter Free – Easy to Use!

  22. Regularized Minimization in SLIM • Cyclic approach with majorization minimization employed to minimize cost function. • Conjugate gradient + FFT can be used for efficient implementation of SLIM. • For fixed noise variance (i.e., making it a user parameter), SLIM becomes FOCUSS/Kragh et al. Approach.

  23. FFT for GOTCHA

  24. SLIM for GOTCHA

  25. SLIM for GOTCHA

  26. IAA (Dense) vs. SLIM (Sparse) • IAA is dense; SLIM is sparse. • IAA is more accurate; SLIM tends to bias downward. • IAA has high resolution; SLIM has higher resolution. • IAA’s fast implementation is trickier, especially for non-uniformly sampled data; SLIM is faster and its fast implementation is more straightforward.

  27. Concluding Remarks • We need to devise dense and sparse methods that are user parameter free – easy to use in practice, • And accurate, • And with high resolution, • And computationally efficient.

  28. Thank you!

More Related