440 likes | 541 Views
Multiscale Analysis for Intensity and Density Estimation. Rebecca Willett’s MS Defense Thanks to Rob Nowak , Mike Orchard , Don Johnson , and Rich Baraniuk Eric Kolaczyk and Tycho Hoogland. Poisson and Multinomial Processes. Why study Poisson Processes?. Astrophysics. Network analysis.
E N D
Multiscale Analysis for Intensity and Density Estimation Rebecca Willett’s MS Defense Thanks to Rob Nowak, Mike Orchard, Don Johnson, and Rich Baraniuk Eric Kolaczyk and Tycho Hoogland
Why study Poisson Processes? Astrophysics Network analysis Medical Imaging
Multiresolution Analysis Examining data at different resolutions (e.g., seeing the forest, the trees, the leaves, or the dew) yields different information about the structure of the data. Multiresolution analysis is effective because it sees the forest (the overall structure of the data) without losing sight of the trees (data singularities)
Beyond Wavelets Multiresolution analysis is a powerful tool, but what about… Edges? Nongaussian noise? Inverse problems? Piecewise polynomial- and platelet- based methods address these issues. Non-Gaussian problems? Image Edges? Inverse problems?
Computational Harmonic Analysis • Define Class of Functions to Model Signal • Piecewise Polynomials • Platelets • Derive basis or other representation • Threshold or prune small coefficients • Demonstrate near-optimality
Approximation with Platelets Consider approximating this image:
E.g. Haar analysis Terms = 2068, Params = 2068
Wedgelets Haar Wavelet Partition Original Image Wedgelet Partition
E.g. Haar analysis with wedgelets Terms = 1164, Params = 1164
E.g. Platelets Terms = 510, Params = 774
Platelet Approximation Theory Error decay rates: • Fourier: O(m-1/2) • Wavelets: O(m-1) • Wedgelets: O(m-1) • Platelets: O(m-min(a,b))
Maximum Penalized Likelihood Estimation Goal: Maximize the penalized likelihood So the MPLE is
The Algorithm • Const Estimate • Wedge Estimate Data • Platelet Estimate • Wedged Platelet Estimate • Inherit from finer scale
Penalty Parameter Penalty parameter balances between fidelity to the data (likelihood) and complexity (penalty). g = 0 Estimate is MLE: g Estimate is a constant:
Inverse Problems Goal: estimate m from observations x ~ Poisson(Pm) EM algorithm (Nowak and Kolaczyk, ’00):
Hellinger Loss • Upper bound for affinity (like squared error) • Relates expected error to Lp approximation bounds
KL distance Approximation error Estimation error Bound on Hellinger Risk (follows from Li & Barron ’99)
Bounding the KL • We can show: • Recall approximation result: • Choose optimal d
Near-optimal Risk • Maximum risk within logarithmic factor of minimum risk • Penalty structure effective:
Conclusions CHA with Piecewise Polynomials or Platelets • Effectively describe Poisson or multinomial data • Strong approximation capabilites • Fast MPLE algorithms for estimation and reconstruction • Near-optimal characteristics
Risk analysis for piecewise polynomials Platelet representations and approximation theory Shift-invariant methods Fast algorithms for wedgelets and platelets Risk Analysis for platelets Future Work Major Contributions
Multiscale Likelihood Factorization • Probabilistic analogue to orthonormal wavelet decomposition • Parameters wavelet coefficients • Allow MPLE framework, where penalization based on complexity of underlying partition
Poisson Processes • Goal: Estimate spatially varying function, l(i,j), from observations of Poisson random variables x(i,j) with intensities l(i,j) • MLE of l would simply equal x. We will use complexity regularization to yield smoother estimate.
Accurate Model Parsimonious Model Complexity Regularization Penalty for each constant region results in fewer splits Bigger penalty for each polynomial or platelet region more degrees of freedom, so more efficient to store constant if likely