1 / 29

Representation and Compression of Multi-Dimensional Piecewise Functions

Representation and Compression of Multi-Dimensional Piecewise Functions. Dror Baron Signal Processing and Systems (SP&S) Seminar June 2009 Joint work with: Venkat Chandrasekaran Michael Wakin Richard Baraniuk. The Challenge of Multi-D Horizon Functions. Signals have edges

kadeem-cook
Download Presentation

Representation and Compression of Multi-Dimensional Piecewise Functions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Representation and Compression of Multi-Dimensional Piecewise Functions Dror Baron Signal Processing and Systems (SP&S) Seminar June 2009 Joint work with: Venkat Chandrasekaran Michael Wakin Richard Baraniuk

  2. The Challenge of Multi-D Horizon Functions • Signals have edges • images (2D) • video (3D) • light field imaging (4D, 5D) • Horizon class model • multidimensional • discontinuities • smooth areas • Main challenge: sparse representation • Related applications: approximation, compression, denoising, classification, segmentation… N = 2 N = 3

  3. Existing tool: 1D Wavelets • Advantages for 1D signals: • efficient filter bank implementation • multiresolution framework • sparse representation for smooth signals • Success motivates application to 2D, but…

  4. 13 26 52 2D Signal Representations • Challenge: geometry - discontinuities along 1D contours • separable 2D wavelets (squares) fail to capture geometric structure • Response: • tight frames:curvelets [Candés & Donoho], contourlets [Do & Vetterli], bandelets [Mallat] • geometric tilings: wedgelets [Donoho], wedgeprints [Wakin et al.]

  5. Wedgelet Dictionary [Donoho] • Piecewise linear, multiscale representation • supported over a square dyadic block • Tree-structured approximation • Intended for C2 discontinuities wedgelet decomposition

  6. 13 26 52 Non-Separable Representations have Potential to be Sparse Separable wavelets Non-separable geometric tiling

  7. Signal Representations in Higher Dimensions • Failure of separable wavelets more pronounced in N>2 dimensions • Similar problems exist • smooth regions separated by discontinuities • discontinuities often smooth functions in N-1 dimensions • Shortcomings of existing work • not yet extended to higher dimensions • intended for efficient (sparse) representations for C2 discontinuities

  8. Goals • Develop representationfor higher-dimensional data containing discontinuities • smooth N-dimensional function • (N-1)-dimensional smooth discontinuity • Optimal rate-distortion (RD) performance • metric entropy – order of RD function • Flow of research: • FromN=2 dimensions, C2-smooth discontinuities • ToN¸2 dimensions, arbitrary smoothness

  9. Piecewise Constant Horizon Functions [Donoho] • f: binary function in N dimensions • b: CK smooth (N-1)-dimensional horizon/boundary discontinuity • Let x2 [0,1]N and y= {x1,…,xN-1} 2 [0,1](N-1)

  10. N = 2 N = 3 Example Horizon Class Functions

  11. Compression Problem • Approximate f with R bits ! • Squared L2 error metric (energy) • Need optimal tradeoff between rate and L2 distortion

  12. Compression via Implicit Approximation • Edge detection: • estimate horizon discontinuity b • encode using (N-1)-dimensional wavelets [Cohen et al.] • Implicitly approximate f from b • Theorem[Kolmogorov & Tihomirov; Clements]: Metric entropy for CK smooth (N-1)-D function: L1 distortion O(¢) lower bound

  13. Metric Entropy for Horizon Functions • Problems with edge detection: • edge detection often impractical • want to approximate f (not b) • require solution that provides estimate in N-dimensions, without explicit knowledge of b • Theorem: Metric entropy for N-D horizon function f with CK smooth (N-1)-D discontinuity: • Converse result – our algorithms achieve this RD performance

  14. Motivation for Solution: Taylor’s Theorem • For a CK function b in (N-1) dimensions, • Key idea: order (K-1) polynomial approximation on small regions • Challenge: organize tractable discrete dictionary for piecewise polynomial approximation derivatives

  15. Surflets: Piecewise Polynomial Approximations on Dyadic Hypercubes • Surfletat scale j • N-dimensional atom • defined on hypercube Xj of size 2-j£2-j£L£2-j • horizon function with order K-1 polynomial discontinuity (“surface”-let) • Tile to form multiscale approximation to f Wedgelet K = 2 K = 3 K = 4

  16. 3D Surflets K = 2 K = 3

  17. Discrete Surflet Dictionary • Describe surflet using polynomial coefficients Wedgelet K = 2 K = 3 K = 4 K = 2 K = 3

  18. ~ O(2-(K-2)j) ~ O(2-Kj) ~ O(2-2j) ~ O(2-Kj) Quantization • Challenge: with naïve quantization of coefficients, dictionary size blows up with K and N • Surflet coefficients approximate Taylor coefficients • Higher-order coefficients quantized with lesser precision  same order error for all coefficients • Response:for order-lcoefficient, use step-size

  19. Approximation without Edge Detection • “Taylor surflets” • obtained by quantizing derivatives of b • requires knowledge/estimation of b • “L2-best surflets” • obtained by searching dictionary for best fit • requires no explicit knowledge of b • fast search algorithm via manifolds • Theorem: Taylor or L2–best surflets have same asymptotic performance

  20. Tree-structured Surflet Approximation • Arrange surflets on 2N-tree • each node is either a leaf or has 2N children • all nodes labeled with surflets • leaf nodes provide approximation • interior nodes useful for predictivecoding

  21. Tree-structured Surflet Encoder • Surflet leaf encoder achieves near-optimal RD performance • Top-down predictive encoder • code all nodes in surflet tree • use parent surflets to predictchildren • constant # bits per surflet regardless of scale • layeredcoarse-scale approximation in early bits • Theorem: Top-down predictive encoder achieves

  22. Discretization • Signals often acquired discretely (pixels/voxels) • Pixelization artifacts at fine scales • Approach to discrete data • discretize continuous surflet dictionary • coarse scales: use regular dictionary • smaller dictionary at fine scales • Theorem: Predictive encoder achieves same RD performance at low rate with discretized dictionary

  23. Numerical Example • N=2,K=3 • 1024£1024 pixels • Scale-adaptive dictionaries Wedgelets: 482 bits, 29.9 dB Surflets: 275 bits, 30.2 dB

  24. RD Results • Dictionary 1: fixed-scale wedgelets • Dictionary 2: wedgelets + scale-adaptive • Dictionary 3: surflets + scale-adaptive

  25. Piecewise SmoothHorizon Functions g1([x1, x2]) • g1,g2: real-valued smooth functions • N dimensional • CKs smooth • b: CKd smooth (N-1)-dimensional horizon/boundary discontinuity • Theorem: Metric entropy for CKs smooth N-D horizon function f with CKd smooth discontinuity: b(x1) g2([x1, x2])

  26. 13 26 52 Surfprints • Challenge: • wavelets good in smooth regions • wavelets wasteful near discontinuity • Surflets good near edges • Response: surfprints project surflets to wavelet subspace

  27. w w w w w w w w w w Tree-structured Surprint Encoder • Discontinuity information needed at finer scales • Top-down encoder • Prediction not used • Theorem: Top-down encoder achieves near-optimal • coarse: keep wavelet nodes • intermediate: nodes with discontinuity • maximal depth: surfprints coarse intermediate maximal surfprint

  28. Conclusions and Future Work • Metric entropy (converse) • piecewise constant/smooth horizon functions • arbitrary dimension & arbitrary smoothness • Multiresolution compression framework (achievable) • quantization scheme tractable dictionary size • predictive top-down coding optimal performance • scale-adaptive approach to discretization • surfprints at maximal depth  near-optimal • Future research: algorithms

  29. THE END

More Related