620 likes | 859 Views
Design in Imaging From Compressive to Comprehensive Sensing . Lior Horesh Joint work with E. Haber, L. Tenorio , A. Conn, U. Mello, J. Fohring. IBM TJ Watson Research Center SIAM Imaging Science 2012, Philadelphia, PA - May 2012. Introduction.
E N D
Design in ImagingFrom Compressive to Comprehensive Sensing LiorHoresh Joint work with E. Haber, L. Tenorio, A. Conn, U. Mello, J. Fohring IBM TJ Watson Research Center SIAM Imaging Science 2012, Philadelphia, PA - May 2012
Exposition - ill - posed Inverse problems • Aim: infer model • Given • Experimental design y • Measurements • Observation model • Ill - posedness Naïve inversion ... • Fails... • Need to fill in the missing information
How to improve model recovery ? • How can we ... • Improve observation model ? • Extract more information in the measurement procedure ? • Define appropriate distance measures / noise model ? • Incorporate more meaningful a-priori information ? • Provide more efficient optimization schemes ?
Evolution of inversionFrom simulation to Design • Forward problem (simulation, description) • Given: model m & observation model • Simulate: data d • Inverse problem (estimation, prediction) • Given: data d & observation model • Infer: model m (and uncertainties) • Design (prescription) • Given: inversion scheme & observation model • Find: the ‘best’ experimental settings y, regularization S, …
Which Experimental design is best ? Design 1 Design 2 Design 3
Design questions – Diffuse Optical Tomography • Where light sources and optodes should be placed ? • In what sequence they should be activated ? • What laser source intensities should be used ? Arridge 1999
Design questions –Electromagnetic inversion • What frequencies should be used ? • What trajectory of acquisition should be considered ? Newman 1996, Haber & Ascher 2000
Design questions – Seismic inversion • How simultaneous sources can be used ? Clerbout 2000
Design questions –Limited Angle Tomography • How many projections should we collect and at what angles ? • How accurate should the projection data be ?
Design experimental layout Stonehenge 2500 B.C.
Design experimental process Galileo Galilei 1564-1642
Respect experimental constraints… French nuclear test, Mururoa, 1970
Ill vs. Well - posed optimal experimental design • Previous work • Well-posed problems - well established [Fedorov 1997, Pukelsheim 2006] • Ill-posed problems - under-researched [Curtis 1999, Bardow 2008] • Many practical problems in engineering and sciences are ill-posed What makes ill-posed problems so special ?
Optimality criteria in well-posed problems • For linear inversion, employ Tikhonov regularized least squares solution • Bias - variance decomposition • For over-determined problems • A-optimal design problem
Optimality criteria in well-posed problems • Optimality criteria of the information matrix • A-optimal design average variance • D-optimality uncertainty ellipsoid • E-optimality minimax • Almost a complete alphabet… Stone, DeGroot and Bernardo
The problems... • Ill-posedness controlling variance alone reduces the error mildly [Johansen 1996 ] • Non-linearity bias-variance decomposition is impossible What strategy can be used ? Proposition 1 - Common practice so far Trial and Error…
Experimental Design by Trial And Error • Pick a model • Run observation model of different experimental designs, and get data • Invert and compare recovered models • Choose the experimental design that provides the best model recovery
The problems... • Ill-posedness controlling variance alone reduces the error mildly [Johansen 1996 ] • Non-linearity bias-variance decomposition is impossible What other strategy can be used ? Proposition 2 - Minimize bias and variance altogether How to define the optimality criterion ?
Optimality criteria for design • Loss • Mean Square Error • Bayes risk • Depends on the noise • Depends on unknown model • Depends on unknown model • Computationally infeasible
Optimality criteria for design • Bayes empirical risk • Assume a set of authentic model examples is available • Discrepancy between training and recovered models [Vapnik 1998 ] How can y and S be regularized ? • Regularized Bayesian empirical risk
Other Designers / Key players • This is one doctrine • Other interesting choices were developed by • Y. Marzouk et. al, MIT (2011, 2012) • A. Curtis et. al, University of Edinburgh (1999, 2010) • D. Coles & M. Prange, Schlumberger (2008, 2012) • S. Körkel et. al, Heidelberg University (2011) • A. Bardow, RWTH Aachen University (2008, 2009)
Differentiable ObservationSparsity Controlled Design • Assume: fixed number of observations • Design preference: small number of sources / receivers is activated • The observation model • Regularized risk Horesh, Haber & Tenorio 2011
Differentiable Observation Sparsity Controlled Design • Total number of observations may be large • Derivatives of the forward operator w.r.t. y • Effective when activation of each source and receiver is expensive Difficult… Horesh, Haber & Tenorio 2011
Weights formulation Sparsity Controlled Design • Assume: a predefined set of candidate experimental setups is given • Design preference: small number of observations Haber, Horesh & Tenorio 2010
Weights formulation Sparsity Controlled Design • Let be discretization of the space [Pukelsheim 1994] • Let • The (candidates) observation operator is weighted • w inverse of standard deviation • - reasonable standard deviation - conduct the experiment • - infinite standard deviation - do not conduct the experiment Haber, Horesh & Tenorio 2010
Weights formulation Sparsity Controlled Design • Solution - more observations give better recovery • Desired solution many w‘s are 0 • Add penalty to promote sparsity in observations selection • Less degrees of freedom • No explicit access to the observation operator needed
The optimization problems • Leads to a stochastic bi-level optimization problem • Direct formulation • Weights formulation Haber, Horesh & Tenorio 2010, 2011
The optimization problem(s) • In general (multi-level) optimization problem • Difficult to solve (although not impossible) [Alexaderov & Denis 1994 ] • To simplify make assumptions on • F- linear, nonlinear • S- quadratic, matrix form • y- discrete, continuous • Important - sufficient improvement can be “good enough”
How Many Samples are needed ? • The optimist - choose a single m and design for it • The pessimist - assume that m is the worst it can be • The realist - choose an ensemble of m’s and average • A Bayesian estimate of the frequentist risk [Rubin 1984 ]
Linear Design • Assume • Then
Linear Design • Assume m has second moment • Then • Best linear design
the best (linear) experiment • Considering discretization of the design space • We obtain
Borehole Ray Tomography -Observation model • Use 322 × 3 = 3072 rays • Goal: choose 500 optimal observations
Borehole Ray Tomography Experimental Design ASSESSMENT • Assess design performance upon unseen models (cross validation) Test model Non-optimal design Optimal design
Non - linear design • Use Stochastic Optimization - approximate by sampling [Shapiro, Dentcheva & Ruszczynski 2009 ]
Non - linear design • Solution through sensitivity calculation and therefore • The sensitivities
Non - linear design • The reduced gradient • Can use steepest descent/LBFGS/Truncated Gauss-Newton to solve the problem
Impedance Tomography – Observation model • Governing equations • Following Finite Element discretization • Given model and design settings • Find data , n < k • Design: find optimal source-receiver configuration
Impedance Tomography – Designs comparison Naive design True model Optimized design Horesh, Haber & Tenorio 2011
Magneto - tullerics Tomography – Observation model • Governing equations • Following Finite Volume discretization • Given: model and design settings (frequency ) • Find: data , • Design: find an optimal set of frequencies
Magneto - tellurics Tomography – Designs comparison Test model Naive design Optimal linearized design Optimized non-linear design Haber, Horesh & Tenorio 2008, 2010