400 likes | 687 Views
Applications of Automatic Differentiation and the Cramér-Rao Lower Bound to Parameter Mapping. July 31, 2013 Jason Su. Outline. Cramér-Rao Lower Bound. How precisely can I measure something with this pulse sequence?. CRLB: What is it?.
E N D
Applications of Automatic Differentiation and the Cramér-Rao Lower Bound to Parameter Mapping July 31, 2013 Jason Su
Cramér-Rao Lower Bound How precisely can I measure something with this pulse sequence?
CRLB: What is it? • A lower limit on the variance of an estimator of a parameter. • The best you can do at estimating say T1 with a given pulse sequence and signal equation: g(T1) • Estimators that achieve the bound are called “efficient” • The minimum variance unbiased estimator (MVUE) is efficient
CRLB: Fisher Information Matrix • Typically calculated for a given tissue, θ • Interpretation • J captures the sensitivity of the signal equation to changes in a parameter • Its “invertibility” or conditioning is howseparable parameters are from each other,i.e. the specificity of the measurement
CRLB: How does it work? • A common formulation • Unbiased estimator • A signal equation with normally distributed noise • Measurement noise is independent
Automatic Differentiation The most criminally underused tool in your computational toolbox?
AD: What is it? • Automatic differentiation is NOT: • Analytic differentiation
AD: What is it? • Automatic differentiation is NOT: • Analytic differentiation • Symbolic differentiation syms x1 x2; f = 1/(1 + exp(-x1/x2)); df_dx1 = diff(f, x1) >> 1/(x2*exp(x1/x2)*(1/exp(x1/x2) + 1)^2)
AD: What is it? • Automatic differentiation is NOT: • Analytic differentiation • Symbolic differentiation • Numeric differentiation (finite difference) f = @(x1, x2) 1/(1 + exp(-x1/x2)); eps = 1e-10; df_dx1 = f(x1+eps, x2) – f(x1, x2)) df_dx1 = df_dx1/eps
AD: What is it? • Automatic differentiation IS: • Fast, esp. for many input partial derivatives • Symbolic requires substitution of symbolic objects • Numeric requires multiple function calls for each partial
AD: What is it? • Automatic differentiation IS: • Fast, esp. for many input partial derivatives • Effective for computing higher derivatives • Symbolic generates huge expressions • Numeric becomes even more inaccurate
AD: What is it? • Automatic differentiation IS: • Fast, esp. for many input partial derivatives • Effective for computing higher derivatives • Adept at analyzing complex algorithms • Bloch simulations • Loops and conditional statements • 1.6 million-line FEM model
AD: What is it? • Automatic differentiation IS: • Fast, esp. for many input partial derivatives • Effective for computing higher derivatives • Adept at analyzing complex algorithms • Accurate to machine precision
AD: What is it? • Some disadvantages: • Exact details of the implementation are hidden • Hard to accelerate
AD: How does it work? • Numeric: implement definition of derivative • Symbolic: N-line function -> single line expression • Automatic: N-line function -> M-line function • A technology to automaticallyaugment programs with statements to compute derivatives
AD: How does it work? f = @(x1, x2) 1/(1 + exp(-x1/x2)); • Build a graph of intermediate quantities and apply the chain rule
AD: How does it work? f = @(x1, x2) 1/(1 + exp(-x1/x2));
AD: How does it work? f = @(x1, x2) 1/(1 + exp(-x1/x2));
AD: How does it work? f = @(x1, x2) 1/(1 + exp(-x1/x2));
AD: How does it work? f = @(x1, x2) 1/(1 + exp(-x1/x2));
AD: How does it work? f = @(x1, x2) 1/(1 + exp(-x1/x2));
AD: How does it work? f = @(x1, x2) 1/(1 + exp(-x1/x2));
AD: How do I use it? • Applications • Gradient-based optimization methods • Uncertainty propagation • Transparent calculation of the Jacobian of a multiple-input, multiple-output function • Packages • MATLAB • ADiMat • AD for MATLAB, Adiff • Python • pyautodiff • uncertainties, algopy, CasADi
What is Parameter Mapping? • Start with a signal model for your data • Collect a series of scans, typically with only 1 or 2 sequence variables changing • Fit model to data • Motivation • Reveals quantifiable physical properties of tissue unlike conventional imaging • Maps are ideally scanner independent
Parameter Mapping • Some examples • FA/MD mapping with DTI – most widely known mapping sequence • T1 mapping – relevant in study of contrast agent relaxivity and diseases • B1 mapping – important for high field applications
Evaluating all the choices • T1 mapping methods • Spin-echo inversion recovery • Look-Locker • DESPOT1/VFA • MPnRAGE family
Evaluating all the choices • T1 mapping methods • Spin-echo inversion recovery • Look-Locker • DESPOT1/VFA • MPnRAGE family
DESPOT1 T1 Mapping • Protocol optimization • What is the acquisition protocol which best maximizes our T1 precision? Christensen 1974, Homer 1984, Wang 1987, Deoni 2003
DESPOT1: Protocol Optimization • Acquiring N images: with what flip angles and how long should we scan each? • Cost function, λ = 0 for M0 • Coefficient of variation (CoV = ) for a single T1 • The sum of CoVs for a range of T1s
Problem Setup • Minimize the CoV of T1 with Jacobiansimplemented by AD • Constraints • TR = TRmin = 5ms • Solver • Sequential least squares programming with multiple start points(scipy.optimize.fmin_slsqp)
Results: T1=1000ms • N = 2 • α = [2.373 13.766]° • This agrees with Deoni 2003 • Corresponds to the pair of flip angles producing signal at of the Ernst angle • Previously approximated as 0.71
Results: T1=500-5000ms • N = 2 • α = [1.4318 8.6643]° • Compare for single T1 = 2750ms, optimalα = [1.4311 8.3266]° • Contradicts Deoni 2004, which suggests to collect a range of flip angles to cover more T1s
Future Work • More protocol optimization • DESPOT2-FM: free parameters incl. SPGR or bSSFP, αs, phase-cycle • mcDESPOT: precision of MWF has recently been under question (Lankford 2012) • Exploration of other pulse sequences • Comparison of competing methods
Questions? • Slides available at http://mr.jason.su/ • Python source code available soon