1 / 33

A Bayesian approach to discovering free-form differential equations from data

A Bayesian approach to discovering free-form differential equations from data. Steven Atkinson 1 , Waad Subber 1 , Genghis Khan 1 , Liping Wang 1 , Roger Ghanem 2 1 GE Research 2 USC. Background. Motivation. Background. ML works in narrow domains; human knowledge generalizes .

fpace
Download Presentation

A Bayesian approach to discovering free-form differential equations from data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Bayesian approach to discovering free-form differential equations from data Steven Atkinson1, Waad Subber1, Genghis Khan1, Liping Wang1, Roger Ghanem2 1GE Research 2USC

  2. Background

  3. Motivation Background ML works in narrow domains; human knowledge generalizes. In science, knowledge takes the form of governing laws. Can ML be used as a partner in the scientific process? • Formulate hypotheses • Test and criticize theories • Guide experimentation and data acquisition • Provide interpretable results

  4. Using raw numerical data, uncover governing equations. Compose Data Models Result Exploration Refine

  5. Outline Differential equation discovery pipeline • Curve fitting • Operator composition • Genetic programming • Criticism and adaptive data acquisition Propagation of epistemic uncertainty Examples

  6. Differential equation discovery pipeline

  7. Curve fitting Differential equation discovery pipeline Identify independent (spatiotemporal) and dependent variables in data set. Learn function representations of data (traditional supervised learning, highly flexible) Must mind differentiability of models e.g.

  8. Function composition Differential equation discovery pipeline Define operator composition as a tree operands are functions, not data. Enables use of operators like derivatives, gradients, divergences, etc. through automatic differentiation (Theano, Chainer, TensorFlow, PyTorch, JAX, …)

  9. Function composition Differential equation discovery pipeline m = Model() m.fit(x, y) f = m(x2) fx = m.grad()(x2) fxxfx = (m.grad().grad() + m.grad())(x2) plot(x2, [f, fx, fxxfx])

  10. Function composition Differential equation discovery pipeline Example: synthetic function

  11. Function composition Differential equation discovery pipeline Example:

  12. Function composition Differential equation discovery pipeline Example:

  13. Function composition Differential equation discovery pipeline Example:

  14. Function composition Differential equation discovery pipeline Example:

  15. Genetic programming Differential equation discovery pipeline Define a set of primitives: and terminals . Initialize a population of individuals (differential equations). Iterate, mutating and mating individuals Retain fit individuals, discard unfit

  16. Genetic programming—fitness function Differential equation discovery pipeline Fitness of an individual: Only requires good curve fit at !

  17. Notes Unlike traditional GP approaches, we operate on dependent variable functions, not independent variable data.

  18. Quantification and propagation of epistemic uncertainty

  19. Quantification of epistemic uncertainty We use Bayesian modeling with Gaussian processes to capture epistemic uncertainty. (Could also use Bayesian NNs, etc…)

  20. Examples

  21. Overview Examples

  22. First-order ODE Examples Ground truth: .

  23. 1D Elliptic problem (heterogeneous conductivity) Examples Ground truth: .

  24. 1D Elliptic problem (nonlinear conductivity) Examples Ground truth: (Using

  25. 2D Elliptic problem (heterogeneous conductivity) Examples Ground truth: . • Data taken from [Atkinson and Zabaras, 2019]

  26. Effect of data on fitness uncertainty As more data become available, epistemic uncertainty is reduced and it becomes easier to distinguish between competing hypotheses.

  27. Adaptive data acquisition Using elliptic.Heterogeneous Acquisition function: residual of fittest individual Iterate until .

  28. Investigating poorly-explained regions reducing curve-fitting uncertainty!

  29. Conclusions

  30. Conclusions Demonstrated a method to uncover governing differential equations from data. Results are human-readable and compatible with traditional theoretical & simulation methods (incl. physics-informed ML...)

  31. Code Gaussian processes with PyTorch: pip install gptorch PIRATE—Physics-Informed Research Assistant for Theory Extraction—coming soon

  32. Thank you This work was funded through DARPA’s Artificial Intelligence Exploration (AIE) PA-18-02-02 Artificial Intelligence Research Assistant (AIRA) agreement no. HR00111990032 Presentation Title

More Related