1 / 25

Understanding Bayesian Methods for Parameter Estimation

Explore Bayesian methods for parameter estimation and data assimilation with crop models. Learn about likelihood functions, prior distributions, and their role in estimating parameters. Discover examples and comparisons between Bayesian and frequentist approaches.

luisw
Download Presentation

Understanding Bayesian Methods for Parameter Estimation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. October 2006 Bayesian methods for parameter estimation and data assimilation with crop modelsPart 2: Likelihood function and prior distribution David Makowski and Daniel Wallach INRA, France

  2. Previously • Notions in probability - Joint probability - Conditional probability - Marginal probability • Bayes’ theorem

  3. Objectives of part 2 • Introduce the notion of prior distribution. • Introduce the notion of likelihood function. • Show how to estimate parameters with a Bayesian method.

  4. Part 2: Likelihood function and prior distributions Estimation of parameters () Parameter = numerical value not calculated by the model and not observed. Information available to estimate parameters - A set of observations (y). - Prior knowledge about parameter values.

  5. Part 2: Likelihood function and prior distributions Two distributions in Bayes’ theorem • Prior parameter distribution = probability distribution describing our initial knowledge about parameter values. • Likelihood function = function relating data to parameters.

  6. Part 2: Likelihood function and prior distributions Measurements Combined info about parameters Bayesian method Prior Information about parameter values

  7. Part 2: Likelihood function and prior distributions Example Estimation of crop yield θby combining a measurement with expert knowledge. Plot about 5 t/ha ± 2 Measurement y = 9t/ha ± 1 Field with unknown yield  Expert

  8. Part 2: Likelihood function and prior distributions Example Estimation of crop yield θby combining a measurement with expert knowledge. • One parameter to estimate: the crop yield θ. • Two types of information available: • - A measurement equal to 9 t/ha with a standard error equal to 1 t/ha. • - An estimation provided by an expert equal to 5 t/ha with a standard error equal to 2 t/ha.

  9. Part 2: Likelihood function and prior distributions Prior distribution • It describes our belief about the parameter values before we observe the measurements. • It is based on past studies, expert knowledge, and litterature.

  10. Part 2: Likelihood function and prior distributions Example (continued) Definition of a prior distribution θ ~ N( µ, ² ) • Normal probability distribution. • Expected value equal to 5 t/ha. • Standard error equal to 2 t/ha

  11. Part 2: Likelihood function and prior distributions Example (continued) Plot of the prior distribution

  12. Part 2: Likelihood function and prior distributions Likelihood function • A likelihood function is a function relating data to parameters. • It is equal to the probability that the measurements would have been observed given some parameter values. • Notation: P(y | θ)

  13. Part 2: Likelihood function and prior distributions Example (continued) Statistical model y = θ +  with  ~ N( 0, σ² ) y | θ ~ N( θ, σ² )

  14. Part 2: Likelihood function and prior distributions Example (continued) Definition of a likelihood function y | θ ~ N( θ, σ² ) • Normal probability distribution. • Measurement y assumed unbiaised and equal to 9 t/ha. • Standard error σ assumed equal to 1 t/ha

  15. Part 2: Likelihood function and prior distributions Example (continued) Definition of a likelihood function Maximum likelihood estimate

  16. Part 2: Likelihood function and prior distributions Maximum likelihood Likelihood functions are also used by frequentist to implement the maximum likelihood method. The maximum likelihood estimator is the value of θmaximizing P(y | θ) .

  17. Part 2: Likelihood function and prior distributions Likelihood function Prior probability distribution

  18. Part 2: Likelihood function and prior distributions Example (continued) Analytical expression of the posterior distribution θ | y ~ N( µpost, post² )

  19. Part 2: Likelihood function and prior distributions Posterior probability distribution Prior probability distribution Likelihood function

  20. Part 2: Likelihood function and prior distributions Example (continued) Discussion of the posterior distribution • Result is a probability distribution (posterior distr.) • Posterior mean is intermediate between prior mean and observation. • Weight of each depends on prior variance and measurement error. • Posterior variance is lower than both prior variance and measurement error variance. • Used just one data point and still got estimator.

  21. Part 2: Likelihood function and prior distributions Frequentist versus Bayesian Bayesian analysis introduces an element of subjectivity: the prior distribution. But its representation of the uncertainty is easy to understand - the uncertainty is assessed conditionally to the observations, - the calculations are straightforward when the posterior distribution is known.

  22. Part 2: Likelihood function and prior distributions Which is better? Bayesian methods often lead to - more realistic estimated parameter values, - in some cases, more accurate model predictions. Problems when prior information is wrong and when one has a strong confidence in it.

  23. Part 2: Likelihood function and prior distributions Difficulties for estimating crop model parameters Which likelihood function ? - Unbiaised errors ? - Independent errors ? Which prior distribution ? - What do the parameters really represent ? - Level of uncertainty ? - Symmetric distribution ?

  24. Part 2: Likelihood function and prior distributions Practical considerations • The analytical expression of the posterior distribution can be derived for simple applications. • For complex problems, the posterior distribution must be approximated.

  25. Next part Importance sampling, an algorithm to approximate the posterior probability distribution.

More Related