70 likes | 263 Views
The Uniform Prior and the Laplace Correction. Supplemental Material not on exam. Bayesian Inference. We start with P( ) - prior distribution about the values of P( x 1 , …, x n | ) - likelihood of examples given a known value
E N D
The Uniform Prior and the Laplace Correction Supplemental Material not on exam
Bayesian Inference We start with • P() - prior distribution about the values of • P(x1, …, xn|) - likelihood of examples given a known value Given examples x1, …, xn, we can compute posterior distribution on Where the marginal likelihood is
Binomial Distribution: Laplace Est. • In this case the unknown parameter is = P(H) • Simplest prior P() = 1 for 0< <1 • Likelihood where his number of heads in the sequence • Marginal Likelihood:
Marginal Likelihood Using integration by parts we have: Multiply both side by n choose h, we have
Marginal Likelihood - Cont • The recursion terminates when h = n Thus We conclude that the posterior is
Bayesian Prediction • How do we predict using the posterior? • We can think of this as computing the probability of the next element in the sequence • Assumption: if we know , the probability of Xn+1 is independent of X1, …, Xn
Bayesian Prediction • Thus, we conclude that