210 likes | 673 Views
The Likelihood Function - Introduction. Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. The distribution f θ can be either a probability density function or a
E N D
The Likelihood Function - Introduction • Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. • The distribution fθ can be either a probability density function or a probability mass function. • The joint probability density function or probability mass function of iid random variables X1, …, Xn is week 4
The Likelihood Function • Let x1, …, xn be sample observations taken on corresponding random variables X1, …, Xn whose distribution depends on a parameter θ. The likelihood function defined on the parameter space Ω is given by • Note that for the likelihood function we are fixing the data, x1,…, xn, and varying the value of the parameter. • The value L(θ | x1, …, xn) is called the likelihood of θ. It is the probability of observing the data values we observed given that θ is the true value of the parameter. It is not the probability of θ given that we observed x1, …, xn. week 4
Examples • Suppose we toss a coin n = 10 times and observed 4 heads. With no knowledge whatsoever about the probability of getting a head on a single toss, the appropriate statistical model for the data is the Binomial(10, θ) model. The likelihood function is given by • Suppose X1, …, Xn is a random sample from an Exponential(θ) distribution. The likelihood function is week 4
Maximum Likelihood Estimators • In the likelihood function, different values of θ will attach different probabilities to a particular observed sample. • The likelihood function, L(θ | x1, …, xn), can be maximized over θ, to give the parameter value that attaches the highest possible probability to a particular observed sample. • We can maximize the likelihood function to find an estimator of θ. • This estimator is a statistics – it is a function of the sample data. It is denoted by week 4
The log likelihood function • l(θ) = ln(L(θ)) is the log likelihood function. • Both the likelihood function and the log likelihood function have their maximums at the same value of • It is often easier to maximize l(θ). week 4
Examples week 4
Properties of MLE • The MLE is invariant, i.e., the MLE of g(θ) equal to the function g evaluated at the MLE. • Proof: • Examples: week 4
Important Comment • Some MLE’s cannot be determined using calculus. This occurs whenever the support is a function of the parameter θ. • These are best solved by graphing the likelihood function. • Example: week 4