40 likes | 66 Views
This article discusses the Maximum Likelihood Estimation (MLE) for unknown parameters in Gamma random variables. It delves into the challenges of finding solutions, differentiability, and complexities of the likelihood function. Additionally, it explores fitting data using a linear model and deriving MLE for various parameters.
E N D
Let be i.i.d Gamma random variables with unknown parameters and . Determine the ML estimator for and
In general the (log)-likelihood function can have more than one solution, or no solutions at all. Further, the (log)-likelihood function may not be even differentiable, or it can be extremely complicated to solve explicitly
We wish to fit data (x1; y1),…,(xn; yn) with linear model Yj= 1 + βxj+ ejwith ej ~ N(0; σ2). • Find the maximum likelihood estimator for β • Either prove or disprove unbiasedness of the estimator you computed
Let X1…Xn be a random sample from N(0; ө). • (a) Obtain an estimator of 1/өusing the method of moments. • (b) Find the likelihood function and use it to obtain the maximum likelihood estimator of 1/ө.