110 likes | 222 Views
Estimation. Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population (i.e. m , s ) Distributional characteristics – pdf and cdf
E N D
Estimation • Samples are collected to estimate characteristics of the population of particular interest. • Parameter – numerical characteristic of the population (i.e. m, s) • Distributional characteristics – pdf and cdf • Statistic - numerical characteristic of the sample used to estimate parameters
Point Estimation • A sample statistic is often used to estimate and draw conclusions about a population parameter (). • The sample statistic is called a point estimator of • For a particular sample, the calculated value of the statistic is called a point estimate of .
Point Estimation • Let X1, X2, …, Xnbe a random sample of size n from the population of interest, and let Y=u(X1, X2, …, Xn) be a statistic used to estimate . • Then Y is called an estimator of . • A specific value of the estimator y=u(x1, x2, …, xn) is called an estimate of .
Estimator • Discussion is limited to random variables with functional form of the pdf known. • The pdf typically depends on an unknown parameter which can take on any value in the parameter space . i.e. f(x;) • It is often necessary to pick one member from a family of members as most likely to be true. • i.e. pick “best” value of for f(x;) • The best estimator can depend on the distribution being sampled.
Properties of an Estimator • If E[Y]=, then the statistic Y is called an unbiased estimator of . Otherwise, it is said to be biased. • E[Y-] is the bias of an estimator • In many cases, the “best” estimator is an unbiased estimator.
Properties of an Estimator • Another important property is small variance. If two estimators are both unbiased, we prefer the one with small variance. • Minimize E[(Y-)2] = Var[(Y-)] + E[(Y-)] 2 • The estimator Y that minimizes E[(Y-)2] is said to have minimum mean square error (MSE) • If we consider only unbiased estimators, the statistic Y that minimizes MSE is called the minimum variance unbiased estimator (MVUE)
Properties of an Estimator • The efficiency of an estimator 1compare to another estimator 2 is equal to the ratio
Method of Maximum Likelihood • An important method for finding an estimator • Let X1,X2,…,Xnbe a random sample of size n from f(x;). • The likelihood function is the joint pdf of X1,X2,…,Xnevaluated at observed values x1,x2,…,xn as a function of the parameter of interest. • L() = f(x1, x2, …, xn;) = f(x1,) f(x2,) f(xn,) is the probability of observing x1,x2,…,xn if the pdf is f(x;). • The value of that maximizes L() is the value ofmost likely to have produced x1,x2,…,xn
Maximum Likelihood Estimator • The maximum likelihood estimator (MLE) of is found by setting the differential of L() with respect to equal to zero and solving for . • The MLE can also be found by maximizing the natural log of L(), which is often easier to differentiate. • For more than one parameter, maximum likelihood equations are formed and simultaneously solved to arrive at the MLE’s
Invariance Property • If t is the MLE for and u()is a function of then the MLE of u() is u(t). • Plug the MLE(s) into the function to get an MLE estimate of the function
Method of Moments • An important method for finding an estimator • Let X1,X2,…,Xnbe a random sample of size n from f(x). The kth population moment is E[Xk]. The kthsample moment is (1/n) Xk. • The method of moments involves setting the sample moment(s) equal to the population moment(s) and solving for the parameter of interest.