1 / 11

Estimation

Estimation. Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population (i.e. m , s ) Distributional characteristics – pdf and cdf

tal
Download Presentation

Estimation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Estimation • Samples are collected to estimate characteristics of the population of particular interest. • Parameter – numerical characteristic of the population (i.e. m, s) • Distributional characteristics – pdf and cdf • Statistic - numerical characteristic of the sample used to estimate parameters

  2. Point Estimation • A sample statistic is often used to estimate and draw conclusions about a population parameter (). • The sample statistic is called a point estimator of  • For a particular sample, the calculated value of the statistic is called a point estimate of .

  3. Point Estimation • Let X1, X2, …, Xnbe a random sample of size n from the population of interest, and let Y=u(X1, X2, …, Xn) be a statistic used to estimate . • Then Y is called an estimator of . • A specific value of the estimator y=u(x1, x2, …, xn) is called an estimate of .

  4. Estimator • Discussion is limited to random variables with functional form of the pdf known. • The pdf typically depends on an unknown parameter  which can take on any value in the parameter space . i.e. f(x;) • It is often necessary to pick one member from a family of members as most likely to be true. • i.e. pick “best” value of  for f(x;) • The best estimator can depend on the distribution being sampled.

  5. Properties of an Estimator • If E[Y]=, then the statistic Y is called an unbiased estimator of . Otherwise, it is said to be biased. • E[Y-] is the bias of an estimator • In many cases, the “best” estimator is an unbiased estimator.

  6. Properties of an Estimator • Another important property is small variance. If two estimators are both unbiased, we prefer the one with small variance. • Minimize E[(Y-)2] = Var[(Y-)] + E[(Y-)] 2 • The estimator Y that minimizes E[(Y-)2] is said to have minimum mean square error (MSE) • If we consider only unbiased estimators, the statistic Y that minimizes MSE is called the minimum variance unbiased estimator (MVUE)

  7. Properties of an Estimator • The efficiency of an estimator 1compare to another estimator 2 is equal to the ratio

  8. Method of Maximum Likelihood • An important method for finding an estimator • Let X1,X2,…,Xnbe a random sample of size n from f(x;). • The likelihood function is the joint pdf of X1,X2,…,Xnevaluated at observed values x1,x2,…,xn as a function of the parameter of interest. • L() = f(x1, x2, …, xn;) = f(x1,) f(x2,)    f(xn,) is the probability of observing x1,x2,…,xn if the pdf is f(x;). • The value of  that maximizes L() is the value ofmost likely to have produced x1,x2,…,xn

  9. Maximum Likelihood Estimator • The maximum likelihood estimator (MLE) of  is found by setting the differential of L() with respect to  equal to zero and solving for . • The MLE can also be found by maximizing the natural log of L(), which is often easier to differentiate. • For more than one parameter, maximum likelihood equations are formed and simultaneously solved to arrive at the MLE’s

  10. Invariance Property • If t is the MLE for  and u()is a function of  then the MLE of u() is u(t). • Plug the MLE(s) into the function to get an MLE estimate of the function

  11. Method of Moments • An important method for finding an estimator • Let X1,X2,…,Xnbe a random sample of size n from f(x). The kth population moment is E[Xk]. The kthsample moment is (1/n) Xk. • The method of moments involves setting the sample moment(s) equal to the population moment(s) and solving for the parameter of interest.

More Related