1 / 14

Confidence Interval & Unbiased Estimator

Confidence Interval & Unbiased Estimator. Review and Foreword. Central limit theorem vs. the weak law of large numbers. Weak law vs. strong law. Personal research Search on the web or the library Compare and tell me why. Cont. Maximum Likelihood estimator.

romanp
Download Presentation

Confidence Interval & Unbiased Estimator

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Confidence Interval & Unbiased Estimator Review and Foreword

  2. Central limit theorem vs. the weak law of large numbers

  3. Weak law vs. strong law • Personal research • Search on the web or the library • Compare and tell me why

  4. Cont.

  5. Maximum Likelihood estimator • Suppose the i.i.d. random variables X1, X2, …Xn, whose joint distribution is assumed given except for an unknown parameter θ, are to be observed and constituted a random sample. • f(x1,x2,…,xn)=f(x1)f(x2)…f(xn), The value of likelihood function f(x1,x2,…,xn/θ) will be determined by the observed sample (x1,x2,…,xn)if the true value of θ couldalso befound. Differentiate on the θ and let the first order condition equal to zero, and then rearrange the random variables X1, X2, …Xn to obtain θ.

  6. Confidence interval

  7. Confidence vs. Probability • Probability is used to describe the distribution of a certain random variable (interval) • Confidence (trust) is used to argue how the specific sampling consequence would approach to the reality (population)

  8. 100(1-α)% Confidence intervals

  9. 100(1-α)% confidence intervals for (μ1 -μ2)

  10. Approximate 100(1-α)% confidence intervals for p

  11. Unbiased estimators

  12. Linear combination of several unbiased estimators • If d1,d2,d3,d4…dn are independent unbiased estimators • If a new estimator with the form, d=λ1d1+λ2d2+λ3d3+…λndn and λ1+λ2+…λn=1, it will also be an unbiased estimator. • The mean square error of any estimator is equal to its variance plus the square of the bias • r(d, θ)=E[(d(X)-θ)2]=E[d-E(d)2]+(E[d]-θ)2

  13. The Bayes estimator

  14. The value of additional information • The Bayes estimator • The set of observed sample revised the prior θ distribution • Smaller variance of posterior θ distribution • Ref. pp.274-275

More Related