140 likes | 164 Views
Confidence Interval & Unbiased Estimator. Review and Foreword. Central limit theorem vs. the weak law of large numbers. Weak law vs. strong law. Personal research Search on the web or the library Compare and tell me why. Cont. Maximum Likelihood estimator.
E N D
Confidence Interval & Unbiased Estimator Review and Foreword
Weak law vs. strong law • Personal research • Search on the web or the library • Compare and tell me why
Maximum Likelihood estimator • Suppose the i.i.d. random variables X1, X2, …Xn, whose joint distribution is assumed given except for an unknown parameter θ, are to be observed and constituted a random sample. • f(x1,x2,…,xn)=f(x1)f(x2)…f(xn), The value of likelihood function f(x1,x2,…,xn/θ) will be determined by the observed sample (x1,x2,…,xn)if the true value of θ couldalso befound. Differentiate on the θ and let the first order condition equal to zero, and then rearrange the random variables X1, X2, …Xn to obtain θ.
Confidence vs. Probability • Probability is used to describe the distribution of a certain random variable (interval) • Confidence (trust) is used to argue how the specific sampling consequence would approach to the reality (population)
Linear combination of several unbiased estimators • If d1,d2,d3,d4…dn are independent unbiased estimators • If a new estimator with the form, d=λ1d1+λ2d2+λ3d3+…λndn and λ1+λ2+…λn=1, it will also be an unbiased estimator. • The mean square error of any estimator is equal to its variance plus the square of the bias • r(d, θ)=E[(d(X)-θ)2]=E[d-E(d)2]+(E[d]-θ)2
The value of additional information • The Bayes estimator • The set of observed sample revised the prior θ distribution • Smaller variance of posterior θ distribution • Ref. pp.274-275