120 likes | 145 Views
Learn about the importance of consistency in estimators, intuitive explanations, empirical consistency, correlation impact, and more. Dive into theorem proofs with practical insights on avoiding inconsistency.
E N D
5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where unbiasedness cannot be achieved, consistency is the minimum requirement for an estimator -Consistency requires MLR. 1 through MLR.4, as well as no correlation between x’s
5. Intuitive Consistency While the actual proof of consistency is complicated, it can be intuitively explained -Each sample of n observations produces a Bjhat with a given distribution -MLR. 1 through MLR. 4 cause this Bjhat to be unbiased with mean Bj -If the estimator is consistent, as n increases the distribution becomes more tightly distributed around Bj -as n tends to infinity, Bjhat’s distribution collapses to Bj
5. Empirical Consistency In general, If obtaining more data DOES NOT get us closer to our parameter of interest… We are using a poor (inconsistent) estimator. -Fortunately, the same assumptions imply unbiasedness and consistency:
Theorem 5.1(Consistency of OLS) Under assumptions MLR. 1 through MLR. 4, the OLS estimator Bjhat is consistent for Bj for all j=0, 1,…,k.
Theorem 5.1 Notes While a general proof of this theorem requires matrix algebra, the single independent variable case can be proved from our B1hat estimator: Which uses the fact that yi=B0+B1xi1+u1 and previously seen algebraic properties
Theorem 5.1 Notes Using the law of large numbers, the numerator and denominator converge in probability to the population quantities Cov(x1,u) and Var(x1) -Since Var(x1)≠0 (MLR.3), we can use probability limits (Appendix C) to conclude: Note that MLR.4, which assumes x1 and u aren’t correlated, is essential to the above -Technically, Var(x1) and Var(u) should also be less than infinity
5. Correlation and Inconsistency -If MLR. 4 fails, consistency fails -that is, correlation between u and ANY x generally causes all OLS estimators to be inconsistent -”if the error is correlated with any of the independent variables, then OLS is biased and inconsistent” -in the simple regression case, the INCONSISTENCY in B1hat (or ASYMPTOTIC BIAS) is:
5. Correlation and Inconsistency -Since variance is always positive, the sign of inconsistency depends on the sign of covariance -If the covariance is small compared to the variance, the inconsistency is negligible -However we can’t estimate this covariance as u is unobserved
5. Correlation and Inconsistency Consider the following true model: Where we satisfy MLR.1 through MLR.4 (v has a zero mean and is uncorrelated with x1 and x2) -By Theorem 5.1 our OLS estimators (Bjhat) are consistent -If we omit x2 and run an OLS regression, then u=B2x2+v and
5. Correlation and Inconsistency Practically, inconsistency can be viewed the same as bias -Inconsistency deals with population covariance and variance -Bias deals with sample covariance and variance -If x1 and x2 are uncorrelated, the delta1=0 and B1tilde is consistent (but not necessarily unbiased)
5. Inconsistency -The direction of inconsistency can be calculated using the same table as bias:
5. Inconsistency Notes If OLS is inconsistent, adding observations does not fix it -in fact, increasing sample size makes the problem worse -In the k regressor case, correlation between one x variable and u generally makes ALL coefficient estimators inconsistent -The one exception is when xj is correlated with u but ALL other variables are uncorrelated with both xj and u -Here only Bjhat is inconsistent