1 / 12

5. Consistency

Learn about the importance of consistency in estimators, intuitive explanations, empirical consistency, correlation impact, and more. Dive into theorem proofs with practical insights on avoiding inconsistency.

Download Presentation

5. Consistency

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where unbiasedness cannot be achieved, consistency is the minimum requirement for an estimator -Consistency requires MLR. 1 through MLR.4, as well as no correlation between x’s

  2. 5. Intuitive Consistency While the actual proof of consistency is complicated, it can be intuitively explained -Each sample of n observations produces a Bjhat with a given distribution -MLR. 1 through MLR. 4 cause this Bjhat to be unbiased with mean Bj -If the estimator is consistent, as n increases the distribution becomes more tightly distributed around Bj -as n tends to infinity, Bjhat’s distribution collapses to Bj

  3. 5. Empirical Consistency In general, If obtaining more data DOES NOT get us closer to our parameter of interest… We are using a poor (inconsistent) estimator. -Fortunately, the same assumptions imply unbiasedness and consistency:

  4. Theorem 5.1(Consistency of OLS) Under assumptions MLR. 1 through MLR. 4, the OLS estimator Bjhat is consistent for Bj for all j=0, 1,…,k.

  5. Theorem 5.1 Notes While a general proof of this theorem requires matrix algebra, the single independent variable case can be proved from our B1hat estimator: Which uses the fact that yi=B0+B1xi1+u1 and previously seen algebraic properties

  6. Theorem 5.1 Notes Using the law of large numbers, the numerator and denominator converge in probability to the population quantities Cov(x1,u) and Var(x1) -Since Var(x1)≠0 (MLR.3), we can use probability limits (Appendix C) to conclude: Note that MLR.4, which assumes x1 and u aren’t correlated, is essential to the above -Technically, Var(x1) and Var(u) should also be less than infinity

  7. 5. Correlation and Inconsistency -If MLR. 4 fails, consistency fails -that is, correlation between u and ANY x generally causes all OLS estimators to be inconsistent -”if the error is correlated with any of the independent variables, then OLS is biased and inconsistent” -in the simple regression case, the INCONSISTENCY in B1hat (or ASYMPTOTIC BIAS) is:

  8. 5. Correlation and Inconsistency -Since variance is always positive, the sign of inconsistency depends on the sign of covariance -If the covariance is small compared to the variance, the inconsistency is negligible -However we can’t estimate this covariance as u is unobserved

  9. 5. Correlation and Inconsistency Consider the following true model: Where we satisfy MLR.1 through MLR.4 (v has a zero mean and is uncorrelated with x1 and x2) -By Theorem 5.1 our OLS estimators (Bjhat) are consistent -If we omit x2 and run an OLS regression, then u=B2x2+v and

  10. 5. Correlation and Inconsistency Practically, inconsistency can be viewed the same as bias -Inconsistency deals with population covariance and variance -Bias deals with sample covariance and variance -If x1 and x2 are uncorrelated, the delta1=0 and B1tilde is consistent (but not necessarily unbiased)

  11. 5. Inconsistency -The direction of inconsistency can be calculated using the same table as bias:

  12. 5. Inconsistency Notes If OLS is inconsistent, adding observations does not fix it -in fact, increasing sample size makes the problem worse -In the k regressor case, correlation between one x variable and u generally makes ALL coefficient estimators inconsistent -The one exception is when xj is correlated with u but ALL other variables are uncorrelated with both xj and u -Here only Bjhat is inconsistent

More Related