1 / 24

PS 233

The Problem of Heteroskedasticity. Autocorrelation is one of two possible violations of our assumption E(ee')=s2InSpecifically, it is a violation of the assumption E(et,et-1)=0Coefficients are unbiased, but standard errors and t-tests are wrong.Generally, standard errors are TOO SMALL. Patterns

sutton
Download Presentation

PS 233

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. PS 233 Intermediate Statistical Methods Lecture 16 Correcting for Autocorrelation

    2. The Problem of Heteroskedasticity Autocorrelation is one of two possible violations of our assumption E(ee’)=s2In Specifically, it is a violation of the assumption E(et,et-1)=0 Coefficients are unbiased, but standard errors and t-tests are wrong. Generally, standard errors are TOO SMALL

    3. Patterns of Autocorrelation Autocorrelation can be across one term: Or Autocorrelation can be a more complex function: As it turns out AR(1) process is VERY robust estimator of temporal autocorrelation problems

    4. Patterns of Autocorrelation AR(1) is robust because ?2 represents the impact of et-2 controlling for the impact of et-1 Most of correlation from previous errors will be transmitted through the impact of e t-1 One exception to this is seasonal or quarterly autocorrelation

    5. Patterns of Autocorrelation Second exception is spatial autocorrelation Difficult to know what pattern to adjust for with spatial autocorrelation For most time-series problems, AR(1) correction will be sufficient At least captures most of the temporal dependence

    6. Diagnosing Autocorrelation Since coefficients are unbiased, we can use observed residuals to estimate and diagnose autocorrelation One strategy is to estimate dependence directly by regressing residuals on the lagged values

    7. Diagnosing Autocorrelation This is flexible because we can specify any set of lags we want. Common statistic for testing for AR(1) autocorrelation is the Durbin-Watson statistic Durbin-Waston is the ratio of the distance between the errors to their overall variance

    8. The Durbin-Watson Statistic

    9. The Durbin Watson Statistic Thus DW is equal to 2 minus two times the correlation of et and et-1 Durbin-Watson is used both as diagnostic for autocorrelation and as estimate of ? Note that DW statistic is a correlation and thus depends on values of independent variables

    10. The Durbin-Watson Statistic But it turns out that DW statistic does have a known distribution DW distribution has upper and lower bounds based solely on sample size, number of parameters and ? DW varies from 0 to 4, if ?=0 then DW statistic=2

    11. The Durbin-Watson Statistic Durbin-Watson is symetrically distributed around 2 Values greater than 2 indicate negative autocorrelation Values less than 2 indicate positive autocorrelation STATA will calculate this value for you

    12. GLS and Correcting Autocorrelation If we do have autocorrelated errors, how do we solve the problem? Basic formula is similar to heteroskedasticity Weight the data by some function F such that F’F=O-1 But what is the proper weight?

    13. GLS and Defining F We begin with our equation: Now recall that: Consequently:

    14. GLS and Defining F Thus if: Then: And our GLS estimator is:

    15. Completing the GLS Model Only problem with this system is we lose the first observation in the time-series Not a big problem is N is large But the first observation can be recovered There IS no previous value, so AR(1) autocorrelation is not an issue

    16. Completing GLS: Recovering the First Observation Problem is the data have been transformed, and so the variance of the errors has been transformed as well Can’t use X1 and Y1 observations because this would create heteroskedasticity Thus we need an appropriate weight for the first case so that ee’=s2I

    17. Completing GLS: Recovering the First Observation Recall that: Therefore: And since et is not autocorrelated:

    18. Recovering the First Observation Thus in correcting for ? we have altered the variance of the errors by 1- ?2 Consistent with our previous correction for heteroskedasticity we weight the first observation by And GLS is:

    19. The Transformed Data for GLS: The Prais-Winsten Method

    20. Autocorrelation: An Example Let’s look at…Presidential Approval We have quarterly data from 1949-1985 Create a variable that counts the time units of observation: gen time = (year - 1949)*4 + quarter This yields a count variable from 1-148 Then tell STATA you have time-series data with the command: tsset time, q …(for quarterly data)

    21. Autocorrelation: An Example

    22. Autocorrelation: An Example

    23. Autocorrelation: An Example

    24. Autocorrelation: An Example

    25. Autocorrelation: An Example

More Related