160 likes | 357 Views
Econometrics. Chapter 12 Autocorrelation. Autocorrelation - Chapter 12. OLS is best econometric tool if all Gauss-Markov assumptions are satisfied If not, there are better estimation techniques Autocorrelation (Correlated Error) Heteroskedasticity (Non-Constant Variances)
E N D
Econometrics Chapter 12 Autocorrelation
Autocorrelation - Chapter 12 • OLS is best econometric tool if all Gauss-Markov assumptions are satisfied • If not, there are better estimation techniques • Autocorrelation (Correlated Error) • Heteroskedasticity (Non-Constant Variances) • Endogenous RHS Variables
Autocorrelation – continuedConsequences • Consequences of data that has Autocorrelation • Estimates ( ) of the True Values of β parameters remain unbiased • OLS Estimates of β parameters are consistent • OLS Estimates of β parameters become inefficient: other unbiased estimators exist that have lower variances • Standard Errors of the OLS estimates become Biased and Inconsistent
Autocorrelation – continuedDetecting Serial Correlation • First-Order Autoregression (Error Terms): AR(1) • εt = ρεt-1 + μt • εt = error term in time t • ρ = autocorrelation coefficient • μt = a random variable that is not serially correlated • Assume that; -1 < ρ < +1 • If not, shocks do not dissipate over time
Autocorrelation – continuedDetecting Serial Correlation • Durbin-Watson Statistic [AR(1)] • DW ≈ 2 - 2ρ(0 ≤ DW ≤ 4) • If ρ = 0, then DW ≈ 2 • No Autocorrelation • Look for critical values of DW (LC & UC) • DW Table (DW above UC = No Autocorrelation; DW below LC = Positive Autocorrelation) • Autocorrelation occurs most often in Time Series Data
Autocorrelation – continuedCorrecting for Serial Correlation • If AR(1) exists, we need a better estimation method than OLS • Yt = β0 + β1Xt + εt and (12.4.1) • Yt-1 = β0 + β1Xt-1 + εt-1 (12.4.4) • εt = ρεt-1 + μt [AR(1)] (12.4.2) • Yt = β0 + β1Xt + ρεt-1 + μt (12.4.3) • ρYt-1 = ρβ0 + ρβ1Xt-1 + ρεt-1 (12.4.5) • Yt = (1-ρ)β0 + ρYt-1 + β1Xt + ρβ1Xt-1 + μt (12.4.3 – 12.4.5) • Yt = γ0 + γ1Yt-1 + γ2Xt + γ3Xt-1 + μt (Cochrane-Orcutt Transformation)
Autocorrelation – continuedCorrecting for Serial Correlation • Estimates after correcting for Autocorrelation are more reliable • Corrected Estimates have different interpretations • Yt = γ0 + γ1Yt-1 + γ2Xt + γ3Xt-1 + μt • γ1 = ρ (the shock) • γ2 = β1
Autocorrelation – continuedCorrecting for Serial Correlation • When ρ = 1 (Unit Roots) • If εt = ρεt-1 + μt[AR(1) Process] • Then εt follows a Random Walk or εt has a Unit Root • Error Terms fail to tend back to 0 • Cannot predict values of Y from X • We correct this problem by “Estimating in first differences”
Autocorrelation – continuedCorrecting for Serial Correlation • Yt = β0 + β1Xt + εt-1 + μt (12.5.1) • Note: εt = ρεt-1 + μt and ρ=1 • Yt-1 = β0 + β1Xt-1 + εt-1 (12.5.2) • Yt - Yt-1 = β1(Xt - Xt-1) + μt (12.5.3) • Subtracting 12.5.2 from 12.5.1 • ∆Yt =β1∆Xt + μt (12.5.4) • Estimating in First Differences • Solves the problem of Unit Roots
Autocorrelation – continuedMore Complicated Autocorrelation Structures • Second order Autoregressive (AR) processes • First order Moving Average (MA) processes • ARMA processes with Lags • Time Series Data Analysis can have big problems that need correcting
Autocorrelation – continuedSummary – Chapter 12 • When unobserved, persistent shocks affect the value of a Dependent Variable, regressions will tend to suffer from Serial Correlation • If Serial Correlation is present, OLS Estimates will be Unbiased and Consistent, but Inefficient • Standard Errors are Biased, causing parameters to appear more precisely measured than they are
Autocorrelation – continuedSummary – Chapter 12 • Parameters may appear to be more statistically significant than they are • The Durbin-Watson Statistic can be used to detect first order Autoregression in the Error Terms • The Cochrane-Orcutt Transformation (or other method) can be used to produce estimates more efficient than OLS • An Error Term is a Random Walk or has a Unit Root if ρ = 1