840 likes | 1.06k Views
Econ 240C. Lecture 15. ARCH-GARCH Structure?. Part I. Conditional Heteroskedasticity. An Example. Producer Price Index for Finished Goods April 1947-April 2003 1982=100 Seasonally adjusted rate (SAR). Transformations. PPI is evolutionary Take logarithms Then difference
E N D
Econ 240C Lecture 15
Part I. Conditional Heteroskedasticity • An Example
Producer Price Index for Finished Goods • April 1947-April 2003 • 1982=100 • Seasonally adjusted rate (SAR)
Transformations • PPI is evolutionary • Take logarithms • Then difference • Obtain the fractional changes, i.e. the inflation rate for producer goods
Modeling dlnppi • Try an artwo
Modeling dlnppi • Try an ARMA(1,1)
ARMA(1,1) Model of Producer Goods Inflation • Residuals from ARMA(1, 1) model are orthogonal but not normal • Are we done?
Part II. Examine Residuals • Trace of residuals • Trace of square of residuals
Episodic variance • Not homoskedastic • So call heteroskedastic, conditional on dates or episodes when the variance kicks up • Hence name “conditional heteroskedaticity”
Clues • Check trace of residuals squared • Check correlogram of residuals squared
Clues • Check trace of residuals squared • can get residuals from Actual, fitted, residuals table • Check correlogram of residuals squared • EVIEWS option along with correlogram of residuals • Heteroskedasticity of residuals • Histogram of residuals • kurtotic residuals are a clue
Part III: Modeling Conditional Heteroskedasticity • Robert Engle: UCSD • Autoregressive error variance model
Modeling the error • Model the error e(t) as the product of two independent parts, wn(t) and h(t) • WN(t) ~N(0,1)
Modeling the error • Assume that WN(t) is independent of • So density f{wn(t)*[h(t)]1/2} is the product of two densities, g and k: • f =g[wn(t)]*k{[h(t)]1/2} • And expectations can be written as products of expectations • This is related to writing the Probability of P(A and B) as P(A)*P(B) if events A • And B are independent
Modeling the error • We would like the error, e(t) to have the usual properties of mean zero, orthogonality, and unconditional variance constant • E e(t) = E {[h(t)]1/2*WN(t)} = E{[h(t)]1/2}*E[WN(t)] , the product of expectations because of independence • We may not know E{[h(t)]1/2}, but we know E[WN(t)] =0 so Ee(t)=0
Modeling the error, e(t) • In a similar fashion, • E[e(t)*e(t-1)] = E( {[h(t)]1/2*WN(t)} * {[h(t-1)]1/2*WN(t-1)}) • =E {[h(t)]1/2*[h(t-1)]1/2} *E[WN(t)*WN(t-1)] • We may not know the first expectation but we know the second is zero since white noise is orthogonal, so e(t) is also orthogonal, i.e. E[e(t)*e(t-1)] = 0
Modeling the error, e(t) • The unconditional variance of e(t), E[e(t)]2 = E{[wn(t)]2 *h(t)]} • And once again by independence, E[e(t)]2 = E[wn(t)]2 *E[h(t)] • Where the first expectation is 1, since white noise has variance one and Eh(t) =E{a0 + a1 [e(t-1)]2 } • So E[e(t)]2 = a0 + a1 E[e(t-1)]2 • And since E[e(t)]2 = E[e(t-1)]2 =Var • Var e(t) = a0 /(1- a1 ), a constant
Modeling the error, e(t) • The conditional mean is the expected value of e(t) at time t-1: • Et-1 e(t) = Et-1 {wn(t)*[h(t)]1/2 } • And by independence Et-1 e(t) = Et-1 [wn(t)]*Et-1 [h(t)]1/2 • Our best guess at time t-1 of the shock wn(t) next period is zero so our best guess of the shock e(t) for next period is also zero
Modeling the error, e(t) • The conditional variance of e(t) is • Et-1 [e(t)]2 = Et-1 {[wn(t)]2 *h(t)} and by independence • Et-1 [e(t)]2 = Et-1 [wn(t)]2 *Et-1 h(t) and the first conditional expectation is 1, and the second is Et-1 h(t) = Et-1 {a0 + a1 [e(t-1)]2} = a0 + a1 Et-1 [e(t-1)]2 = a0 + a1 [e(t-1)]2 • So Et-1 [e(t)]2 = a0 + a1 [e(t-1)]2 = h(t) • i.e. the conditional variance is h(t), and depends on the error squared from the previous period, the autoregressive feature
Generalizations • Autoregressive conditional heteroskedasticity, or ARCH, can be extended with more lagged terms: • h(t) = a0 + a1 {e(t-1)]2 + a2 [e(t-2)]2 + … • The conditional variance h(t), can also be modeled by adding lagged conditional variance terms: • h(t) = a0 + a1 {e(t-1)]2 + a2 [e(t-2)]2 + …b1h(t-1) • + …. • This extension was suggested by Bollerslev and is called GARCH, generalized ARCH
From the correlogram of the squared residuals from the ARMA(1, 1) model for dlnppi, it looks like there may be AR and MA structure
Diagnostics • “Residuals”, estimate of e(t) • Actual, fitted and residuals • GENR resgar = resid • “Standardized residuals”, estimate of wn(t) = e(t)/[h(t)]1/2 • Correlogram of residuals • Histogram of residuals • Correlogram of residuals squared • Where do we find estimate of h(t) • PROCS menu in the equation window, make GARCH variance series
Actual, fitted, residuals from GARCH Model