570 likes | 798 Views
Ka-fu Wong University of Hong Kong. Volatility Measurement, Modeling, and Forecasting. Importance of volatility. Good volatility forecasts are crucial for the implementation and evaluation of asset and derivative pricing theories as well as trading and hedging strategies. Two assets:
E N D
Ka-fu WongUniversity of Hong Kong Volatility Measurement, Modeling, and Forecasting
Importance of volatility • Good volatility forecasts are crucial for the implementation and evaluation of asset and derivative pricing theories as well as trading and hedging strategies. • Two assets: • an risky and a riskless (i.e., volatility = 0) • Risky asset generally has a higher expected return than the riskless assets. • We would like to invest in a portfolio consisting of the two assets. • When the risky asset has a very high volatility, the portfolio will consist of the riskless asset only. • When the risky asset has a very low volatility, the portfolio will consist of more risky assets.
Importance of volatility • The variance of inflation may have impact on various macro and investment decisions. • High variance in inflation may also imply welfare loss. • Previous studies have tried to measure the time-varying variance of inflation.
Clustering of volatility • It is a well-establishedfact, dating back to Mandelbrot (1963) and Fama (1965), that financial returnsdisplay pronounced volatility clustering. • Therefore, models of volatility should allow such clustering.
Example: AR(1) AR(1): yt = φ yt-1 + et et ~ WN(0, s2) Repeatd substitution: yt = φ(φ yt-2 + et-1) + et = φ2 yt-2 + φet-1 + et = φ2(φyt-3+et-2)+ φet-1 + et = φ3 yt-3+ φ2et-2+ φet-1 + et … = et + φet-1 + φ2et-2 + φ3et-3 + φ4et-4 + φ5et-5+ … E(et) = 0, E(yt) = 0 Var(et) = E[(et – E(et))2] = s2 Var(yt) = E[(yt – E(yt))2] =s2(1+ φ + φ2 + φ3 + φ4 +…)
Homoskedasticity vs. Heteroskedasticity So far, innovation are assumed to be i.i.d. It is possible to allow variance to change across observations, i.e., Heteroskedasticity. Information available at time t-1
A general linear process Consider a general linear process: Need not be i.i.d.
Two examples Consider a general linear process: AR(1) yt = φ yt-1 + et yt= et + φet-1 + φ2et-2 + φ3et-3 + φ4et-4 + … bi = φi MA(2) Need not be i.i.d. yt= et + θ1et-1 + θ2et-2 b0=1, b1= θ1, b2= θ2, b3=b4=…=0
Unconditional means and variances Consider a general linear process: AR(1) yt = φ yt-1 + et yt= et + φet-1 + φ2et-2 + φ3et-3 + φ4et-4 + … bi = φi E(yt)= E(et) + φE(et-1) + φ2E(et-2) + … = 0 V(yt)= V(et) + φ2V(et-1) + φ4V(et-2) + … MA(2) yt= et + θ1et-1 + θ2et-2 b0=1, b1= θ1, b2= θ2, b3=b4=…=0 E(yt)= E(et) + θ1E(et-1) + θ2E(et-2) = 0 V(yt)= V(et) + θ12V(et-1) + θ22V(et-2)
Conditional variances change with horizon of forecast but are not time-varying given a horizon. Consider a general linear process: Conditional information MA(2) Conditional mean is time-varying : yt= et + θ1et-1 + θ2et-2 b0=1, b1= θ1, b2= θ2, b3=b4=…=0 E(yt|t-1)= θ1et-1 + θ2et-2 h-step ahead forecast is time-varying: E(yt+1|t)= θ1et + θ2et-1 E(yt+2|t+1)= θ1et+1 + θ2et
Conditional variances change with horizon of forecast but are not time-varying given a horizon. Consider a general linear process: Conditional information MA(2) Conditional variance is not time-varying: yt= et + θ1et-1 + θ2et-2 b0=1, b1= θ1, b2= θ2, b3=b4=…=0 Conditional prediction error variance: E[(yt-E(yt|t-1) )2|t-1] =E(et2 |t-1) = s2 Non-time-varying!
ARCH(p) process Examples: (1)ARCH(1): st2 = w + g1et-12 (2) ARCH(2): st2 = w + g1et-12+ g2et-22 ARCH(p) AutoRegressive Conditional Heteroskedasticy of order p
ARCH(p) process Examples: (1)ARCH(1): st2 = w + g1et-12 (2) ARCH(2): st2 = w + g1et-12+ g2et-22 ARCH implies volatility clustering. That is, large changes tend to be followed by large changes and small by small, of either sign.
ARCH(p) process Some properties (1) Unconditional mean (2) Unconditional variance (3) Conditional variance Examples: (1)ARCH(1): st2 = w + g1et-12 (2) ARCH(2): st2 = w + g1et-12+ g2et-22
ARCH(1) • st2 = w + g1et-12 • Note that • E[et2] = E[ E(et2|t-1) ] = E(st2) = s2 • E[(et-E(et))2] = ? • E[st2] = w + g1 E[et-12] • s2 = w + g1s2 • s2 = w / (1- g1)
How to simulation ARCH(1)? • Suppose we are interested in generating T observations of et that has the property of ARCH(1). • et ~ N(0,st2), where st2 = w + g1et-12 (1) Fixed the parameters. Compute the unconditional variance of et. • s2 = w / (1- g1) (2) Generate T+1 observations of standard normal random variables, v0, v1, …., vT (3) Generate et recursively • For t=0, st2 = s2, et = vtst • For t=1, st2 = w + g1et-12, and et = vtst • For t=2, st2 = w + g1et-12, and et = vtst
The inflation example of Engle (1982) log of the quarterly manual wage rates First difference of the log of the quarterly consumer price index Lagged 4 periods Engle, Robert F. (1982): “Autoregressive Conditional Heteroscedasticity with Estimates of the Variance ofUnited Kingdom Inflation,” Econometrica, 50(4): 987-1007.
The inflation example of Engle (1982)OLS regression Restriction imposed.
The inflation example of Engle (1982)ML estimation with ARCH(1) The ARCH model comes closer to truly random residuals after standardizing for their conditional distributions.
GARCH(p,q) Backward substitution on st2 yields A infinite-order ARCH process with some restriction in the coefficients. (Analogy: An ARMA(p,q) process can be written as MA(∞) process.) GARCH can be viewed as a parsimonious way to approximate a high order ARCH process
Important properties of GARCH(p,q)(1) Unconditional variance is fixed but conditional variance is time-varying
Important properties of GARCH(p,q)(2) Unconditional distribution of conditionally Gaussian GARCH is symmetric and leptokurtic. Real-world financial asset returns, are often found to symmetrically distributed and have a fatter tail than Gaussian distribution. Ordinary Gaussian distribution does not provide a good approximation of the asset returns, but the Gaussian distribution with GARCH does.
Important properties of GARCH(p,q)(3) Conditional prediction error variance varies with conditional information set. unbiased forecast Conditional variance of the prediction error Conditional variance approaches unconditional variance
Important properties of GARCH(p,q)(3) et follows GARCH implies et2 follows an ARMA .
Extension of ARCH and GARCH ModelsThreshold GARCH • When the lagged return is positive (good news yesterday), D=0, so the effect of the lagged squared return on the current conditional variance is simply a. • When the lagged return is negative (negative news yesterday), D=1, so the effect of the lagged squared return on the current conditional variance is simply a + g. • Allowance for asymmetric response has proved useful for modeling “leverage effects” in stock returns, which occur when g < 0.
Extension of ARCH and GARCH Modelsexponential GARCH • Volatility is drive by both the size and sign of shocks (both positive and negative). Hence, the model allows for asymmetric response depending on the sign of news. • When the shock is positive, the impact of (et-1/st-1) on ln(st2) is • a + g • When the shock is negative, the impact of (et-1/st-1) on ln(st2) is • -a + g
Extension of ARCH and GARCH ModelsGARCH with exogenous variables • Financial market volume, for example, often helps to explain market volatility.
Extension of ARCH and GARCH ModelsGARCH-in-Mean (i.e., GARCH-M) Conditional mean regression • High risk, high return.
Estimating, Forecasting, and Diagnosing GARCH Models • Diagnostic: • Estimate the model without GARCH in the usual way. • Look at the time series properties of the squared residuals. • Correlogram, AIC, SIC, etc. • ARMA(1,1) in the squared residuals implies GARCH(1,1).
Estimating, Forecasting, and Diagnosing GARCH Models • Estimation: Usually use maximum likelihood with the assumption of normal distribution. • Maximum likelihood estimation finds the parameter values that maximize the likelihood function • Forecast: • In financial applications, volatility forecasts are often of direct interest. • Better forecast confidence interval 1-step-ahead conditional variance vs.
Application: Stock Market Volatility • Objective: Model and forecast the volatility of daily returns on the New York Stock Exchange • Data: • Daily returns on the New York Stock Exchange (NYSE) form January 1, 1988, through December 31, 2001. • Excluding holidays, there are 3531 observations. • Estimation: 1-3461 • Forecast: 3462-3531.
Correlogram, Squared Standardized ARCH(5) residuals, NYSE Returns
GARCH(1,1) Model, NYSE Returns et-12 st-12
Correlogram, Squared Standardized GARCH(1,1) residuals, NYSE Returns
Estimated Conditional Standard Deviation, GARCH(1,1) Model, NYSE Returns
Estimated Conditional Standard Deviation, Exponential Smoothing, NYSE Returns
Conditional Standard Deviation, History and Forecast, GARCH(1,1) Model
Conditional Standard Deviation, Extended History and Extended Forecast, GARCH(1,1) Model
Is GARCH(1,1) enough most of the time? • 330 GARCH-type models are compared in terms of their ability to forecast the one-day-aheadconditional variance. • The models are evaluated out-of-sample using six different loss functions,where the realized variance is substituted for the latent conditional variance. Hansen, Peter R. and Asger Lunde (2005): “A Forecast Comparison Of Volatility Models:Does Anything Beat A GARCH(1,1)?” Journal of Applied Econometrics, 20: 873-889.
Is GARCH(1,1) enough most of the time? • Data: • DM–$ spot exchange rate data, • the estimation samplespans the period from October 1, 1987 through September 30, 1992 (1254 observations) and • theout-of-sample evaluation sample spans the period from October 1, 1992 through September 30,1993 (n = 260). • IBM stock returns, • the estimation period spans the periodfrom January 2, 1990 through May 28, 1999 (2378 days) and • the evaluation period spans the periodfrom June 1, 1999 through May 31, 2000 (n = 254).
Loss functions for forecast evaluation MSE2 and R2Log are similar to R2 of the MZ regressions.
The test Loss of alternatiave GARCH models. Loss of GARCH(1,1) Giving benefits of the doubt to the benchmark, i.e., GARCH(1,1). The maintained hypothesis is that GARCH(1,1) is better unless there is strong evidence against it.