440 likes | 644 Views
Class 2 Measuring Market Risk. What does the company measure risk for?. What can we say about Sberbank shares? What can we predict?. What can we say about Sberbank shares? What can we predict?. Movements in asset prices are (almost) unpredictable
E N D
What can we say about Sberbank shares? What can we predict? • Movements in asset prices are (almost) unpredictable • If someone could predict tomorrow’s price, (s)he would trade on this information and move today’s price to fair value • This is the basis for the Efficient Markets hypothesis • Prices move due to economic news arriving • Positive and negative news move the market up and down • Althoughwecan’tpredictfutureprices • …wecanpredictrisk!
How to measure risk? • Size of the position • $XXX mln • Magnitude of price changes • Volatility: st. deviation / EWMA / GARCH • Sensitivity (exposure) to risk factors (indices) • Beta / duration / delta • Potential losses • VaR / CVaR (Expected Shortfall)
t n How to model market risk? Probability 40 Maturity 50 Time Price 60 t o
From prices to returns • Prices are convenient for graphical analysis, but.. • Non-stationary: properties of the stochastic process change over time • Must be corrected for dividends and stock splits • Thus, prices should be normalized to compare dynamics over time and across securities • Transforming prices into returns: Rt = (Pt+Dt-Pt-1)/Pt-1 • Return = percentage growth in the portfolio’s value, including capital gain and accumulated dividends/coupons
Sberbank’s risks in 2005-2013 • Average weekly return: 0.64% (33% per annum) ravg = (1/T) Σt=1:Trt • rt is return, T is length of the sample period • Total risk = volatility • Variance: σ2t = (1/T) Σt=1:T(rt -ravg)2 • Standard deviation σ: 6.6% (47% p.a.) • Systematic risk = beta • In the market modelRi,t = αi + βi*RM,t + εi,t • RM,t:market index return, εi,t : error • βсбер= 1.1 • However, risks change over time! • Let’s compute risks based on the rolling window of 1 year
How to improve the volatility measure? • Basic approach: historical volatility • Moving Average (MA) with equal weights • How long should be the estimation period?
How to improve the volatility measure? • EWMA: σ2t = λσ2t-1 + (1-λ)r2t-1 = (1-λ) Σk>0λk-1r2t-k • Exponentially Weighted Moving Average quickly absorbs shocks • How to choose λ? • Minimize Root of Mean Squared Error: RMSE = √ (1/T)∑t=1:T (σ2t-r2t)2 • λ = 0.94 for developed markets
How to improve the volatility measure? • GARCH(1,1):σ2t = a + bσ2t-1 + cε2t-1 • Generalized AutoRegressive Conditional Heteroskedasticity model is more general and flexible than EWMA • Can include additional effects: • More lags • Stronger reaction to negative shocks (leverage effect) • Is more general model always good? • More parameters leads to larger estimation error • GARCH is used less frequently in risk management than EWMA
How to improve the volatility measure? • Implied volatility: based on options’ market prices and (Black-Sholes) model • Realized volatility: based on intraday data
How to improve the volatility measure? • Implied volatility: based on options’ market prices and (Black-Sholes) model • Forward-looking! • But depends on the model • Only for assets with liquid options • Realized volatility: based on intraday data • E.g., prices over hourly intervals • May be biased by trading effects • Only for liquid assets
How to improve the beta measure? • Historical beta • Estimation error: if you estimated beta of 2, true beta is probably 1.5 • Low prediction ability • Trade-off when choosing the estimation period • Non-linear effects • E.g., allow beta depend on the market index • More factors: multi-factor model Ri,t = αi + ΣkβkiIkt + εi,t • Global / regional / country market indices • Industry indices • Macro-factors: oil price, inflation, exchange rates, interest rates,… • Investment styles: small-cap, value (low P/E), momentum(past winners)
How to apply beta approach to other assets? • Bonds: • Duration D measures elasticity of the bond’s price to interest rates • Convexity C measures the second-order effect • For a small change in the interest rate y: ΔP/P ≈ -D Δy/y + ½ C (Δy/y)2 • Derivatives: • Delta δmeasures sensitivity to the underlying asset’s price • Gamma γ measures the second-order effect • Vega measures sensitivity to volatility
Why are volatility and beta not good enough? • Drawbacks of volatility • It measures speculative risk: both negative and positive deviations • It does not capture fat tails and asymmetry • Drawbacks of beta • Can’t capture non-linear effects (especially important for derivatives) • Ignores omitted risks • Even more important problems: • Communication: how to explain to your boss • Comparabilityof different types of risk
How to interpret VaR? • Value-at-Risk measures potential losses • Maximum loss due to market fluctuations over a certain time period with a given probability (confidence level) Prob (Loss over1day <VaR)=95% • RiskMetrics (original approach of investment banks) Prob (Loss over10days <VaR)=99% • Basel approach (banking regulators): larger confidence interval and holding period • What is a better confidence interval and holding period? • The higher the confidence level, the lower the precision • Holding period depends on time necessary to close or hedge the position
Method 1: Historical simulation • Nonparametric approach • Assuming that distribution of future returns is well approximated by the empirical distribution of returns over a certain period in the past • Applied directly to the asset: • VaR = Percentile of historical returns • Easy way to get it: by sorting or plotting histogram • Applied to the portfolio: • Portfolio return = function of assets or risk factors • Must know portfolio weights, factor betas or price function (e.g. Black-Scholes) • Compute hypothetical historical returns of the current portfolio • Note: similar to bootstrap • Using past returns as possible scenarios
Critique: pros and cons • Easy and simple • Model-free • No need to assume normal distribution, forecast volatility • Correlations are embedded • Choice of the sample period • Usually, at least 1 year • Using short history, we may miss rare shocks • Slow reaction to changes in risks • Hard to extrapolate to a longer horizon
How to modify historical simulation approach? • Time-weighted historical simulation: more weight to recent observations • Each historical return is assigned a probability weight • Probabilities are geometrically decreasing with lag: (1- λ)λtfor lag t • Usually, 0.95<λ<0.99 • Sort returns and compute a percentile by accumulating the weights • Higher weight for observations from the same month • For seasonal commodities, such as natural gas • Filtered historical simulation: combining HS with dynamic variance (volatility scaling) • Estimate the time series of σt • Compute the historical standardized returns Rt/σt • Compute the percentile of standardized returns and multiply it by the current volatility forecast σ0 to obtain VaR • Thus you can solve the problem that current volatility may be different from volatility prevailing in the past
Method 2: delta-normal approach for a single asset • Assuming that portfolio returns are normally distributed: VaR = k1-αVσt • Quantilek1-α: 1.65 (95%) or 2.33 (99%) • With daily data, we usually assume that expected return is 0 • Holding period may be extended up to 10 days • T-day Var = daily Var * √T • Assuming stationarity and zero auto-correlation
Example: VaR (95%, 1d) for 1 mln of IBM stocks with price $24; annual σ=60% From annual to daily Square Root of Time (divide by 16) Normal Distribution & Confidence level (multiply by 1.645) Calculate the $ value of the exposure Size of the Position Size of Risk (volatility)
Histogram of daily S&P 500 returns and the normal distribution, 2001-2010 • Can we justify normal distribution and delta-normal VaR?
Delta-normal method for a portfolio: risk mapping • Decomposition of the portfolio to multiple risk factors: Rp=β’F+ε VaR = k1-αV√β’ΣFβ • Rp is decomposed based on Taylor series • β: vector of portfolio weights or sensitivities of portfolio return to factor returns • ΣF: covariance matrix of risk factors
Critique: pros and cons • Quick computations • Easy to extrapolate to a longer horizon • Quick reaction to changes in risks • Strong assumption about normal distribution • Cannot properly handle complicated derivatives with non-linear payoffs • Computations rise geometrically with the number of assets/factors
How to deal with deviations from normal distribution? • Fat tails / skewness • Adjusted quantiles based on (asymmetric) Student’s t distribution or mixture of normal distributions • Modified VaR: the Cornish-Fisher expansion taking into account skewness S and kurtosis K • μ is mean, σ is std deviation, zc is number of std deviations for VaR • Nonlinear relationships (e.g., for options) dV = δdS + ½ γ dS2 + ... • Delta-gamma approximation: VaR= |δ|k1-ασS - ½ γ(k1-ασS)2 • Will this method understate or overstate risk in presence of options?
Method 3: Monte Carlo simulation • Model (multivariate) factor distributions • Stocks: Geometric Brownian Motion / with jumps • Interest rates: Vasicek / CIR / multifactor models • Generate scenarios and compute the realized P&L • Using factor innovations from the model • Plot and analyze the empirical distribution of P&L
Critique: pros and cons • Most powerful and flexible • Can be applied to most complicated instruments • E.g. path-dependent options • Allows to model tail risk with higher precision • Intellectual and technological skills required • Complexity • Looks like a black box • Lengthy computations • Longer reaction • Model risk • E.g., estimating cross-factor dependencies
Which approaches to measure VaR are used by banks? • Most banks rely on Historical Simulation method with Full revaluation
Back-testing VaR • Verification of how precisely VaR is measured • Compare % violations (cases when the losses exceed VaR) with the predicted frequency • Testing whether the difference is significant: • H0: % violations = expected frequency • p-value = 1 - binomdist(#violations, #obs., exp.freq., TRUE) • E.g., 1 - binomdist(18, 252, 0.05, TRUE) = 0.07 • Historical approach: based on the actual P&L • Required by Basel • Helps to identify the model’s weaknesses, mistakes in the data, and intra-day trading • Often, actual P&L produces lower than expected frequency of VaR violations due to day trading
Violations: delta-normal (5%), historical simulation (7%), filtered hist. simulation (8%)
Back-testing VaR • Basel: back-test is based on one year of daily data • Small sample problem • Need long history for high confidence level (99%) to ensure statistical accuracy of VaR forecasts
How else can we back-test different VaR models (besides percentage of violations)? • Accuracy: the difference between VAR and actual daily P&L • An accurate model will be highly reactive, in the sense that it will rise and fall in a way that corresponds to daily fluctuations in the P&L. As a result, it will have high information content; management will be able to see changes in market conditions reflected quickly. Excess RWAs will be avoided, as VAR reduces rapidly when volatility declines. • Stability: the change in VAR from day to day • A stable model will not be prone to surprising leaps in VAR when risk positions change only slightly. • A stable model will avoid sudden drops in VAR when data points fall out of the time series and will not be overly reactive to small, short-term changes in market conditions. • Precision in predicting losses: mean violation