1 / 45

Modelling the CRM for the Correlation Trading Portfolio

Modelling the CRM for the Correlation Trading Portfolio. Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010. Agenda. Regulatory Requirements Challenges in Meeting Regulatory Requirements RBS Approach to CRM Calculation

tegan
Download Presentation

Modelling the CRM for the Correlation Trading Portfolio

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of ScotlandMay 19, 2010

  2. Agenda • Regulatory Requirements • Challenges in Meeting Regulatory Requirements • RBS Approach to CRM Calculation • Modelling Approaches and Assumptions • Price Risk • Simulation of Market • Default Risk • Appendix • Computational Implementation of CRM

  3. Regulatory Requirements • The All Price Risk Measure represents a special form of the Incremental Risk Charge, described in 7.10.55S R (1) for positions in the correlation trading book • The “All Price Risk Measure”, must • Adequately capture all price risks at the 99.9% confidence interval over a capital horizon of one year • Under the assumption of a constant level of risk • And be run at least weekly • Price risk measures include: • Defaults, including the ordering of defaults; • Credit spread risk; • Volatility of implied correlations, including the cross effect between spreads and correlations; • Index to single names basis and implied correlation of an index to bespoke portfolios basis; • Recovery rate volatility; • Risk dynamic hedging and the cost of rebalancing; • Though interest rate and foreign exchange risk have not been explicitly mentioned, we consider this to included in “All Price Risk”

  4. Agenda • Regulatory Requirements • Challenges in Meeting Regulatory Requirements • RBS Approach to CRM Calculation • Modelling Approaches and Assumptions • Price Risk • Simulation of Market • Default Risk • Timing and Next Steps • Appendix • Computational Implementation of CRM

  5. Naive Implementation of CRM • Naively implementing the CRM i.e., computing the 99.9% worst loss on a 1 year horizon on RBS’s entire correlation trading portfolio is very difficult • For example, using Monte Carlo, we would need to evolve the market forwards in time, pricing and hedging the portfolio as per the desk, tracking P&L over a 1 year horizon • A back of the envelope calculation immediately reveals the high likelihood of failure: Figure 1: Numbers and types of trades in our portfolio along with representative times to compute PV and risks for one trade on one computer

  6. Naive Implementation of CRM (cont’d) • Assuming a rehedging frequency of once every month and a grid of 300 computers and the minimum number of paths to compute the 99.9% confidence limit (i.e., 1,000) we see that we would need ~ 2,600 hours to compute results for just the bespoke CDOs • Recalibration of the market (which needs to happen for every valuation and hedging time point) adds substantially to this timing • Over the next few slides we highlight: • How one might address the core issue of computational intractability • Issues in simulating the market • Subjectivity of hedging • We will in effect pose a series of questions; the decisions that we have made form the basis of the RBS approach to computing the CRM. This will be discussed in more detail in the following section.

  7. Possible Areas of Optimisation: Pricing Algorithms Choice of Algorithm • Can we use convolution ? • Importance sampling for the Monte Carlo Replace Recursion ? • The 1 factor Gaussian copula (with random recovery) is very popular because rapid computational schemes exist. The ASB (or variants thereof) algorithm is commonly used in the industry because it returns (quasi exacts) PV and risks rapidly. • Faster pricing approaches are well known in the literature; however, these are to some extent (uncontrolled) approximations to the true price. • LHP (Large Homogeneous Portfolio) • Conditional Gaussian approach (Shelton) • Saddlepoint Methods • Stein • Choice of scheme depends on counterplay between accuracy and speed Optimisation of the Implementation • Parallelisation of the code - currently valuation and risks are computed on a grid. Buy more computers? • Performance of grids do not necessarily scale linearly - data passing is a limiting factor • Front office pricing code focuses on accuracy: potential speed ups by for example reducing tolerances whilst maintaining high levels of accuracy • Rewriting time critical parts of the code in the assembler?

  8. Possible Approaches: Changing Mapping Approaches • When pricing bespoke tranches within a copula based model, we apply mapping technologies to determine the base correlation for the bespoke - this reflects the different riskiness of the bespoke tranche relative to the index • Loss Fraction (“LF”) mapping (the approach used by RBS and much of the industry) is slow - it requires the inversion of prices to determine correlations • Consider the use of a faster mapping technique such as At the Money (“ATM”) mapping • RBS front office uses LF mapping to risk manage their correlation book • LF deltas differ from ATM deltas • Valuing current portfolio and hedges using ATM mapping rather than LF, will make it appear unhedged • If we use ATM mapping, we would need to modify RBS’s current portfolio to achieve the same “level of risk” as per LF mapping and then apply a different mapping technique

  9. Figure 2: Mapping iTraxx9 to CDX9 using ATM mapping and LF mapping Figure 3: Mapping iTraxx9 5Y to 7Yusing ATM mapping and LF mapping Possible Approaches: Changing Mapping Approaches • We demonstrate the effect of the different mapping approaches in two scenarios: • Figure 2 shows the effect of the ATM and LF mapping, when mapping iTraxx S9 to CDX S9. Due to the important differences between the two indices, none of the considered mapping methods produces satisfactory results. However, we note that the LF mapping shifts the market correlation curve in the right direction (as opposed to the ATM mapping) • Figure 3 shows the effect of the ATM and LF mapping, when mapping iTraxx S9 5 year to 7 year . The two mapping methods produce similar results, with slightly higher correlation values for the LF mapping

  10. Subjectivity in Hedging • Typically traders hedge a position in a CDO tranche [a, b] using primarily the constituent CDSs and the index, and sometimes with an additional tranche [l, u] • Delta hedging movements in the Single Name CDS • Delta-hedging movements in the index • Delta and gamma hedging movements in the index • Hedging parallel shifts in correlation • Hedging default risk • Regression based hedging Traders are free to use some/other of the strategies outlined above; the choice will change depending on market conditions and trader outlook Algorithmically predicting the hedging strategy is therefore very difficult Hedging is computationally expensive; furthermore it is very subjective and implementing only a simplistic approach will give rise to greater slippages

  11. Simulation of the Market Simulating the universe of observed prices relevant to the CDO book forwards by periods up to one year is challenging • We need to model possible movements in yield curves and FX rates • There is a need to capture the dynamics of the market implied CDS spreads to model the price risk. Desiderata for the evolution of the CDS spreads include: • Impact of rating migrations (jumps?) • Empirical co-dependence between CDS spreads shows regional and sectoral variation • Co-dependence between CDS spreads is time dependent - showing regime like behaviour • Level dependent volatility • Modelling the index tranche market is, if anything, even more challenging • The observed index tranche market comprises: • Given the occurrence of defaults, some of these detachments have changed - e.g., for the high yield the original (0,10%) tranche has been completely wiped out

  12. Simulation of the Market • Typically these index tranche prices are mapped into base correlations using the (random recovery) Gaussian copula. In simulating the market forwards in time, we need to evolve the price / correlation surface • Can we evolve correlations e.g., additively? • Pretty clear that correlations are bounded between (0, 1) • However, the problem is far more subtle than this: it rapidly becomes clear that an arbitrary set of correlations do not describe a valid set of prices • Applying historical moves in base correlation to the current base correlation curve can lead to arbitrage situations, for example, negative tranche spreads • In the following graphs, the 3 month move in base correlations from September 2008 to December 2008 is applied to the current base correlation curve to obtain a shifted correlation curve • As can be seen from the graph on the bottom left, the resulting shifted base correlation curve results in tranche spreads which eventually become negative

  13. Figure 4 - Historic Base Correlation Moves (iTraxx 5y) Figure 6 - Base Correlation  Tranche Prices Figure 5 – Historic change applied to spot Figure 7 - Base Correlation  Tranche Prices, zoomed in Evolving correlations can lead to arbitrage opportunities

  14. Simulation of the Market (cont’d) • For the prices of index tranches to be admissible (i.e., for the absence of arbitrage) a set of strong conditions (that have effectively never violated for the market quoted points) must hold. • Typically these conditions are expressed in terms of the ETL (Expected Tranche Loss), denoted here by: • Intuitively, this is just the price of a European (capped call) option on the loss (More formally we define it as the expected loss on an equity tranche of width K at time T, as seen from time 0). • A number of boundary conditions are immediately apparent: • An equity tranche cannot lose more than its width i.e., • To ensure no arbitrage, the density of the loss distribution must be non-negative for all strikes and times. The ETL is just a normalised price of a call option on the loss; hence the non-negativity of the loss density implies that: • Losses cannot be reversed - hence the ETL of an equity tranche must be a constant or increasing function of T

  15. Agenda • Regulatory Requirements • Challenges in Meeting Regulatory Requirements • RBS Approach to CRM Calculation • Modelling Approaches and Assumptions • Price Risk • Simulation of Market • Default Risk • Appendix • Computational Implementation of CRM

  16. RBS Approach – Disaggregation of CRM calculation into Default and Price Risk Issues with Simulation • Unfeasibly large number of computations required to estimate 99.9th percentile. • Calculating hedges is computationally very expensive • Hedging strategy is very subjective – dependent upon the market and trader’s view of the future Definition of Price & Default Risk • We term Price Risk to be the impact on the portfolio of all moves in the market except for a default event; • Default Risk is defined to be the impact on portfolio value of default events • Default events are irreversible; price moves are reversible. Names cannot come back out of default • Different time horizons for Price Risk and Default Risk: • We can hedge price risk – hence the time horizon for price risk is dependent on hedge frequency (days to 1 month) • Defaults have a longer natural timescale – number of defaults in 1 month is minimal RBS chosen approach: Evaluate Price Risk and Default Risk separately, then aggregate to obtain CRM • Constant level of risk allows convolution of Price Risk (up to 1 month for re-hedging) cf.IRC • Reduces number of computations required for Pricing Risk • Removes need for extensive computation of sensitivities and reduces subjectivity in choice of hedging algorithm • Need to evaluate default risk separately – defaults are irreversible. Use Monte Carlo for default risk. • Enables development of an importance sampling algorithm for Default Risk • Is more conservative: double counts defaults combined with large spread move scenarios

  17. Price Risk - Constant Level of Risk Mathematically speaking, the constant level of risk assumption translates to assuming an identical loss distribution after each time interval, D, corresponding to 1/Hedging frequency i.e., after every hedge interval we are able to re-hedge such that the overall riskiness of RBS’s portfolio is identical to today’s level Assume D = 1Month. Then the constant level of risk P/L distribution over 1 year is the convolution of 12 copies of the 1 Month P/L distribution. This is very powerful: We do not need to compute actual hedges, just monthly P/L. Convolution allows us to get easily into the tail i.e., to estimate 99.9% This leads to significant savings in time – the computation becomes feasible without the need to move away from our books and records valuation approaches (i.e., CRM and desk approaches are consistent) Removes the subjectivity in choice of hedging approach Obviously convolution cannot be used for defaults (names that default over a month would need to come back out of default !) 17

  18. Constant Level of Risk - Convolution

  19. Convolution lets us get into the tails!

  20. Price Risk – RBS Algorithm • Choose time-horizon over which portfolio could be re-hedged (2 – 4 weeks) • Simulate Market (index tranches, single name CDS yield curves, basis etc) over hedging interval using our historical simulation algorithm (see below) • Compute P/L over this period; repeat ~200 – 500 times to compute a distribution • Use pricing technologies consistent (essentially identical analytics) with those used for books and records valuations • Use stressed market scenarios and probability weight (see below) to compute the full 1M P/L distribution. • Convolve N times (N = 12 if hedge frequency = 1M) to obtain full P/L distribution over 1Y • The use of convolution implies the absence of autocorrelation i.e., the 1M P/L distribution is uncorrelated with next month’s P/L distribution • We will quantify this by examining the impact on price risk of changing the hedging horizon • From a final number perspective, the impact of autocorrelation will be captured via the use of stressed starting scenarios

  21. Stressed Starting Scenarios More significantly, however, the constant level of risk assumption implies that (at the end of each hedging interval, despite significant market moves) we are able to re-hedge our CDO portfolio to the same level of riskiness as today This is a strong assumption. We therefore aim to apply an approach similar to that used for the IRC, where we use stressed starting scenarios Algorithmically: Choose 5 starting scenarios i.e., the market is in one of 5 starting scenarios (each with a weight – the Gauss Hermite weight). Our CDO positions will only be partially hedged to this scenario; the cost of this partial hedging will be part of the final P/L distribution The starting scenarios will correspond to dates on which the iTraxx, CDX and HY indices assumed the values implied by the Gauss Hermite percentiles The market is then evolved as per the algorithm above; the total loss distribution for 1M is computed, accounting for the impact of the stressed scenarios Hedging Allow partial (risk based) re-hedging of book when switching to stressed scenarios Model the relevant cost of re-hedging – based on applicable market bid/offers but also by including a liquidity premium 21

  22. Stressed Starting Scenarios • Choose stress scenarios to be market on particular days in our history. • Proxy stress events by absolute level of iTraxx spread levels • Choose days in history corresponding to stress events by finding days when quantile of the index matches the probability levels implied by Gauss Hermite.

  23. Agenda • Regulatory Requirements • Challenges in Meeting Regulatory Requirements • RBS Approach to CRM Calculation • Modelling Approaches and Assumptions • Price Risk • Simulation of Market • Default Risk • Appendix • Computational Implementation of CRM

  24. Price Risk: Simulation of Market Variables • Typical Approach in the Industry: • Choose a stochastic differential equation (SDE) to describe the market data parameter (e.g., FX) that we wish to simulate • Immediately introduces model dependence. • Estimate the parameters of the SDE (Kalman Filtering) • Simulate the SDE forwards to generate a possible future time series • Issues – why don’t we do this? • Strong model dependence – if we estimate a market using a diffusion, we will never predict any jumps! • Estimation dependent upon quality of history • Very difficult when we want to simulate a group of inter related variables (e.g. spreads, yield curves, FX, rates) consistently • Estimation very difficult in the multidimensional case! • Typically attempt to capture codependence using static correlation; real codependences are far more complex – time dependent and show regimes • Such an approach will struggle to preserve the shapes of curves (e.g. yield curves) Simulation of the Market Yield Curve Single Name Spreads Index Loss Fractions

  25. Simulation of the Market Yield Curve Single Name Spreads Index Loss Fractions Price Risk – Simulation of Market Variables Δt,t+1market = {Δt,t+1spreads, Δt,t+1FX , Δt,t+1YC , …} Δ01market Δ12market Δ23market History ... t0 t4 t0 t1 t2 t3 Jump in history Apply H(Δ01market) H(Δt,t+1market) Simulation Today’s Market • Derive a time-series of intra-period changes in market variables (FX, Interest rates, etc.) • Historic changes can not be applied to current data directly – define transformation function H() • Apply changes as well as sign-reversal : drift is random, directional correlations are preserved. Change sign of the entire market. 25

  26. Price RiskSimulation of Market Variables • RBS uses the Mahal, Rebonato et el. approach: • Apply sequence of historical market changes to current market • Starting date is randomly chosen • Dates of selected changes must agree across all risk drivers. • Randomly jump from sequence to a new date • Randomised trend reversal • Preserves directional inter-dependence (so, no need to model correlations etc.). • Historical changes must be applicable to current market • e.g. if current spread is 10bps, not realistic to apply ±100bps historical change • Transform risk drivers: Simulation of the Market Yield Curve Single Name Spreads Index Loss Fractions y = H [x] ysim = ytoday + Dyhist xsim = H-1[ ysim] • e.g. for proportional changes: H [x] = ln (x) • (we use this transformation for FX rates) • Historical changes should look like ‘white noise’ (not dependent on current market)

  27. CEV transformation s (x) xL xR Rate, x Source: RBS Price RiskMarket Simulation – Yield Curves Individual rates • Simple CEV-type transformations: Simulation of the Market Yield Curve Single Name Spreads Index Loss Fractions • To be calibrated: a, xL, xR, C • We also have ‘band reversion’ parameters, but not necessary for 1-month changes • Also necessary to check shape of simulated curves • See Mahal et al. ‘Barbell’ effects, shape reversion, etc. • Again, not a problem for short time-horizons

  28. Source: RBS Price Risk Market Simulation – Spreads General Approach • The history of individual names not necessarily relevant to modelling spread dynamic of the same name today (e.g. Ford) • For obligors that have experienced downgrades or corporate actions, a direct map to its spread history and historical spread change would be unrepresentative of the behaviour it is likely to exhibit today • For any date, bucket names by sector and spread percentile range • For each path (start-date) randomly map each name into a name in the same historical bucket. • Apply the corresponding changes from the mapped name. • Introduces more randomness and therefore a wider range of plausible outcomes • Preserves correlations across an industry • Captures cross-gamma risk concentrated by name Simulation of the Market Yield Curve Single Name Spreads Index Loss Fractions Spread Mapping Exercise • Transformation • Simplest model would be H[x] = ln(x) • (as used in Regulatory Stress Test) • However, we would expect some dependency on current levels of spreads • Maybe similar to Interest Rates • This is work in progress Industrial Sector Spread Percentile Band

  29. Price RiskMarket Simulation – Index Loss Fractions General Considerations • Need a different parameterisation of index tranche prices beyond correlation • Simulated prices must be non-arbitrage-able across detachment points and across maturities RBS is in the process of testing two alternative models, both involving Index Loss Fractions (“ILFs”) • ILFs are effectively the ratio between the Expected Tranche Loss for an equity tranche with strike K to the total expected loss (EL) of the index (i.e., expected tranche loss on an equity tranche with strike 0). • Index Loss fractions – underlie loss fraction mapping • RBS first simulates single name CDS spreads and the basis – we can therefore compute EL. • We then propose to simulate the ILFs (i.e., the above ratios), and then convert to tranche price Simulation of the Market Yield Curve Single Name Spreads Index Loss Fractions

  30. Price RiskMarket Simulation – Loss Fractions Bounds Loss Fraction Bounds • ILFs for any maturity need to be concave functions of detachment point. We model changes so that simulated ILFs automatically have this property • Simulate equity tranche, and for successively senior tranches find lower and upper bounds for the ILFs, say LB and UB • Define tranche ‘Theta’ as the following ratio: • [q] (must be between 0 and 1) • Clearly we cannot just add q (given the bounds); instead map q onto the range (- , ) using inverse normal cumulative distribution, say: S = F-1[q] • Additive changes in S will therefore always be valid. Are we done? Simulation of the Market Yield Curve Single Name Spreads Index Loss Fractions

  31. Market Simulation – Loss Fractions Bounds • Y-axis: base correlation • X-axis: detachment point • Right hand graphs show magnified view of corresponding left hand graph Simulation of the Market Yield Curve 0-x 0-x Single Name Spreads Index Loss Fractions 0-3 0-3 0-7 0-7 31

  32. Figure 8 Change in S Versus Expected Loss - Unscaled Figure 9 Change in S Versus Expected Loss - Scaled Source: RBS Source: RBS Source: RBS Price RiskMarket Simulation – Loss Fractions Bounds Simulation of the Market • Let us look at a plot of Historical S’s • Figure 8 shows a plot of changes in S for 5-year CDX versus Index Expected Loss • There is clearly a pattern: as we go to higher expected loss the range over which S can vary decreases • This effect appears more significant in the data than it is – fewer data points for larger EL • Hence S is not a good quantity to simulate • Figure 9 shows the impact of scaling S by Expected Loss: ie Z = S *G{EL) G{.} is calibrated to different indices and maturities • No pattern i.e., apply historical changes in Z to today’s market Yield Curve Single Name Spreads Index Loss Fractions

  33. Agenda • Regulatory Requirements • Challenges in Meeting Regulatory Requirements • RBS Approach to CRM Calculation • Modelling Approaches and Assumptions • Price Risk • Simulation of Market • Default Risk • Appendix • Computational Implementation of CRM

  34. Default RiskSummary Schematic – Explanation Simulating Defaults • Simulate defaults over a 1-year liquidity horizon • Approach employs same PD/Default correlationstructure as IRC • Through-the-cycle (i.e., long term) PDs based on, for example, historically experienced default rates • However, we don’t know what stage of the credit cycle we will be in 1 year in the future. Hence we need to stress these PDs. • Use a Merton firm value (Gaussian copula) type approach – familiar from IRC as described by the IRB. • Stress the common factor to give default correlation/contagioneffects • Non-default spreads driven by the same systematic effects • We need to integrate over systematic effects (use Gauss-Hermite if 1 factor model, Monte Carlo if multi factor) • Recovery rates randomised (driven by systematic effects) • Therefore we have a set of defaulted names (defaulted as per “real world” dynamics) and the times of default up to 1 year Valuation • Given a set of defaults over 1 year, we would expect that the spreads of the non defaulted firms will have changed: • If we do not allow contagion, FTD baskets would always make money on a default • We need to know the form of the entire market – index tranche prices, CDS spreads, FX, yield curves, basis etc. We propose to do this by using the value of the common factor to pick out dates where the empirical cumulative probability of the Itraxx / CDX index level corresponds to the cumulative probability of the common factor.

  35. Default Risk Detailed Explanation • Then we • Revalue the portfolio under the given market scenario incorporating randomized recoveries and defaults and spreads blown out (V1) • Revalue the portfolio under the given market scenario incorporating spreads blown out but with no defaults (V2). • Default p/l = V1 -V2 • The impact of price risk is already captured • A series of default events will cause the spread environment to change (possibly markedly). The aim is to capture this cross effect – these products are nonlinear! • Tail risk identified by 2-stage estimation process (Importance sampling) 1. Large number of simulations (10,000) using approximate revaluation Select subset (1,000) giving largest approximate losses 2. Compute corresponding losses using exact revaluations Find appropriate tail average of these

  36. Default Risk - Modelling Contagion Effects

  37. Optimisation – Improving on Stein ? • The 1 factor Gaussian copula (with random recovery) is very popular because rapid computational schemes exist • All such schemes are predicated on the fact that after conditioning on the common factor credits become conditionally independent • The standard approach - the so called ASB algorithm - computes this conditional loss distribution using recursion and is essentially exact • Various approximations –all of which seek to approximate this conditional loss distribution exist • Probably the most accurate approach in the literature is an application of the Stein approximation (El Karoui, 2008) • We have implemented Stein and extensively investigated its use for this problem. We have also developed an alternative (novel – i.e., not seen in the literature) Poisson approximation • Both approaches are significantly quicker than standard recursion (factor ~3) • Both methods have been compared with Random Recovery Recursion on actual Index Tranche and Bespoke portfolios, for a range of: • Spread Scenarios • Correlations • Maturities • Attachments / Detachments • Our testing has encompassed stress events such as those produced by our market simulation. 37

  38. Normal Approximation • The conditional loss distribution is bounded between 0 and the (factor-dependent) maximum loss. When portfolio expected loss is not too low or high the loss distribution can be close to normal. • Otherwise, however, the distribution can accumulate at either extreme and the normal approximation deteriorates. • The figures below compare a 100-name homogeneous loss distribution with its approximating normal for different levels of expected portfolio loss. Extreme low or high expected losses will always arise since we are integrating across the market factor. • (Note that these figures are qualitative comparisons only, where discrete distributions are normalized by grid size. The tranche prices themselves give the true quantitative comparison.) 38

  39. Standard Poisson Approximation • A poisson distribution is a natural approximation to the true conditional loss distribution when expected losses are low. • The figures below compare the same 100-name homogeneous distribution with the usual poisson approximation. As portfolio expected loss increases the accuracy deteriorates. • The range of accuracy of the poisson and normal are complementary so a threshold for expected loss can be specified at which the approximation changes from poisson to normal. For the example here this would typically be set around 0.10 to 0.15. • If recoveries are inhomogeneous however the distribution will be sparse with a small loss unit or grid size, and the standard poisson approach becomes problematic. 39

  40. Adjusted Poisson Approximation • The standard poisson approximation uses the same loss grid as the true distribution. If instead we allow the approximating poisson to have its own loss unit we have an extra parameter and a more flexible approach. • At low expected losses, the adjusted poisson is very similar to the standard poisson and the grid size is very close to that of the true distribution (homogeneous in this example). • As expected loss increases the grid size decreases and the adjusted poisson smoothly changes over to be very close to normal. 40

  41. Comparison (Poisson vs Stein) Price Differences versus Random Recovery Recursion: 0 – 3% Tranche Poisson Stein 41

  42. Comparison (Poisson vs Stein) Price Differences versus Random Recovery Recursion: 9 – 12% Tranche Poisson Stein 42

  43. Comparison (Poisson vs Stein) Price Difference Comparison for all Bespoke Tranches 43

  44. Default Risk Approximation • Default Risk modelling is time consuming • Under full revaluation - each time a default occurs, the model must, for each trade: • Remove defaulted name from portfolio • Calculate expected recovery of defaulted name • Calculate adjusted portfolio expected loss • Iteratively Re-calibrate loss fraction curve(s) based on new portfolio expected loss • Re-price adjusted tranche using new attachment and detachment points • In scenarios where we are simulating a number of defaults occurring (i.e. tail risk), computation times increase dramatically 44

  45. Default Risk Approximation – PV Interpolation • The time consuming step in calculating the PV impact of defaults is the recalibration of the tranche loss fraction curve required for the new portfolio after defaulted names have been removed • For calculation of the PV of a tranche on a portfolio that has experienced defaults, this recalibration step can be circumvented if we keep the portfolio the same, but readjust the tranche attachment point by the loss amount: • Operationally, for each trade portfolio, the (mid spreads and durations of) 15 tranches beneath the attachment point of the original tranche are pre-calculated • These tranches are of the same tranche thickness as the original transaction • The specific pre-calculated tranches depend on the original tranche attach and detach • Based on the simulated number of defaults, a loss amount is calculated and the corresponding loss in subordination of the original tranche is calculated • The PV of the defaulted tranche is calculated based on interpolating the subordination adjusted curve against the pre-calculated tranches 100% 100% 100% 100% Recovery 95% Simulate Defaults Calculate PV Impact 6% 6% 6% 6.0% Tranche Tranche Tranche Loss 5.5% 5% 5% Tranche 5% 4.5% 1% 0.5% 0.5% Default Loss Loss 0% 0% 0% 0% 0% Full Reval PV Interp 45

More Related