Presentation

Download Report

Transcript Presentation

Modelling the CRM for the Correlation Trading
Portfolio
Dherminder Kainth, Jan Kwiatowski & Douglas Muirden
Royal Bank of Scotland
May 19, 2010
Agenda
 Regulatory Requirements
 Challenges in Meeting Regulatory Requirements
 RBS Approach to CRM Calculation
 Modelling Approaches and Assumptions
 Price Risk
• Simulation of Market
 Default Risk
 Appendix
• Computational Implementation of CRM
RBS00000
2
Regulatory Requirements

The All Price Risk Measure represents a special form of the Incremental Risk Charge, described in 7.10.55S R (1) for positions
in the correlation trading book

The “All Price Risk Measure”, must
• Adequately capture all price risks at the 99.9% confidence interval over a capital horizon of one year
• Under the assumption of a constant level of risk
• And be run at least weekly

Price risk measures include:
• Defaults, including the ordering of defaults;
• Credit spread risk;
• Volatility of implied correlations, including the cross effect between spreads and correlations;
• Index to single names basis and implied correlation of an index to bespoke portfolios basis;
• Recovery rate volatility;
• Risk dynamic hedging and the cost of rebalancing;
• Though interest rate and foreign exchange risk have not been explicitly mentioned, we consider this to included in “All Price
Risk”
RBS00000
3
Agenda
 Regulatory Requirements
 Challenges in Meeting Regulatory Requirements
 RBS Approach to CRM Calculation
 Modelling Approaches and Assumptions
 Price Risk
• Simulation of Market
 Default Risk
 Timing and Next Steps
 Appendix
 Computational Implementation of CRM
RBS00000
4
Naive Implementation of CRM

Naively implementing the CRM i.e., computing the 99.9% worst loss on a 1 year horizon on RBS’s entire correlation trading
portfolio is very difficult

For example, using Monte Carlo, we would need to evolve the market forwards in time, pricing and hedging the portfolio as per
the desk, tracking P&L over a 1 year horizon

A back of the envelope calculation immediately reveals the high likelihood of failure:
Trade Type
Trade Count
Computation
Timing (seconds)
Bespoke
~500
PV
~10
Nth To Default
~500
Parallel CR01
~200
Index CDS
~3000
Default Delta
~60
Index tranches
~3000
Recovery Delta
~200
Base Correlation Sensitivity
~60
Single name CDS
c. 75,000
Figure 1: Numbers and types of trades in our portfolio along with representative times to compute PV and risks for one trade on one computer
RBS00000
5
Naive Implementation of CRM (cont’d)

Assuming a rehedging frequency of once every month and a grid of 300 computers and the minimum number of paths to
compute the 99.9% confidence limit (i.e., 1,000) we see that we would need ~ 2,600 hours to compute results for just the
bespoke CDOs

Recalibration of the market (which needs to happen for every valuation and hedging time point) adds substantially to this timing

Over the next few slides we highlight:
• How one might address the core issue of computational intractability
• Issues in simulating the market
• Subjectivity of hedging

We will in effect pose a series of questions; the decisions that we have made form the basis of the RBS approach to computing
the CRM. This will be discussed in more detail in the following section.
RBS00000
6
Possible Areas of Optimisation: Pricing Algorithms
Choice of Algorithm
 Can we use convolution ?
 Importance sampling for the Monte Carlo
Replace Recursion ?
The 1 factor Gaussian copula (with random recovery) is very popular because rapid computational schemes exist. The ASB (or
variants thereof) algorithm is commonly used in the industry because it returns (quasi exacts) PV and risks rapidly.
• Faster pricing approaches are well known in the literature; however, these are to some extent (uncontrolled) approximations
to the true price.
– LHP (Large Homogeneous Portfolio)
– Conditional Gaussian approach (Shelton)
– Saddlepoint Methods
– Stein

Choice of scheme depends on counterplay between accuracy and speed
Optimisation of the Implementation
 Parallelisation of the code - currently valuation and risks are computed on a grid. Buy more computers?
• Performance of grids do not necessarily scale linearly - data passing is a limiting factor
 Front office pricing code focuses on accuracy: potential speed ups by for example reducing tolerances whilst maintaining high
levels of accuracy

Rewriting time critical parts of the code in the assembler?
RBS00000
7
Possible Approaches: Changing Mapping Approaches

When pricing bespoke tranches within a copula based model, we apply mapping technologies to determine the base
correlation for the bespoke - this reflects the different riskiness of the bespoke tranche relative to the index

Loss Fraction (“LF”) mapping (the approach used by RBS and much of the industry) is slow - it requires the inversion of prices
to determine correlations

Consider the use of a faster mapping technique such as At the Money (“ATM”) mapping
• RBS front office uses LF mapping to risk manage their correlation book
• LF deltas differ from ATM deltas
• Valuing current portfolio and hedges using ATM mapping rather than LF, will make it appear unhedged
• If we use ATM mapping, we would need to modify RBS’s current portfolio to achieve the same “level of risk” as per LF
mapping and then apply a different mapping technique
RBS00000
8
Possible Approaches: Changing Mapping Approaches
Figure 2: Mapping iTraxx9 to CDX9 using ATM mapping and LF mapping
Figure 3: Mapping iTraxx9 5Y to 7Yusing ATM mapping and LF mapping
100%
90%
90%
80%
80%
70%
Correlation
Correlation
70%
60%
50%
40%
A TM M apping
LF mapping
CDX S9
10%
iTraxx S9
A TM mapping
0%

40%
20%
20%
0.0%
50%
30%
30%
10%
60%
LF mapping
iTraxx S9
0%
10.0%
20.0%
30.0%
Strike
40.0%
50.0%
60.0%
0.0%
5.0%
10.0%
15.0%
Strike
20.0%
25.0%
30.0%
We demonstrate the effect of the different mapping approaches in two scenarios:
• Figure 2 shows the effect of the ATM and LF mapping, when mapping iTraxx S9 to CDX S9. Due to the important
differences between the two indices, none of the considered mapping methods produces satisfactory results. However, we
note that the LF mapping shifts the market correlation curve in the right direction (as opposed to the ATM mapping)
• Figure 3 shows the effect of the ATM and LF mapping, when mapping iTraxx S9 5 year to 7 year . The two mapping methods
produce similar results, with slightly higher correlation values for the LF mapping
RBS00000
9
Subjectivity in Hedging

Typically traders hedge a position in a CDO tranche [a, b] using primarily the constituent CDSs and the index, and sometimes
with an additional tranche [l, u]
• Delta hedging movements in the Single Name CDS
• Delta-hedging movements in the index
• Delta and gamma hedging movements in the index
• Hedging parallel shifts in correlation
• Hedging default risk
• Regression based hedging
Traders are free to use some/other of the strategies outlined above; the choice will change depending on market
conditions and trader outlook
Algorithmically predicting the hedging strategy is therefore very difficult
Hedging is computationally expensive; furthermore it is very subjective and implementing only a simplistic approach will
give rise to greater slippages
RBS00000
10
Simulation of the Market
Simulating the universe of observed prices relevant to the CDO book forwards by periods up to one year is challenging

We need to model possible movements in yield curves and FX rates

There is a need to capture the dynamics of the market implied CDS spreads to model the price risk. Desiderata for the
evolution of the CDS spreads include:
• Impact of rating migrations (jumps?)
• Empirical co-dependence between CDS spreads shows regional and sectoral variation
• Co-dependence between CDS spreads is time dependent - showing regime like behaviour
• Level dependent volatility


Modelling the index tranche market is, if anything, even more challenging
The observed index tranche market comprises:
CDX
iTraxx
5, 7, 10 Years
High Yield
5, 7, 10 Years
5 Years
Attach
Detach
Attach
Detach
Attach
Detach
0%
3%
0%
3%
0%
10%
3%
7%
3%
6%
10%
15%
7%
10%
6%
9%
15%
25%
10%
15%
9%
12%
25%
35%
15%
30%
12%
22%
35%
100%
30%
100%
22%
100%
(initial maturity shown)
Given the occurrence of defaults, some of these detachments have changed - e.g., for the high yield the original (0,10%)
tranche has been completely wiped out
11
RBS00000

Simulation of the Market

Typically these index tranche prices are mapped into base correlations using the (random recovery) Gaussian copula. In
simulating the market forwards in time, we need to evolve the price / correlation surface

Can we evolve correlations e.g., additively?
• Pretty clear that correlations are bounded between (0, 1)

However, the problem is far more subtle than this: it rapidly becomes clear that an arbitrary set of correlations do not describe
a valid set of prices

Applying historical moves in base correlation to the current base correlation curve can lead to arbitrage situations, for example,
negative tranche spreads
• In the following graphs, the 3 month move in base correlations from September 2008 to December 2008 is applied to the
current base correlation curve to obtain a shifted correlation curve
• As can be seen from the graph on the bottom left, the resulting shifted base correlation curve results in tranche spreads
which eventually become negative
RBS00000
12
Evolving correlations can lead to arbitrage opportunities
Figure 4 - Historic Base Correlation Moves (iTraxx 5y)
Figure 5 – Historic change applied to spot
Figure 6 - Base Correlation  Tranche Prices
Figure 7 - Base Correlation  Tranche Prices, zoomed in
RBS00000
13
Simulation of the Market (cont’d)

For the prices of index tranches to be admissible (i.e., for the absence of arbitrage) a set of strong conditions (that have
effectively never violated for the market quoted points) must hold.

Typically these conditions are expressed in terms of the ETL (Expected Tranche Loss), denoted here by:

Intuitively, this is just the price of a European (capped call) option on the loss (More formally we define it as the expected loss
on an equity tranche of width K at time T, as seen from time 0).

A number of boundary conditions are immediately apparent:
• An equity tranche cannot lose more than its width i.e.,
• To ensure no arbitrage, the density of the loss distribution must be non-negative for all strikes and times. The ETL is just a
normalised price of a call option on the loss; hence the non-negativity of the loss density implies that:
• Losses cannot be reversed - hence the ETL of an equity tranche must be a constant or increasing function of T
RBS00000
14
Agenda
 Regulatory Requirements
 Challenges in Meeting Regulatory Requirements
 RBS Approach to CRM Calculation
 Modelling Approaches and Assumptions
 Price Risk
• Simulation of Market
 Default Risk
 Appendix
• Computational Implementation of CRM
RBS00000
15
RBS Approach – Disaggregation of CRM calculation into Default and Price Risk
Issues with Simulation
1. Unfeasibly large number of computations required to estimate 99.9th percentile.
2. Calculating hedges is computationally very expensive
3. Hedging strategy is very subjective – dependent upon the market and trader’s view of the future
Definition of Price & Default Risk
•
We term Price Risk to be the impact on the portfolio of all moves in the market except for a default event;
•
Default Risk is defined to be the impact on portfolio value of default events
•
Default events are irreversible; price moves are reversible. Names cannot come back out of default
•
Different time horizons for Price Risk and Default Risk:
•
•
We can hedge price risk – hence the time horizon for price risk is dependent on hedge frequency (days to 1 month)
Defaults have a longer natural timescale – number of defaults in 1 month is minimal
RBS chosen approach: Evaluate Price Risk and Default Risk separately, then aggregate to obtain CRM
 Constant level of risk allows convolution of Price Risk (up to 1 month for re-hedging) cf. IRC
 Reduces number of computations required for Pricing Risk
 Removes need for extensive computation of sensitivities and reduces subjectivity in choice of hedging algorithm
 Need to evaluate default risk separately – defaults are irreversible. Use Monte Carlo for default risk.
 Enables development of an importance sampling algorithm for Default Risk
 Is more conservative: double counts defaults combined with large spread move scenarios
RBS00000
16
Price Risk - Constant Level of Risk

Mathematically speaking, the constant level of risk assumption translates to assuming an identical loss distribution after each
time interval, D, corresponding to 1/Hedging frequency

i.e., after every hedge interval we are able to re-hedge such that the overall riskiness of RBS’s portfolio is identical to today’s
level

Assume D = 1Month. Then the constant level of risk P/L distribution over 1 year is the convolution of 12 copies of the 1 Month
P/L distribution. This is very powerful:
• We do not need to compute actual hedges, just monthly P/L.
• Convolution allows us to get easily into the tail i.e., to estimate 99.9%
• This leads to significant savings in time – the computation becomes feasible without the need to move away from
our books and records valuation approaches (i.e., CRM and desk approaches are consistent)
• Removes the subjectivity in choice of hedging approach
• Obviously convolution cannot be used for defaults (names that default over a month would need to come back out
of default !)
RBS00000
17
Constant Level of Risk - Convolution
RBS00000
18
Convolution lets us get into the tails!
RBS00000
19
Price Risk – RBS Algorithm
1. Choose time-horizon over which portfolio could be re-hedged (2 – 4 weeks)
2. Simulate Market (index tranches, single name CDS yield curves, basis etc) over hedging interval using our historical simulation
algorithm (see below)
3. Compute P/L over this period; repeat ~200 – 500 times to compute a distribution
• Use pricing technologies consistent (essentially identical analytics) with those used for books and records valuations
4. Use stressed market scenarios and probability weight (see below) to compute the full 1M P/L distribution.
5. Convolve N times (N = 12 if hedge frequency = 1M) to obtain full P/L distribution over 1Y
•
The use of convolution implies the absence of autocorrelation i.e., the 1M P/L distribution is uncorrelated with next month’s P/L
distribution
• We will quantify this by examining the impact on price risk of changing the hedging horizon
• From a final number perspective, the impact of autocorrelation will be captured via the use of stressed starting scenarios
RBS00000
20
Stressed Starting Scenarios

More significantly, however, the constant level of risk assumption implies that (at the end of each hedging interval, despite
significant market moves) we are able to re-hedge our CDO portfolio to the same level of riskiness as today

This is a strong assumption. We therefore aim to apply an approach similar to that used for the IRC, where we use stressed
starting scenarios

Algorithmically:
• Choose 5 starting scenarios i.e., the market is in one of 5 starting scenarios (each with a weight – the Gauss
Hermite weight).
–
–
Our CDO positions will only be partially hedged to this scenario; the cost of this partial hedging will be part of the final
P/L distribution
The starting scenarios will correspond to dates on which the iTraxx, CDX and HY indices assumed the values implied
by the Gauss Hermite percentiles
• The market is then evolved as per the algorithm above; the total loss distribution for 1M is computed, accounting
for the impact of the stressed scenarios



Hedging
Allow partial (risk based) re-hedging of book when switching to stressed scenarios
Model the relevant cost of re-hedging – based on applicable market bid/offers but also by including a liquidity
premium
RBS00000
21
Stressed Starting Scenarios

Choose stress scenarios
to be market on particular
days in our history.

Proxy stress events by
absolute level of iTraxx
spread levels

Choose days in history
corresponding to stress
events by finding days
when quantile of the index
matches the probability
levels implied by Gauss
Hermite.
RBS00000
22
Agenda
 Regulatory Requirements
 Challenges in Meeting Regulatory Requirements
 RBS Approach to CRM Calculation
 Modelling Approaches and Assumptions
 Price Risk
• Simulation of Market
 Default Risk
 Appendix
• Computational Implementation of CRM
RBS00000
23
Price Risk: Simulation of Market Variables
Simulation of the
Market
Yield Curve
Single Name
Spreads
Index Loss Fractions
Typical Approach in the Industry:
 Choose a stochastic differential equation (SDE) to describe the market data parameter (e.g., FX) that
we wish to simulate
• Immediately introduces model dependence.
• Estimate the parameters of the SDE (Kalman Filtering)
• Simulate the SDE forwards to generate a possible future time series
 Issues – why don’t we do this?
• Strong model dependence – if we estimate a market using a diffusion, we will never predict any
jumps!
• Estimation dependent upon quality of history
• Very difficult when we want to simulate a group of inter related variables (e.g. spreads, yield
curves, FX, rates) consistently
• Estimation very difficult in the multidimensional case!
• Typically attempt to capture codependence using static correlation; real codependences are far
more complex – time dependent and show regimes
• Such an approach will struggle to preserve the shapes of curves (e.g. yield curves)
RBS00000
24
25
Price Risk – Simulation of Market Variables
Δt,t+1market = {Δt,t+1spreads, Δt,t+1FX , Δt,t+1YC , …}
Simulation of the
Market
Δ01market
Yield Curve
Single Name
Spreads
Δ12market
Δ23market
History
tt00
t1
t2
t3
t4
...
Index Loss Fractions
Jump in
history
Apply
H(Δt,t+1market)
H(Δ01market)
Simulation
Today’s
Market
 Derive a time-series of intra-period changes in market variables (FX, Interest rates, etc.)
 Historic changes can not be applied to current data directly – define transformation function H()
 Apply changes as well as sign-reversal : drift is random, directional correlations are preserved.
Change sign of the entire market.
RBS00000
25
Price Risk
Simulation of Market Variables
Simulation of the
Market
Yield Curve
Single Name
Spreads
Index Loss Fractions
RBS uses the Mahal, Rebonato et el. approach:
 Apply sequence of historical market changes to current market
• Starting date is randomly chosen
• Dates of selected changes must agree across all risk drivers.
• Randomly jump from sequence to a new date
• Randomised trend reversal
• Preserves directional inter-dependence (so, no need to model correlations etc.).
 Historical changes must be applicable to current market
• e.g. if current spread is 10bps, not realistic to apply ±100bps historical change
• Transform risk drivers:
y = H [x]
ysim = ytoday + Dyhist
xsim = H-1[ ysim]
26
RBS00000
• e.g. for proportional changes: H [x] = ln (x)
• (we use this transformation for FX rates)
• Historical changes should look like ‘white noise’ (not dependent on current market)
Price Risk
Market Simulation – Yield Curves
Yield Curve
Single Name
Spreads
Index Loss Fractions
Individual rates
 Simple CEV-type transformations:




x
 s x 
if x  [0, xL ]
L

x

H [ x] 
where s ( x)  
s
if x  [ xL , xR ]
s ( x)
s 1  C.( x  x )  if x  [ x , ]
R
R


 To be calibrated: , xL, xR, C
 We also have ‘band reversion’ parameters,
CEV transformation
but not necessary for 1-month changes
 Also necessary to check shape of simulated
s (x)
Simulation of the
Market
curves
 See Mahal et al. ‘Barbell’ effects, shape
reversion, etc.
 Again, not a problem for short time-horizons
xL
27
Rate, x
RBS00000
Source: RBS
xR
Price Risk
Market Simulation – Spreads
Simulation of the
Market
Yield Curve
General Approach
 The history of individual names not necessarily relevant to modelling spread dynamic of the same
name today (e.g. Ford)
• For obligors that have experienced downgrades or corporate actions, a direct map to its spread history and historical
Index Loss Fractions
spread change would be unrepresentative of the behaviour it is likely to exhibit today
 For any date, bucket names by sector and spread percentile range
 For each path (start-date) randomly map each name into a name in the same historical bucket.
 Apply the corresponding changes from the mapped name.
• Introduces more randomness and therefore a wider range of plausible outcomes
• Preserves correlations across an industry
• Captures cross-gamma risk concentrated by name
Spread Mapping Exercise
Transformation
 Simplest model would be H[x] = ln(x)
 (as used in Regulatory Stress Test)
 However, we would expect some
dependency on current levels of spreads
 Maybe similar to Interest Rates
 This is work in progress
Industrial Sector
Spread Percentile Band
Single Name
Spreads
100
A
Auto
B
Energy
C
Metals
…
K
Telco
90
80
20
10
Source: RBS
28
RBS00000
0
Price Risk
Market Simulation – Index Loss Fractions
Simulation of the
Market
General Considerations
Yield Curve
 Need a different parameterisation of index tranche prices beyond correlation
 Simulated prices must be non-arbitrage-able across detachment points and across maturities
Single Name
Spreads
RBS is in the process of testing two alternative models, both involving Index Loss Fractions
(“ILFs”)
Index Loss
Fractions
ILFt 
E[max(Lt  K ,0)]
E[ Lt ]
 ILFs are effectively the ratio between the Expected Tranche Loss for an equity tranche with strike K to
the total expected loss (EL) of the index (i.e., expected tranche loss on an equity tranche with strike
0).
 Index Loss fractions – underlie loss fraction mapping
 RBS first simulates single name CDS spreads and the basis – we can therefore compute EL.
 We then propose to simulate the ILFs (i.e., the above ratios), and then convert to tranche price
RBS00000
29
Price Risk
Market Simulation – Loss Fractions Bounds
Simulation of the
Market
Yield Curve
Loss Fraction Bounds
 ILFs for any maturity need to be concave functions of detachment point. We model changes so that
simulated ILFs automatically have this property
 Simulate equity tranche, and for successively senior tranches find lower and upper bounds for the
Single Name
Spreads
ILFs, say LB and UB
 Define tranche ‘Theta’ as the following ratio:
Index Loss
Fractions
q
ILF  LB
UB  LB
• [q] (must be between 0 and 1)
 Clearly we cannot just add q given the bounds); instead map q onto the range (- , ) using inverse
normal cumulative distribution, say:
S = F-1[q]
 Additive changes in S will therefore always be valid. Are we done?
RBS00000
30
Market Simulation – Loss Fractions Bounds



Simulation of the
Market
Yield Curve
Y-axis: base correlation
X-axis: detachment point
Right hand graphs show
magnified view of
corresponding left hand
graph
0-x
Single Name
Spreads
0-x
Index Loss
Fractions
0-3
0-3
0-7
0-7
RBS00000
31
Price Risk
Market Simulation – Loss Fractions Bounds
 Let us look at a plot of Historical S’s
 Figure 8 shows a plot of changes in S for 5-year CDX versus Index Expected Loss
• There is clearly a pattern: as we go to higher expected loss the range over which S can vary
Simulation of the
Market
Yield Curve
decreases
Single Name
Spreads
• This effect appears more significant in the data than it is – fewer data points for larger EL
 Hence S is not a good quantity to simulate
 Figure 9 shows the impact of scaling S by Expected Loss: ie Z = S *G{EL)
Index Loss
Fractions
G{.} is calibrated to different indices and maturities
• No pattern i.e., apply historical changes in Z to today’s market
Figure 8 Change in S Versus Expected Loss - Unscaled
Figure 9 Change in S Versus Expected Loss - Scaled
6.0
6.0
4.0
4.0
2.0
2.0
0.0
0.0
(2.0)
(2.0)
(4.0)
(4.0)
(6.0)
(6.0)
2.0%
4.0%
6.0%
8.0%
10.0%
12.0%
Source:
Source: RBS
RBS
14.0%
(8.0)
0.0%
Source: RBS
32
2.0%
4.0%
6.0%
8.0%
10.0%
12.0%
14.0%
RBS00000
(8.0)
0.0%
Agenda
 Regulatory Requirements
 Challenges in Meeting Regulatory Requirements
 RBS Approach to CRM Calculation
 Modelling Approaches and Assumptions
 Price Risk
• Simulation of Market
 Default Risk
 Appendix
 Computational Implementation of CRM
RBS00000
33
Default Risk
Summary Schematic – Explanation
Simulating Defaults
 Simulate defaults over a 1-year liquidity horizon
 Approach employs same PD/Default correlation structure as IRC
• Through-the-cycle (i.e., long term) PDs based on, for example, historically experienced default rates
• However, we don’t know what stage of the credit cycle we will be in 1 year in the future. Hence we need to stress these PDs.
• Use a Merton firm value (Gaussian copula) type approach – familiar from IRC as described by the IRB.
• Stress the common factor to give default correlation/contagion effects
• Non-default spreads driven by the same systematic effects
• We need to integrate over systematic effects (use Gauss-Hermite if 1 factor model, Monte Carlo if multi factor)
 Recovery rates randomised (driven by systematic effects)
 Therefore we have a set of defaulted names (defaulted as per “real world” dynamics) and the times of default up to 1 year
Valuation
 Given a set of defaults over 1 year, we would expect that the spreads of the non defaulted firms will have changed:
• If we do not allow contagion, FTD baskets would always make money on a default
 We need to know the form of the entire market – index tranche prices, CDS spreads, FX, yield curves, basis etc. We propose
to do this by using the value of the common factor to pick out dates where the empirical cumulative probability of the Itraxx /
CDX index level corresponds to the cumulative probability of the common factor.
RBS00000
34
Default Risk Detailed Explanation

Then we
• Revalue the portfolio under the given market scenario incorporating randomized recoveries and defaults and spreads blown
out (V1)
• Revalue the portfolio under the given market scenario incorporating spreads blown out but with no defaults (V 2).
• Default p/l = V1 -V2

The impact of price risk is already captured
• A series of default events will cause the spread environment to change (possibly markedly). The aim is to capture this cross
effect – these products are nonlinear!

Tail risk identified by 2-stage estimation process (Importance sampling)
1.
Large number of simulations (10,000) using approximate revaluation
Select subset (1,000) giving largest approximate losses
2.
Compute corresponding losses using exact revaluations
Find appropriate tail average of these
RBS00000
35
Default Risk - Modelling Contagion Effects
RBS00000
36
Optimisation – Improving on Stein ?

The 1 factor Gaussian copula (with random recovery) is very popular because rapid computational schemes
exist
• All such schemes are predicated on the fact that after conditioning on the common factor credits become
conditionally independent
• The standard approach - the so called ASB algorithm - computes this conditional loss distribution using
recursion and is essentially exact
•
•
Various approximations –all of which seek to approximate this conditional loss distribution exist
•
We have implemented Stein and extensively investigated its use for this problem. We have also developed an
alternative (novel – i.e., not seen in the literature) Poisson approximation
•
•
Both approaches are significantly quicker than standard recursion (factor ~3)
Probably the most accurate approach in the literature is an application of the Stein approximation (El Karoui,
2008)
Both methods have been compared with Random Recovery Recursion on actual Index Tranche and Bespoke
portfolios, for a range of:
– Spread Scenarios
– Correlations
– Maturities
– Attachments / Detachments
– Our testing has encompassed stress events such as those produced by our market simulation.
RBS00000
37
Normal Approximation
•
The conditional loss distribution is bounded between 0 and the (factor-dependent) maximum
loss. When portfolio expected loss is not too low or high the loss distribution can be close to
normal.
•
Otherwise, however, the distribution can accumulate at either extreme and the normal
approximation deteriorates.
•
The figures below compare a 100-name homogeneous loss distribution with its approximating
normal for different levels of expected portfolio loss. Extreme low or high expected losses will
always arise since we are integrating across the market factor.
•
(Note that these figures are qualitative comparisons only, where discrete distributions are normalized by grid
size. The tranche prices themselves give the true quantitative comparison.)
Normal Approximation : Expected Portfolio Loss=0.01
Normal Approximation : Expected Portfolio Loss=0.10
Normal Approximation : Expected Portfolio Loss=0.30
14
45
40
10
35
10
25
8
20
6
binomial
8
normal
normal
30
9
Exact
12
Exact
normal
7
6
5
4
15
3
4
10
2
2
5
0
1
0
0.05
0.1
0.15
0.2
0
0
0.05
0.1
38
0.15
0.2
0.2
0.25
0.3
0.35
0.4
RBS00000
0
Standard Poisson Approximation
•
A poisson distribution is a natural approximation to the true conditional loss distribution when
expected losses are low.
•
The figures below compare the same 100-name homogeneous distribution with the usual
poisson approximation. As portfolio expected loss increases the accuracy deteriorates.
•
The range of accuracy of the poisson and normal are complementary so a threshold for
expected loss can be specified at which the approximation changes from poisson to normal. For
the example here this would typically be set around 0.10 to 0.15.
•
If recoveries are inhomogeneous however the distribution will be sparse with a small loss unit or
grid size, and the standard poisson approach becomes problematic.
Std Poisson Approximation : Expected Portfolio Loss=0.01
Std Poisson Approximation : Expected Portfolio Loss=0.01
45
Std Poisson Approximation : Expected Portfolio Loss=0.01
14
40
binomial
35
10
9
binomial
12
std poisson
10
8
20
6
std poisson
7
30
25
binomial
8
std poisson
6
5
4
15
3
4
10
2
2
5
1
0
0
0.05
0.1
0.15
0.2
0
0
0.05
0.1
39
0.15
0.2
0.2
0.25
0.3
0.35
0.4
RBS00000
0
Adjusted Poisson Approximation
•
The standard poisson approximation uses the same loss grid as the true distribution. If instead
we allow the approximating poisson to have its own loss unit we have an extra parameter and a
more flexible approach.
•
At low expected losses, the adjusted poisson is very similar to the standard poisson and the grid
size is very close to that of the true distribution (homogeneous in this example).
•
As expected loss increases the grid size decreases and the adjusted poisson smoothly changes
over to be very close to normal.
Adj Poisson Approximation : Expected Portfolio Loss=0.01
Adj Poisson Approximation : Expected Portfolio Loss=0.01
45
Adj Poisson Approximation : Expected Portfolio Loss=0.01
14
40
binomial
35
10
9
binomial
12
adj poisson
10
8
20
6
adj poisson
7
30
25
binomial
8
adj poisson
6
5
4
15
3
4
10
2
2
5
1
0
0
0.05
0.1
0.15
0.2
0
0
0.05
0.1
0.15
40
0.2
0.2
0.25
0.3
0.35
0.4
RBS00000
0
Comparison (Poisson vs Stein)
Price Differences versus Random Recovery Recursion: 0 – 3% Tranche
Pricing Error: 0-3% Tranche 5Y
Pricing Error: 0-3% Tranche 5Y
( CDX9 portfolio )
6.00%
( CDX9 portfolio )
6.00%
AssetLegPV Rel Error
(Stein - FullRec)
AssetLegPV Rel Error
(Poisson - FullRec)
5.00%
5.00%
4.00%
5.000%-6.000%
4.00%
4.000%-5.000%
5.000%-6.000%
3.000%-4.000%
3.00%
3.00%
2.000%-3.000%
3.000%-4.000%
1.000%-2.000%
2.00%
1.00%
2.00%
0.000%-1.000%
-1.000%-0.000%
0.00%
0.6
spread scaling facor
1.0
1.4
1.8
4.0
8.0
5%
15%
25%
35%
45%
55%
65%
75%
0.000%-1.000%
-1.000%-0.000%
-1.00%
-1.00%
0.3
2.000%-3.000%
1.000%-2.000%
1.00%
0.00%
0.1
4.000%-5.000%
0.1
95
85% %
flat correlation
0.3
0.6
spread scaling factor
1.0
1.4
1.8
4.0
8.0
5%
15%
25%
35%
45%
55%
65%
75%
85%
95%
flat correlation
Poisson
Stein
RBS00000
41
Comparison (Poisson vs Stein)
Price Differences versus Random Recovery Recursion: 9 – 12% Tranche
Pricing Error: 9-12% Tranche 5Y
( CDX9 portfolio )
10.00%
Pricing Error: 9-12% Tranche 5Y
( CDX9 portfolio )
10.00%
AssetLegPV Rel Error
(Stein - FullRec)
5.00%
0.00%
0.00%
-5.00%
-10.00%
-15.00%
-20.00%
-25.00%
-30.00%
-5.00%
5.000%-10.000%
0.000%-5.000%
-5.000%-0.000%
-10.000%--5.000%
-15.000%--10.000%
-20.000%--15.000%
-25.000%--20.000%
-30.000%--25.000%
-35.000%--30.000%
-40.000%--35.000%
-10.00%
-15.00%
-20.00%
-25.00%
-30.00%
-35.00%
0.3
0.6
spread scaling facor
1.0
1.4
1.8
4.0
8.0
5%
15%
25%
35%
45%
55%
65%
75%
5.000%-10.000%
0.000%-5.000%
-5.000%-0.000%
-10.000%--5.000%
-15.000%--10.000%
-20.000%--15.000%
-25.000%--20.000%
-30.000%--25.000%
-35.000%--30.000%
-40.000%--35.000%
-35.00%
-40.00%
-40.00%
0.1
AssetLegPV Rel Error
(Poisson - FullRec)
5.00%
0.1
9
85% 5%
0.3
0.6
spread scaling factor
flat correlation
1.0
1.4
1.8
4.0
8.0
5%
15%
25%
35%
45%
55%
65%
75%
85%
95%
flat correlation
Poisson
Stein
RBS00000
42
PV diffe re nce as % of Notional
1.
00
%
0.
93
%
0.
87
%
0.
80
%
0.
73
%
0.
67
%
0.
60
%
0.
53
%
0.
47
%
0.
40
%
0.
33
%
0.
27
%
0.
20
%
0.
13
%
RBS00000
43
0.
07
%
0.
00
%
-1
.0
0%
-0
.9
3%
-0
.8
7%
-0
.8
0%
-0
.7
3%
-0
.6
7%
-0
.6
0%
-0
.5
3%
-0
.4
7%
-0
.4
0%
-0
.3
3%
-0
.2
7%
-0
.2
0%
-0
.1
3%
-0
.0
7%
number of trades as fraction of book size
1.
00
%
0.
93
%
0.
87
%
0.
80
%
0.
73
%
0.
67
%
0.
60
%
0.
53
%
0.
47
%
0.
40
%
0.
33
%
0.
27
%
0.
20
%
0.
13
%
0.
07
%
0.
00
%
-1
.0
0%
-0
.9
3%
-0
.8
7%
-0
.8
0%
-0
.7
3%
-0
.6
7%
-0
.6
0%
-0
.5
3%
-0
.4
7%
-0
.4
0%
-0
.3
3%
-0
.2
7%
-0
.2
0%
-0
.1
3%
-0
.0
7%
number of trades as fraction of book size
Comparison (Poisson vs Stein)
Price Difference Comparison for all Bespoke Tranches
80%
Stein vs Full Recursion (Bespoke tranches)
70%
60%
50%
40%
30%
20%
10%
0%
PV diffe re nce as % of Notional
80%
Poisson vs Full Recursion (Bespoke tranches)
70%
60%
50%
40%
30%
20%
10%
0%
Default Risk Approximation

Default Risk modelling is time consuming
• Under full revaluation - each time a default occurs, the model must, for each trade:
– Remove defaulted name from portfolio
– Calculate expected recovery of defaulted name
– Calculate adjusted portfolio expected loss
– Iteratively Re-calibrate loss fraction curve(s) based on new portfolio expected loss
– Re-price adjusted tranche using new attachment and detachment points

In scenarios where we are simulating a number of defaults occurring (i.e. tail risk), computation times increase dramatically
RBS00000
44
Default Risk Approximation – PV Interpolation

The time consuming step in calculating the PV impact of defaults is the recalibration of the tranche loss fraction curve
required for the new portfolio after defaulted names have been removed

For calculation of the PV of a tranche on a portfolio that has experienced defaults, this recalibration step can be
circumvented if we keep the portfolio the same, but readjust the tranche attachment point by the loss amount:
100%
6%
100%
Simulate
Defaults
Tranche
Recovery
Calculate PV
Impact
6%
Tranche
5%
6%
Tranche
5%
Default
0%
100%
95%
5%
1%
0%
Loss
0.5%
0%
Loss
Loss
Tranche
0.5%
0%
Full Reval

100%
6.0%
5.5%
4.5%
0%
PV Interp
Operationally, for each trade portfolio, the (mid spreads and durations of) 15 tranches beneath the attachment point of
the original tranche are pre-calculated
• These tranches are of the same tranche thickness as the original transaction
• The specific pre-calculated tranches depend on the original tranche attach and detach
Based on the simulated number of defaults, a loss amount is calculated and the corresponding loss in subordination of
the original tranche is calculated

The PV of the defaulted tranche is calculated based on interpolating the subordination adjusted curve against the precalculated tranches
45
RBS00000
