Presentazione di PowerPoint
Download
Report
Transcript Presentazione di PowerPoint
Advanced Risk Management I
Lecture 5
Value at Risk & co.
Map of the exposures
• Equity
– Country
– Sector
• Bond
– Currency and bucket
– Issuer (rating class) and bucket
• Foreign exchange
– Value of the exposures in foreign currency
Profit and loss
• Define, at time t, for a given market,
– A set of maturities t1,t2,…tn
– A set of nominal cash-flows c1,c2,…cn
– A set of discount factors P(t,t1),P(t,t2)…P(t,tn)
• The mark-to-market value at time t is
V(t) = c1P(t,t1)+ c2P(t,t2)+ …+cn P(t,tn)
• At time t+, te mark-to-market is
P(t+,ti)=(1+ri) P(t ,ti) for every i, so that
V(t+)-V(t) = c1r1P(t,t1)+ c2r2P(t,t2)+ …+cnrn P(t,tn)
Risk measurement
• The key problem for the construction of a risk
measurement system is then the joint distribution
of the percentage changes of value r1, r2,…rn.
• The simplest hypothesis is a multivariate normal
distribution. The RiskMetrics™ approach is
consistent with a model of “locally” normal
distribution, consistent with a GARCH model.
Risk measurement methodologies
• Parametric approach: assume a distribution
conditionally normal (EWMA model ) and is
based on volatility and correlation parameters
• Monte Carlo simulation: risk factors scenarios are
simulated from a given distributon, the position is
revaluated and the empirical distribution of losses
is computed
• Historical simulation: risk factors scenarios are
simulated from market history, the position is
revaluated, and the empirical distribution of losses
is computed.
Value-at-Risk
• Define Xi = riciP(t,ti) the profit and loss on
bucket i. The loss is then given by –Xi. A
risk measure is a function (Xi).
• Value-at-Risk:
VaR(Xi) = q(–Xi) = inf(x: Prob(–Xi x) > )
• The function q(.) is the level quantile of
the distribution of losses (Xi).
VaR as “margin”
• Value-at-Risk is the corresponding concept of
“margin” in the futures market.
• In futures markets, positions are marked-to-market
every day, and for each position a margin (a cash
deposit) is posted by both the buyer and the seller,
to ensure enough capital is available to absorb the
losses within a trading day.
• Likewise, a VaR is the amount of capital allocated
to a given risk to absorb losses within a holding
period horizon (unwinding period).
VaR as “capital”
• It is easy to see that VaR can also be seen as
the amount of capital that must be allocated
to a risk position to limit the probability of
loss to a given confidence level.
VaR(Xi) = q(–Xi) = inf(x: Prob(–Xi x) > )
= inf(x: Prob(x + Xi > 0) > ) =
= inf(x: Prob(x + Xi 0) 1 – )
VaR and distribution
• Call FX the distribution of Xi. Notice that
• FX(–VaR(Xi)) = Prob(Xi –VaR(Xi))
= Prob(– Xi >VaR(Xi))
= Prob(– Xi > F–X –1())
= Prob(F–X (– Xi ) > ) = 1 –
• So, we may conclude
Prob(Xi –VaR(Xi)) = 1 –
VaR methodologies
• Parametric: assume profit and losses to be
(locally) normally distributed.
• Monte Carlo: assumes the probability
distribution to be known, but the pay-off is
not linear (i.e options)
• Historical simulation: no assumption about
profit and losses distribution.
VaR in a parametric approach
• pi=ciP(t,ti) marking-to-market of cash flow i
ri, percentage daily change of i-th factor
Xi, profits and losses piri
• Example: ri has normal distribution with mean i and
volatility i, Take = 99%
Prob(ri i – i 2.33) = 1%
If i = 0, Prob(Xi = ri pi – i pi 2.33) = 1%
VaRi = i pi 2.33 = Maximum probable loss (1%)
Volatility estimation
• Volatility estimation is the key issue in the
parametric approach
• Choice of the information: implied and
historical
• Measurement risk
• Model risk
Volatility information
• Historical volatility
– Pros: historical info available for a large set of markets
– Cons: history never repeats itself in the same way
• Implied vol
– Pros: forward looking
– Cons: available for a limited number of markets
Measurement risk
• Estimation risk of volatility can be reduced
using more information on
– Opening and closing prices
– Maximum and minimum price in the perido
• Estimators: i) Garman and Klass; ii)
Parkinson; iii) Rogers and Satchell; iv)
Yang and Zhang
Estimation risk (1)
• Oi and Ci are opening and closing prices of
day i respectively
• Hi and Li are the highest and lowest prices
of day i.
2
T
H i Li
1
2
• Parkinson:
ˆ P
T i 1 4 ln 2
2
2
Oi Ci 1
H i Li
1
2
ˆ P a
1 a
1 f 4 ln 2
T i 1
f
T
• GK
Estimation risk (2)
• Define: oi = Oi – Ci-1, hi = Hi – Oi, li = Li –
Oi, ci = Ci – Oi. Moreover, 2o and 2c are
variances computed with opening and
closing prices respectively
T
1
2
• Rogers-Satchell : ˆ RS
hi hi ci li li ci
T
i 1
2
2
2
2
ˆ
ˆ
ˆ
ˆ
• Yang-Zang: YZ O k C 1 k RS
Estimation risk (3)
• Parkinson: 5 times more efficient
– Mean return = 0; “opening jump” f = 0
• Garman and Klass: 6 times more efficient
– Mean return = 0; “opening jump” f 0
• Rogers and Satchell:
– Mean return 0; “opening jump” f = 0
• Yang and Zhang:
– Mean return 0; “opening jump” f 0
26
/0
9/
00
27
/0
9/
00
28
/0
9/
00
29
/0
9/
00
30
/0
9/
00
01
/1
0/
00
02
/1
0/
00
03
/1
0/
00
04
/1
0/
00
05
/1
0/
00
06
/1
0/
00
07
/1
0/
00
08
/1
0/
00
09
/1
0/
00
10
/1
0/
00
11
/1
0/
00
12
/1
0/
00
13
/1
0/
00
Example: Italian blue chips
47000
46000
45000
44000
43000
42000
41000
Results
Data
27/09/00
28/09/00
29/09/00
02/10/00
03/10/00
04/10/00
05/10/00
06/10/00
09/10/00
10/10/00
11/10/00
12/10/00
13/10/00
h
1.57304%
0.00000%
0.24260%
1.87499%
0.72456%
0.59880%
0.15623%
0.17383%
0.00222%
0.29532%
0.12797%
0.54148%
3.39528%
l
0.00000%
-1.42275%
-0.97856%
0.00000%
-0.48452%
-0.67934%
-0.49416%
-1.60912%
-1.25356%
-0.77015%
-1.77694%
-2.07786%
0.00000%
Volatilità
Numero Osservazioni
o
c
(h - l)^2/(4*ln2)
h(h-c)+l(l-c)
-1.08562%
0.90659%
0.00892%
0.01048%
0.39039%
-0.79109%
0.00730%
0.00899%
0.26074%
-0.81075%
0.00538%
0.00420%
-0.34253%
1.72715%
0.01268%
0.00277%
-0.06747%
0.11533%
0.00527%
0.00732%
-0.42497%
0.19419%
0.00589%
0.00836%
0.37639%
-0.19563%
0.00153%
0.00202%
0.05003%
-1.45675%
0.01147%
0.00529%
-0.45117%
-0.97338%
0.00569%
0.00353%
0.65145%
-0.12238%
0.00409%
0.00622%
-0.89458%
-1.27056%
0.01309%
0.01079%
0.56945%
-1.48141%
0.02475%
0.02335%
-1.41515%
3.39528%
0.04158%
0.00000%
Apertura
Chiusura
Parkinson
Rogers-Satchell
0.65599%
1.40101%
1.06566%
0.84726%
13
Yang-Zhang k=0.1875
1.17542%
Model risk
• Beyond estimation risk, it may happen that
volatility itself may change in time, making the
distribution non normal.
• Garch models: shocks reaching the return change
the volatility of the nexgt period return.
• Stochastic volatility models: volatility may depend
on other variables than the return itself.
03/01/00
03/11/99
03/09/99
03/07/99
03/05/99
03/03/99
03/01/99
03/11/98
03/09/98
03/07/98
03/05/98
03/03/98
03/01/98
03/11/97
03/09/97
03/07/97
03/05/97
03/03/97
03/01/97
03/11/96
03/09/96
03/07/96
03/05/96
03/03/96
03/01/96
03/11/95
03/09/95
03/07/95
03/05/95
03/03/95
Blue chips volatility
0.25
0.2
0.15
0.1
0.05
0
Garch(p,q) models
• Conditional distribution of the returns is normal, but
volatility changes in time following an
autoregressive process of the ARMA(p,q) kind. For
example, the Garch(1,1) model is:
t ~ N 0, t
Rt t
2
t
2
1 t 1
1
2
t 1
Garch: ABC…
• In a Garch model the unconditional distribution
NON of returns is not normal, and in particular is
leptokurtic (“fat-tails”): extreme events are more
likely than under the normal distribution
• In a Garch model the future variance is forecasted
recursively with the formula
ˆ t2i 1 1 ˆ t2i1
• The degree of persistence is given by 1 + 1 1
A special Garch…
• Assume: = 0 and 1 + 1 = 1. This is
integrated Garch (Igarch) without drift:
– i) volatility is persistent: every shock remains
in the history of volatility forever
– ii) the best predictor of time t + i volatility is t
+ i – 1 volatility
– iii) time t volatility is given by ( 1)
1
2
t
2
t 1
2
t 1
…called EWMA
• Notice that IGarch(1,1) with = 0 is the same as a
model in which volatility is updated with a moving
average with exponentially decaying weights
(EWMA).
• The model, with parameter = 0.94, is employed by
RiskMetrics™ to evaluate volatility and correlations.
• The model corresponds to an estimate of volatility
that weights the more recent observations (the
parameter corresponds to giving positive weights to
the last 75 observations)
Volatility estimates
Ghost feature
• Tuning the weights of the EWMA option allows to
reduce the relevance of a phenomenon called the
ghost feature.
• Ghost feature: a shock continues to affect with the
same weight the VaR estimate for all the period it
remains in the sample, and when it exits from the
sample, the VaR estimate changes with no
apparent motivation.
Cross-section aggregation
• Once the Value-at-Risk is computed for every
factor and position, the measure is aggregated
across factors or across different business units.
• Aggregation is performed according to two
methods
– Undiversified VaR: the algebraic sum of individual VaR
values
– Diversified VaR: quadratic sum computed with
correlation matrix C.
Cross-section aggregation:
diversified VaR
VaR
d (VaR1,VaR2 ,...,VaRN )
d C d
T
1
C
1N
1N
1
VaR1
VaR2
T
d
VaR
N
VaR: temporal aggregation
• Aggregating VaR for different uwinding periods
require assumption concerning the dynamic
process describing losses
• The relationship is
VaR unwinding period Daily VaR
• Notice: the relationship is based on the assumption
that:
– i) shocks are not serially correlated
– ii) the portfolio does not change during the unwinding
period
Example
• Position: 1 mil. euros on Italian equity and
0.5 mil. euros on US equity. Stocks on the
US market are denominated in dollars.
• Exposure:
1 000 000 Euro Italy equity
500 000 Euro US equity
500 000 Euro US/Euro exchange rate risk
Value-at-Risk validation
• Once one has built a system for the computation
of VaR, how to test its effectiveness?
• A possible strategy is to verify how many times in
past history losses have been higher than the VaR
measure computed for the corresponding periods.
• These are called validation procedures (or
backtesting)
What P&L is used for validation?
• Notice again the difference between market price
and marked-to-market price
• Since VaR has the goal of evaluating the marked-tomarket loss of a position, the validation procedure
must be carried out on the same concept of value
• Since market prices are determined by other
elements than the mark-to-market (liquidity factors
and others), it would be mistake to evaluate the
VaR measure directly on market losses.
Kupiec test
• A statistical test, suggested by Kupiec, is based on
the hypothesis that losses exceeding VaR be
indipendent.
• Based on this hypothesis one may compare the
number x of episodes of exceeding losses out of a
sample on N cases, and the binomial distribution
with probability
N x
N x
P( x) 1
x
Likelihood ratio
• The test is simply given by the ratio of the
probability to extract x excess losses from the
binomial distribution with respect to the
theoretical probability.
• The test, which is distributed as chi-square with
one degree of freedom, is
N x
x x
x
N x
LR 2ln 1 ln x 1
N N
Example
• In applications one typically takes one year of data
and a 1% confidence interval
• If we assume to count 4 excess losses in one year,
4 4 246 246
ln 0.0140.99246
LR 2ln
250 250
0.77
• Since the value of the chi-square distribution with
one degree of freedom is 6.6349, the hypothesis of
accuracy of the VaR measure is not rejected ( pvalue of 0.77 è 38,02%).
Christoffersen extension
• A flaw of Kupiec test isnbased on the hypothesis
of independent excess losses.
• Christoffersen proposed an extension taking into
account serial dependence. It is a joint test of the
two hypotheses.
• The joint test may be written as
LRcc = LRun + LRind
where LRun is the unconditional test and LRind is
that of indipendence. It is distributed as a chissquare with 2 degrees of freedom.