Examination into the Applications of
the Linear Value at Risk Model
2
Abstract
This paper offers a detailed investigation into the concept of VaR, its origins and
applications. Wew begin with a detailed investigation into the prigins of VaR, from both
an internal need and external requirement to manage risk exposure before providing a
brief description on the historical simulation and Monte Carlo simulation techniques in
implementing VaR. Greater emphasis is given to the delta-normal method, its advantages
and disadvantages. Through the construction of functions in Matlab, we examine the
application of the delta-normal method to single asset and multi-asset portfolios, before
examining the limitations of this method in using VaR to value portfolios containing
options.
3
Table of Contents
1. Introduction ...................................................................................................................................................................... 4
2. The Basel Accord ............................................................................................................................................................. 7
3. Literature Review ........................................................................................................................................................... 10
4. Theoretical Methodology ............................................................................................................................................... 22
Assumptions ........................................................................................................................................................................... 22
5. Practical Methodology .................................................................................................................................................... 24
6. Results ............................................................................................................................................................................ 26
Application of VaR Linear Model to Portfolios containing options ...................................................................................... 30
7. Conclusion ...................................................................................................................................................................... 34
8. Bibliography ................................................................................................................................................................... 35
4
1. Introduction
The financial world is an everchanging and dynamic global system that offers investors
the opportunity to achieve infinite gains, whilst simultaneously exposing them to large
amounts of risk associated with inherent investments. In more recent times there has been
a change in emphasis in the way in which financial institutions, and even individual
investors, attempt to achieve maximum rewards possible. No longer is it acceptable to
focus solely on the upside benefits of any investment opportunity. Instead, more rational
behaviour now involves awareness and the managing of any potential downside risk. As
such, improvements are continuously being sought to relate the regulatory capital that
must be available to the underlying risks that a firm undertakes.
The emergence of Value at Risk (“VaR”) in more recent times has been as a result of a
twofold development; consisting of both internal and external factors.
In relation to external developments, regulations have played a major role in the
development of risk measurement techniques. In 1996 under the Basel Accord
Amendment, the Basel Committee on Banking Supervision recommended two types of
models for measuring market risk on a daily basis. Due to the importance of this
Amendment, we will devote an entire section of this report outlining its devlopments.
Not only have these models become industry standards for measuring risk for external
5
regulatory purposes, but they are also constructed and implemented for internal risk
management and control.
The internal need for VaR stems from the repercussions associated with an everchanging
financial world. The past few decades have witnessed tremendous volatility in exchange
rates, interest rates, commodity prices and a vast array of derivative instruments for use in
managing the risks of such changes. This proliferation of derivative instruments has been
accompanied by significant increases in the number of securities traded along with
significant growth in financing opportunities. In addition, there has been colossal growth
in foreign trade resulting in financial links amongst international companies. As a
consequence of these factors, many companies have portfolios that include large numbers
of cash and (often very complex) derivatives. The complexity associated with such
portfolios and the vast array of trading transactions that occur on a daily basis result in the
magnitudes of these risks constantly changing. Further complicating this issue is the fact
that many of these risks are not immediately obvious. (Linsmeier & Pearson, 2000)
identify that these trends have led to a demand for a “portfolio-level quantitative measure
of market risk that a manager can succinctly report to senior management charged with
the oversight of risk and trading operations”. VaR is the leading summary measure of this
type.
In short, VaR is an attempt to provide a single number summarising the total risk in a
portfolio of financial assets. This single number is generally expressed in the following
way:
6
“I am X percent certain that there will not be a loss of more than V dollars in the next N
days”.
The elegance of the VaR measure is its applicability to various levels of operational scale.
It is as useful to an individual investor with maybe only a few assets, as it is to an entire
financial institution managing countless portfolios. This report intends to examine VAR
in its applications as a linear model, as well as analysing its performance for a portfolio
containing options.
7
2. The Basel Accord
The 1988 Basel Accord typified a major milestone in the history of banking regulation.
Otherwise referred to as Basel I, and issued by the Basel Committee on Banking (BCBS),
its primary focus was in relation to credit risk and the appropriate risk-weighting of assets.
Under the terms of this meeting, bank assets were to be classified and then grouped into
categories according to their level of credit risk. These assets which carried risk
weightings of 0% were typically short-term cash type instruments including; currency
stocks, bullion and debt from the banks own domestic country. The next weighting
bracket was 20%, within this bracket were mortgage backed securities with the highest
AAA rating. A 50% weighting bracket included municipal revenue bonds and residential
mortgages. Finally, a 100% weighting bracket was established to include corporate debt.
Under the terms of this Accord, some assets were given no rating and banks, operating on
an international scale were required to hold capital equal to 8% of their risk-weighted
assets (RWA).
In 1996 the BCBS issued an Amendment to the 1988 Basel Accord in order to address the
price risks to which banks are exposed, with particular emphasis on those risks inherent in
a bank’s trading activities. The key principle under this Amendment was the introduction
of the Cooke Ratio (ratio of eligible capital, consisting of Tier 1,2 and 3 capital) to risk-
weighted assets. In addition, two alternative approaches were outlined as a method of
8
measuring market risk. These approaches comprised both a standardised method as well
as an internal models approach. From this, banks were allowed to make use of proprietary
in-house models in order to measure their market risks, on the condition that these models
compute a daily VaR, using a 99th percentile with an industry wide base time horizon of
ten trading days. These calculations must also be based upon historical observations
identified over the previous year, as a minimum observation period. Although outlined in
1996, this Amendment did not come into effect until 1998 and eventually gave rise to the
implementation of the VaR model.
In more recent years, further amendments have been applied to the regulations imposed in
1988 and 1996. Basel II, published in 2004, was developed in order to establish an
international standard for banking regulators to control the level of capital reserves held
by banks in order to compensate for both financial and operational risks they were
exposed to. Centered upon a “three pillars’ approach:
• Pillar I focuses on the maintenance of regulatory capital a bank is required to hold
in lieu of the financial, operational and credit risks that they are exposed to. It is
within this pillar that the VaR model is incorporated.
• Pillar II is concerned with the regulatory framework for the issues outlined in
Pillar I.
• Finally, Pillar III aimed to complement Pillars I and II by developing disclosure
requirements to allow market participants a means of gauging the capital adequacy
9
set aside by an institution.
In theory, Basel II attempted to ensure that a bank held sufficient capital reserves in
proportion and in relation to the risks it was exposing itself to through its normal
operations, i.e. borrowing and lending activities. Generally speaking, under these rules,
the greater risk to which the bank is exposed, the greater the amount of capital the bank
needs to hold to safeguard its solvency and overall economic stability.
Despite the strong intentions of Basel II it was extremely difficult to implement and was
ultimately cut short by the 2007/2008 financial crisis. Basel II has now been superceeded
by Basel III which is unlikely to be implemented until 2018 and is focused on the
strengthening the capital reserves and regulation of banks based on problems exposed by
the recent financial crisis.
10
3. Literature Review
The concept of risk management in finance, economics and statistics has been a growing
problem since the conception of financial institutions many years ago. In particular,
financial risk management has been a growing concern for regulators and banking
executives. VaR analysis was not just a concept that emerged from purely from academia;
it was triggered by the stock market crash of 1987. This was the first time that within a
major financial crisis, academically-trained quantitative analysts were in high-end
positions in the banking world to worry about firm-wide survival. (Jorion, 2006)
The crash of 1987 was not predicted under standard statistical models leading analysts to
doubt their models and question everything developed up to date. From some historical
back testing it was found that these crises’ have a recurring pattern, about one or two a
decade, that overpowered the statistical models used in trading, investment management
and derivative pricing. With the inclusion of these extremely rare events in the models,
results where dominated by these factors and lead to trading strategies failing.
Conversely, with the exclusion of these events, profits between crisis points were much
smaller than the losses suffered during the crisis. Institutions were set to fail as a result.
(Taleb, 2007) (Kolman, 1997)
11
VaR was developed to separate these extreme events, which were studied using historical
data over a long-term timeline and over broad market events, from standard price
movements, which could be easily extracted from specific markets. The aim was to have
this risk metric that would precede extreme events in the marketplace, by increasing the
estimated VaR.
The first models of VaR were kept quite confined among quantitative trading groups in
financial institutions during the beginning of the 1990’s, most notably “Bankers Trusts”.
The instability of the “Dot Com” bubble during the 1990’s left many financial firms in
trouble. Since many trading desks already computed risk management VaR, and it was
the only common risk measure that could be both defined for all businesses and
aggregated without strong assumptions, it was the natural choice for reporting firm wide
risk. The most famous account of this was J.P. Morgan’s CEO, Dennis Weatherstone and
the famously called “4.15 report”, that was a one page account of all the firm risk stating:
“I am “X”% certain there will not be a loss of “V” dollars over the next “N” days”
which was made available to him 15 minutes before the market closed. (Kolman, 1997).
Thus the risk measurement of VaR was adapted for this purpose and J.P.Morgan carried
out further extensive development on this risk metric. They published their methodology
and the estimates of the necessary underlying parameters in 1994. (Spencer &
Longerstaey, 1996) . Two years later, this methodology was established into an
independent for-profit business now named the “Risk Metrics Group” (Anon., 1996). In
12
1997, the U.S. Securities and Exchange Commission (SEC) ruled that public corporations
must disclose quantitative information about their derivative activity. Major Banks and
dealers chose to implement the rule by including VaR information in the notes to their
financial statements. (Jorion, 2006) (SEC, 1997)
It is agreed among most academics that there are several approaches to evaluating VaR in
the firm; the main three approaches are historical simulation, the delta-normal approach,
and Monte Carlo simulation. (Linsmeier & Pearson, 2000). There are some arbitrary
factors when considered estimating VaR; one must keep in mind the probability “x” and
holding period “t”. Without them, VaR numbers are meaningless. Theory provides little
guidance about the choice of probability. It is determined primarily by how the designer
of the risk-management system wants to interpret the VAR number i.e. Risk Metrics
system uses 5 percent whereas Mobil Oil's 1998 annual report indicates that Mobil uses
0.3 percent. For the entity's time horizon, those that actively trade their portfolios, such as
financial firms, typically use one day, whereas institutional investors and nonfinancial
corporations may use longer periods. The choice of holding period can have an even
larger impact because the VAR computed using a t-day holding period is approximately
𝑡 times as large as the VAR computed using a one day holding period. VaR numbers are
not comparable across entities, one company’s VaR is different to another depending on
portfolio selection and diversification (Markowitz, 1952), holding period and portfolio
probability (Linsmeier & Pearson, 2000). (Pearson, 2002)
13
A recent survey by Britain’s Financial Services Authority has revealed that 42 percent of
banks use the Delta-Normal approach, 31 percent use Historical Simulation, and 23
percent use the Monte Carlo approach. The Monte Carlo analysis of linear positions with
normal returns, for instance should yield the same result as the Delta-Normal method
(Jorion, 2006).
In Historical Simulations, one simply uses real historical data to build an empirical model
for the portfolio profit and loss. There are no assumptions about the analytic form of this
distribution, nor about the type of co-movements between assets or risk factors. It is for
this reason that option prices and other complex positions for risk factors can be evaluated
and is favourable among institutions because of this matter. The absence of distributional
assumptions and the direct estimation of the variation without the use of a covariance
matrix are the main strengths of the historical VaR model. Historical data on the
underlying assets and risk factors for the portfolio are collected on a daily basis covering
several years, this is used to calculate the portfolio value each day keeping current
portfolio weighs constant. The values of any of the options are calculated using pricing
models. This model has some drawbacks such as; all data for assets may not be available,
long historical data may contain a number of extreme market events in the past or current
circumstances, filtering out these events is a possibility to obtain normal market
behaviour, but it has to be done carefully has not thrown out a relevant part of the data as
in VaR models the relevant part of this model is exceptional losses. (Alexander, 2001).
14
Yet another method of computing the VaR is to use a Monte Carlo simulation approach.
In this approach, a simulation of the hypothetical future values of the portfolio is
performed by drawing pseudo-random return vectors from an assumed joint distribution
of the percentage changes in the two stock market indexes and then computing
hypothetical new values of the stock market indexes and hypothetical new values of the
portfolio. This provides a large data set with thousands of possibilities for the portfolio; a
histogram of the differences between the generated and current portfolio value is obtained
and as like the historical method above VaR is measured from the lower percentile of the
distribution. (Pearson, 2002). This model can be used regardless of option content in a
portfolio, it is easy to implement, and it does take up a lot of computational time for large
portfolios. (Linsmeier & Pearson, 2000)
The Delta-normal method is viewed as the best method to compute VaR for portfolios
with linear positions and whose distributions are close to the normal probability density
function. Using the Delta-Normal method, Value-at-Risk would be relatively easy to
compute, fast, and accurate. In addition, it is not too prone to model risk (due to faulty
assumptions or computations). Because the method is analytical, it allows easy analysis of
the VaR results using marginal and component VaR measures. This method requires a
procedure known as “risk-mapping”. To estimate the covariance matrix of returns is large
instrument portfolios is quite hard so the simplification of mapping the actual instruments
to a set of simpler standard positions or instruments such as the returns from market
indexes. So once these new standard positions are located and the covariance matrix of
15
their changes in values has been determined the standard deviation of any portfolio of the
standardized positons can be computed using know formulas and the VaR calculated. This
mapping captures the fact that one of the new standardized positions are not exposed to
risk changes and the mapping amounts to replace the portfolio with a linear
approximation. (Pearson, 2002).
Another way to look at the delta of a portfolio containing options is to consider the
“Greek” construction of the number. It is the rate of change of the value of the portfolio
with change in current price “S” and change in portfolio price “P”:
𝜕 = ∆𝑃∆𝑆
Hull’s book derives this linear approximation elegantly and discusses how to extract the
standard deviation to use in the VaR calculation. (Hull, 2011)
From graphical “Risk-mapping” presented in Pearson’s book, it can be seen the graphical
linear approximation or delta approximation of the value of the portfolio. (Pearson, 2002).
16
FIGURE 1: PORTFOLIO VALUE AGAINST FTSE-100 INDEX
This approach will work when the linear approximation describes the changes in the value
of the portfolio. From Figure 1 previously, taken from Pearson’s Book, it can be seen that
the linear approximation will depend on the curvature of the function expressing the value
of the portfolio in terms of the underlying asset. This measure is captured by the portfolios
second derivative, or the portfolios “Greek” Gamma. Gamma measures the curvature of
the relationship between the portfolio value and the underlying market variable, while
Delta measures the rate of change of the portfolio value with respect to the underlying
market variable (first derivative). (Hull, 2011).
17
This Gamma is a characteristic of options and option-like instruments, implying that the
relationship of the linear approximation is determined by the number of options in the
portfolio and their characteristics. The simplest case is a portfolio with no options
consisting of positions in stock, bonds, FX, and commodities. In this case the change in
the value of the portfolio is linearly dependant on the percentage changes in the prices of
the assets consisting in the portfolio. (Hull, 2011) Thus as a result of this the Delta-
approach may not be an accurate VaR estimation for portfolios that contain large number
of options. (Pearson, 2002)
For figures with downward curvature, or gamma less than zero, the linear approximation
is always greater than the actual portfolio value, so that the estimate predicted for the loss
based on this approximation will be less than the actual loss, thus the delta-normal
approach underestimates the VaR of such portfolios. Conversely, for positive Gamma
portfolios (approximation lies below the true portfolio value), the approximation will
exceed the true loss and will overestimate the VaR. (Pearson, 2002). Portfolios with
mostly purchased options have positive Gamma, while written option portfolios have
negative Gamma. Thus the Delta-Normal approach tends to underestimate the VaR of
written option portfolios and overestimate bought options portfolios.
Apart from the Delta approximation used heavily in pricing option portfolios, this
approach has some other assumptions, and the portfolio does not change during the
holding period as in all VaR models. The Delta method has the assumption that market
factors are normally distributed, which is not in the case in reality. Changes in financial
18
rates and prices typically have fat tails relative to Normal Distribution; in actual
distributions both extreme and small changes are predicted more frequently than Normal
Distribution predictions with the same variance.
The Delta-Normal method is easy to implement because it involves a simple matrix
multiplication. It is also computationally fast, even with a large number of assets, because
it replaces each position by its linear exposure. Portfolios that are linear combinations of
normally distributed risk factors are themselves normally distributed. It only requires the
market values and exposures of current positions, combined with risk data. This method is
important not only for its own sake but also because it illustrates the “mapping” principle
in risk management.
FIGURE 2: ERROR FROM DELTA APPLICATION
19
VaR has come under some criticism over the last number of years from academics for not
being what is known as “coherent risk”, as it does not respect the sub-additivity property.
An immediate consequence is that value at risk might discourage diversification. Value at
risk is, however, coherent, under the assumption of elliptically distributed losses (e.g.
normally distributed) when the portfolio value is a linear function of the asset prices.
However, in this case the value at risk becomes equivalent to a mean-variance approach
where the risk of a portfolio is measured by the variance of the portfolio's return. In a
paper by (Artzner, Delbaen, Eber, & and Heath, 1999) they mathematically define the
four properties for measures of risk that satisfy for a measure to be coherent. These
properties are translational invariance (if we add an amount of cash K to a portfolio, its
risk measure should go down by K), sub-additivity (the risk measure for two portfolios
after they have been merged should be no greater than the sum of their risk measures
before they were merged.), positive homogeneity (changing the size of a portfolio by a
factor (X) while keeping the relative amounts of different items in the portfolio the same
should result in the risk measure being multiplied by (X)), and monotonicity (if a portfolio
has lower returns than another portfolio for every state of the world, its risk measure
should be greater.). (Artzner, Delbaen, Eber, & and Heath, 1999)
(Artzner, Delbaen, Eber, & and Heath, 1999) further go on to discuss three currently used
in methods of measuring market risk and comment on the performance of these methods
in relation to coherent risk measure. They examine the SPAN 1995 (Chicago Mercantile
Exchange), SEC rulings used by NASD 1996 and FED 1994, and lastly quintiles-based
20
VaR.
For VaR it is usually defined in terms of net wins or P/L and therefore ignores the
difference between money at one date and money at a different date, which, for small time
periods and a single currency, may be acceptable. It uses quantiles, which requires paying
attention to discontinuities and intervals of quantile numbers. They concluded VaR was
non sub-additive, fails to recognize concentration of risks, and fails to encourage a
reasonable allocation of risks among agents. Value at risk does not behave nicely with
respect to the addition of risks, even independent ones, thereby creating severe
aggregation problems. The use of value at risk does not encourage and, indeed, sometimes
prohibits diversification because value at risk does not take into account the economic
consequences of the events, the probabilities of which it controls. (Artzner, Delbaen, Eber,
& and Heath, 1999)
A way to remedy the deficiencies of Value at Risk was to use Expected Shortfall in
several forms to combat this non-coherent risk measure. In a paper by (Acerbi & Tasche,
2002) they define and discuss the properties of Expected shortfall. Again they reiterate
VaR being heavily criticized for not being sub-additive, thus measuring the risk of a
portfolio can be larger than the stand-alone risk of the components when measured by
VaR. Hence, managing risk by VaR may fail to stimulate diversification. Moreover, VaR
does not take into account the severity of an incurred damage event. The use of coherent
risk measures such as worst conditional expectation (WCE) or tail conditional expectation
(TCE) was advised against as WCE is a mainly used in a theoretical setting as TCE tends
21
more to practical application. Thus the authors set out to achieve a construct of a risk
measure that was coherent and both easy to compute. (Acerbi & Tasche, 2002)
Expected Shortfall evaluates the risk of an investment in a conservative way, focusing on
the less profitable outcomes. For high values of α it ignores the most profitable but
unlikely possibilities, for small values of α it focuses on the worst losses. On the other
hand, unlike the discounted maximum loss even for lower values of α expected shortfall
does not consider only the single most catastrophic outcome. A value of α often used in
practice is 5%. It is stated as “average loss for α% tail”. The expected return on the
portfolio in the worst α % of the cases. It is an alternative to value at risk that is more
sensitive to the shape of the loss distribution in the tail of the distribution. (Acerbi &
Tasche, 2002)
VaR is criticised for not painting a full picture of the risks a company faces. A measure
that produces better incentives for traders than VAR is expected shortfall. This is also
sometimes referred to as conditional VAR, or tail loss. Where VAR asks the question
“how bad can things get?” expected shortfall asks, “If things do get bad, what is our
expected loss?” Regulators make extensive use of VAR and its importance, as a risk
measure is therefore unlikely to diminish. However, expected shortfall has a number of
advantages over VAR. This has led many financial institutions to use it as a risk measure
internally. (Hull, 2011)
22
4. Theoretical Methodology
The covariance VaR was initially introduced in October 1994 by J.P. Morgan as a means
of easily reporting to senior management, the current risks the Company’s portfolios were
exposed to. This next section outlines the theoretical methodology underlying the
implementation of the VaR model in practice.
ASSUMPTIONS
In constructing the VaR of a linear portfolio under the covariance VaR methodology, the
only necessary information required is a covariance matrix of all assets enclosed within
the portfolio. This covariance matrix consists of the variances and covariances of the
returns on the assets contained within the portfolio, calculated using a variety of methods
such as; GARCH models or moving average methods. As identified under the 1996 Basel
Amendment, it is recommended that observations of at least one preceding year are used
in the construction of this matrix.
The fundamental assumption required in implementing this methodology, assumes that
the portfolio P&L is normally distributed. Although the validity of this assumption is
subject to extensive debate and analysis, if we consider the change in P&L over an n-day
period, this expression can be denoted as:
ΔnPt = Pt+n - Pt
23
With ΔnPt ≈ N (µt,σ2t).
The 100α% n-day VaR represents the VaR α ,n number such that:
Prob (ΔnPt < - VaR α ,n ) = α
Applying the standard normal transformation:
Prob ( [ΔnPt - µt ] / σt < [-VaR α ,n - µt ] / σt ) = α
Given that [ΔnPt - µt ] / σt ≈ N (0,1), denoting [ΔnPt - µt ] / σt by the standard normal
variate Zt:
Prob (Zt < [-VaR α ,n - µt ] / σt ) = α
Prob (Zt < - Z α ) = α,
With (-VaR α ,n - µt ] / σt ) = - Z α
Where Zα represents the 100αth percentile of the standard normal density. Rearranging the
above formulae results in the following expression for calculating VaR:
VaR α ,n = Z α σt- µt
As previously identified, VaR is only a suitable risk measure for over small time frames
and associated short-term risks. It is therefore possible to assume that µt = 0 over short
time periods, whilst Z α is a constant number extracted from the Standardized Normal
Distribution Tables. As such, it is the volatility of the P&L, denoted as σt, that governs
the VaR for the entire portfolio.
24
5. Practical Methodology
Value at Risk calculates the maximum loss expected (worst case scenario) on an
investment, over a given time period and given a specified degree of confidence. We
created two functions in Matlab that could be used for this purpose. The first function
calculated the VAR of a single asset for a given number of days. The second calculated
VAR for an entire portfolio. We will begin by looking at a single asset and given specified
inputs for the position value, daily volatility and the number of days we then calculate
VAR for different assets.
FIGURE 3: SINGLE ASSET MATLAB FUNCTION
function ValueAtRisk=AssetRisk(Value,Volatility,Days,Confidence)
if (nargin <4)
Confidence = 2.33;
end
if (nargin < 3 )
Days = 1;
end
SD = Volatility * Value * sqrt(Days);
ValueAtRisk= Confidence * SD;
end
25
Figure 3 contains a copy of the function code created using Matlab. The function itself is
denoted as “AssetRisk” and incorporates numerical input values for the value of the
portfolio, its volatility, number of days and the required confidence level. As a precaution,
the function also runs safety checks to make sure the inputs are entered correctly. For
instance, if no confidence interval is entered, the function will simply use a default
confidence interval equivalent to 95%. In addition, a one day period is set as the default
level number of days.
Using the previously identified inputs, the function calculates the one-day standard
deviation of change over the period analysed via the following formula:
Standard Deviation = Value * Volatility * √Days
Based upon the assumption:
10-day VaR = √10 × 1-day VaR
The above assumption holds when the portfolio changes on successive days stem from
independent and identically distributed Normal Distributions.
Next, from the standard deviation previously derived, we calculate the VaR of the
portfolio by multiplying it by the confidence interval value taken from the Standard
Normal Distribution.
26
6. Results
Using the methodolgy outlined in Sections 4 and 5, we conducted testing of our created
function and obtained the following results:
FIGURE 4: VaR OF SINGLE ASSET
The output generated by our function is in the form of a single value. From the first test
conducted, as shown in Figure 4, we can see that for an asset valued at $10,000,000, with
a volatility of 2% per day over a 10 day period and a confidence interval of 95%, we
obtain a VaR value of 1.4736e+06. This can be interpreted as “we would expect to lose
no more than $1,473,600 over the next 10 day period, for 95 days out of 100, based on the
portfolio’s current composition and recent market behaviour.”
27
We then conducted further testing having changed the values of each input to examine the
effects on the VaR outputted. As part of our final test, we omitted the period and
confidence limit inputs with our function successfully implmenting the default rates of
95% for our confidence interval over a 1 day period.
We then progressed onto a function with the ability to calculate the VaR for an entire
portfolio. The code of this function can be seen in Figure 5. This function uses the inputs
assets and position. Assets represents a column vector containing the percentage change
of each asset contained within the portfolio over a period of time, whilst position denotes
another vector representing the amount invested (held) in each asset. Again, from these
inputs, our function calculates the 95% confidence interval VaR for a 1 day period.
FIGURE 5: ASSET PORTFOLIO MATLAB FUNCTION
function PVar = pvar(assets,position)
cm = corrcoef([assets]);
vol = mean([assets]);
for I = 1:length(vol)
Var(i) = assetrisk(position(i),vol(i));
end
TVar = Var.';
cvar = cm*TVar;
PVa r= Var * cvar;
PVar =sqrt(PVar);
28
The code begins by calculating the correlation matrix of the vector of assets which is an
inbuilt function in Matlab. Once it is calculated, a mean value is taken for each asset
which is used as its daily volatility. We then use the function previously created, to
calculate the VaR for a single asset, in calculating the VaR for each of the stocks, through
implementation of a “for loop” which calculates each iteration in turn before storing them
in a vector denoted as VaR. The final stages of coding relate to the multiplication
between the correlation matrix and individual VaRs to arrive at our final outputted result.
In establishing a dummy portfolio for testing purposes, we extracted data from Yahoo
Finance. Our portfolio consists of Apple, Google and Microsoft with 245 daily
observations based upon the closing stock price of each stock between 28th March 2014
and 20th March 2015. In order to convert the data into a workable format, we must
transform the data using logarithms before importing the data into Matlab.
FIGURE 6: VaR FOR ASSET PORTFOLIO
The daily change in vector assets can be seen in Figure 6.
29
The output value of 1.3603e+06, shown within Figure 6, equates to $1,360,300 and we are
95% confident in this calculation. This is a general function and can be used for the
portfolios of stocks each carrying different positions. In the above example we assumed
that equally weighted investments of $1,000,000 invested in stock. As a means of
robustly testing our function, we now change the amount invested in each stock and
observe the difference in the portfolio VaR.
FIGURE 7: ASSET PORTFOLIO RESULTS WITH DIFFERENT WEIGHTINGS
Figure 7 shows the impact on VaR having changed our positions in Apple to $10,000 and
Microsoft to $500,000. We calculated the VaR for the portfolio to be $68,238 for a single
day, this lead us to believe one of these two stocks to be a greater risk than the rest.
We again changed the position in the portfolio, decreasing Google down to $250,000 and
increasing Apple back up to $1,000,000 the VaR increased to $1,362,000 confirming our
suspicion that Apple is a very risky asset compared to the other two considered.
30
However, it is important to consider that these conclusions only apply to the time period
analysed. In extending this period, further testing would be required before drawing any
additional conclusions.
APPLICATION OF VAR LINEAR MODEL TO PORTFOLIOS CONTAINING OPTIONS
Since the implementation of the Basel Accord and the imposition of minimum capital
requirements by regulators as a means of insurance against market risks, VaR has become
the universal measure of risk across businesses and industries. Despite its popularity and
usefulness in providing a single risk measure easily understood by senior management,
the parametric linear VaR model previously considered is only applicable to a portfolio
whose returns, or Profit and loss (P&L), is a linear function of its risk factor returns or its
asset returns.
The fundamental assumption underlying the linear VaR model relates to the fact that risk
factor returns are normally distributed, and that their joint probability distribution is
multivariate normal. The covariance matrix of risk factor returns is therefore all that is
necessary to capture the dependency amongst risk factor returns. As such, the linear VaR
model can only be applied to linear portfolios, or those which have been assumed to be
linear in respect to their risk factors.
For portfolios containing options, the linear VaR model can only be used to provide an
approximation of portfolio risk. Option portfolios have too many non-linear
characteristics to be simply measured by a covariance VaR model. For instance, the
31
change in the value of an option is not linearly related to the percentage change in the
value of the underlying variable. Under the linear model it is assumed that the change in
the value of a portfolio is linearly related to percentage change in the underlying variables.
In order to generate a more accurate approximation of risk, it is necessary to apply one of
several simulation methods made available using alternative versions of the VaR model.
For instance Monte Carlo simulation.
Now that we have shown how our function works for a variety of asset portfolios, we will
now consider in more detail a portfolio VaR. As shown in our previous functions,in order
to calculate VaR we must take into consideration the correlation of each of the assets
contained within the portfolio. To incorporate this information we use matrix
multiplication between the VaR of each asset conatined within the portfolio and the
correlation matrix of the portfolio. By extension, in reducing the correlation matrix of a
portfolio, it is possible to reduce the VaR of the portfolio. There are a variety of different
ways in which to reduce this correlation. For instance; simply choosing uncorrelated or
via our proposed method of reducing correlation by diversification.
In order to reduce correlation by diversification we begin by obtaining the 1 year daily
close values of 12 random stocks within the FTSE 100 and transform the data, using logs,
into a workable format. In Matlab, we combine stocks in a variety of different sized
portfolios such as; two stocks, five stocks etc. We then use our function from Figure 5 as
base before extending it into that code contained within Figure 8, to calculate the VaR for
each of these portfolios. Initially, we begin with a $12,000,000 total investment in the
32
Position = [12000000];
asset1 = [Aviva];
VAR1 = pvar(asset1,position1);
position2 = [6000000 6000000];
asset2 = [Aviva Barclays];
VAR2 = pvar(asset2,position2);
asset3 = [RoyalMail ITV BP];
position3 = [4000000 4000000 4000000];
VAR3 = pvar(asset3,position3);
position4 = [3000000 3000000 3000000 30000000];
asset4 = [Aviva Barclays BLT BP];
VAR4 = pvar(asset4,position4);
asset6 = [Aviva Barclays BLT BP BT CocaCola];
position6 = [2000000 2000000 2000000 20000000 2000000 2000000];
VAR6 = pvar(asset6,position6);
position10 = [1200000 1200000 1200000 1200000 1200000 1200000 1200000 1200000 1200000 1200000];
asset10 = [Aviva Barclays BLT BP BT CocaCola Diago ImperialTobacco ITV Next];
VAR10 = pvar(asset10,position10);
asset12 = [Aviva Barclays BLT BP BT CocaCola Diago ImperialTobacco ITV Next RoyalMail Vodafone];
position12 = [1000000 1000000 1000000 1000000 1000000 1000000 1000000 1000000 1000000 1000000 1000000 1000000];
VAR12 = pvar(asset12,position12);
VAR = [VAR1 VAR2 VAR3 VAR4 VAR6 VAR10 VAR12];
X =[1 2 3 4 6 10 12];
portfolio before equally weighting each investment as assets are subsequently added.
FIGURE 8: VaR FOR MULTI-ASSET PORTFOLIO
33
The results of this analysis are demonstrated graphically within Figure 9 below.
FIGURE 9: GRAPHICAL RESULTS OF VaR FOR MULTI-ASSET PORTFOLIO
As expected, the VaR of each the portfolios decreased as we included more and more stocks.
If we continued this process indefinitely, we would expect the VaR to level off eventually as
it is impossible to eliminate all risk. In examining Figure 9, we can see a steady decrease in
VaR for the majority of portfolios constructed. However, a spike exists at when four assets
make up the portfolio, potentially due to the addition of an asset with very high volatility
resulting in an increase in the VaR.
34
7. Conclusion
In conclusion, the aim of this report was to investigate and observe the VaR of a linear
asset portfolio. We began with a short history of VaR and its origins, finding that VaR
stems from both an internal and external need for risk management triggered largely by
the Basel Accords. We then examined the theoretical studies behind VaR itself as well as
its different variations, uses and limitations in relation to non-linear portfolios. Upon
designing our own function using Matlab to calculate the VaR for both single asset and
multi-asset portfolios we deduced a significant relationship between the correlation of the
assets in a portfolio and its VaR. Finally, we were able to reduce the VaR of a portfolio
by diversifying our sample-constructed portfolio by diversifying and including more
assets.
35
8. Bibliography
Acerbi, C. & Tasche, D., 2002. On the Coherence of Expected Shortfall. Journal of Banking and Finance, 26(7), pp. 1487-1503.
Alexander, C., 2001. Market Models: A guide to financial data analysis. s.l.:Wiley.
Anon., 1996. MSCI. [Online] Available at: http://www.msci.com/
Artzner, P., Delbaen, F., Eber, J. M. & and Heath, D., 1999. Coherent Measures of Risk. Journal of Mathematical Finance, 9(3), pp. 203-228.
Hull, J. C., 2006. Options, futures, and other derivatives.. s.l.:Pearson Education India.
Hull, J. C., 2011. Options, Futures & Other Derivatives. New York: Pearson.
Hull, J. C., 2011. Options, Futures & Other Derivatives. 8th Edition ed. New York: Pearson.
Jorion, P., 2006. Value at Risk - The New Benchmark for Managing Financial Risk (3rd Edition ed). New York: McGraw-Hill.
Kolman, J., 1997. Roundtable on the limits of VaR. Derivatives Strategy, pp. 14-22.
Linsmeier, T. J. & Pearson, N. D., 2000. Value at Risk. Financial Analysts Journal, pp. 47-67.
Markowitz, H., 1952. Portfolio Selection. The Journal of Finance, pp. 77-91.
Pearson, N. D., 2002. Risk Budgeting: Portfolio Problem Solving. New York: Wiley Finance.
Pearson, N. D., 2002. Risk Budgeting:Portfolio Problem Solving. New York: Wiley Finance.
SEC, 1997. U.S. Securities & Exchange Commission. [Online] Available at: http://www.sec.gov/divisions/corpfin/guidance/derivfaq.htm#comp
Spencer, M. & Longerstaey, J., 1996. RiskMetrics—Technical Document, New York: Morgan Guaranty Trust Company of New York.
Taleb, N. N., 2007. The Black Swan:The Impact of the Highly Improbable. New York: Random House.