Está en la página 1de 103

mark to future

A framework for measuring risk and reward

Ron S. Dembo, Andrew R. Aziz, Dan Rosen and Michael Zerbs

May 2000

Algorithmics Publications
Mark-to-Future
Technical Document

Editor
Judith M. Farvolden
Editorial
Editorial Assistance I suspect that the first trade-off between return and risk occurred when a
Lynn Coulthard, Christine Farmery, caveman had to decide whether to spear a mastodon or run. The relevant
Marguerite Martindale scenarios probably sped instinctively through the neurons. Stand and
throw the spear. If successful, have a fine feast, if not, get trampled. Or,
Editorial Board run and go hungry but live to hunt another day.
Andrew Aziz
Judith M. Farvolden We naturally deal with risky situations by contemplating the outcomes
Dan Rosen and musing about possible scenarios. Oddly, though, as we have grown
Stephen A. Ross more sophisticated in the world of financial risk control, we have often
Michael Zerbs left this intuitive and common sense approach behind us in favour of
technically more formal approaches such as “mean-variance.” There is
Layout
nothing inherently wrong with these formal ways of analyzing risky
Tammy McCausland
situations, but it is wrong to think that they have the same universal
applicability as the scenario approach. They are, after all, just special
cases of the caveman’s scenario analysis.
Mark-to-Future is a full-scale realization of the scenario approach. It
addresses risk and return by examining possible futures, i.e., scenarios,
http://www.mark-to-future.com and by cataloguing their effects on financial positions and portfolios. It is
based on the most fundamental of economic models: the state space
approach to uncertainty. In this view of the world, a state of nature is a
 2000 Algorithmics Incorporated
realization of uncertainty, i.e., a possible scenario. In each state of nature,
(“Algo”). All rights reserved. Errors
assets and portfolios are marked-to-market along the future scenario,
and omissions excepted. Permission whence Mark-to-Future. Risk and reward analysis consists of tabulating
to make digital/hard copy of part or all all of the financial consequences of each state of nature and then making
of this work for personal or classroom decisions by trading off the benefits in some scenarios against the losses in
use may be granted without fee others.
provided that no copy is made, used
or distributed for any commercial While this is easy to grasp conceptually, it is a Herculean task to make it
purpose whatsoever and provided that practical. First and foremost, we have to identify the economic drivers of
Algo’s prior written consent has been future valuations. Then we have to generate a meaningful set of scenarios
obtained, and that Algo’s copyright that spans the relevant future possibilities for those drivers. Along each
notice, the title of the publication and scenario, assets must be priced and they must be priced with a full
its date appear on any such copy. understanding of all of their cash flows, of liquidity and trading issues and
To request permission to use part or of the policy decisions that will be made as the scenario unfolds.
all of this work contact: Fortunately, there is a major simplification—the Mark-to-Future Cube—
Algorithmics Incorporated that allows these computational tasks to be undertaken in an efficient
185 Spadina Avenue and easily decentralized fashion.
Toronto, Ontario
Canada M5T 2C6 This book, the result of 10 years of effort by the men and women of
Tel: (416) 217-1500 Algorithmics, describes the practical achievements that have taken their
Fax: (416) 971-6100 vision, spearheaded by Ron Dembo, and turned it into a working reality
that is being implemented in the market place today. This work is what
There are no representations or risk monitoring and measurement in financial markets should be about
warranties, express or implied, as to and now, thanks to them, it is what risk management can actually be.
the applicability for any particular The next ten years will see this approach become the standard for
purpose of any of the material(s) financial risk monitoring, analysis and regulation.
contained herein, and Algo accepts no
liability for any loss or damages,
consequential or otherwise, arising
from any use of the said material(s).
Stephen A. Ross
Guest Editor
With Thanks
The editors would like to acknowledge the individuals listed here for
their contributions. Our sincere thanks for their diligent efforts
which made the Mark-to-Future Technical Document possible.

Internal Referees

Nisso Bucay
Olivier Croissant
Ben De Prisco
Neil Dodgson
Joan den Haan
Alexander Kreinin
Asif Lakhany
Helmut Mausser
Jivantha Mendes
Leonid Merkoulovitch
Savita Verma
Hao Wang

Contributors

Ben De Prisco
Nisso Bucay
Olivier Croissant
Doug Gardner
Alexander Kreinin
Helmut Mausser
Savita Verma
Hao Wang
Preface
Ten years ago, Algorithmics was formed in response to the complex
issues surrounding risk management in banking. At that time, even the
most advanced financial institutions associated risk management with
instantaneous hedging of an options book. We recognized then that to
proactively measure and manage risk and reward, one needed to know
the total exposure of the institution across all of its global activities.
This gave birth to what is today referred to as enterprise risk
management or ERM. Since then, our software has helped to transform
the way in which approximately 100 banks, asset managers, insurance
companies and corporations, in 20 countries, are able to measure their
risk and manage their capital. At the heart of our software solution is
the Mark-to-Future methodology.
We have decided to make Mark-to-Future generally available as a
proposed new standard for risk/reward measurement. This innovative
approach to risk measurement significantly extends current
methodologies. It conforms not only with existing regulatory
requirements, but also to proposed regulatory changes. It responds
perfectly to the calls by regulators for a comprehensive framework and
not “formulaic approaches to risk” (Greenspan 1999). As the first truly
forward-looking risk/reward framework, we believe Mark-to-Future has
the potential to change the industry by profoundly affecting the way
risk management and capital allocation are practised in financial
institutions. Most importantly, since Mark-to-Future is a generic
framework and not based on a single formula or method, we believe it
will evolve with risk management practice.
Mark-to-Future allows for portfolios that may change over time and
under differing scenarios. By taking into account the effects of
changing portfolios at future points in time, a more realistic assessment
of risk is possible. Mark-to-Future provides a natural basis for linking
market, credit and liquidity risk. It also provides a unified framework
(the Put/Call Efficient Frontier) for calculating the risk/reward trade-
off of any combinations of these risks.
In a Mark-to-Future world, the fundamental input is scenarios and the
emphasis is on instruments, not portfolios. Since the framework is
additive, the Mark-to-Future of a portfolio is simply the same
combination of each instrument's Mark-to-Future, regardless of the
nature of the security. Accordingly, once a decision has been made
with respect to the future scenarios and the instruments one is likely to
trade, the Mark-to-Future values of these instruments may be
computed well before the actual portfolio composition has been
determined. This makes computation of marginal near-time risk/
reward measurement feasible, even for large institutions. With current
simulation methods this is impossible, since prior to a risk calculation
one needs to assemble all portfolios centrally, which is extremely time
consuming and fraught with difficulty.
What is Mark-to-Future? Most importantly, we believe that Mark-to-Future will break down
significant barriers to the risk service bureau business and make risk
A new, adaptable, multi-step, management outsourcing a reality, by making it possible for institutions
dynamic simulation framework to obtain risk management services without divulging their portfolios.
that integrates disparate sources This will level the playing field and put smaller institutions on an equal
and measures of risk and footing with even the biggest of banks. The same risk management
reward. Algorithmics has analytics, formerly very expensive and available to only the largest
proposed Mark-to-Future, or financial institutions, will be easily accessible. Ultimately, this will lead
MtF, as a standard for evolving to far more extensive and better risk management worldwide.
financial risk/reward
measurement. Mark-to-Future will also revolutionize application development and
risk architecture. The additive nature of Mark-to-Future makes truly
What outstanding problems
decentralized risk architecture possible. Mark-to-Future eliminates the
does MtF address?
need for monolithic applications; instead, software can be created to
• Measures the risk/reward of meet the specific needs of each and every business unit within financial
portfolios that are dynamic institutions. These “thin client” applications will be network-enabled
in nature. and will access pre-computed Mark-to-Future data. Because
• Captures the impact of applications can be built quickly and easily and do not require a large
settlement, maturity and investment, they may be modified quickly to adapt to an ever-changing
trading/investment risk management landscape.
strategies.
• Provides a framework for This document and the implementation of the ideas herein is the result
computing liquidity risk. of a collaborative undertaking, over a 10-year period, by an extremely
• Captures the interaction of knowledgeable, dedicated and talented workforce at Algorithmics. In
market, credit and liquidity particular, I would like to thank Bob Boettcher, Alex Kreinin, Savita
risk and reward. Verma, Olivier Croissant, Ben De Prisco, Judy Farvolden, Helmut
• Makes simulation-based Mausser, Doug Gardner, Michael Durland and David Penny for their
pre-deal risk analysis a contributions to material in this book and to the methodology itself.
reality for large institutions.
• Computes marginal market
An earlier version of this document appeared in Dembo et al. (1999)
and counterparty credit risk and in summarized form in Dembo (1998a, 1998b). We have also
in near time. presented the ideas herein at industrial conferences over the years.
• Measures accurately risk/ Many of our colleagues, both clients and non-clients, have provided
reward for all types of invaluable comments on this proposed standard. In particular, I would
financial instruments. like to acknowledge the feedback from Carol Alexander, Jean-Louis
• Addresses the needs of Bravard, Michael Durland, Robert Fiedler, Andrew Freeman, Arthur
large and small institutions Geoffrion, Fabiano Gobbo, André Horovitz, Chuck Lucas, Robert
in a consistent manner. Mark, Gary Nan Tie, Ariel Salama, Charles Smithson, Lawrence Tabb,
• Breaks the major barrier to Debbie Williams and Rudi Zagst.
risk services—it is not
necessary to disclose I would also like to extend my gratitude to Stephen Ross for his advice,
holdings to outsource risk insight and detailed comments, which have resulted in significant
calculations. improvements to this manuscript. Stephen worked tirelessly with the
What are the business benefits Algorithmics’ research team to refine the presentation and ensure that
of MtF?
the material was consistent, accurate and appropriate for the target
audience.
• Consolidates market risk,
credit risk, liquidity risk and The Mark-to-Future framework is the latest milestone in an incredible
ALM functions in one journey to understand risk and reward. It is not abstract theory. The
consistent framework for ideas presented here have been tried and tested and are available in
efficient capital allocation our products today. We anticipate that this framework will evolve and
and risk management.
grow over time and consequently welcome and look forward to your
• Enables truly distributed feedback. Updates to the methodology as well as discussions, critiques
risk management. and commentary will be posted on www.mark-to-future.com.
• Provides a common
language for technical and
non-technical risk
managers.
• Adapts to changes as risk
management needs evolve. Ron S. Dembo,
Toronto, April 2000
Table of Contents

A Call for a Framework ....................................................................... 1


We introduce the Mark-to-Future methodology and describe how this unifying framework
advances the standard of risk management practice and oversight.

Step 1: Defining Scenarios and Time Steps ..........................................7


The explicit choice of scenarios is the key input to a Mark-to-Future analysis. They directly
determine the future distributions of portfolio MtF values, the dynamics of portfolio
strategies, the liquidity in the market and the creditworthiness of counterparties and
issuers.

Step 2: Defining the Basis Instruments ............................................... 33


The choice of basis instruments to be simulated in the pre-Cube stage of the Mark-to-
Future framework depends on the overall business application and may involve trade-offs
between the magnitude of data storage and pricing accuracy.

Step 3: Simulating to Produce the MtF Cube ....................................... 39


Once the scenarios and time steps have been defined, the MtF Cube is generated by
simulating the MtF values over scenarios and time steps.

Step 4: Producing the Portfolio MtF Table ............................................ 41


Any portfolio or portfolio regime can be represented by mapping the MtF Cube into static
or dynamically changing portfolio holdings. The regimes may be predetermined or
conditional upon the scenario-based MtF values. Conditional regimes are used to capture
liquidity risk.

Step 5: Producing the Desired Risk/Reward Measures ........................... 55


The actual risk and reward measures chosen to characterize future uncertainty can be
arbitrarily defined and incorporated strictly in the post-Cube stage.

Step 6: Advanced Mark-to-Future Applications ..................................... 63


MtF Cubes may serve as input for applications more complex than calculating simple
risk/reward measures. Portfolio or instrument MtF tables may be used as input to an
integrated market and credit risk framework and scenario-based optimization tools.

References ........................................................................................ 75
Notation ...........................................................................................79
ARQ Review ...................................................................................... 81
List of Figures

Figure A.1: Representative MtF Cube ............................................................................. 4


Figure A.2: Schema of the MtF methodology ................................................................ 5
Figure 2.1: Mapping sequence of risk factor through to portfolio ................................ 34
Figure 2.2: Mapping sequence A .................................................................................. 34
Figure 2.3: Mapping sequence B .................................................................................. 35
Figure 2.4: Mapping sequence C .................................................................................. 35
Figure 2.5: MtF values of all-in t = 2 zero coupon instrument across scenarios .......... 36
Figure 2.6: MtF values of all-in t = 0 zero coupon instrument across scenarios .......... 37
Figure 2.7: MtF values of an all-in t = 4 zero coupon instrument across scenarios ..... 37
Figure 3.1: Generation of the Basis MtF Cube ............................................................. 39
Figure 4.1: Portfolio mapping sequence ....................................................................... 42
Figure 4.2: Mapping of a bond into a portfolio liquidation strategy ............................ 42
Figure 4.3: Mapping of products for an attribution analysis ........................................ 46
Figure 4.4: Mapping Treasury bills into a portfolio roll-over strategy ......................... 48
Figure 4.5: Mapping financial products into a portfolio delta-hedging strategy .......... 49
Figure 4.6: Portfolio mapping sequence with abstract basis ......................................... 49
Figure 4.7: Mapping zero coupon basis instruments into a swap ................................. 50
Figure 4.8: Mapping zero coupon basis instruments into zero bond ............................ 51
Figure 5.1: Mark-to-market of the portfolio over a range of DAX values ................... 57
Figure 5.2: Positive drift associated with the portfolio ................................................. 57
Figure 5.3: Changes in portfolio MtM values across Monte Carlo scenarios .............. 58
Figure 5.4: Portfolio MtF values over 100 scenario paths ............................................ 58
Figure 5.5: Dynamic MtF values over 100 scenario paths ........................................... 59
Figure 5.6: Transforming portfolio MtF values ............................................................ 60
Figure 6.1: Integrated market and credit risk in the MtF framework ........................... 66
Figure 6.2: Efficient frontier ......................................................................................... 72
List of Tables

Table 1.1: MtF scenarios for various applications .........................................................11


Table 2.1: MtF values of all-in stock instrument ...........................................................36
Table 2.2: MtF values of all-in t = 2 zero coupon instrument ......................................37
Table 5.1: Selected market risk measures ......................................................................56
Table 5.2: Selected liquidity risk measure .....................................................................56
Table 5.3: Selected reward measures .............................................................................57
Table 5.4: Transformation of MtF values to credit exposure measures .......................59
Table 5.5: Selected credit risk measures ........................................................................61
Table 5.6: Selected performance measures ....................................................................62
Table N.1: Indices ..........................................................................................................79
Table N.2: Variables and parameters .............................................................................79
Table N.3: Functions ......................................................................................................80
Table N.4: Notation for Step 5 .......................................................................................80
Table N.5: Coefficients ..................................................................................................80
A Call for a Framework
In a 1999 address, Alan Greenspan, Chairman of the US Federal Reserve Board
(1999), questioned the suitability of formulaic approaches for the assessment of risk
and espoused the development of a framework that is adaptable to the inevitable
changes in supervision and regulation:

We are striving for a framework whose underlying goals and broad strategies can remain
relatively fixed, but within which changes in application can be made as both bankers and
supervisors learn more, as banking practices change, and as individual banks grow and
change their operations and risk-control techniques. Operationally, this means that we
should not view innovations in supervision and regulation as one-off events. Rather, the
underlying framework needs to be flexible and to embody a workable process by which
modest improvements in supervision and regulation at the outset can be adjusted and fur-
ther enhanced over time as experience and operational feasibility dictate. In particular, we
should avoid mechanical or formulaic approaches that, whether intentionally or not, effec-
tively ‘lock’ us into particular technologies long after they become outmoded. We should be
planning for the long pull, not developing near-term quick fixes. It is the framework that
we must get right. The application might initially be bare-boned but over time become
more sophisticated.

Governor Laurence H. Meyer of the US Federal the requirements set out by Greenspan. It is
Reserve Board (1999) echoed the views of many designed to accommodate evolving standards
regulators who vigorously advocate increasing and, as a framework, does not “lock” into any
both the scale and the scope of public disclosure particular technology so as not to become
of information for various risk categories, most outmoded as technology changes. As Greenspan
notably credit exposure and credit concentration. contends, risk management is all about the
Governor Meyer identified internal bank systems fundamentals: “It is the framework that we must
as the first lines of defence in the prevention of get right.” In what follows, we introduce the
undue risk-taking, and was quick to point out Mark-to-Future methodology and describe how
that supervision and regulation should not this unifying framework advances the standard of
duplicate the efforts of increasingly sophisticated, risk management practice and oversight.
internal best-practice risk management systems.
Notwithstanding these sentiments, coupling Mark-to-Future is the product of over 10 years of
minimum capital regulation to the internal risk- research by Algorithmics’ financial and software
profiling system of a bank is a large and daunting engineers. However, Mark-to-Future is not just a
task. theory—it is the basis of Algorithmics’ risk
management solution, AlgoSuite. AlgoSuite has
Our goal at Algorithmics in writing this book is helped transform the way more than 100 banks,
to describe a standard framework for simulation- asset managers and corporations in 18 countries
based risk management that links together measure their risk and manage their capital.
market, credit and liquidity risks. Mark-to-Future Today, as the leading software provider with the
is an open, flexible and extensible framework most experienced team in the industry,
that advances best-practice risk management as Algorithmics continues to develop and market
advocated by Meyer and systematically satisfies enterprise risk management solutions that
2 A call for a framework

address the ever-increasing complexity of of credit risk across a group of counterparties or


financial markets and growing regulatory sectors cannot be based on a measure that
pressures. assumes joint normality of credit losses; losses
due to default and credit migration are usually
Risk and reward in an uncertain future highly skewed.

At the heart of every investment decision is the What is needed, then, is an approach that not
question, “What will be the value of a given only deals robustly with the full range of future
portfolio at some future time horizon?” The possibilities, but is also backward compatible with
uncertainty inherent in virtually all investment older technologies such as mean-variance
choices implies that this question can only be approaches.
effectively addressed by assessing a range of
possible future outcomes. The decision to invest Mark-to-Future focuses on simulated future
in a given security is made on the basis of some scenarios. In a simulation-based framework,
trade-off between the security’s contribution to scenarios completely define the possible future
overall risk and reward. realizations of value. Only through simulation
can the choice of risk/reward measures be fully
Risk and reward are typically quantified by decoupled from the estimation of future
measures computed from the distribution of outcomes. Bringing scenarios to the forefront is
future portfolio values. For example, risk is often the key to designing a flexible and extensible risk/
measured by a statistic or descriptor such as reward framework. Scenarios, effectively, become
variance, Value-at-Risk (VaR), worst case or the language of risk and reward.
regret while reward is often measured by a
statistic such as mean return, expected profit or In addition, scenarios provide a natural linkage
expected upside. between risk factors that traditionally have been
treated as separate “islands” in the risk
Traditional approaches to quantifying this trade-
management landscape. For example, the
off are typically constrained by narrow
integration of market and credit risk can be
assumptions, implicit or explicit, regarding the
achieved by defining the joint evolution of
shape of the distribution and the choice of
market risk factors and credit drivers. Each
appropriate statistics or descriptors that
individual scenario represents a given realization
characterize the distribution. In most cases, these
of the systemic risk factors, while idiosyncratic
assumptions are inextricably bound together in
risks may be modeled as realizations conditionally
the formulation of the model because a given
independent upon the occurrence of that
statistic serves as the key information input. As
scenario.
such, the choice of a particular risk/reward
measure invariably precludes arbitrary choices of What is Mark-to-Future?
portfolio realizations.

A case in point is the familiar mean-variance At any point in time, the levels of a collection of
framework, in which the use of standard risk factors completely determine the mark-to-
deviation as a risk measure is tightly bound to the market value of a portfolio. Scenarios on these
assumption that future portfolio values are risk factors determine the distribution of possible
normally distributed. Unfortunately, such mark-to-market values. Scenarios on the
traditional approaches are awkward at best, and evolution of these risk factors determine the
at worst, begin to unravel when the underlying distribution of possible Mark-to-Future values
assumptions are violated. (MtF values) through time. Because scenarios
capture future uncertainty as time unfolds, and
For example, the distribution of future values for because Mark-to-Future has scenarios as its key
a “zero-cost” spread position is clearly not normal information input, the framework enables the
and the use of return as a reward measure is calculation of future mark-to-market values that
meaningless when the current mark-to-market capture future uncertainty across scenarios and
value is zero. As another example, the estimation time steps.
A call for a framework 3

Mark-to-Future is essentially about two things: framework is strictly linear across position,
scenario and time dimensions. The addition
• Mark-to-Future is a robust and forward- of a new position, scenario or time step
looking framework that integrates disparate requires only the simulation of the new MtF
sources of risk. By explicitly incorporating values, which are then appended to the
the passage of time, the evolution of previously computed results. Previously
scenarios over time, and the dynamics of simulated MtF results need not be
portfolio holdings over time, Mark-to-Future recalculated; only the calculation of risk or
provides a flexible and unifying platform for reward measure need be repeated.
assessing future uncertainty.
• Mark-to-Future produces risk and reward
• Mark-to-Future is an extensible risk measures that explicitly capture the
architecture that can be leveraged within a passage of time. Thus, challenging issues
single organization and across several such as portfolio path dependency,
organizations. Mark-to-Future enables the settlement and reinvestment, dynamic
decoupling of the computationally intensive rebalancing and thin market effects can be
simulation stage (the risk service) from the addressed effectively.
post-processing risk/reward assessment stage
(multiple risk clients). It also accommodates • Mark-to-Future enables multiple portfolio
differing views of what the future may bring, regimes (strategies) to be evaluated. In
and different models for pricing assets and addition to the analysis of static portfolios,
positions, now and in the future. the MtF Cube makes it possible to assess the
risk/reward of a portfolio with holdings that
A framework for risk and reward change dynamically over scenarios and time.

As a risk framework for producing simulation- An architecture for risk and reward
based risk and reward measures, Mark-to-Future
As a risk architecture for the efficient delivery of
provides the following benefits:
risk and reward measures, Mark-to-Future
• Mark-to-Future is an intuitive framework. provides the following benefits:
Scenarios are the drivers of all future • MtF values need only be computed once.
uncertainty; they are the language of risk. Thus, Mark-to-Future is efficient and the
Individual MtF values are produced as a results can be leveraged to support multiple
direct function of a given scenario whose applications and/or risk clients.
characteristics can be explained in a
straightforward manner. Individuals of very • Mark-to-Future can be implemented
different levels of sophistication can globally across an organization. The
contribute to risk/reward discussions through generation of MtF values serves as the
the focus on plausible scenarios. common input to many risk/reward
applications at different levels of the
• Scenarios on risk factors define organization, from desk-level trader
distributions of MtF values. As individual applications to enterprise-wide applications.
risk factors can evolve jointly (and
• Mark-to-Future decouples the
arbitrarily) over time, Mark-to-Future can
computationally intensive simulation stage
capture the implied relationships (including from the post-processing stage. A service
correlations) among disparate risk factors bureau network can be established over
over multiple time steps. Thus, market, which risk services can be distributed by a
credit and liquidity risks may be integrated service provider to clients who need not
within a common framework. divulge their holdings.
• The realization of MtF values over • Mark-to-Future is an open and extensible
individual scenarios and time steps framework. The framework enables the
determines any risk or reward measure. incorporation of new scenario generation
Most importantly, as we will see, the techniques, new pricing algorithms and new
4 A call for a framework

post-processing applications. The framework knowledge of portfolio holdings is not required to


can therefore be extended to accommodate generate a MtF Cube: a single MtF Cube
new business lines and to evolving risk accommodates the risk/reward assessment of
management practice. In no manner is it multiple portfolios simultaneously. A MtF Cube
linked or committed to any one particular provides a pre-computed basis that maps into all
risk/reward, pricing or scenario generation portfolios of financial products. Since the
methodology. MtF Cube contains all of the necessary
information about the values of individual
Mark-to-Future methodology instruments, a portfolio MtF table can be created
At the core of the MtF framework is the simply as a combination of those basis
generation of a three-dimensional MtF Cube. instruments. All risk/reward analyses and
The MtF Cube is built in steps. portfolio dynamics for any set of holdings are,
therefore, derived by post-processing the
First, a set of scenarios is chosen. A scenario is a contents of the MtF Cube. For example, the risk/
complete description of the evolution of key risk reward assessment of a portfolio regime such as a
factors over time. In the second step, a MtF table roll-over strategy or an immunization strategy is
is generated for a given financial instrument. captured strictly through the mapping of the
Each cell of the MtF table contains the computed MtF Cube into dynamically rebalanced positions.
MtF value for that financial instrument under a
given scenario at a specified time step. A The MtF methodology for risk/reward assessment
MtF Cube consists of a set of MtF tables, one for is summarized by the following six steps, each of
each financial instrument of interest. Figure A.1 which can be explicitly configured as an
illustrates a representative MtF Cube. independent component of the overall process:

The first three steps build the MtF Cube:


S Scenarios
1. Define the scenario paths and time steps.
s
tep
es

2. Define the basis instruments.


im
N instruments T t

3. Simulate the instruments over scenarios and


time steps to generate a MtF Cube.

The next three steps apply the MtF Cube:

1. Map the MtF Cube into portfolios to


Figure A.1: Representative MtF Cube produce a portfolio MtF table.

In certain applications, a cell of the MtF Cube 2. Aggregate across dimensions of the portfolio
may contain other measures in addition to its MtF table to produce risk/reward measures.
MtF value, such as an instrument’s MtF delta or
MtF duration. In the general case, each cell of a 3. Incorporate portfolio MtF tables into
MtF Cube contains a vector of risk-factor advanced applications.
dependent measures for a given instrument
under a given scenario and time step. In some The simulation of the MtF Cube in Step 1 to
applications, the vector may also contain a set of Step 3 represents the only computationally
risk-factor dependent MtF cash flows for each intensive stage of the process and, significantly,
scenario and time step. For ease of exposition, need be performed only once. These steps
however, we focus primarily on the typical case in represent the pre-Cube stage of MtF processing.
which each cell contains only the instrument’s In contrast, Step 4 to Step 6 represent post-
MtF value. processing exercises, which can be performed
with minimal additional processing (Step 4 and
Key to the MtF framework is the premise that Step 5) or slightly more complex processing
A call for a framework 5

Step 1: Scenarios
Define the scenarios
and time steps Time steps

Basis instruments
Step 2:
Define the basis
instruments
Scenarios

Pre-Cube
p s
ste
MtF Cube

e
Instruments Tim
Step 3:
Simulate the instruments
over the scenarios and time steps

Step 4:
Map the MtF Cube into
portfolios/portfolio regimes
Scenarios

Time steps Portfolio

Post-Cube
MtF values
Step 5:
Aggregate portfolio MtF values
to produce risk/reward statistics

VaR, Regret, RAROC Optimal replication

Step 6:
Incorporate portfolio MtF Portfolio credit loss
values into other applications

Figure A.2: Schema of the MtF methodology

(Step 6). These steps represent the post-Cube framework designed not merely to measure risk
stage of MtF processing. Figure A.2 provides a and reward but, significantly, to manage the
schema illustrating the six steps of the MtF trade-off of risk and reward. The remainder of
methodology. this document is organized as follows.
The decoupling of the post-Cube stage from the Step 1 discusses the definition of scenarios. In
pre-Cube stage is the key architectural benefit of the MtF framework, scenarios represent the joint
the Mark-to-Future framework. A single risk evolution of risk factors through time and are,
service may generate a MtF Cube (pre-Cube) thus, the ultimate determinant of future
that can be distributed to multiple risk clients uncertainty. The explicit choice of scenarios is
(post-Cube) for a variety of customized business the key input to any analysis. Accordingly,
applications. This generates leverage as a scenarios directly determine the future
common risk/reward framework and can be distributions of portfolio MtF values, the
widely distributed throughout the organization as dynamics of portfolio strategies, the liquidity in
well as to external organizations for user-specific the market and the creditworthiness of
analyses. counterparties and issuers. This step discusses
scenarios in risk management, their importance,
The six steps of Mark-to-Future and various methodologies used to generate
them.
The objective of this document is to provide a
step-by-step description of the fundamentals of Step 2 discusses the definition of basis
the MtF framework and to demonstrate why it instruments. Portfolios consist of positions in a
represents a standard for simulation-based risk/ number of financial products, both exchange
reward management. Mark-to-Future is a traded and over-the-counter (OTC). The MtF
6 A call for a framework

Cube is the package of MtF tables, each Step 5 discusses the estimation of risk/reward
corresponding to an individual basis instrument. measures derived from the distribution of
A basis instrument may represent an actual portfolio MtF values. The portfolio MtF table
financial product or an abstract instrument. As resulting from the mapping of the MtF Cube into
the number of OTC products is virtually
a given portfolio or portfolio strategy contains a
unlimited, it is often possible to reduce
full description of future uncertainty. Each cell of
substantially the number of basis instruments
required by representing the MtF values of OTC the portfolio MtF table contains a portfolio MtF
products as a function of the MtF values of the value for a given scenario and time step. The
abstract instruments. This step discusses the actual risk and reward measures chosen to
issues surrounding the selection of basis characterize this uncertainty can be arbitrarily
instruments to be contained in the MtF Cube. defined and incorporated strictly as post-
processing functionality in the post-Cube stage.
Step 3 discusses the generation of the MtF
Cube. The MtF Cube consists of a set of MtF
tables each associated with a given basis Step 6 discusses more advanced post-
instrument. The cells of a MtF table contain the processing applications using the MtF Cube.
MtF values of that basis instrument as simulated MtF Cubes may serve as input for applications
over a set of scenarios and a number of time more complex than calculating simple risk/
steps. This step discusses the relationship reward measures. The properties of linearity and
between risk factors, scenario paths and pricing conditional independence on each scenario can
functions for the simulation of MtF values.
be used to obtain computationally efficient
Step 4 discusses the mapping of the MtF Cube methodologies. For example, conditional
into portfolios and portfolio strategies. From independence within a particular scenario is a
the MtF Cube, multiple portfolio MtF tables can powerful tool that allows the MtF framework to
be generated as functions of the MtF tables incorporate effectively processes such as joint
associated with each basis instrument. Key to the counterparty migration. In addition, portfolio or
MtF framework is the premise that a MtF Cube is
instrument MtF tables may be used as input to a
generated independently of portfolio holdings.
Any portfolio or portfolio regime can be wide variety of scenario-based risk management
represented by mapping the MtF Cube into static and portfolio optimization applications.
or dynamically changing portfolio holdings. This
step discusses how portfolio MtF tables are The notation used in this book is summarized in
produced from a single MtF Cube. the Notation chapter (page 79).
Step 1: Defining Scenarios and
Time Steps
Scenarios represent the joint evolution of risk factors through time and are, thus,
comprehensive descriptors of future uncertainty. They directly determine the future
distributions of portfolio MtF values, the dynamics of portfolio strategies, the liquidity
in the market and the creditworthiness of counterparties and issuers. The explicit
choice of scenarios is the key input to a MtF analysis. This chapter discusses scenarios
in risk management, their importance and various methodologies used to generate
them.

The quality of a risk management analysis recognize the limitations of such techniques.
depends on the ability to generate relevant, Scenario-based risk management analyses
forward-looking scenarios that properly represent overcome these limitations, but create a need for
the future. In the MtF framework, at a point in relevant scenarios. Accordingly, in the last
time, the distribution of underlying risk factors decade there has been extensive research into
such as interest rates, equity indices, commodity techniques for generating realistic and
prices, foreign exchange rates or macroeconomic computationally efficient scenarios for risk
variables is defined by the set of scenarios management. Many of these techniques are
explicitly chosen. Thus, any desired future risk largely derived from major advancements in
factor distribution can be incorporated directly scenario generation in both the physical and
into the analysis in a manner that is not economic sciences, starting with the
constrained by a specific summary statistic or a development over half a century ago of tools such
single scenario generation technique. Future as statistical bootstrapping and the Monte Carlo
distributions of portfolio outcomes are then method.
ultimately driven by future distributions of
underlying risk factors. The transparent This chapter defines scenarios formally, discusses
integration of various seemingly different risks, their merits, provides an overview of the
such as market, credit and liquidity risk, can be characteristics of scenarios for different risk
accomplished through the explicit introduction analyses and, finally, considers the main
of scenarios which include the factors that are techniques used to generate scenarios including
the sources of these risks. history, scenario proxies, scenario bootstrapping,
subjective opinions and model-based methods.
Traditional risk/reward methodologies are based
on simplifications, often parametric, of the What are scenarios?
description of the risk factor evolution that are
necessary to achieve mathematical tractability. A scenario is the basic descriptor of the
Simplifying assumptions are made despite the evolution of the state-of-the-world over time. It
fact that financial risk managers now widely gives a joint realization of all the relevant
8 Step 1: Defining scenarios and time steps

financial and economic risk factors at a discrete In contrast, by generating scenarios on systemic
set of times in the future. The period of the risk factors, the estimation of models is more
analysis is [0,T], where today is time t = 0 and robust and the use of historical simulation is
the time horizon is t = T. A MtF table is straightforward, as is obtaining the MtF value of
constructed by valuing a financial instrument a new instrument that is completely consistent
across each of the j = 1, …, S scenarios over with the prices of the existing instruments. Using
this approach, it is not necessary to define
t = 1, …, T time steps. The MtF value, mijt, of explicitly the correlations among all instruments,
instrument i under scenario j at time step t is a since the co-dependence is defined implicitly by
function f(•) of the levels of specified risk factors the scenarios.

mijt = f ( u 1jt, u 2jt, u 3jt, …, u Kjt ) Scenario selection differs considerably from
forecasting. A forecast is a prediction that a
where u kjt = ( k = 1, …, K ) is the level of risk factor single scenario will occur, and its accuracy is
therefore crucial. However, no one is able to
k under scenario j at time step t. Thus, the
predict accurately specific financial events in the
realization of risk factor levels over time
future. As Nobel Laureate Paul Samuelson
completely defines the future realizations of
observed, “analysts successfully predicted 12 out
instrument and portfolio values through time. In of the last four major recessions”
advanced applications, the scenarios may (Samuelson 1991).
describe only the systemic realizations of
instrument values. This is further described in The goal in selecting scenarios in the MtF
Step 6. framework is to span the range of future possible
events, and not necessarily to forecast that any of
The current state-of-the-world is the point of these events will actually occur. Thus, in a given
departure of every scenario. Today, the current risk analysis, we assume the existence of a set of
market and economic data determine, for scenarios that may occur in the future. If the
example: scenarios set is rich enough, one of these events
will actually occur, but at the start of the period,
• the mark-to-market value of all instruments there is uncertainty as to which one it will be.
in a portfolio
It is also assumed that we have a perception of
• the cash flows to be paid, received and the likelihood of these scenarios. The likelihood,
reinvested or probability, pj , ( j = 1, 2, … , S ) , of a given
• the bid-ask spreads scenario j is not needed to compute MtF values.
Probabilities are required only at a later stage
• the credit quality of all obligors (issuers, when risk/reward statistics are computed. In
counterparties, accounts) in the portfolio. addition, the choice of scenarios need not
necessarily depend on the probability assigned to
In principle, scenarios can be generated directly them. This separation of probabilities from the
on the prices of all securities (accounting for the scenarios themselves allows MtF values to be
sizes of trades), the credit state of given obligors computed once and subsequently used as inputs
and the composition of the portfolio, etc. This into many different risk analyses.
would be impractical, however, for a number of
Scenarios must span a wide range of possible
reasons. For example, to generate the scenarios,
outcomes and they must also extend over a
correlations between all instruments are needed, horizon, or multiple horizons, of appropriate
and each time a new instrument is added to the length. The length of the horizons varies with the
portfolio, its correlation to all other instruments problem. While the appropriate horizon for a
must be computed. Furthermore, it is over- market risk analysis might be one to 10 days (or
restrictive because the characteristics of longer depending on the liquidity of the
instruments change through time and it is position), estimating counterparty exposure
unclear how we would construct their joint profiles may require multiple time horizons over
evolution from previous historical data. 10 years or more.
Step 1: Defining scenarios and time steps 9

Why use scenarios? 4. Computational efficiency. Scenario-based


risk analyses are computationally intensive,
Using a scenario-based approach to risk/reward but scenarios can be used efficiently in the
assessment offers many advantages. MtF framework, which exploits some simple,
yet powerful, mathematical properties:
1. Accurate description of the future. A
wealth of techniques can be used to generate • Conditional linearity. Conditional on a
scenarios from history, advanced models or scenario, the value of a portfolio is a
subjective views of the world. A risk linear function of both the value of the
management framework based on scenarios instruments and the number of positions
overcomes the limitation of standard held in each instrument. This facilitates
analytical methodologies that may require the fast pricing of a large number of
onerous simplifications of the description of portfolios simultaneously as is necessary,
the risk factor distributions for mathematical for example, in applications that
tractability. distribute risk management reports to
various users. This property can also be
2. Transparency and communication. Using exploited in the implementation of real-
scenarios to describe the future possible time incremental analysis of new deals,
states-of-the-world makes the risk dynamic portfolio strategies and portfolio
management process transparent, descriptive optimization.
and manageable. A thorough understanding
of scenarios is essential for good risk/reward • Conditional credit independence.
management. In order to interpret a risk Conditional on a given scenario on all
measure, it is important to understand the systemic factors, credit events are
nature of the scenarios used to produce it. independent. This property can be
Furthermore, the end-users of the risk effectively exploited to integrate market
information should help sanction the and credit risk and to compute portfolio
scenarios. Risk managers understand credit capital. This is generally more
scenarios and their market experience and efficient than a direct simulation of
intuition are key to defining them. Scenarios credit events.
allow individuals from widely different • Conditional independence of specific
backgrounds and perspectives to engage in risks. Conditional on a scenario on all
meaningful discussions on risk. systemic factors, equity prices are
independent. This can be used
3. Integration of risks. Regulators are calling effectively, for example, to evaluate the
for an integrated framework for measuring risk of equity portfolios and determine
market, credit and liquidity risk. These specific risk. Conditional independence
disparate sources of risk are linked quite is indeed a general property of multi-
naturally through the scenarios of the MtF factor models commonly used to price
framework. The integration occurs at the corporate bonds and other securities.
scenario level. Scenarios on simultaneous
changes in all systemic factors affecting Scenarios in risk management applications
market, credit and liquidity states naturally
provide correlated, consistent MtF values Scenarios are the critical elements of a MtF risk
from which risk measures that link these analysis. Hence, they must span a wide range of
sources of risk can be calculated. Various possible outcomes. In most implementations, a
integrated measures of risk and return can distinction is made between extreme scenarios
then be obtained, simplifying the integration that are used for stress testing and scenarios
of risk/reward measurement for capital based on “typical” market conditions. However,
allocation purposes. both types of scenarios have equivalent status
10 Step 1: Defining scenarios and time steps

(but not necessarily weights) in the MtF factors affecting their value over time (Aziz and
framework. All scenarios are part of the same Charupat 1998). Portfolio credit risk models, on
consistent MtF Cube. A distinction is made only the other hand, are concerned with the joint
at the post-Cube stage of MtF processing. defaults and credit migrations of all obligors, and
hence require scenarios on the credit drivers that
The choice of scenarios typically depends upon influence obligor creditworthiness. For example,
the risk management application. Table 1.1 CreditMetrics (J.P. Morgan 1997) defines
presents a summary of scenario requirements country, region and sector indices as credit
(type of scenario, risk factors, horizon and drivers for the asset value and credit quality of
number of times steps) that depend on the risk each obligor.
management area and desired application. These
are general guidelines on the characteristics of Integration of market and credit risk through
the scenarios required, and not necessarily the scenarios
current “best practices.” In most applications It is still common practice to treat market and
today, rigorous scenario-based risk management credit risk separately. This separation has a major
is still not the norm. impact when measuring counterparty exposures,
portfolio credit risk and specific risk for bonds.
In general, scenarios in market risk applications
This is further described in Step 6.
are described by financial market risk factors that
include various zero coupon term structures, Ultimately, a comprehensive framework requires
equity indices, exchange rates, commodity prices the full integration of market and credit risk. In
and implied volatilities. Typically, a market risk the Mark-to-Future framework, the integration
analysis is performed over a single, short horizon of market and credit risk is achieved through the
(one to 10 days). Term structures and volatility scenarios, which explicitly define the joint
surfaces are modeled with a large number of evolution of market risk factors and credit
factors (e.g., the RiskMetrics dataset contains drivers. Market factors drive the prices of
14 nodes per term structure). Statistical analysis, securities, and hence exposures, whereas credit
such as VaR, requires the description of the joint drivers are systemic factors that drive the
distribution of the factors at the given horizon. creditworthiness of obligors in the portfolio.
Both historical and Monte Carlo scenarios are Factors are general and can be microeconomic,
widely used. Scenarios for asset management macroeconomic, economic and financial.
applications are similar but extend over longer
horizons ranging generally from one month to Consider, for example, the counterparty exposure
one year and may include multi-time step to a position in a corporate bond. The present
simulations. Accordingly, asset managers also value of the bond is a function of the Treasury
customarily use factor models to reduce the curve and an additional credit spread that
factors in their analysis to a more manageable compensates for the default risk of the corporate
number. issuer. The credit spread, in turn, depends on the
credit quality of the corporate issuer. If a credit
In contrast, asset liability management (ALM) downgrade occurs, cash flows must be valued
and credit risk analyses require the generation of using a higher credit spread. Market and credit
scenarios over multiple steps spanning long risk are consistently linked by incorporating all
horizons, sometimes extending to the maturity of the systemic information on the interrelated
all outstanding positions. An ALM analysis is outcomes for the issuer’s credit rating and for the
concerned mainly with scenarios describing the Treasury curve in the scenarios. An illustration of
evolution of interest rate risk factors (and integrated market and credit scenarios based on
perhaps foreign exchange risk factors) affecting the integrated model of Iscoe et al. (1999) is
the cash flows of trading and banking positions. presented on page 68.
Since the values of derivatives are extremely
Scenarios and liquidity risk
sensitive to the market conditions, the analysis of
counterparty credit exposures in a derivatives Although the cost of liquidity is usually reflected
trading book requires scenarios on all market in bid-offer spreads, liquidity risk can also be
Step 1: Defining scenarios and time steps
Number of Number of
Area Factors Risk analysis Horizon Type of scenarios
time steps scenarios

Market risk 50–1,000 statistical (VaR, RAROC, risk 1–10d 1 historical 100–500
IR (government, spreads) contributions)
MC: normal and fat-tail 1,000–
foreign exchange distribution 10,000
equity stress testing (worse-case, what-if) 1–30d 1–10 extreme: historical “crashes”, 5–50
commodities subjective

sensitivity analysis (deltas, etc.) 0–1d 1 shifts in small number of factors 10–100

Asset/fund 20–500 statistical (VaR, risk contributions, risk 30d–1y 1–10 historical, bootstrapping 100–500
management IR (government, spreads) adjusted returns)
MC: multi-factor processes 1,000–
foreign exchange 10,000
equity stress testing (worse-case, what-if) 30d–1y 1–10 extreme: historical “crashes”, 5–20
commodities subjective

Asset liability 20–100 VaR (over time) 6m–30y 20–100 historical, bootstrapping 100–500
management IR (government, spreads) cash flow at risk
earnings at risk MC: multi-factor processes 500–5,000
foreign exchange
stress testing 6m–30y 20–1,000 extreme: historical “crashes”, 5–20
subjective

Counterparty 50–100 statistical: expected: “VaR” exposures” 1–30y 10–40 MC: multi-factor processes 1,000–5,000
credit IR (government, spreads)
exposures stress testing 1–30y 10–5,000 extreme: historical “crashes”, 10–50
foreign exchange subjective, scenario banding
equity
commodities

Portfolio 50–200 Capital: Expected losses, Credit VaR, 1–10y 1–30 MC: multi-factor processes 1,000–5,000
credit risk (systemic market and RAROC, risk contributions
credit factors): IR
(government, spreads), stress testing 1–10y 1–30 extreme: historical “crashes”, 5–20
foreign exchange subjective

11
equity
commodities
macroeconomic factors

Table 1.1: MtF scenarios for various applications


12 Step 1: Defining scenarios and time steps

modeled indirectly by designing joint scenarios that span the range of possible future states-of-
on daily trading volumes and market risk factors the-world over the time horizon relevant for risk/
such as price and volatility (see, for example, reward measurement. Ideally, the scenarios in a
Yung 1999a). In such a model, portfolio positions risk analysis should satisfy three principles:
are liquidated conditional on the simulated
outcomes for these risk factors in each scenario. 1. Scenarios must “cover” all relevant past
For example, when simulated trading volumes history. As Mark Twain once said: “history
are high, positions are liquidated faster than does not repeat itself, but it rhymes.”
when trading volumes are low. If portfolio
Historical scenarios satisfy this principle
holdings are conditional on scenario outcomes, it
directly. For model-based scenarios this
must be possible to simulate the value of
means that history must be used explicitly to
portfolios that evolve over time. An example of
estimate the parameters of the model. Of
simulated trading volumes is presented on
course, the process of determining the
page 44.
relevant history is largely subjective. The
Joint scenarios on market and liquidity risk for appropriateness of using one or five years of
equity portfolios might include equity indices and historical data (either in a historical analysis
specific risk factors, which capture market risk, as or to calibrate a model), depends on the
well as bid-ask spreads and trading volumes, application and on the analyst’s opinion of
which determine liquidity risk. Joint scenarios on what periods of history are relevant.
market, credit and liquidity risk for emerging Although theories can be applied to
market bonds, for example, may cover US determine the amount of data required for
Treasury rates, credit risk drivers such as an robust estimation or to exclude (include)
emerging market index, sovereign spreads and outliers, in the end there is no simple formula
bid-ask spreads. to determine the relevance of history for the
future and, hence, the subjective
Scenario generation methodologies assumptions must be made as explicitly as
possible.
There are many ways to generate scenarios for
2. Scenarios should also account for events that
stress testing or statistical risk measurement.
have occurred in the past, but which may be
Systematic scenario generation methods can be
plausible under the current circumstances.
broadly classified into historical and model-based
methods. Historical scenario generation methods In the case of historical scenarios, this
are sometimes referred to as non-parametric requires the use of history in creative ways.
methods, while model-based methods are First, the history of each risk factor can be
referred to as parametric methods. In addition to augmented by including more distant events
these systematic methods, risk managers also from the risk factor’s own history than would
customarily use ad hoc or subjective methods, be a part of a standard, fixed calibration
which are not based directly on a model or period. Second, the history of one risk factor
history but rather on experience, intuition or a can become a proxy for scenarios on another
specific need. Examples of these are an opinion factor. For model-based scenarios, certain
that the NASDAQ will plummet 30%, a “what- parameters of a model account for current
if” analysis to investigate the impact of a information that is not rooted in history,
100 basis point move in a GBP interest rate book, such as a new political development or
and a sensitivity analysis of a position performed special market conditions.
by shocking each node in an interest rate curve
separately by one basis point. 3. Scenario generation methods must be
validated, whenever possible, out of sample.
Whichever method is chosen, with careful
analysis, and a broad view on where to seek As an example, in the case of a market risk
pertinent information, it is generally possible to analysis, the validity of the scenario set
generate reasonable, forward-looking scenarios generated yesterday can be tested by
Step 1: Defining scenarios and time steps 13

comparing today’s mark-to-market values mean that a 6-sigma crash is a very unlikely
with yesterday’s Mark-to-Future values. By historical event?
repeating this exercise over an extended
To answer this question, assume that daily
period, it is possible to determine the
returns are normally distributed and
performance of various methods. In the risk
independent. The probability of a single 6-sigma
management literature, this is commonly
referred to as backtesting, and has become an event on any given day is about 10-9, significantly
important regulatory requirement. There are less than 0.04%. The probability of one event in
various statistical backtesting methods (Basle 84 years is then about 0.002%.; the probability of
Committee on Banking Supervision 1997, nine 6-sigma events occurring in 84 years is
Lopez 1999). Note that in the case of longer about 10-43. Whereas according to history, 6-
simulation horizons, out-of-sample testing sigma events are rare, a simple parametric model
becomes more difficult, but it is still possible grossly underestimates their probability.
to apply advanced statistical methods (Lopez
and Saidenberg 2000) and to perform some In established markets where no fundamental
checks and balances. Finally, in practice it is shifts in the underlying structure have recently
clearly not possible to perform out-of-sample occurred, a long history may be an adequate
validation of extreme scenarios for stress source of scenarios describing typical market
testing in isolation. conditions. However, shifts can and do occur and
the future is sure to contain events outside this
In the following sections we review the basic range. When history is the sole source of
principles of historical and model-based methods. scenarios, it is implicitly assumed that between
now and the horizon there will be no
Historical scenario generation methods fundamental shift in the underlying forces in the
market, other than those already observed during
Historical time series over long periods reflect a the historical period chosen. This precludes the
very wide range of scenarios. Since these events incorporation of new views that could be
have occurred previously in the market, they are extracted from other events or by other
certainly candidates for possible future events. techniques.
However, significant events occur infrequently
and a long history may be required to produce a Consider a second example. At the end of
set of scenarios that spans the possible range of March 1998, the Canadian dollar was trading at
outcomes. The longer the time series, the wider approximately 0.708 USD. The Canadian dollar
the range of possible outcomes that can be taken had last touched 0.700 USD in 1994. A risk
into account. analysis of a portfolio with Canadian dollar
exposure based on a standard historical time
Neglecting the possibility of a crash scenario is series would have provided a false sense of
clearly not prudent in a comprehensive risk security to the holders of that portfolio. In fact,
analysis. Yet, if only recent history is considered, the drop in the Canadian dollar from 0.708 to
or the model is restricted to normal distributions 0.633 USD that occurred between March and
that do not include fat tails, this possibility will August 1998 represented a three standard
probably be excluded. In the MtF framework, a deviation move with about a 0.02% likelihood of
crash scenario can be incorporated explicitly. occurring. (The standard deviation of daily log
Risk analyses can include both historical returns with zero mean is calculated based upon
scenarios to account for typical events and stress one year of daily observations between
test scenarios to account for atypical events. March 1997 and March 1998 and scaled to a
five-month horizon by the square root of time).
For example, in the last 84 years, there have been
nine crashes in the Dow Jones Industrial Index Beyond that, how would we understand the risk
that exceeded six standard deviations (6-sigma) or return of a stock that has not yet been issued
in a single day (based upon the previous 250 or of the euro before it became a traded currency?
business days). From historical experience, the How can we measure the risk of Internet stocks
actual likelihood of a crash of that magnitude when there is insufficient history in this market
occurring on any given day is 0.04%. Does this on which to base historical scenarios?
14 Step 1: Defining scenarios and time steps

In situations such as these we must go beyond last decade, for example in 1987, 1989 and 1993,
the history of the risk factors in order to generate the exchange rate on the Australian dollar
future events that reasonably span the range of dropped by 10% or more within a short number
possible outcomes. One way to add to the history of months. Scenarios on the AUD/USD
exchange rate would have provided a sensible
forward-looking view of future Canadian dollar
behaviour, including a Canadian dollar scenario
During World War II, a team of three
describing a drop to 0.633 USD or below.
eminent professors was asked to make long-
range weather forecasts for the Air Force. It A useful way to address the sentiment or
did not take them long to realize that they economic environment of a given market is to
had no idea how to go about this, but one of look for history in a different market or region
them remembered that each year the that may have gone through similar situations in
Farmer’s Almanac published detailed and the past. We then take these conditions and apply
somewhat reliable predictions of the them today as scenarios on the desired market.
weather for the upcoming year. Such scenarios are generally called proxies.
Unfortunately, when the professors asked
how the Almanac made the weather Consider Brazil as another example. Early in
prediction, the publisher refused to say! 1999, market risk managers in Brazil asked, “How
can we calculate Value-at-Risk figures for our
Rebuffed but inspired, the professors set portfolios when we don’t have any historic data?”
about to reverse engineer the process and (Locke 1999). One possibility would have been
find out how the Almanac’s predictions to consider the market turbulence caused by the
were made. After some effort, they were devaluation of the ruble in August 1998 as a
convinced they had figured it out: to proxy for how events in Brazil might unfold
generate a forecast for, say, the year 2001, should the real be allowed to float (Yung 1999b).
the summer of 1954 was combined with
the fall of 1973, the winter of 1937 and Bootstrapping can be performed in more
the spring of 1982. By combining events sophisticated ways to obtain conditional
that had actually occurred they were able to scenarios. For example, high volatility scenarios
generate a plausible forecast for 2001. can be generated by sampling repeatedly from a
Bootstrapping was born when numerous history filtered to exclude periods of low
such forecasts were generated randomly, volatility.
creating a scenario set of weather forecasts.
Bootstrapping can also be used in conjunction
with scenario proxies. Previous markets that have
of a market is to use the history for similar enjoyed exceptional growth, Nokia A stock (an
situations in different markets as a proxy. emerging industry) and the gold rush in the
1970s, can serve as proxies for upside scenarios to
Scenario proxies and bootstrapping determine the risk in a position in a new Internet
stock (Yung 1999c). The 1929 stock market crash
that precipitated the Great Depression, and
If standard historical methods would not have Black Monday of October 1987 can serve as
captured the exchange risk in the Canadian proxies for downside scenarios. Bootstrapping can
dollar in March 1998, what methods might have then be used to scramble and rearrange daily
been used to determine the six-month risk in the returns from the applicable proxy periods. Further
CAD/USD exchange rate? Novel scenarios on randomization can be achieved by randomly
future Canadian dollar rates for August and selecting the starting point for a scenario series.
September 1998 could have been extracted from
an AUD/USD time series. Using the AUD/USD Model-based scenario generation
exchange rate as a proxy for the scenarios on the
CAD/USD rate could have been justified on the Research in computational finance over the last
basis of perceived similarities between the decade has resulted in powerful models for
Australian and Canadian macroeconomic scenario generation applicable to risk
conditions at that time. Several times over the management. By adding theory and structure to
Step 1: Defining scenarios and time steps 15

Proxies—Brazil’s Real Crisis


In January 1999, US-based
investors were keenly
interested in understanding
how events in Brazil might
unfold should the real be
allowed to float: devaluation
of the real by the central bank
would cause the rates on
Brazilian Brady bonds to soar.
Just as the Russian ruble
devaluation was the catalyst
for market turbulence in
August 1998, might pressure
building up behind the real Devaluation of the ruble
have a similar impact in
January 1999?

Parallels to Russia
In August 1998, Russia
announced the devaluation of
its currency and temporary
default on its government
debt. Russian stocks fell by Brady spreads US 30-year Treasury rate
more than 35% while the ruble
tumbled by more than 50%. Changes in key rates (August 14–28, 1998)

Yields on emerging market


debt soared, including those financial markets were History repeats itself
on Brazilian Brady bonds, feverish. Credit spreads
while those on US Treasury widened, equity markets Using the period of the ruble
bonds reached historic lows. declined, volatility increased devaluation as a proxy for
Spreads on the Russian and bid-offer spreads on current market scenarios, the
Ministry of Finance emerging market debt widened two weeks from August 14 to 28,
Venesheconombank bonds precipitously. 1998 become two weeks in
doubled from their usual norm
and the US Treasury zero
curve dropped by more than
6% from August 14 to 28. The
flight to quality that
commenced as investors RUSSIAN CRISIS BRAZILIAN CRISIS
rebalanced their portfolios
US Treasury zero rate (August) US Treasury zero rate (January)
caused liquidity in emerging
markets to evaporate. Brazilian Brady bond spreads Brazilian Brady bond spreads
(August) (January)
Once confidence in Russia
waned, capital fled and
Mapping the ruble to the real
16 Step 1: Defining scenarios and time steps

January 1999: the US Treasury


VaR zero rate and the Brazilian Brady
bond spread for those
10 business days are mapped to
the US rate and the Brazilian
Brady spread (respectively) from
Actual January 11 to January 25.
Compared to the 95% VaR
Russian stress test
estimate, the extreme event
proxied by the Russian scenario
confirms suspicions of the
magnitude of the potential
risk—it forecasts a 57% decline
in portfolio value over this two-
Devaluation of the Ruble Details of Russia’s
debt restructuring
Russia’s central bank
yielded support of
week period. This estimate
program announced the Ruble foreshadows the actual
behaviour of the Brady portfolio.
Evolution of portfolio value (January 11–26, 1999)
Brazil experienced a de facto
devaluation of its currency on
January 12 when the central
bank stopped defending the real.
On January 15, 1999, the
Brazilian government let the real
Actual
float and raised interest rates;
investors’ relief at the
government’s responsiveness to
the crisis is evident from the
actual behaviour of the Brady
Russian stress test portfolio. This sentiment was
comparable to that felt by
investors when Russia
announced the details of its
debt-restructuring program on
August 24, 1998.
Impact of government intervention (January 15–26, 1999)
Step 1: Defining scenarios and time steps 17

the way scenarios are generated, these models strong emphasis on modeling tails is required.
allow us to Pricing derivatives, on the other hand,
requires the computation of expectations of
• extend what has been observed, using all the distributions and may be less concerned with
historical and “current event” information extreme events.
available
• generate a large number of scenarios in the As with pricing models, however, risk
sample that can lead to reliable statistical risk management scenario generation models must be
measures financially consistent and build on existing
mathematical and financial theory. For example,
• move from a mere description of what could a model for interest rate scenarios that permits
happen to an explicit explanation of negative forward rates would permit static
correlations and causality that explains the arbitrage and could result in unreasonable MtF
sources of risk values for certain securities.
• construct tools for forecasting and for the
A scenario generation model consists of three
computation of conditional events.
separate parts:
Models of the evolution of financial risk factors
1. The risk-factor evolution model is a
are used for both pricing of derivative securities
mathematical model that describes the joint
and risk management. However, the models
evolution of the risk factors in the future.
traditionally built for derivatives pricing may not
The model is chosen for its mathematical
be appropriate for risk management for various
tractability, the underlying financial theory,
reasons:
its asymptotic properties and its ability to
• Risk management scenarios must use the true explain causality, as well as other scientific
probabilities of events and are calibrated and aesthetic properties of good modeling.
using history and current event information.
In contrast, pricing models generally use 2. The model calibration method refers to the
what are commonly called the risk-neutral methodology used to estimate the parameters
probabilities. This is a mathematical of the model. Generally, models are
convenience that allows the direct calibrated using historical information,
incorporation of a no-arbitrage condition incorporating current events and using the
while expressing the price of a derivative as experience of the modeler.
an expectation. These models are calibrated 3. The sampling methodology is used to obtain
to fit current available prices of traded random samples from a model. The objective
securities. of scenario generation in risk management is
• Risk management scenario generation not to provide a single forecast of the risk
models are generally high dimensional. Since factor levels in the future. Rather, we require
the emphasis is on understanding the joint methodologies to sample various scenarios,
behaviour of many instruments in a portfolio, generated according to the evolution model,
they must incorporate the relationships and assign probabilities to them. Thus, at the
between all factors. Pricing models, on the heart of model-based scenarios is the
other hand, generally use a small number of generation of random numbers. Note that, in
factors; for example, most implementations addition to sampling randomly from a model,
of term structure models use one or two it is possible also to sample specific likely
factors. Also, since they are used to price scenarios, extreme events and conditional
single securities, these models usually do not scenarios on the realization of a small
incorporate the relationships to other factors. number of variables.
• Risk management focuses generally on the The most general method for sampling a large
tails of portfolio distributions; this is where number of random scenarios from a model is the
the action is for a risk manager! Hence, a Monte Carlo method. Monte Carlo simulation is
18 Step 1: Defining scenarios and time steps

Proxies and Bootstrapping—Amazon.com


in the US, its stock enjoyed
growth akin to what Internet
stocks are experiencing today:
the Nokia A share trading in the
NYSE has risen by more than 20
times its value since January
1994. The gold rush in the late
1970s also has uncanny
similarities: when the US Federal
Reserve Board reduced interest
rates in response to a threatened
debt default by Mexico in 1979,
the price of gold took off, rising
AMEX Internet Index vs. S&P 500 (Jan. 1998–Mar. 1999) 300% in six months.

The impressive 34% return of Comparison of Nokia A in


the S&P 500 since January 1994 and AMEX Internet
Index in 1998
1998 pales in comparison to
the performance of Internet
stocks. The AMEX Internet
Index, which tracks 50 such
stocks trading on the New
York Stock Exchange, rose by
more than 200% during the
same period. Gains
experienced by blue-chip
Internet stocks were even
more significant. Overall,
a) Nokia A shares (Jan. 1994–Mar. 1999) vs.
share values of companies like AMEX Internet Index (Mar. 1998–Mar. 1999)
Amazon.com, Yahoo!, eBay Comparison of Gold
and AOL have grown by more (1979–1980) to AMEX
than 1,000% since the start of Internet Index (1998–1999)
1998. Internet stocks have
experienced exceptional
growth. A position of five
million shares in Amazon.com
purchased in early 1988 was
worth only 65 million USD; a
year later that position was
worth 700 million USD.
Have previous markets
enjoyed growth similar to b) Price of gold (Jun. 1979–Mar. 1999) vs.
Internet stocks today? When AMEX Internet Index (Mar. 1998–Mar. 1999)
Nokia first entered the
telecommunications industry Prices of Nokia A and gold vs. AMEX Internet Index
Step 1: Defining scenarios and time steps 19

1929 Stock Market Crash 1980 Gold Crash


(Dow Jones Industrial 1925–1935) (Gold prices 1973–1999)

Black Monday of October 1987 Japanese Stock Market Crash in 1990


(Dow Jones Industrial 1985–1990) (Nikkei 225 1981–1999)

Various stock market crashes

Mark the position to future volatility scenarios are Would a trading strategy
bootstrapped by combining mitigate risk? Five million shares
The three-year period returns from the upside and represent 10% of Amazon.com’s
following each historic market downside proxies. This process total market float. The average
peak serves as the proxy for creates a set of 200 scenarios daily volume for Amazon.com is
downside risk while the one- over a horizon of 100 days. only seven million shares; it is
year period that precedes each common for daily trading
historic market peak serves as Assuming a hold strategy, the
volume to be as low as 10% of
the proxy for further growth Mark-to-Future values of one
average volume. This lack of
scenarios. share of Amazon.com indicate
liquidity poses additional
that the stock can potentially
Downside scenarios are concerns—instantaneous
gain 120% in the next 100
bootstrapped using daily disposition of the entire holdings
days. However, it can also lose
returns generated from the is not likely. A stress test model
more than 85% over the same
downside proxies, upside must forecast the risk specific to
period. This translates into a
scenarios are bootstrapped the Amazon.com shares, as well
loss of more than 587 million
using daily returns generated as the risks of trading in this
USD in just five months on a
from the upside proxies, and illiquid market.
position of five million shares!
20 Step 1: Defining scenarios and time steps

Two bid/ask spread curves depict


normal and extreme liquidity
risks. The first curve portrays
limited liquidity risks by
assuming that sales can be
transacted without moving
market prices by more than 20
basis points. The second curve
portrays a scenario of high
liquidity risk corresponding to a
discount of 20% to the market
price of Amazon.com for a
trading volume of 700,000
shares.

The impact of liquidity (the next 100 days) The dynamic trading strategy
yields benefits. Under the worst
To account for a light trading the position size is decreased downside scenario, the hold
day in which daily volume is by 250,000 shares. When the strategy forecasts a potential loss
limited to 700,000 shares, the price drops below 20% of its of 85% over the next 100 days.
strategy is conditioned on the current value, the disposition By locking in some of the gains,
movement of the of shares is increased to potential losses are limited to
Amazon.com share price. 500,000. For simplicity, 28% under normal liquidity
Whenever Amazon.com falls proceeds from the sale remain conditions and 38% in a highly
below 10% of its current value, as uninvested cash. illiquid market.
Step 1: Defining scenarios and time steps 21

the most widely used methodology but, generally, be modeled as random variables. The model
it is not computationally efficient. For example, describes the future distribution of those random
it is not uncommon for the portfolio of a bank or variables at the horizon.
an insurance company to contain several
A model of the joint distribution of the random
hundred thousand positions, including
variables must realistically describe
substantial volumes of derivative products such
as swaps, caps and floors, swaptions and • the marginal distribution of each variable
mortgage-backed securities. A full Monte Carlo • the co-dependence structure of all variables.
simulation of a large and complex portfolio
containing such instruments is computationally
expensive, and may not even be achievable
The introduction of computers in the late
within a reasonable time.
1940s stimulated the development of
Various computational methodologies have been mathematical methods that employ random
developed to make the Monte Carlo method, numbers. A mathematician at Los Alamos,
Stanislav Ulam, was one of the first to
which uses pseudo-random numbers, more
realize that they would allow computations
efficient for various applications. These on a sufficiently large scale to be useful for
methodologies, commonly referred to as variance engineers and scientists designing nuclear
reduction methods, include antithetic variables, bombs. In 1947, Ulam and Princeton
control variates, importance sampling, stratified mathematician, John von Neumann,
sampling and quasi Monte Carlo methods. proposed the use of computers to apply a
statistical approach to physics calculations.
Most of the financial literature has focused on The story goes that the name “Monte Carlo”
the application of variance reduction methods was given because “Ulam had an uncle who
for pricing (e.g., Boyle 1977, Boyle et al. 1997, would often borrow money from relatives
Ackworth et al. 1997, Broadie and because he just had to go to Monte Carlo to
Glasserman 1997, Caflisch et al. 1997, Paskov visit their famous roulette wheels”
and Traub 1995, Schoenmakers and (Peterson 1998). The name is now
Heemink 1997). Some of these tools do not commonly used to refer to any mathematical
provide advantages for risk management in method that employs statistical sampling
practice because of the high dimensionality of and random numbers.
the problems and the required focus on the tails
rather than on the expected value. However, An efficient and flexible method of modeling the
several methods have proven advantageous. joint distribution of a large number of financial
Applications of stratified sampling, control variables is to model the marginal distribution of
variates, important sampling and quasi Monte each variable separately and then construct a
Carlo methods have demonstrated performance model of the co-dependence structure (Carillo et
improvements in a number of risk management al. 2000).
applications (e.g., Jamshidian and Zhu 1997,
Shaw 1997, Kreinin et al. 1998a, 1998b, Traditional applications assume that the joint
Cardenas et al. 1999). For example, Kreinin et distribution of financial returns is described by a
al. (1998a, 1998b) show that quasi Monte Carlo multi-variate normal or log-normal distribution.
methods, in conjunction with principal This means that the marginal distribution of all
component analysis, compute VaR an order of variables is normal and the co-dependence
magnitude faster than standard Monte Carlo structure is defined in its entirety by the
simulation. An example of quasi Monte Carlo correlation matrix of all the variables. These
methods coupled with principal component distributions are also easy to estimate from
analysis is given in the sidebar. historical data. Some advanced estimation
methods that have proven quite popular in
Risk factor evolution models practice include exponentially weighted moving
If the simulation requires only one step, as may averages (Longerstaey and Zangari 1996) and
be the case in many market risk and portfolio GARCH (see, for example, Bollerslev 1986 and
management applications, then risk factors can Engle and Mezrich 1995).
22 Step 1: Defining scenarios and time steps

Monte Carlo and Quasi Monte Carlo Methods


Standard Monte Carlo (MC)
methods draw random samples 1
from the distribution using a 0.9
pseudo-random number 0.8
generator. By contrast, quasi 0.7
Monte Carlo (QMC) methods 0.6
use deterministic points 0.5
generated from a type of 0.4
mathematical vector- 0.3
sequences called low 0.2
discrepancy sequences (LDS). 0.1
The idea behind QMC 0
techniques is that by choosing 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
points in the risk factor space (a) Two-dimensional pseudo-random points
more evenly, the number of
scenarios necessary to achieve
1
a desired level of accuracy in
0.9
pricing and risk calculations is
0.8
reduced. 0.7
QMC methods have a strong 0.6
intuitive appeal for risk 0.5
management problems. Low 0.4
0.3
discrepancy sequences
0.2
specifically attempt to cover
0.1
the space of risk factors 0
“evenly,” thus avoiding the
clustering usually associated 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
with pseudo-random sampling. (b) Two-dimensional Sobol sequences
Sampling evenly seems to be a
desirable property not only
when estimating the average Pseudo random points vs. Sobol sequences
of a distribution, as required
for options pricing, but also
sequences of points that satisfy algorithms for generating the
when searching for
the property that their sequences are far from trivial. A
exceptional cases in the tails.
elements evenly cover the detailed explanation of the
Low discrepancy hypercube. The measure of solutions and algorithms is found
how evenly a sequence of in Sobol (1967) and Niederreiter
sequences points covers the region is (1992).
called discrepancy; the more
If d is the dimension of the A sequence would cover the
evenly the points are
space of independent risk hypercube uniformly if the
distributed in the region, the
factors, then sampling number of points in any possible
lower the discrepancy. Hence,
methods generally sample subset of the hypercube were
these sequences have been
from a unit d-dimensional proportional to the volume of
termed low discrepancy
hypercube. Whereas MC the subset. Discrepancy
sequences.
methods are based on points measures the worst deviation
randomly generated from the Although the idea behind between the volume of the
hypercube, QMC methods are LDS is simple, the subset and the fraction of the
based on deterministic mathematical theory and the number of points in the subset
Step 1: Defining scenarios and time steps 23

over the total number of • they yield probabilistic • they have been well-tested
points, over all possible errors and a priori bounds in the non-financial and
subsets. on VaR estimates. financial literature (for
derivatives pricing).
Sobol points cover the hyper- The main disadvantages of
cube more uniformly than do MC methods are The main disadvantages of
the randomly generated QMC methods are
points. Thus, the discrepancy • they use pseudo-random
number generators that • their lack of generality when
of the Sobol points is lower
tend to generate clusters compared to MC methods
than that of the pseudo-
of points means that their
random points.
effectiveness may be largely
• they do not explicitly dependent on the problem,
Advantages and exploit particular features and extensive testing is
disadvantages of the problems required
• their rate of convergence • they do not yield
The main advantages of
is slow. probabilistic errors or a priori
standard MC methods are
The advantages of QMC bounds on VaR estimates
• they are generally methods are • their rate of convergence
applicable to all problems
depends on the
• their rate of convergence • they are based on
dimensionality of the risk
is independent of the sampling techniques that
factor space, d
dimensionality of the risk generate points evenly
within the region and • they may be inefficient for
factor space
avoid the clustering problems in very large
• they are very popular and generally associated with dimensions.
their properties are well MC methods
known

Quasi Monte Carlo VaR


Quasi Monte Carlo (QMC) negligible error using a very RiskMetrics on that day
methods have proven to be large number of MC scenarios. (Longerstaey and Zangari 1996).
useful tools both to compute
the distribution of future The test portfolio is part of a The portfolio “true” one-day
portfolio values and to measure suite of benchmark test data VaR is a function of the VaR
VaR, resulting in (Marshall and Seigel 1996). level, α. The true VaR is
The mark-to-market of the calculated using a very large
computational savings when number of MC scenarios (over
compared to standard Monte portfolio is 357.3 million USD.
800,000). Hence the sampling
Carlo (MC) methods based on It contains 14 positions in errors are negligible. From these
pseudo-random sampling fixed-rate government bonds results, the one-day 95% VaR is
in maturities ranging from 4.81 million USD.
To illustrate this, we compare
the performance of the MC 182 days to 10 years in five To assess the speed-ups possible
and QMC methods to measure currencies: USD, DEM, FRF, with QMC simulation, we
VaR for a simple multi- ITL and JPY. The zero curves compare the number of scenarios
currency test portfolio. Details in each currency are modeled required to achieve a given
of the methodology are given using 16 node points, except accuracy in VaR with MC and
in Kreinin et al. (1998a, for the JPY curve which has 15 QMC methods. Since MC
1998b). The accuracy of the nodes. Market data is as of scenarios are random, the
estimations is measured with September 26, 1997. We number of scenarios required to
respect to the “true” UP&L assume that the vector of 83 be within, for instance, a 2%
distribution of the portfolio, risk factors is log-normal with sampling error from the true VaR
which is computed with zero mean and covariance is itself a random number. One
matrix as published by random simulation may achieve
24 Step 1: Defining scenarios and time steps

confidence level as opposed to a


95% level in the tails. This is only
one example, but the results are
fairly characteristic of practical
applications to real portfolios.
Increased performance with PCA
tools
Although the advantages of
QMC methods are clear for
problems of moderate
dimensionality, their benefit to
high dimensional problems is a
topic of active academic debate
(Caflisch et al. 1997). Low
discrepancy sequences usually
Time VaR vs. level of confidence work better in low dimensional
spaces because their discrepancy
2% accuracy in 5,000 scenarios As this example shows, depends on the dimension of the
while a second random however, the QMC method space. Hence, there is a strong
simulation might take 5,500 provides substantial speed-ups motivation to reduce the
scenarios. for VaR calculations compared dimensionality of the risk factor
to MC methods. Although the space before sampling.
Hence we can talk, for QMC method generally
example, of the expected outperforms the MC method, it Kreinin et al. (1998b)
number of scenarios or of the provides greater benefits when demonstrate that principal
number of scenarios required to higher accuracy and a higher component analysis can be used
be 95% certain that we are VaR percentile are required. to reduce the dimensionality of
within a 2% sampling error. For example, the speed-up for a the problem. They find that by
These numbers are determined VaR calculation with 2% using portfolio principal
over many different MC accuracy increases from 6.7 to components, the speed-ups can
simulations. In contrast, the 9.4 to achieve a 99% be up to 27 times faster for the
QMC method is deterministic same example!
and there is no notion of
confidence levels on the
number of scenarios. Error, ε
Confidence
For a 2% VaR error, the Level, α
2% 5%
effective number of QMC
scenarios is 2,048 and the 95% 6.7 2.1
speed-up is 6.7 times
(compared to the MC 99% 9.4 2.7
simulation at a 95% confidence
level). That is, the QMC
method requires almost seven
times fewer scenarios to
compute a 95% VaR with a 2%
sampling error, compared to
the MC method at a 95%
confidence level. Five out of
100 times, a MC simulation
with roughly seven times more
scenarios will yield a worse VaR
estimate than can be achieved
by the QMC method.
The QMC method becomes
less effective as a lower
accuracy is required. For
example, within a 5% sampling
error, the speed-up is only 2.1
times. Sobol vs. mean N and 95%N for 2% error
Step 1: Defining scenarios and time steps 25

Distribution Models With Fat Tails


Commonly used pricing and
risk measurement models
assume for simplicity and
mathematical tractability that
the returns of financial risk
factors follow normal
distributions though the main
objective of such models is to
measure the possible losses in
the tails.
We illustrate two practical
models with fat tails and
compare the results to the
historical observed returns and
a simple normal model, using
the JPY/USD exchange rate as Historical returns on JPY/USD and normal distribution
our example.
slightly less than 10–9 and the model can be interpreted as the
Historical distribution and chance of a 10-standard mixture of a “normal” economic
normal model deviation move is less than regime, which occurs most often,
one over the number of atoms and a “stressed” regime of high
First, consider the historical in the entire universe! volatility, which occurs
daily log-returns of the JPY/ infrequently. Based on the data
USD exchange rate from For the US investor, the model in this example, the model
January 26, 1981 to January with a normal distribution suggests a “normal market,” with
20, 2000. For a US investor predicts a VaR at the 99% annualized volatility of slightly
with 100 USD = 10,547 JPY, confidence level of 1.69 USD, less than 7%, that prevails about
a decrease (increase) of one while the historical simulation 82% of the time and a “highly
standard deviation in the VaR is 2.18 USD. volatile market,” with a volatility
exchange rate results in a gain of almost 24%, which occurs
(loss) of 0.73 (0.72) USD. A Mixtures of normals and almost 18% of the time.
historical simulation results in extreme value distributions
values of P&L in the range While the MN model is a mixture
[–8.12, 4.01] USD. Next, consider two models model, the second model is a
that better describe the fat sum model. Roughly speaking,
The normal distribution tails. the normal plus Weibull model
closely matches the two first (N+W), fits a normal
moments. Clearly, the The mixture of normals
distribution to the “central part”
historical distribution has model (MN) assumes that
of the returns histogram and a
fatter-than-normal tails, which with probability p1 the
Weibull distribution to the
is confirmed by the large distribution is normal with “tails.” The model estimates
kurtosis. The largest negative mean µ1 and variance σ 12 , and from the data the point at which
return in the sample with probability p2 = 1 – p1 the tales begin and the central
corresponds to a move of more the distribution is normal with part ends (e.g, at the 95th-
than 10 standard deviations, percentile).
mean µ2 and variance σ 22 . The
whereas the highest positive
return is almost six standard parameters can be estimated Looking more closely at the left
deviations. Under the normal from the historical data. tail of the distributions, we
model, the probability of a six- observe that the normal model
Besides fitting the historical assigns virtually no probability to
standard deviation move is observations better, the MN returns of –3 and lower, which
26 Step 1: Defining scenarios and time steps

As there were no historical


returns beyond –7.8% during the
observation period, it is difficult
to assign any likelihood to events
beyond that point. Extreme
value theory allows us to explore
extreme events that have not yet
occurred by asymptotically
approximating the tail. For
example, under the N+W
model the likelihood of the log-
returns being –8% or lower is
about 0.1%. A rare event
indeed, but not a zero-
probability event.
In this example, the difference
Mixture of normals and normal plus Weibull distributions for JPY/USD
between the models is not great
occurred historically about eight standard deviations) has at the 95% level, the normal
three times every 1,000 days. a likelihood of 0.4%, which model grossly underestimates the
Both the MN and N+W corresponds to about once 99% VaR and the difference in
models give a reasonable every 17 months (and not the 99.9% VaR in all models is
weight to such an event (0.4% once every six trillion years as considerable.
and 0.94%, respectively). The predicted by a normal model!).
MN model assigns heavier
weights in the tail to log-
returns of up to about –5%
(for example, it assigns Confidence level
likelihoods of .07% and .01%
95% 99% 99.9%
to log-returns of –4% and –
5%, respectively). Under the Normal 1.21 1.69 2.2
normal model a move of –4
occurs about once every Historical 1.22 2.20 3.75
200,000 years (as opposed to Mixture of normals 1.01 2.33 3.79
about once every six years as
predicted by MN). Normal + Weibull 1.25 2.03 5.06

Note that on two occasions VaR for a US investor using different models
the returns from the
observation period fall below
–5%. Although the MN model
captures the tail fairly well, it
may still underestimate the
probability of an extreme
event because the marginal
distribution in the tail is still
approximately normal. In
contrast to the MN model, the
N+W model gives about eight
times and almost 50 times
higher probability to moves of
–4% and –5%, respectively.
Under the N+W model, a
move of –6% (which
corresponds to more than Tails of the distributions
Step 1: Defining scenarios and time steps 27

However, it is widely recognized that, in many modeled as random processes which describe the
cases, the normal distribution does not describe evolution of a random variable through time.
financial returns accurately and that their joint Models of random processes have been widely
behaviour is also more complex than can be used in finance and, specifically, in derivatives
described by the correlations. To address this pricing. For example, the Black-Scholes model for
problem, a number of models have been option pricing (Black and Scholes 1973) assumes
developed and tested that capture the nature of that the stock prices follow a process called
financial returns better and describe the tails of geometric Brownian motion (GBM).
their distributions more accurately. These models
include multi-factor t-distributions, mixtures of As with normal distributions in the single-step
normals, lambda distributions, distributions from case, GBM models are commonly used for both
extreme value theory summed with the normal pricing and risk management because of their
distribution and non-parametric marginal simplicity and mathematical tractability, though
distributions (see, for example, Hull and they do not describe real processes accurately. To
White 1998a, Shaw 1997, Carillo et al. 2000, describe the evolution of interest rates, more
Embrecht et al. 1998). Distribution models that sophisticated models that account for mean
capture fat tails can be effectively incorporated reversion are used. In mean-reverting models,
into a MtF simulation framework and lead to when the level of an interest rate becomes too
more accurate risk measurement and high or too low, a force pulls it back to its mean,
management (see page 28). In summary, these or equilibrium, level. Models with mean
models reversion have been used widely for pricing
interest rate derivatives (e.g., Cox et al. 1985,
• permit more accurate risk measurement and Hull and White 1990, Black and Karasinski
management than commonly used normal 1991, Black et al. 1990). For risk management
distributions purposes, multi-factor models are generally
• provide a better statistical description of the required to capture the joint evolution of term
tails of the distributions and of extreme structures (see page 28).
events with low probability than pure In general, different types of assets have different
historical simulation methods. characteristics and hence a combination of
Most traditional applications use a correlation processes is required to model accurately their
matrix to describe the co-dependence structure evolution into the future. For example, in
of the risk factors. Correlations fully describe the addition to GBM and mean-reverting diffusion
co-dependence structure of random variables processes, one can add processes with jumps and
when the distribution is normal, or more (reflecting) barriers. For example, a complex
generally when the distribution is elliptic. model is required to simulate electricity spot
(Another example of an elliptic distribution is prices (see page 30).
the t-distribution.) However, when the The models discussed above are generally
distribution is not elliptic, there are better described in terms of continuous time models and
descriptors of the co-dependence such as rank the equations used to characterize them are
correlations and copulas Embrecht et al. (1999). called stochastic partial differential equations
Copulas—probably the most general way of (e.g., Karatzas and Shreve 1994). There are also
modeling the joint behaviour of random discrete time models, mostly from the
variables—have been well-researched in recent econometrics literature (e.g., Kim et al. 1999).
years. The description of these tools is beyond These models are commonly used to include
the scope of this document. auto-correlations and other non-Markovian
In contrast to the single-step problem, a multi- structures.
step MtF analysis is required when measuring Continuous time models have been used in
counterparty credit exposures, for ALM, when finance for the last 30 years (Sundaresan 2000).
analyzing dynamic portfolios, etc. (see Table 1.1 They are generally mathematically tractable, and
on page 11). In this case, the risk factors are
28 Step 1: Defining scenarios and time steps

Multi-factor Statistical Term Structure Model


A statistical model with mean dx j = – a j x j dt + σ j dz j
ri = e
yi

reversion for the evolution of


the term structure of interest j = 1, 2, …, k
The model is calibrated using
rates produces realistic future historical data. The calibration
where a j specifies the mean-
scenarios over long horizons procedure obtains the target
Reimers and Zerbs (1999). reversion speed, σ j specifies ∞
Scenarios over long horizons the instantaneous volatility structure r i , i = 1, …, n ; the
are realistic when observed and dzj represents a Brownian state variables x j , j = 1, 2, …, k
rates, prices and exposures motion. The volatilities and (through the calculation of the
over time are consistent with the mean-reversion rates are matrix B); and for each state
the predicted distributions of constant through time.
the model. variable, x j , j = 1, 2 , …, k the
The log rates are mean reversion speed a j and
The model reconstructed by instantaneous volatility σ j .
We assume that the term ∞
k
Example
structure of interest rates is yi = yi + ∑ bijx j
described by n rates and that j=1 The model is calibrated to US
joint movements of the i = 1, 2 , … , n Constant Maturity Treasury
logarithms of the n rates occur yields (US Federal Reserve
only within a k-dimensional where B = [bij]. The columns, Board 1999) in a historical
subset of the n-dimensional B j , of the matrix B are calibration period (January 1,
directions, where k is smaller orthonormal vectors. Here, 1984 to December 31, 1990). A
than n and the n directions are the columns of B are the first k second (non-overlapping)
independent. principal components of the historical period (January 3,
changes in log rates. 1991 to December 31, 1998) is
Let the term structure of designated as the out-of-sample
interest rates be defined by a The individual rates are given testing period.
set of discrete rates, by
r i, i = 1, 2 , …, n and let
yi = log ( r i ) . We assume that
as time evolves, each rate r i

reverts to a target value r i ,
which remains constant over
time.
We assume there are k
independent state variables,
x 1, x 2, …, x k , k < n that
explain the movements of the
log rates, y i . Each state
variable follows an Ornstein-
Uhlenbeck process (Karatzas
and Shreve 1994):
Principal component of daily log rates
Step 1: Defining scenarios and time steps 29

based on these scenarios with


the historical exposures realized
in the validation period one
thousand scenario paths over
seven years are generated, with
quarterly time steps in the first
Time (days)

year and semi-annual time steps


thereafter. In each scenario and
at each time step, the MtF value
of each position is computed and
aggregated at the portfolio level.
The maximum 97.5% portfolio
exposure doubles from 10
million USD at the beginning of
Rate (%) the simulation to more than 20
million USD near the end of the
Historic rates plus lower bound of the 95% envelope for 30-year rate simulation. The estimated
potential exposure of the
The test portfolio, held by a distribution of the interest portfolio consistently dominates
single counterparty, comprises rates at each point in time. its actual exposure. However, in
US denominated at-the- the short term, the value of the
money fixed-floating swaps The rate lies outside the 30-
portfolio is slightly outside of the
(tenors from one to 10 years), year inclusion envelope only
95% envelope.
a swaption and several long- on two short periods. For all
term caps and floors. These nine rates, the historical data Note that the out-of-sample
positions are designed to be falls outside the 95% inclusion testing period covers a dramatic
sensitive to very large interest envelope for the calibrated fall in interest rates between
rate movements. model on 7.7% of days 1991 and 1992 and a dramatic
observed, suggesting that the increase in rates between 1994
Calibration
scenarios generated by the and 1995. In spite of this, these
The first “shift” principle model are realistic. results offer evidence that the
component explains 93% of model generates realistic
To compare the potential
the total variance; together, estimates of worst case
exposures of the test portfolio
the first three components exposures.
explain 99.9% of the total
variance of the daily log rates.
The target values for the

testing period, y i , are
calculated as the sample
means of the log rates over the
calibration period. The
estimates of annualized mean-
reversion rates for the first
principal components are
0.001, 0.066 and 2.120,
respectively.
Out-of-sample testing
The central 95% inclusion
envelope is defined by the
curves that link the 2.5th and Portfolio Mark-to-Future values
97.5th percentiles of the
30 Step 1: Defining scenarios and time steps

A Model to Generate Electricity Price Scenarios


The electricity market is volatile, that show sudden, evident in the logarithm of daily
characterized by constraints short-lived large jumps— spot prices.
on the supply side and particularly in the summer
pronounced cyclical and months—and that exhibit A model for spot pricing of
electricity
seasonal patterns on the seasonality and a very strong
demand side. Yet demand for mean-reverting behaviour. Scenarios of electricity spot
electricity, while highly Further, the volatility itself can prices are generated by
variable, is relatively inelastic: vary—from being moderately specifying a process affecting
it responds very slowly, if at high at lower price levels to change in electricity spot prices
all, to even large price extremely high at higher price and manipulating it over time.
increases. levels. The volatility is more
Quick adjustments in the
supply of electricity are
prevented by the long lead
time required to start up and
shut down some types of
generating equipment. Nor
can electricity be stored for
later use.

The dependence on
transmission lines to convey
electricity further limits the
available supply of electricity
between regions. One region
can transfer electricity to
another region with which it is
connected, but an excess of
electricity cannot be rerouted Daily electricity spot prices, February 6, 1997–June 9, 1999
along lines that are already
working at capacity.
Therefore, electricity markets
are regional.

An imbalance of supply and


demand within connected
regions results in spot price
differentials between regions.
The differentials can vary
depending on the day of the
week and the time of day, as
well as on the season and the
weather.

All of these characteristics of


the electricity market result in Logarithm of daily electricity spot prices, February 6, 1997–June 9, 1999
spot prices that are highly
Step 1: Defining scenarios and time steps 31

Let S t denote the spot price of


electricity at time t. The process
that governs the change in the
logarithm of the spot price over a
small time interval d t is

∆ ( ln ( St ) ) = b ( t, St ) ( ln ( S∞ ) – ln ( S t ) )d t

+ σ ( t, S )∆
t

where S ∞ is the long-term


reversion level for the daily spot
price, b ( t, S t ) is the reversion
rate, W t denotes the Brownian
motion, q t is a Poisson process
with intensity λ ( t, S t ) , σ ( t, S t )
denotes the volatility of Log ( S t )
and J is a positive, real-valued
random variable. The three
right-hand side terms in the
process represent, in order, a
mean-reverting drift component,
a continuous drift component
and a jump component,
respectively.

By making the process


parameters dependent on time
and spot price level, the
observed seasonal effects are
accounted for. Additional
features of the market are
accommodated in the model by
imposing appropriate
constraints, such as reflecting
upward and downward barriers.
Electricy price scenarios

Maximum likelihood estimates


of the process parameters are
obtained and scenarios for spot
prices generated using the model
and the 600 days of daily spot
price data.
Observation of three sample
paths indicates that the
simulation captures the
Three sample paths of the logarithm of daily electricity spot prices distribution of the price jumps
and their observed magnitudes
Step 1: Defining scenarios and time steps 32

reasonably well.
33 Step 1: Defining scenarios and time steps

considerable ongoing academic research is being Summary


conducted to understand the theoretical
properties of the models and their econometric Scenarios are the heart of a MtF risk
estimation in practice. Once they are defined and management strategy because the quality of a risk
calibrated, the implementation of continuous management analysis depends on the ability to
time models in the MtF framework requires the generate relevant, forward-looking scenarios that
discretization of the stochastic partial differential properly represent the future. Moreover,
equations to the desired time steps. seemingly disparate risks, such as market, credit
and liquidity risks, are naturally integrated
Mixed parametric and non-parametric models
through the scenarios used in a risk management
Some scenario generation models retain aspects analysis. Finally, integration of various
of both parametric and non-parametric models. institutional functions and businesses is also
These models can be viewed as bringing an readily achieved when the players speak the
explicit model of randomness or model “same language”; i.e., share the same scenarios.
prediction into historical scenarios or, In this chapter, we have further advanced the
alternatively, as bringing some explicit, non- idea that scenarios are the language of risk.
parametric, historical information into a
parametric model. Research in computational finance over the last
decade has created powerful historical and
For example, Hull and White (1998b) present a model-based scenario generation techniques
methodology that adjusts historical scenarios by applicable to risk management. In order for these
the conditional volatility of the time series, techniques to be practical, all current economic
computed using exponentially weighted moving and subjective information must be applied
averages or GARCH methods. This method creatively to generate forward-looking scenarios
retains the fat-tail characteristics of the historical consistent with current conditions. In the next
data, while using a better forecast of the volatility
chapters, we show how accurate simulation
of the time series than that given by the sample
produces the values that populate the MtF Cube.
volatility. Another example is the use of density
This simulation must account for all the events
kernel estimators (Ahn et al. 2000) to describe
that affect the future value of the underlying
the marginal distributions of risk factors.
instruments and for the dynamic portfolio
Sampling from this type of function is equivalent
to sampling from history and then adding a strategies that model changes to the positions in
Gaussian noise of a predefined variance. the portfolios.
Step 2: Defining the Basis
Instruments
Portfolios consist of positions in a number of financial products, both exchange-traded
and over-the-counter (OTC). The MtF Cube is the package of MtF tables, each
corresponding to an individual basis instrument. A basis instrument may represent an
actual financial product or a synthetic instrument. As the number of OTC products may
be virtually unlimited, it is often possible to substantially reduce the number of basis
instruments required by representing their MtF values as a function of the MtF values
of the abstract instruments. This chapter discusses the issues surrounding the
selection of basis instruments used to build the MtF Cube.

The MtF Cube consists of a set of N MtF valuation aids in the selection of basis
tables—one for each instrument—generated by instruments.
simulating a given basis instrument over
S scenarios and T time steps. Basis instruments The MtF value, mijt, of basis instrument i,
may represent actual financial products such as (i = 1,…,N) under scenario j at time step t is a
stocks and bonds, synthetic instruments such as a function, f(•), of the levels of specified risk
series of zero coupon bonds or, in some cases, factors and can be represented as
abstract indices such as the inputs to a multi-
factor model. m ijt = f ( u1jt, u 2jt, u 3jt, …, u Kjt )

The choice of basis instruments to be simulated where u kjt , ( k = 1, …, K ) is the level of risk
in the pre-Cube stage of the MtF framework factor k under scenario j at time step t.
depends on the overall business application and
may involve trade-offs between the magnitude of When the basis instrument represents a synthetic
data storage (the dimensions of the MtF Cube) instrument or an abstract index, it is often a
and pricing accuracy. Nonetheless, the function of a single risk factor k, m ijt = f ( u kjt ) . In
dimensions of the MtF Cube can often be this case, the basis instrument itself can be
significantly reduced with little or no loss of mapped one-to-one to a risk factor.
pricing accuracy by the judicious choice of basis
instruments. The key benefit derived from the use of basis
instruments is the ability to pre-compute most of
In most applications, the basis instruments the results of the computations required to
consist of a mixture of actual products calculate portfolio MtF values. The MtF value of
representing exchange-traded securities (or OTC a portfolio containing a single financial product is
securities held in inventory) and synthetic determined by a post-processing function, g(•),
instruments which may be mapped into a large applied to the pre-computed MtF values of the
volume of OTC securities. An understanding of basis instruments. In general, when the financial
how risk factors have an impact on product product is itself defined as a basis instrument in
34 Step 2: Defining the basis instruments

the MtF Cube, the MtF value of the portfolio is stock and a forward contract (expiring at time t) on
simply the MtF value of the basis instrument that stock.
(and g(•) is simply an identity). In the case when
the financial product is not an instrument In this mapping sequence, the MtF value of the
defined in the MtF Cube, the MtF value of the stock under a scenario j at time step t is simply that
portfolio is calculated as a function, g(•), of the of the stock basis instrument. The MtF value of the
MtF values of one or more basis instruments. stock basis instrument is modeled as a simple
realization of the stock risk factor under each
As illustrated by the schema in Figure 2.1, a scenario and time step. The MtF value of the
mapping sequence defines the functional forward contract is a straightforward function of

RISK FACTOR BASIS INSTRUMENT PRODUCT PORTFOLIO


f(•) g(•)

Figure 2.1: Mapping sequence of risk factor through to portfolio

relationship between risk factors, basis the MtF values of two basis instruments: the stock
instruments, financial products and portfolios. and a zero coupon bond.

In order to select the appropriate basis Consider a second mapping sequence, illustrated in
instruments to be contained in the MtF Cube, a Figure 2.3. In this case, the stock is valued by a
mapping sequence for each financial product generic N-factor equity model which determines
must be defined. The next sections describe some the MtF value of the stock as a function of its
of the modeling issues involved in defining the factor loadings with respect to the individual equity
mapping sequence from risk factor to financial factors. For example, one equity factor could be a
product. In Step 4, the post-processing functions market index; others could be industry factors.
g(•) are applied to basis instrument MtF values to
produce portfolio MtF values. Under this sequence, the MtF value of the stock is,
once again, simply that of the stock basis
Modeling considerations instrument, but now the MtF value of the basis
instrument is based upon the N-factor equity
The choice of basis instruments depends upon model. The difference in modeling between
the mapping sequences that result from the sequence A and sequence B occurs strictly in the
modeling choices made prior to generating the pre-Cube stage.
MtF Cube. As an example, consider mapping
sequence A, illustrated in Figure 2.2, chosen to Finally, consider a third mapping sequence,
determine MtF values for a non-dividend paying illustrated in Figure 2.4. In this case, the basis

RISK FACTOR BASIS INSTRUMENT FINANCIAL PRODUCT

Stock Price Stock Stock

IR Curve (node t) Zero Bond Forward

Figure 2.2: Mapping sequence A


Step 2: Defining the basis instruments 35

RISK FACTOR BASIS INSTRUMENT FINANCIAL PRODUCT

Equity Factor 1 Stock Stock

Equity Factor n

IR Curve (node i) Zero Bond Forward

Figure 2.3: Mapping sequence B

instruments defining the MtF Cube are the sequence often involves more pragmatic trade-offs.
abstract equity factors themselves. The MtF For example, mapping sequence A may be
value of the stock must, therefore, be determined preferred when full explanatory power is desired for
in a secondary post-processing step as a function future equity distributions (perhaps based upon
of the MtF values of the equity factor basis historical realizations of stock prices). However,
instruments. this approach may prove deficient when newly
issued securities (that possess no price history) are
In this sequence, the MtF value of the stock is modeled.
determined as a function of the MtF values of
the N equity-factor basis instruments. Unlike In contrast, mapping sequence C may be preferred
sequences A and B, sequence C involves a for the purpose of minimizing the dimensions of the
modeling change in the post-Cube stage as well MtF Cube as well as for valuing equities with no
as in the pre-Cube stage. In other words, the previous price history. However, this approach may
post-Cube mapping of basis instruments into be less appropriate when it is important to consider
portfolios in sequence C requires the mapping of the specific risks associated with individual
basis instruments into individual financial equities. Note that specific risks can still be
products in an intermediate step. addressed in this particular mapping sequence
through more complex post-processing that
The benefit of the intermediate step is that the
directly incorporates the specific risk of the equity
MtF Cube in this case need never incorporate
as an additional input. The modeling of specific
more than the N MtF tables associated with each
of the N basis instruments. This is true because risk is discussed in Step 5.
the MtF values of all stocks are functions of the
Mapping sequence B may emerge as the preferred
MtF values of the same N basis instruments,
alternative in this case. Specific risk and newly
regardless of the number of stocks in a portfolio.
issued equities may be addressed easily using this
While it is interesting to compare alternative approach, since each security is simulated uniquely
modeling techniques, the choice of mapping using the factor model, thereby allowing for the

RISK FACTOR BASIS INSTRUMENT FINANCIAL PRODUCT

Equity Factor 1 Equity Factor 1 Stock

Equity Factor n Equity Factor n

IR Curve (node i) Zero Bond Forward

Figure 2.4: Mapping sequence C


36 Step 2: Defining the basis instruments

Table 2.1: MtF values of all-in stock instrument

t=0 t=1 t=2 T=3

Stock instrument s0 s1 s2 s3

Settlement account 0 c1 c1f1,2 c3+c1f1,2 f2,3

All-in instrument s0 s1+ c1 s2+ c1f1,2 s3+ c3 + c1f1,2 f2,3

straightforward incorporation of specific risk. coupon bond (with a notional of one and
The trade-off is that this sequence has maturity at t = 2) along with a settlement
implications for the size of the MtF Cube, which account which captures and reinvests the
now must incorporate MtF tables for each notional amount paid at the maturity date.
individual equity. Table 2.2 summarizes the MtF values of the all-in
basis instrument across a given scenario path for
Settlement and reinvestment considerations the time horizon, T = 4. The MtF value of the
all-in instrument is the sum of the MtF values of
The choice of appropriate basis instruments is the standard instrument and the cash account.
also influenced by the consideration of cash flow
settlement and reinvestment over time. In order Figure 2.5 provides an illustration of MtF values
to capture the distribution over time of the total for a one-unit investment in the all-in zero
return of a financial product, any cash flow coupon instrument maturing at t = 2, simulated
dispersal must also be incorporated in the pre- across S = 100 scenario paths over the time
computed MtF Cube. This can only be achieved horizon, T = 4.
by defining an all-in basis instrument that
consists of a standard basis instrument and a
settlement account in which any cash settled by
the basis instrument through time is reinvested.
The all-in instrument captures the total return of
a financial product through time.
As an example, consider a stock that pays
dividends at time steps t = 1 and t = 3. An
appropriate risk/reward assessment of this
security at the time horizon, T = 3, requires the
inclusion of any dividends settled (and
reinvested) prior to that time. An all-in stock
instrument represents a portfolio consisting of
the basis stock instrument along with a Figure 2.5: MtF values of all-in t = 2 zero
settlement account that captures and reinvests coupon instrument across scenarios
dividends paid at time steps t = 1 and t = 3. In comparison, Figure 2.6 and Figure 2.7
Table 2.1 summarizes the MtF values of the all-in illustrate the MtF values simulated across
stock basis instrument across a given scenario S = 100 scenarios over the time horizon, T = 4,
path over the period [t =0, T = 3]. The MtF of one-unit investments in two additional all-in
value of the all-in instrument is the sum of the zero coupon instruments (with maturities of
MtF values of the standard instrument and the t = 0 and t = 4, respectively).
cash account. Note that the t = 4 all-in instrument poses only
As a second example, consider an all-in zero price risk over the time period 0 ≤ t < 4 , during
coupon instrument consisting of a standard zero which its MtF value is sensitive to interest rate
Step 2: Defining the basis instruments 37

Table 2.2: MtF values of all-in t = 2 zero coupon instrument

t=0 t=1 t= 2 t=3 T=4

Zero coupon bond d0,2 d1,2 1 0 0

Settlement account 0 0 0 f2,3 f2,3 f3,4

All-in instrument d0,2 d1,2 1 f2,3 f2,3 f3,4

changes. At its t = 4 maturity date, when the


notional is settled, it represents a riskless
position. In contrast, the t = 0 all-in instrument
is riskless at its t = 0 maturity date, but poses
reinvestment risk over the duration of the T = 4
time horizon, since the notional amount is
reinvested. The t = 2 all-in instrument has a
changing risk profile over the T = 4 time
horizon; it possesses price risk over the period
t < 2, is riskless at t = 2 and poses reinvestment
risk over the period 2 < t ≤ 4 .

Figure 2.7: MtF values of a all-in t = 4 zero


coupon instrument across scenarios

Explicitly incorporating the passage of time in a


simulation creates some modeling challenges.
Path dependency, settlement, reinvestment and
early exercise are all features of product valuation
that can be appropriately addressed in the
generation of the MtF Cube. The above
examples illustrate the significance of considering
the impact of settlement and reinvestment
Figure 2.6: MtF values of all-in t = 0 zero through time on the modeling of even the
coupon instrument across scenarios simplest instruments.
38
Step 3: Simulating to Produce
the MtF Cube
The MtF Cube consists of a set of N MtF tables each associated with a given basis
instrument. The cells of a MtF table contain the MtF values of the basis instrument
simulated over S scenarios and T time steps. This chapter discusses the relationship
between the risk factors, scenario paths and pricing functions used to simulate MtF
values.

Risk factors

Scenario generation MtM valuation

S Scenarios

T Time steps N Basis instruments

Risk factor distributions MtF valuation

S Scenarios
ps
ste
ime
N MtF tables T T

Figure 3.1: Generation of the Basis MtF Cube


40 Step 3: Simulating to produce the MtF Cube

Once the scenarios and time steps have been The MtF table associated with basis instrument i,
defined, the cell content of a MtF table is Mi, is of dimension S x T, where S is the number
generated by a pricing model that is attached to of scenarios and T is the number of time steps.
the applicable basis instrument. A pricing model The cells of Mi are populated by simulating the
represents a string of pricing functions associated values of the basis instrument i over each of the
with specified output attributes; pricing functions j = 1,…, S scenarios and t = 1, …,T time steps.
produce MtF values as well as other measures The MtF Cube, M, consists of N MtF tables
such as MtF deltas and MtF durations. The associated with each of the i = 1,…,N basis
simulation of a basis instrument over the defined instruments. Figure 3.1 illustrates the process for
scenarios and time steps produces MtF measures generating a Basis MtF Cube.
for the specified output attributes that are
functions of the realized risk factor levels.
Step 4: Producing the Portfolio
MtF Table
Key to the MtF framework is the premise that a MtF Cube is generated independently
of portfolio holdings. Any portfolio or portfolio regime (strategy) can be represented by
mapping the MtF Cube into static or dynamically changing portfolio holdings. The
regimes may be predetermined, such as a buy-and-hold strategy, or they may be
conditional upon the scenario-based MtF values, such as a delta-hedging strategy.
Conditional regimes are used to capture liquidity risk. This chapter discusses how
multiple portfolio MtF tables are produced from a single MtF Cube.

In Step 4 of the MtF framework, the MtF Cube is are contained in the MtF Cube, the values in the
mapped into multiple portfolios or portfolio portfolio MtF table are a linear combination of
regimes over time. The result of this mapping is the values in the instrument MtF tables,
the creation of a portfolio MtF table containing weighted by portfolio holdings in each scenario
the MtF values of a portfolio regime simulated and time step. In this case, the portfolio MtF
across scenarios and time steps. The mapping value in scenario j and time step t is
incorporates the holdings in the instruments,
which may be dynamic across scenarios and N
through time. Thus, a portfolio MtF table R R
m jt = ∑ x ijt ⋅ m ijt
captures the MtF values of a portfolio regime— i=1
such as a roll-over strategy or an immunization
strategy—in a manner similar to that used to
where x Rijt is the holdings of instrument i under
capture the MtF values of a static portfolio. This
mapping is completely independent of the portfolio regime R for scenario j at time step t.
generation of the MtF Cube and occurs strictly as
a post-processing step in the post-Cube stage. The positions, x Rijt , are only constant for all
scenarios and time steps for a static buy-and-hold
A portfolio MtF table, MR, associated with portfolio regime. In the general case, position size
portfolio regime R has dimensions S x T, where S may be time and scenario dependent, thus
is the number of scenarios and T is the number of enabling the risk/reward assessment of
time steps. Each cell in the table contains the dynamically changing portfolios. The fourth step
MtF value, m jtR , of portfolio regime R under of the MtF framework maps basis instruments
scenario j at time step t. A value in the portfolio into the portfolio regimes, as illustrated by the
MtF table is a function g(•) of the values in the mapping sequence in Figure 4.1. For now, we
instrument MtF tables contained in the MtF assume the basis instruments are actual financial
Cube. In the case where actual financial products products.
42 Step 4: Producing the portfolio MtF table

FINANCIAL PRODUCTS PORTFOLIO REGIMES

Figure 4.1: Portfolio mapping sequence

It is useful to identify two general types of For example, consider a position in a single bond.
strategies for mapping the MtF Cube into Differing portfolio liquidation assumptions can
portfolio holdings: predetermined regimes and be assessed by the application of different
conditional regimes. predetermined portfolio regimes. The portfolio
mapping sequence associated with two bond
Predetermined regimes liquidation regimes is illustrated in Figure 4.2.

In a predetermined regime, portfolio strategies First, assume that the given bond position may be
are independent of the contents of the fully liquidated in a single day. In this case, the
MtF Cube, but may, nonetheless, change over distribution of portfolio MtF values over the first
time in a predetermined fashion. Under these time step represents the one-day market risk of
regimes, the values in a portfolio MtF table a the bond position, assuming complete liquidation
straightforward linear combination of fixed on that date. A predetermined portfolio regime
quantities of the values of financial products. A captures this liquidation strategy through a
static buy-and-hold strategy is simply a special schedule that reduces the holdings in the bond
case of a predetermined regime describing a from 100% to 0% in one day.
schedule of portfolio holdings that remains
Next, assume that the bond position must be
constant for all scenarios and time steps.
liquidated proportionately over a 10-day time
As a slightly more complex example, consider the horizon. The effect of illiquidity is to lock in
modeling of liquidity risk. In a standard exposure to market risk until the position is
approach, the change in mark-to-market value— closed. A full 10-day distribution of MtF values
a portfolio’s unrealized profit and loss—serves as must be considered in order to capture this
a proxy for its realized profit and loss or the impact. A predetermined portfolio regime
actual liquidation of portfolio holdings. A captures this liquidation strategy through a
liquidity-adjusted risk measure is often estimated schedule that reduces the holdings in the bond at
by applying an ad hoc addition to a non-adjusted a rate of 10% each day for 10 days (see page 43).
risk measure. The liquidity risk measure assessed Note that by dynamically mapping the individual
in a standard risk framework does not bonds into appropriate liquidation regimes for a
incorporate the explicit passage of time nor allow common 10-day time horizon, the risk of both
for dynamically changing portfolio holdings. In liquid and illiquid bonds may now be assessed
contrast, the MtF framework allows for the within a consistent framework.
explicit liquidation of holdings through the
specification of a regime that liquidates the A liquidation risk measure must often account
portfolio over a given time horizon. for the reinvestment of any cash settlement in

FINANCIAL PRODUCTS MAPPING PORTFOLIO STRATEGY

Liquidation (liquid)
Bond
Liquidation (illiquid)

Figure 4.2: Mapping of a bond into a portfolio liquidation strategy


Step 4: Producing the portfolio MtF table 43

Bond Liquidation Regimes


Consider a single bond (i = 1) A A
m jt = x 1jt ⋅ m 1jt
B B
m jt = x 1jt ⋅ m 1jt
with a MtF value of m1jt.
Under portfolio regime A, the B
= 0 ⋅ m1jt = ( 1 – 0.25 ⋅ t ) ⋅ x 1j0 ⋅ m 1jt
bond is assumed to be liquid; it
can be 100% liquidated by = 0
t = 1. In this case, the for 0 ≤ t ≤ 4
portfolio MtF value at t < 1 is for t ≥ 1
simply and after full liquidation at t > 4
Under portfolio regime B, the B B
A A
m j0 = x 1jt ⋅ m 1jt m jt = x 1jt ⋅ m 1jt
bond is assumed to be illiquid;
it can only be liquidated by
= 1 ⋅ m 1jt = 0 ⋅ m 1jt
25% over each time step
= m 1jt (t = 1,…,4). In this case, the
= 0
portfolio MtF value from
for t < 1 0 ≤ t ≤ 4 can be represented as
for t > 4

and after full liquidation at


t≥1

Regime A MtF
Instrument MtF 100% mA14, ….…, mAS4
liquid regime
100% mA12, ….…, mAS2
t=4 m114, …….…, m1S4 A A
0% m 10, ….…, m S0
t=2 m112, …….…, m1S2
t=0 m110, …….…, m1S0
Regime B MtF
scenarios
100% mB14, ….…, mBS4
illiquid regime 50% mB12, ….…, mBS2
B B
0% m 10, ….…, m S0

Liquidation regimes
44 Step 4: Producing the portfolio MtF table

Equity Liquidation Regimes—Gold Fields


It is May 3, 1999. The trend in
the Gold Fields share price has
been downward over the last
two years. What is the Value-
at-Risk of a position of
300,000 shares of Gold Fields,
the second largest gold
producer in South Africa?
Standard Value-at-Risk
analysis at a 95% confidence
level indicates a probable loss
of 5%, assuming that the
entire position can be
liquidated in the next
24 hours. Given that the Gold Fields: daily volume
average daily trading volume
of Gold Fields on the
Johannesburg Stock Exchange varying degrees of liquidity are 2. High Volatility Scenario
is only 175,000 shares, matched with different Set: In general, as market
portfolio Value-at-Risk is position liquidation regimes to volatility increases, liquidity
increased due to reduced produce realistic estimates of evaporates. Five scenarios at
liquidity—but by how much? the position risk. multiple time horizons are
generated based on the
A Mark-to-Future model 1. Market Risk Scenario trading volumes in the five
Set: First, a set of 100 one- most volatile periods in Gold
Scenarios on Gold Fields’ day shock scenarios that Fields’ share price history.
market price and daily trading take no account of the
volume describe the dynamic impact of liquidity are 3. Low Trading Volume
nature of liquidity risk. Three generated. Scenario Set: The impact of
sets of scenarios applicable to extremely low trading
volumes is captured by
bootstrapping 10 scenarios
at multiple time horizons
based on the 250 lowest
trading volumes from Gold
Fields’ recent history.

The liquidation strategies


The cost of liquidity and the
time to liquidate are modeled by
liquidation strategies. A strategy
describes how the position is
unwound with proceeds from the
disposition settled into cash.

The fall of Gold Fields (Share price = 100 on April 14, 1997)
Step 4: Producing the portfolio MtF table 45

Simulation Category Scenario Liquidation Strategy

Infinite Market risk only. No impact on liquidity Market Risk Scenario Instantaneous
Liquidity risk. Liquidation

Average Market risk with a reasonable amount of Market Risk Scenario Unconditional
Liquidity liquidity in the market. Liquidation

Limited Market risk with reduced liquidity, Market Risk Scenario Conditional
Liquidity simulating the impact of liquidity under and Liquidation
volatile markets. High Volatility Scenario

Negligible Market risk under the worst case Market Risk Scenario Conditional
Liquidity liquidity scenarios, simulating the impact and Liquidation
of liquidity under extreme situations. Low Trading Volume
Scenario

Scenarios and liquidation strategies

1. Instantaneous The model at work unchanged since the market risk


Liquidation: This strategy scenario does not capture
assumes that infinite When the liquidation liquidity risk.
liquidity exists in the strategies are applied and the
market. All 300,000 portfolio is Marked-to-Future, How much risk can be
shares of Gold Fields are each level of liquidity yields attributable to liquidity and how
liquidated at the quoted significantly different results. long would it take to liquidate
market price in one day. As liquidity is reduced, the the entire position? By graphing
liquidation period lengthens the liquidity-adjusted results
2. Unconditional and the portfolio Value-at- over time, a liquidation time
Liquidation: 10% of the Risk, as a percentage of horizon can be inferred for a
Gold Fields’ shares are portfolio value, increases. particular Value-at-Risk or vice
sold each day over the Market risk remains versa.
next 10 days at the quoted
market price.

3. Conditional Liquidation:
Sale of Gold Fields is
restricted to 20% of each
day's trading volume,
assuming that, at this rate,
shares can be sold at the
quoted market price.

Matching scenarios and


strategies
Scenarios and liquidation
strategies are specified for each
of the four liquidity categories:
infinite, average, limited and
negligible (low) liquidity.
Liquidity-adjusted Value-at-Risk
46 Step 4: Producing the portfolio MtF table

FINANCIAL PRODUCTS MAPPING PORTFOLIO STRATEGY

Product 1
Previous Day holdings

Current Holdings
Product n

Figure 4.3: Mapping of products for an attribution analysis

addition to the bond liquidation mechanism just where g(•) represents a function that maps the
described. As the settlement amount depends contents of the MtF Cube into the portfolio
upon the bond MtF values under each scenario holdings appropriate for each scenario and time
and time step, this represents an example of a step. Notice that portfolio holdings are a function
conditional regime; conditional regimes are
of the MtF values contained in the MtF Cube
described in the next section.
which, in turn, are a deterministic function of the
As a second example of a predetermined regime, risk factor levels realized under each scenario and
consider an attribution analysis for a portfolio time step.
over a given time period (see page 47). Standard
risk/reward frameworks presume that the As an example, consider a straightforward
portfolio holdings do not change; the impact on Treasury bill roll-over strategy. At the maturity of
the portfolio profit and loss due to market risk
a Treasury bill currently held in a portfolio, the
factor changes over this period is captured
through the appropriate selection of scenarios for proceeds from its notional amount are reinvested
the market risk factor levels of the current and in a second Treasury bill. The actual quantity of
previous periods. In addition to these factors, the second Treasury bill purchased is a function
however, in the MtF framework, the profit and of its MtF value at the maturity date of the
loss impact due to time decay can be captured by current Treasury bill. At the maturity of the
explicitly incorporating the passage of time and second Treasury bill, the proceeds are reinvested
the impact due to position change by applying into a third Treasury bill in an amount that is a
different portfolio regimes. The portfolio
function of its MtF value at that second maturity
mapping sequence associated with an attribution
analysis is illustrated in Figure 4.3. date. The portfolio mapping sequence is
illustrated in Figure 4.4.
Conditional regimes
Each scenario path through time dictates a
In a conditional regime the portfolio strategies different total return for the roll-over strategy as
are dependent on the contents of the MtF Cube,
differing quantities of each Treasury bill will be
M, or values that may be derived from the MtF
purchased (see page 48).
Cube. Under these regimes, the quantity of
holdings is a function of the MtF values for given
scenarios and time steps. The portfolio holding, As a second example, consider a delta-hedging
R regime applied to a portfolio consisting of an
of instrument i under regime R for scenario j
x ijt ,
equity option, the underlying stock and a cash
at time step t is defined by the function
account. At specified dates, the portfolio is
R
xijt = g ( M )
dynamically rebalanced to delta hedge the
option. This is achieved by acquiring holdings in
= g ( f ( u 1jt, u 2jt, u 3jt, …, u Kjt ) ) the stock (funded by the cash account) sufficient
Step 4: Producing the portfolio MtF table 47

Attribution Analysis
Consider the impact of risk
factor and position changes Portfolio
Attribution factor Portfolio MtF value
between the previous day regime
(t = 0) and today (t = 1) on a
Previous day mark-to- A A A
portfolio containing a single m 00 = x 1 ⋅ m 100
market
financial product (i = 1) with
MtF values of m1jt. Change due to market A A A
m 10 = x 1 ⋅ m 110
factors
In addition to capturing the
Change due to time A A A
impact on profit and loss due m 01 = x 1 ⋅ m 101
decay
to market risk factors, the MtF
framework enables the capture Change due to B B B
m 00 = x 1 ⋅ m 100
position change
of the impact on profit and
loss due to time decay and Total profit and loss B B B
m 11 = x 1 ⋅ m 111
position change. Time decay is attribution
modeled by explicitly
Portfolio MtF values for an attribution analysis
incorporating the passage of
time while position change is
modeled by applying different consists of a static portfolio position change are appropriate
portfolio regimes. In this based strictly on the previous for attribution or backtesting
example, two scenarios are day’s positions, or x A1jt = x A1 , for analysis. The impact on profit
utilized: the risk factor levels and loss of the attribution
all j and t. Portfolio regime B
from the previous day (j = 0) analysis is the difference
consists of a static portfolio
and the risk factor levels from between the portfolio MtF
based strictly on today’s
today (j = 1). values under the two regimes
positions, or x B1jt = x B1 , for all j and the portfolio mark-to-
Individual portfolio MtF and t. market value of the previous
values are calculated based on day.
the mapping of two position The mark-to-market value
schedules associated with the from the previous day, as well
change in position from t = 0 as the change in mark-to-
until t = 1. Portfolio regime A market value due to changes
in risk factors, time decay and

Regime A MtF

m A01 mA11
Instrument MtF ions
posit
e rd a y
’s mA00 mA10
Today m101 m111 Yest

Yesterday m100 m110


Tod
a y’s Regime B MtF
p
0 o rs s

osi
j = fac ’s
j = act ay’

1 tors

tion
k ay
k e rd

s m B01 mB11
ris Tod
ris est
f
Y

mB00 mB10
Attribution analysis schema
48 Step 4: Producing the portfolio MtF table

Treasury Bill Roll-over Regime


At the t = 1 maturity of a R R
m jt = x 1jt ⋅ m1jt function
current Treasury bill, the
R R
proceeds of notional n are = 1 ⋅ m 1tj m jt = x 3jt ⋅ m 3jt
reinvested in a second
Treasury bill that matures at = m 1jt
n
2
= -------------------------- ⋅ m 3jt
t = 3. At the t = 3 maturity of m 2j1 ⋅ m 3j3
for t < 1
the second Treasury bill, the
proceeds are reinvested into a From the first roll-over date for t≥3
third Treasury bill maturing at (t = 1) until just prior to the
t = 5. While the roll-over second roll-over date (t < 3), The second equation illustrates
strategy is known at each roll- the MtF value of the that at the first reset date (t = 1)
over date, the actual holdings dynamically rebalanced the position in the first Treasury
in the three Treasury bills (i = portfolio can be represented by bill rolls into a second Treasury
1, 2, 3) are a function of their the non-linear function bill (with maturity at the next
MtF values (m1jt, m2jt, m3jt). roll-over date t = 3) in an
Prior to the first roll-over date R R
m jt = x 2jt ⋅ m2jt amount determined by the MtF
(t < 1), the MtF value of the value of the second Treasury bill,
portfolio regime can be n ⋅m m2j1, under each appropriate
= ---------- 2jt
represented by the linear m 2j1 scenario. The third equation
function illustrates that the amount of
for 1 ≤ t < 3
dynamic rebalancing at the next
roll-over date (t = 3) depends
From the second roll-over date
on the MtF value of the third
(t = 3) onward, the MtF value
Treasury bill at that date.
of the portfolio can be
represented as the non-linear

to make the overall portfolio delta neutral. The MtF values. Specifically, the MtF deltas for each
portfolio mapping sequence for this portfolio of the financial products are required. The
regime is illustrated in Figure 4.5. impact of delta hedging on the risk and reward
can be captured through a conditional portfolio
Position holdings change as a function of the MtF regime (see page 50).
values and the MtF deltas of each product. To
execute a delta-hedging strategy, the MtF Cube Additionally, changes in the MtF values of the
must contain other MtF measures in addition to cash account reveal the funding requirements in

FINANCIAL PRODUCTS MAPPING PORTFOLIO STRATEGY

t =1 T-Bill

t =3 T-Bill T-bill Rollover

t =5 T-Bill

Figure 4.4: Mapping Treasury bills into a portfolio roll-over strategy


Step 4: Producing the portfolio MtF table 49

FINANCIAL PRODUCTS MAPPING PORTFOLIO STRATEGY

Equity Option

Equity Underlying Delta Hedging

Cash Account

Figure 4.5: Mapping financial products into a portfolio delta-hedging strategy

each scenario, thus determining the associated There are two primary benefits of incorporating
funding and liquidity risk. Finally, the this intermediate step. The first is that all
effectiveness of different trading regimes, such as financial products of interest may not be known
delta and delta-gamma hedging strategies, can be a priori; this step provides a mechanism for
easily compared. evaluating them in the post-Cube stage rather
than requiring additional simulation in the pre-
A conditional strategy is directly dependent on Cube stage. The second benefit is that the
the contents of the MtF Cube and, thus, is a judicious choice of basis instruments and
deterministic function of the scenarios that corresponding mapping rules can significantly
underlie it. A complex multi-stage, multi-period reduce the dimensions of the MtF Cube and,
stochastic problem has become a single-stage, accordingly, the computational requirements in
multi-period problem that can be readily the pre-Cube stage.
analyzed.
Consider a financial institution with an inventory
Mapping abstract basis instruments into of 100,000 vanilla swaps (of a common currency)
portfolios and the desire to produce a risk report by
simulating across 1,000 scenarios and over
10 time steps. Without this intermediate
If the full set of financial products of interest is mapping step, the dimensions of the required
not contained in the pre-computed MtF Cube, MtF Cube are 100,000 instruments x 1,000
an intermediate mapping is required to produce scenarios x 10 time steps. The number of cells
the portfolio MtF values. This intermediate contained in the MtF Cube and, hence, the
mapping step determines a MtF value for a given number of revaluations required is one billion.
financial product as a function (linear or non-
linear) of the MtF values of the basis instruments The inclusion of an intermediate step that maps
contained in the MtF Cube. The MtF value of a abstract basis instruments into the 100,000 swaps
portfolio containing this product is then simply a reduces the computational requirements of this
function of the product MtF value which itself is exercise significantly. If we assume that the swaps
a function of the MtF values of the basis pay cash flows on 5,000 different cash flow dates
instruments. The mapping sequence in this case (corresponding to daily payments over the next
includes the intermediate step, as illustrated in 20 years), then the number of basis instruments
Figure 4.6. can be reduced to 5,000 without loss of accuracy.

BASIS INSTRUMENTS PRODUCTS PORTFOLIO REGIMES

Figure 4.6: Portfolio mapping sequence with abstract basis


50 Step 4: Producing the portfolio MtF table

Delta-Hedging Regime
A delta-hedging regime is Prior to the first rebalancing From the second rebalancing
applied to a portfolio date (t = 2), the MtF value of date (t = 4) onward, the MtF
consisting of positions in an the portfolio can be value of the dynamically
equity option (i = 1), the represented as rebalanced portfolio can be
underlying equity (i = 2) and represented as
R R R
a cash account (i = 3). At m jt = x 1jt ⋅ m 1jt + x 2jt ⋅ m 2jt
R R R R
t = 0 the portfolio contains a mjt = x1jt ⋅ m 1jt + x 2jt ⋅ m 2jt + x 3jt
single position in the option = m 1jt – ∆ 1j0 ⋅ m2jt
⋅ m 3jt
and a position in the
underlying equity such that for t ≤ 2 = m 1jt – ∆ 1j4 ⋅ m 2jt
the overall portfolio is initially
delta neutral. The cash m 2j2
From the first rebalancing date + ( ( ∆ 1j2 – ∆ 1j0 ) ⋅  ----------
account at t = 0 has a position m 
(t = 2) until just prior to the 3j2
of zero. Under this conditional second rebalancing date
delta-hedging regime, at t = 2  m 2j4
(t < 4), the MtF value of the + ( ∆ 1j4 – ∆ 1j2 ) ⋅  ------------ ) ⋅ m3jt
and t = 4 the portfolio is  m 3j4
dynamically rebalanced
dynamically rebalanced to portfolio can be represented as for t ≥ 4
delta hedge the option. This is
achieved by acquiring enough R R R R
m jt = x 1jt ⋅ m 1jt + x 2jt ⋅ m 2jt + x 3jt The portfolio MtF values in this
holdings in the equity (funded example not only enable the
by the cash account) so that ⋅ m 3jt assessment of a delta-managed
the overall portfolio becomes portfolio but, additionally,
= m 1jt – ∆ 1j2 ⋅ m 2jt + ( ∆ 1j2 – ∆ 1j0 )
delta neutral again. enable the assessment of funding
m2j2 risk by inspection of the cash
The MtF deltas for each of the ⋅  ---------- ⋅ m 3j2
m  account MtF values.
three financial products are 3j2

defined as ∆ijt, (i = 1, 2, 3). for 2 ≤ t < 4

The dimensions of the MtF Cube are reduced to As an example, a series of zero coupon all-in basis
5,000 instruments x 1,000 scenarios x 10 time instruments can be mapped into an OTC interest
steps and the number of revaluations is reduced rate swap. Figure 4.7 illustrates the sequence
to 50 million, a decrease in the computational used to map the MtF values of these basis
effort by a factor of 20. instruments into the MtF value of the swap.

RISK FACTORS BASIS INSTRUMENT MAPPING PRODUCT

IR Curve (node t = 1) t =1 Zero Bond

IR Curve (node t = 2) t =2 Zero Bond Swap

IR Curve (node t = 3) t =3 Zero Bond

Figure 4.7: Mapping zero coupon basis instruments into a swap


Step 4: Producing the portfolio MtF table 51

BASIS INSTRUMENTS MAPPING FINANCIAL PRODUCT

t = 2 Zero Bond
t = t* Zero Bond
t = 3 Zero Bond

Figure 4.8: Mapping zero coupon basis instruments into zero bond

The MtF value of the fixed leg of the swap can be Including the interpolation component in the
determined by a static linear function of the mapping can further reduce the number of basis
values of the zero coupon basis instruments, instruments required in the MtF Cube. Consider
while the MtF value of the floating leg can be the 100,000 swaps described above. Assuming
determined by a dynamic non-linear function of that the institution utilizes a 10-node term
the values of the same basis instruments. The
structure in the pricing of these swaps, then the
mapping of the basis instruments into the floating
number of basis instruments can be reduced to 10
leg changes at each reset date and is a function of
their MtF value under each scenario and time without loss of accuracy. The dimensions of the
step. This effect can be captured through a new MtF Cube are reduced to 10 instruments x
conditional portfolio regime (see page 52). 1,000 scenarios x 10 time steps. Correspondingly,
the number of revaluations is reduced
If the cash flow dates of the swap do not coincide dramatically to 100,000, a decrease in the
with the maturity dates of the basis instruments,
computational effort by a factor of 10,000.
a perfect mapping can be achieved by
incorporating a second component in the
Perfect mapping can still be achieved even if the
intermediate mapping. The function of this
swaps pay cash flows that do not fall on the term
additional mapping component is roughly
equivalent to the interpolation method used to structure node points. As all independent
value cash flows that fall between two nodes on interest rate risk factors are represented as basis
an interest rate term structure. The instruments in the MtF Cube, the number of
“interpolation” component of the mapping is basis instruments need not exceed the number of
illustrated in Figure 4.8. nodes in the term structure (see page 53).
52 Step 4: Producing the portfolio MtF table

Mapping Basis Instruments into a Swap


Consider a portfolio (t < 1), the MtF value of a term is the value of the
containing a swap with second portfolio, B, containing remaining swap payments (again
maturity at t = 3, notional n, the floating leg can be represented using the fixed
and a fixed rate, rf, represented using the fixed notional representation). From
corresponding to a term of notional representation of the the second reset date (t = 2)
t = 1. The MtF value of a swap in a linear function onward, the MtF value of the
portfolio A, containing the portfolio containing the floating
B
fixed leg of the swap is a linear, m jt = g ( m 1jt, m 2jt , m3jt ) leg can be represented as the
static function of the zero non-linear function
coupon basis instruments = ( 1 + r –1 ) ⋅ n ⋅ m 1jt
B
(i = 1, 2, 3) associated with m jt = g ( m1jt , m 2jt , m 3jt )
time steps t = 1, 2, 3 for t < 1
1 – 1 ⋅ n
= r – 1 ⋅ n ⋅ m 1jt +  ----------
m 
A
m jt = g ( m1jt , m 2jt , m 3jt ) From the first reset date 2j1

(t = 1) until just prior to the 1 ⋅n ⋅m


= r f ⋅ n ⋅ m 1jt + r f ⋅ n ⋅ m 2jt ⋅ m 2jt + ----------
second reset date (t < 2), the m 3j2 3jt

MtF value of the portfolio


+ ( 1 + r f ) ⋅ n ⋅ m 3jt for t ≥ 2
containing the floating leg can
for all scenarios and time be represented as the non-
Note that the dynamic mapping
steps. linear function
of the basis instruments into the
The floating leg mappings B swap floating leg is equivalent to
m jt = g ( m1jt , m 2jt , m3jt )
change at each reset date (in a strategy of rolling over zero
this example, at t = 1, 2) and coupon bonds at each reset date.
1
= r – 1 ⋅ n ⋅ m 1jt + ---------- ⋅ n ⋅ m 2jt
are a function of the MtF m 2j1
value of the appropriate zero
coupon basis instrument under for 1 ≤ t < 2
each scenario. At time t = –1,
the rate has already been The first term in this
preset at r –1 . expression is the value of the
reinvested first coupon
Prior to the first reset date payment at t = 1. The second
Step 4: Producing the portfolio MtF table 53

Interpolating Abstract Basis Instruments


Consider two zero coupon for all scenarios and time – 0 r2 ⋅ t 2 – 0r 4 ⋅ t 4
m B00 = 10 ⋅ e + 110 ⋅ e
basis instruments, (i = 1 and steps.
i = 2), with maturities of t = 2
and t = 3, respectively, that Assume the term structure of = 10 ⋅ e
– 0.035 ⋅ 2
+ 110 ⋅ e
– 0.045 ⋅ 4

must be mapped into a single continuously compounded


zero coupon bond that interest rates is defined by
= 101.20
three independent nodes
matures at 2 < t*< 3. corresponding to terms one,
A set of zero coupon basis
One possible mapping three and five. The rates
instruments associated with the
function is based on linear associated with the
independent nodes of the term
interpolation of a term intermediate nodes (t = 2 and
structure have values
structure at the level of the t = 4) are determined by
discount factor. In this case, linear interpolation. For – 0 r1 ⋅ t 1 – 0.03 ⋅ 1
m 100 = e = e = 0.9704
the MtF value of a portfolio A example, the rate associated
containing the zero coupon with term t = 2 is – 0 r3 ⋅ t 3 –0.04 ⋅ 3
m300 = e = e = 0.8869
bond is a static linear function
of the basis instruments  t 3 – t 2  t 3 – t2
0r2 =  -------------- ⋅ 0r1 +  -------------- ⋅ 0r 3
t –t3 1 t –t3 1 – 0 r5 ⋅ t 5 –0.05 ⋅ 5
A m500 = e = e = 0.7788
m = g (m , m )
jt 1jt 2jt 3–2 2–1
=  ----------- ⋅ 0.03 +  ----------- ⋅ 0.04
3–1 3–1 In the first mapping approach,
 t – t*  t* – t  the term structure is linearly
3 2
=  --------------- ⋅ m 1jt +  --------------- ⋅ m 2jt
t –t
 3 2
t –t
 3 2 = 0.035 interpolated at the level of the
discount factor. The value of the
for all scenarios and time Term structure of interest rates bond can therefore be
determined as a function of the
steps.
basis instrument values.
rate ort
A second possible mapping term t
(%) In this case, the mapping is
function is based on linear strictly a linear combination of
interpolation of a term 1 3.0
the basis instruments. The
structure at the level of the advantage of a linear function,
2 3.5
continuously compounded which limits the complexity of
zero rate. In this case, the MtF 3 4.0
the mapping, comes at the
value of another portfolio B 4 4.5 expense of perfect replication;
containing the zero coupon the bond is mispriced by 0.3%.
bond is a static non-linear 5 5.0
function of the basis In the second mapping
instruments The value mB00 of a four- approach, the term structure is
period bond with notional 100 linearly interpolated at the level
B
m jt = g (m 1jt , m 2jt) USD and coupons of 10 USD of the continuously compounded
payable every two periods is return. The value of the bond
 t * – t   *
 t3 – t*   t *
 ---------------  ----
 ---------------2  ----
 t – t   t 
t can be determined as another
= m1jt  t 3 – t2   t 2
⋅ m 2jt  3 2 3
function of the basis instrument
values.
54 Step 4: Producing the portfolio MtF table

( t 3 – t2 ) ( t2 – t 1 ) ( t5 – t 4 ) ( t 4 – t3 )
m B00 = 10 ⋅ ------------------
- ⋅ m 100 + ------------------
- ⋅ m 300 + 110 ⋅ ------------------
- ⋅ m 300 + ------------------
- ⋅ m 500
( t 3 – t1 ) ( t3 – t 1 ) ( t5 – t 3 ) ( t 5 – t3 )

2 – 1 ) ⋅ 0.8869 + 110 ⋅ (----------------


3 – 2 ) ⋅ 0.9704 + (----------------
= 10 ⋅ (---------------- ( 4 – 3 )- ⋅ 0.7788
5 – 4 ) ⋅ 0.8869 + ------------------
(3 – 1 ) (3 – 1 ) (5 – 3 ) ( 5 – 33)

= 100.90

First mapping approach

In this case, the mapping is a basis instruments. This however, perfect replication of
non-linear combination of the mapping is more complex, the bond can be achieved.

(t – t ) t (t – t ) t t –t t t –t t
3 2 ⋅ ----2
-------------------- 2 1 ⋅ ----2
-------------------- 5 4 ⋅ ----4
--------------- 4 3 ⋅ ----
--------------- 4

m B00 = 10 ⋅ m 100 ( t 3 – t 1) t
1 ⋅ m 300 ( t3 – t1 ) t
3 + 110 ⋅ m 300 t5 – t 3 t
3 ⋅ m500t 5 – t3 t
5

( 3 – 2) 2 (2 – 1) 2 (5 – 4) 4 ( 4 – 3) 4
---------------- ⋅ -- ---------------- ⋅ -- ---------------- ⋅ -- ---------------- ⋅ --
( 3 – 1) 1 (3 – 1) 3 (5 – 3) 3 ( 5 – 3) 5
= 10 ⋅ 0.9704 ⋅ 0.8869 + 110 ⋅ 0.8869 ⋅ 07788

= 101.20

Second mapping approach


Step 5: Producing the Desired
Risk/Reward Measures
The portfolio MtF table resulting from the mapping of the MtF Cube into a given
portfolio or portfolio regime contains a full description of future uncertainty. Each cell
of the portfolio MtF table contains a portfolio MtF value for a given scenario and time
step. The actual risk and reward measures chosen to characterize this uncertainty can
be arbitrarily defined and incorporated strictly in the post-Cube stage. In this chapter,
we present a selection of market, credit and liquidity risk measures, as well as reward
and performance measures, that can be calculated in the post-Cube stage.

In Step 5, risk/reward measures as well as other values. In some cases, additional information,
quantitative portfolio analytics are applied to the such as scenario probabilities or counterparty
portfolio MtF table. Four general categories of default probabilities are required. In others, the
analytics may be applied to MtF values must be transformed into other
measures before being used to calculate risk/
• calculate distribution descriptors from the reward measures.
portfolio MtF table
• transform MtF values into other MtF Calculating distribution descriptors
measures
Typical post-processing applications based on
• trade off risk and reward
untransformed MtF values calculate statistics
• incorporate specific risk. characterizing the market risk and liquidity risk
associated with particular portfolios or portfolio
A key construct underlying the calculation of the regimes. Selected measures associated with these
distribution descriptors associated with various two risk classes are summarized in Table 5.1 and
risk/reward measures is that the underlying input Table 5.2. In some cases, the applications require
for each is the same MtF Cube. Thus, the MtF the scenario weightings as defined by the S x 1
methodology provides a unifying framework for probability vector p, where each cell
the integration of market risk, credit risk and p j, ( j = 1 , 2, …, S ) contains the probability
liquidity risk measures. A variety of summary
associated with each state j. The formulations are
statistics can be calculated by aggregating the
based on the notation summarized in Table N.4
R
MtF values, m ijt , across the scenario and/or time (see page 80.)
step dimensions of a portfolio MtF table. Thus, in
addition to standard measures that may not The MtF Cube can be used for the assessment of
account for the passage of time, the MtF reward as well as for risk measurement. Table 5.3
framework incorporates forward-looking provides selected reward measures applied as
measures based on future distributions of MtF post-processing applications.
56 Step 5: Producing the desired risk/reward measures

Table 5.1: Selected market risk measures

Measure Formulation

Variance
S 2
R 2  R  S R 
( σt ) = ∑ j  jt  ∑ j jt 
p  m –  p m  
j=1 j=1

Value-at-Risk R R R
VaR t ( α ): Pr { ( m 0t – m jt ) ≥ VaR Rt ( α ) }
(confidence level
α) = 1–α

Expected shortfall S
(confidence level R 1 R R
E [ S jt ] ( α ) = ------------ ∑ p j ( m 0t – m jt – VaR Rt ( α ) )+
α) 1–α
j=1

Regret (risk- S
preference R R –
λE [ D t ] = λ ∑ p j ( m jt – τ jt )
adjusted expected
j=1
downside)

Put value (value


of downside)  S R –
Put Rt =  ∑ ρ j ( m jt – τjt )  ⋅ m 1jt
j = 1 

Table 5.2: Selected liquidity risk measure

Measure Formulation

Liquidity-adjusted
R  R R t∗ 
VaR (period t*, VaR t, t∗ ( α ): Pr  ( m 0t∗ – m jt ) ≥ VaR Rt ( α ) 
confidence level α)  
= 1–α

A detailed example • a long position on the DAX such that the


overall position is delta neutral.
Consider a portfolio in which a leveraged
exposure has been taken on the DAX index. The Figure 5.1 illustrates the values of the overall
objective is to describe the risk profile of the position and each of its components simulated
portfolio based on a 30-day time horizon. over moves in the DAX ranging from a 10%
decline to a 10% increase; zero represents the
The overall position has a mark-to-market value current level of the DAX.
of 5.0 million EUR and consists of the following
products: Since the overall position is delta neutral, the
value of the portfolio is relatively insensitive to
• a cash account of 4.0 million EUR.
local changes in the DAX. As such, a simple
• a short “at-the-money” straddle (short parametric approach to measuring VaR (based on
positions in both a call and a put on the delta-equivalent exposures) produces a measure
DAX) that suggests negligible risk.
Step 5: Producing the desired risk/reward measures 57

Table 5.3: Selected reward measures

Measure Formulation

Expected profit and loss S


(P&L) R R R
E t∗ [ P&L ] = ∑ pj ( m jt∗ – m0t )
j=1

Expected return S R
R p j m jt∗
R t∗ = ∑ -------------
R
–1
j = 1 m 0t

Expected upside S
R R
E [ Ut ] = ∑ pj( mjt – τ jt )+
j=1

Call value (value of upside)


R  S R 
Call t =  ∑ ρ j ( m jt – τjt ) + ⋅ m 1jt
j = 1 

upward or downward. Therefore, there must be


other enticements for investors who take on this
position that are not captured by changes in the
DAX alone.
A second key factor missing from this analysis, as
illustrated in Figure 5.2, is the fact that the
portfolio has positive drift associated with the
passage of time. The time decay inherent in the
short straddle is such that, if everything else
remains constant, the overall position will gain
value over time.

Figure 5.1: Mark-to-market of the portfolio over


a range of DAX values

A couple of key factors are missing from this


analysis. One, of course, is that this position is
highly non-linear. For small changes in the DAX,
the change in value of the overall position is
minor: a 2.5% DAX move (in either direction)
results in a 1.5% loss. For larger changes in the
DAX, however, the change in the value of the
overall position is dramatic: a 10% DAX move
(in either direction) results in a 40% loss. Clearly,
a simple parametric VaR measure cannot
adequately describe the risk inherent in this Figure 5.2: Positive drift associated with the
position. portfolio

Another observation from Figure 5.1 is that the In addition, the value of the short straddle is
portfolio loses value for all DAX moves, whether highly sensitive to changes in the implied
58 Step 5: Producing the desired risk/reward measures

volatilities associated with the options on the


underlying DAX. As the call and put values are
positively correlated with implied volatilities, a
volatility increase will result in a decrease in the
value of the short straddle and, hence, of the
overall position. In contrast, a decrease in
implied volatilities will result in an increase in
the value of the overall position.

Since a parametric approach to estimating VaR


for this position is completely inadequate,
consider instead a standard Monte Carlo
approach designed to capture the portfolio’s a) P&L by scenario
inherent non-linearity. In this approach, the
value of the position is simulated over
100 scenarios containing instantaneous shocks to
the relevant risk factors.

In Figure 5.3a, the profit and loss of the overall


position, corresponding to 100 Monte Carlo-
generated scenarios on 30-day changes in the
value of the DAX (based on a volatility estimated
from a time series of DAX history), is displayed
along the vertical axis. In Figure 5.3b, post-
processing is applied to produce a histogram of
profit and loss. b) Histogram of P&L

The mean and 95% VaR measures for the 30-day Figure 5.3: Changes in portfolio MtM values
time horizon are calculated to be losses of across Monte Carlo scenarios
1 million EUR and 4.5 million EUR, respectively.
This is not surprising. Inspection of Figure 5.3a
reveals that every scenario results in a loss, and
thus there appears to be risk with no possibility of
reward. A standard Monte Carlo approach
captures the non-linearity but misses the reward
derived from the positive drift that accrues from
the passage of time. In addition, standard Monte
Carlo approaches typically do not account for the
risk associated with changes in implied
volatilities.

Within the MtF framework, the passage of time is


captured explicitly by scenarios that now
represent the evolution of risk factors through Figure 5.4: Portfolio MtF values over 100
time and not merely instantaneous shocks. scenario paths
Consider the same position and the same set of
100 scenarios triggered not instantaneously, but Each line represents the value of the position
evolving daily over a 30-day time horizon. over a given scenario path through time. The
Figure 5.4 illustrates the MtF values of the paths of MtF values over time provide much
portfolio simulated daily over the 30-day time greater insights into the dynamics of the position
horizon. and, hence, its risk/reward profile.
Step 5: Producing the desired risk/reward measures 59

Note that the positive drift of the portfolio is With the imposition of the delta-neutral regime,
captured as most scenario paths result in a it is now possible to assess the risks of a managed
modest profit. A number of scenario paths, portfolio. Under this regime, the mean and 95%
however, result in a loss that, in contrast, can be VaR measures in 30 days are calculated to be a
quite dramatic. This is the risk inherent in the gain of 175,000 EUR and a loss of 750,000 EUR,
portfolio if the positions are static over the 30- respectively. There is a 2.5% chance that the
day time horizon. The mean and 95% VaR portfolio will lose 20% of its initial value but no
measures in 30 days are calculated to be a gain of chance that the portfolio will be in ruin. Analysis
250,000 EUR and a loss of 4.0 million EUR, of the managed portfolio reveals that delta
respectively. There is a 2.5% chance that the hedging constrains the potential downside risk of
portfolio will be in ruin, having lost all of its the portfolio considerably, but at the expense of
initial value of 5 million EUR. reduced mean performance.

This analysis can be taken even further if we


The MtF framework allows us to take this consider the funding and market liquidity issues
analysis yet further. Suppose that the portfolio that may arise under each scenario and time step
positions are not static but rather that the as quantities of the DAX must be purchased.
portfolio is managed over the 30-day horizon. Inspection of the changes in the level of the cash
Consider a portfolio regime designed to maintain account required to fund the strategy provides a
delta neutrality of the overall position by trading means for assessing funding risk.
in the DAX and funding the trades by the cash
account. Figure 5.5 illustrates the MtF values of Transforming MtF values
the dynamically rebalanced portfolio simulated
daily over the 30-day time horizon. In certain applications, it is necessary to
transform the product or portfolio MtF values
prior to the calculation of risk/reward measures.
The transformation can be any arbitrary function
(linear or non-linear) applied to the portfolio
MtF value.
A function that takes the maximum of the
portfolio MtF value and zero transforms the MtF
value to a MtF measure of counterparty credit
exposure. In addition, if the portfolio is
composed of multiple positions, the
transformation may also account for netting rules
and other applicable migration provisions. Note
that for the measurement of counterparty credit
Figure 5.5: Dynamic MtF values over 100 exposure, there is no need to account for total
scenario paths return, and therefore no need to model cash flow

Table 5.4: Transformation of MtF values to credit exposure measures

Measure Transformation

Actual exposure R R
AE jt = max [ mjt ,0 ]

Potential exposure R R R
PE jt = maxt < t* ≤ T [ m jt* – AE jt ,0 ]

Total exposure R R R
TE jt = AE jt + PE jt
60 Step 5: Producing the desired risk/reward measures

a) MtF values b) MtF actual exposures

c) MtF potential exposures d) MtF total exposures

Figure 5.6: Transforming portfolio MtF values

reinvestment through the definition of all-in Figure 5.6b illustrates the transformation of
instruments. portfolio MtF values into counterparty MtF
actual exposures across three scenarios and over
Table 5.4 summarizes the transformations of
portfolio MtF values required to provide some 10 time steps. Figure 5.6c and Figure 5.6d
example credit exposure measures. Note that illustrate the transformations of portfolio MtF
these transformation functions are based solely values into counterparty MtF potential exposures
on the information residing in the portfolio MtF and MtF total exposures, respectively, across
table. To calculate actual exposure, a simple three scenarios and over 10 time steps.
transformation is applied to the portfolio MtF
value under each scenario and time step. In the In other cases, additional information such as
slightly more complex case of potential exposure, counterparty default probabilities and recovery
the transformation is based on all portfolio MtF rates may be required. This information can be
values over the t ≤ t* ≤ T time horizon for a given incorporated statically or dynamically in the
scenario. transformation in the algorithms embedded in
Figure 5.6a illustrates the portfolio MtF values of the post-processing application. A summary of
a position with exposure to a given counterparty selected counterparty credit risk measures is
across three scenarios and over 10 time steps. presented in Table 5.5.
Step 5: Producing the desired risk/reward measures 61

Table 5.5: Selected credit risk measures


Measure Formulation

Expected S
counterparty R R
credit exposure
E [ TE jt ] = ∑ pjT E jt
j=1

Expected S T
counterparty R R R R
credit loss
Lt = ∑ pj ⋅ ∑ A E jt ∗ ⋅ p ( t∗ j ) ⋅ ( 1 – r ( t∗ j ) )
j=1 t∗ = t

Expected cross-

∑ . ∑ AE
S T R
counterparty R R R
credit loss
Lt = ∑ pj ⋅ jt∗
⋅ p ( t∗ j ) ⋅ ( 1 – r ( t∗ j ) )
j=1 t∗ = t R = 1

Trading off risk and reward S


+
R R

Both risk and reward measures calculated in this


E [ Ut ] = ∑ pj ( m jt – τ jt )
j=1
step may span the scenario and time dimensions
of a portfolio MtF table. Such an approach is which allows us to define the Put/Call Efficient
ideal for the calculation of measures that are Frontier. This frontier can be formulated as
based upon the relative performance of a
portfolio strategy with respect to a benchmark. * R R
U t = E [ U t ] – λE [ D t ]
Consider the measure expected downside,
defined as the expected under performance of a where λ represents a coefficient of risk aversion.
portfolio regime R with respect to the
benchmark’s performance across a range of Table 5.6 summarizes selected performance
scenarios (see Table 5.1). If the functions (a)+ measures that can be implemented in post-
and (a)– equal max(a,0) and min(a,0), processing applications.
respectively, then the expected downside at time
t for portfolio regime R (with respect to a Incorporating specific risk
benchmark) is equal to
Specific risk may be incorporated into an analysis
R
S
R – by simply including the specific risk factors
E [ Dt ] = ∑ pj ( m jt – τjt ) directly in the original scenarios. Examples of this
j=1
approach include modeling individual equities as
where τjt represents the value of an arbitrary unique risk factors (see Figure 2.2 in Step 2) and
including “spreads-over-yields” as risk factors for
benchmark under scenario j at time t. Expected
individual bonds. The disadvantage of this
downside is an ideal measure for evaluating the
approach is that the total number of risk factors
performance of mutual funds with respect to
may increase dramatically, implying that the
target payoffs. Notice that this measure is
scenarios required to span the overall risk factor
identical to the expected payoff of a put option
on the current value of the portfolio. space may increase in number and complexity.

In addition, regret (risk-preference adjusted In traditional risk management approaches, this


expected downside) can be traded off against the issue is often avoided by assuming that enough
analogous expected upside measure which is diversification exists in the portfolio such that
identical to the expected payoff of a call option specific risk is negligible. By adopting this
on the current value of the portfolio (see assumption, specific risk factors may be ignored
Table 5.3), or, completely and excluded from the scenarios.
62 Step 5: Producing the desired risk/reward measures

Table 5.6: Selected performance measures

Measure Formulation

Sharpe ratio R
R R
S t = ------t
R
σt

RAROC R
Rt
R *t ( α ) = ----------------------
-
R
VaR t ( α )

Put/Call Efficient Frontier * R R


Ut = E [ U t ] – λE [ D t ]

Examples of this approach include modeling As an example, for a given equity pricing model,
equities by their systemic risk factors alone (see the specific risk component of an individual
Figure 2.4 in Step 2) or by holding spreads-over- stock is independent of the levels of the systemic
yields constant for individual bonds. risk factors that, when combined, determine its
MtF value. The pre-computed MtF Cube need
The MtF framework offers a powerful alternative only contain the systemic realizations of
approach. Specific risk may be incorporated in individual stocks under each scenario and time
the post-Cube stage rather than in the pre-Cube step. Conditional upon a given scenario, the
stage by taking advantage of the independence specific risk for each stock can be incorporated in
between systemic risk factors and specific, or the post-Cube stage to produce appropriate risk/
idiosyncratic, risk factors. Given this property, reward measures combining both systemic and
the specific components of the MtF values of a specific risks.
set of securities are independent of each other, As a second example, for a given credit risk
conditional upon the occurrence of a given model, the specific, or idiosyncratic, component
scenario. This implies that the systemic that contributes to the creditworthiness of a
component of a MtF value can be pre-computed counterparty is independent of the levels of the
and stored in the MtF Cube with the specific risk systemic credit drivers. The pre-computed
component incorporated strictly in the post- MtF Cube contains the counterparty MtF credit
Cube stage, either analytically or through a exposures as well as the systemic realizations of a
secondary sampling exercise. Methodologies that creditworthiness index under each scenario and
may be applied in the post-Cube stage include time step. Conditional upon a given scenario, the
the law of large numbers, the central limit specific risk component of the creditworthiness
theorem, probability generating functions and index can be incorporated in the post-Cube
moment generating functions. stage. This is further explained in Step 6.
Step 6: Advanced Mark-to-
Future Applications
MtF Cubes may serve as input for applications more complex than calculating simple
risk/reward measures. The properties of linearity and conditional independence on
each scenario can be used to obtain computationally efficient methodologies. For
example, conditional independence within a particular scenario is a powerful tool that
allows the MtF framework to incorporate processes such as joint counterparty
migration effectively. In addition, portfolio or instrument MtF tables may be used as
input to a wide variety of scenario-based risk management and portfolio optimization
applications.

Financial institutions worldwide have devoted market, credit and liquidity risk, and facilitates
considerable effort to developing enterprise-wide the construction of effective scenario-based risk
systems that integrate financial information management and optimization tools. In this
across their organizations to measure their section, we demonstrate these principles with
institution’s risk. Probabilistic measures, such as two applications: an integrated market and credit
VaR, are now widely accepted by both financial risk framework and scenario-based optimization
institutions and regulators for assigning risk tools that trade off risk and reward efficiently.
capital and monitoring risk. Since development
efforts have been driven largely by regulatory and Integrated market and credit risk
internal requirements to report risk numbers, the
development of tools to understand and manage Credit risk modeling is one of the most important
risk across the enterprise has generally lagged topics in risk management and finance today.
behind those designed to measure it. The last decade has seen the development of
models for pricing credit risky instruments and
Measuring risk is a passive activity; simply derivatives, for assessing the creditworthiness of
knowing the VaR does not provide much obligors, for managing exposures of derivatives
guidance for managing risk. In contrast, risk and for computing portfolio credit losses for
management is a dynamic endeavour, requiring bonds and loan portfolios.
tools that construct a comprehensive picture that However, common practice still treats market
integrates all types of risk as well as identify and and credit risk separately. When measuring
reduce the sources of risk. Risk management market risk, credit risk is commonly not taken
tools should lead to an effective utilization of the into account; when measuring portfolio credit
wealth of financial products available in the risk, the market is assumed to be constant. The
markets to obtain the desired risk and reward two risks are then “added” in ad hoc ways,
profiles. resulting in an incomplete picture of risk.
The Mark-to-Future framework enables the There are two types of credit risk measurement
integration of various types of risks, such as models: counterparty credit exposure models and
64 Step 6: Advanced applications

portfolio credit risk models. The integration of One of the strongest arguments for integrating
market and credit risk has a major impact in both market and credit risk is the desire to avoid
cases. “wrong-way” counterparty exposures. Wrong-way
exposures occur when, in a given scenario, the
Counterparty exposure models market move increases the counterparty
exposure and simultaneously weakens the
Derivative desks traditionally manage credit risk
counterparty's credit quality. Many analysts have
by monitoring and placing limits on counterparty
credit exposures. For derivative portfolios, identified wrong-way exposures as the cause of
counterparty exposure is generally defined as US commercial bank problems during the 1998
the economic loss that will be incurred on all Asian crisis. Most methodologies do not measure
outstanding transactions if a counterparty counterparty exposures properly since they
defaults, unadjusted for possible future assume, either explicitly or implicitly, that
recoveries. More generally, exposures may also be interest rates and other financial factors and a
defined conditional not only on a counterparty counterparty's credit quality are independent.
default but also on a credit migration such as a
downgrade. Counterparty exposure models Portfolio credit risk models
measure and aggregate the exposures of all
Counterparty credit risk models focus on risk at
transactions with a given counterparty.
the counterparty level only; they do not attempt
As explained in Step 5, the total exposure to a to capture portfolio effects such as the
counterparty is defined as a single number which correlation between counterparty defaults and
is the sum of the actual exposure, based on the migrations. In contrast, portfolio credit risk
mark-to-market of the portfolio, and the models measure credit capital and are specifically
potential exposure, reflecting the changes of the designed to capture portfolio effects, specifically
counterparty exposure in the future. In the BIS obligor correlations. Portfolio credit risk models
regulatory model, the potential exposure is given popular in the industry include CreditMetrics
by an add-on factor multiplying the notional of (J.P. Morgan 1997), CreditRisk+ (Crédit Suisse
each transaction (Basle Committee on Banking Financial Products 1997), Credit Portfolio View
Supervision 1988). Although simple to (Wilson 1997a and 1997b) and KMV’s Portfolio
implement, the model has been widely criticized Manager (Kealhofer 1996). Although
because it does not accurately account for the superficially they appear quite different—the
nature of these exposures in the future. Since models differ in their distributional assumptions,
exposures of derivatives such as swaps depend on
restrictions, calibration and solution—
the level of the market when default/migration
Gordy (1998) and Koyluoglu and
occurs, models must capture not only the actual
Hickman (1998) show an underlying
exposure to a counterparty at the time of the
mathematical equivalence among these models.
analysis but also its potential future changes.
Furthermore, empirical work shows generally
Recently, more advanced methods based on that all portfolio credit risk models yield similar
Monte Carlo simulation (Aziz and results if the input data is consistent (Crouhy and
Charupat 1998) have been implemented by Mark 1998, Gordy 1998).
financial institutions. The contingency of the
market on derivative portfolios and credit risk A major limitation of portfolio credit risk models
can be captured explicitly by using a MtF is the assumption that market risk factors, such
framework to simulate counterparty portfolios as interest rates, are deterministic. Hence, they
through time over a wide range of scenarios. do not account for stochastic exposures. While
Furthermore, natural offsets, netting, collateral this assumption has less consequence for
and various mitigation techniques used in portfolios of floating rate loans or bonds, it has
practice can be modeled accurately in a MtF great impact on derivatives such as swaps and
framework. The framework also lends itself to options.
implementing systematic stress testing for
exposures using techniques such as scenario Ultimately, a comprehensive framework requires
banding (Cartolano and Verma 2000). the full integration of market and credit risk.
Step 6: Advanced applications 65

An integrated market and credit risk framework default/migration probabilities directly. In


this sense, it constitutes an integration of
Iscoe et al. (1999) introduce a multi-step model
counterparty exposure and portfolio credit
based on the MtF framework to measure portfolio
credit risk. The model integrates exposure risk models and effectively incorporates
simulation and portfolio credit risk methods. By wrong-way exposures.
explicitly modeling stochastic exposures, the
model overcomes the major limitation of current • Finally, it extends the Merton model of
portfolio models that do not properly account for default (1974), as used, for example, in
the exposure caused by instruments with CreditMetrics, to multiple time steps. It
embedded derivatives. Furthermore, through the explicitly solves for multi-step thresholds and
explicit modeling of default/migration conditional default and migration
probabilities conditional on the state of the probabilities in a general simulation setting.
economy, the framework also serves as the basis
for estimating accurate wrong-way exposures. Figure 6.1 presents a schematic describing the
implementation in the MtF framework of the
The MtF framework is key to obtaining a
five-part approach of Iscoe et al. The framework
computationally efficient methodology that is
also consistent with the overall risk management consists of five parts:
strategy of the institution. First, the methodology
can be seen as a secondary post-processing step Part 1: Risk factors and scenarios. Scenarios are
applied to counterparty exposures which are created over the analysis period using a model of
transformed from the MtF Cube as discussed in the evolution of the relevant systemic risk
Step 5. This can be the same post-processed factors. These factors may include both credit
MtF Cube that is used to manage counterparty drivers and market factors. The integration of
credit risk limits. Second, we can exploit the market and credit risk is achieved in the
powerful property of conditional credit scenarios which explicitly define the joint
independence also discussed in Step 5 to evolution of market factors and credit drivers.
minimize the number of scenarios for which
expensive portfolio valuations are calculated. Part 2: Obligor exposures, recoveries and
Thus, advanced Monte Carlo or analytical
losses in a scenario. The amount that will be lost
techniques that take advantage of the problem
in the event of default or credit migration, as well
structure can be applied.
as potential recoveries, is computed under each
Specifically, the model presented in Iscoe et al. is scenario. Based on the level of the market factors
an improvement over current portfolio models in in a scenario at each point in time, MtF
three main aspects: exposures for each counterparty are obtained
accounting for netting, mitigation and collateral.
• First, it defines explicitly the joint evolution
Similarly, recovery rates in the event of default
of market risk factors and credit drivers
through the realization of scenarios. Market can be scenario dependent.
factors drive the prices of securities and
credit drivers are systemic factors that drive Part 3: The joint default/migration model.
the creditworthiness of obligors in the Default/migration probabilities vary as a result of
portfolio. Factors are general and can be changing economic conditions. An obligor’s
microeconomic, macroeconomic, economic default/migration probabilities are conditioned
and financial. The MtF values and credit on the scenario at each point in time. The
drivers are stored in the MtF Cube; the MtF relationship between its conditional probabilities
values are transformed to exposures. and the scenario is obtained through an
• Second, through simulation, it models intermediate variable, called the obligor’s
stochastic exposures (as do the counterparty creditworthiness index (CWI), which represents
credit exposure models) and conditional the financial health or value of the obligor.
66 Step 6: Advanced applications

Scenarios
Part 1 (risk factors) Basis instruments
Generation of systemic risk factors

Pre-Cube
MtF Cube

Simulation of instrument
MtF values
Mapping into obligors to Obligor mapping into
produce obligor MtF values credit drivers
Obligor MtF Part 3
Part 2 values & CWI’s

Transformation of MtF

Post-Cube
values to exposures

Obligor MtF exposures Conditional probabilities

Estimation of conditional
portfolio loss distributions
Part 4

Portfolio credit loss

Part 5 Estimation of unconditional


portfolio loss distributions

Figure 6.1: Integrated market and credit risk in the MtF framework

Correlations among obligors are determined by intensity based (Lando 1997 and 1998, Duffie
the joint variation of conditional probabilities and Singleton 1997, Jarrow and Turnbull 2000).
across scenarios. Thus, a default/migration model
defines a functional relationship that maps the Part 4: Conditional portfolio loss distribution
CWI and the unconditional default/migration in a scenario. Conditional upon a scenario,
probabilities into conditional default/migration obligor defaults and migrations are independent.
probabilities for each scenario and time step. In practice, the computation of conditional losses
can be onerous. In the most general case, a
Default/migration models can be econometric as Monte Carlo simulation can be applied to
in the logit model (Wilson 1997a, 1997b), determine conditional portfolio losses. However,
structural as in the Merton model (J.P. the observation that obligor defaults are
Morgan 1997, Iscoe et al. 1999), reduced form or independent permits the application of more
Step 6: Advanced applications 67

effective computational tools. Some of these • explain the effect of new trades on portfolio
techniques are described in Crédit Suisse (1997), risk
Finger (1999) and Nagpal and Bahar (1999).
• explain the impact of positions in non-linear
For example, for very large and homogeneous instruments and of non-normal risk factor
portfolios, the law of large numbers can be used distributions on portfolio risks
to estimate conditional portfolio losses. As the
number of obligors approaches infinity, the • explain complex, non-intuitive, market views
conditional loss distribution converges to the implicit in the portfolio as well as in the
mean loss over that scenario; the conditional
investment policy or market liquidity
variance and higher moments become negligible.
Other methods include the application of the • generate potential hedges and optimize
central limit theorem (which assumes the portfolios.
number of obligors is large, but not necessarily as
large as that required for the law of large
The most widely used tools are based on
numbers), the application of moment generating
extensions of the insights originally developed by
functions with numerical integration or the
application of probability generating functions Markowitz (1952) and Sharpe (1964) in modern
with a discretization of exposures. portfolio theory. For example, Litterman (1996a,
1996b, 1997a, 1997b) describes a comprehensive
Part 5: Aggregation of losses in all scenarios. set of analytical risk management tools,
Finally, the unconditional distribution of portfolio developed in close collaboration with the late
credit losses is obtained by averaging the Fisher Black and his colleagues at Goldman
conditional loss distributions over all possible
Sachs. These tools are based on a linear
scenarios.
approximation of the portfolio to measure its risk
In summary, the integration of market and credit and assume a joint (log)normal distribution of
risk is achieved in the scenarios which define the underlying market risk factors, similar to the
explicitly the joint evolution of market risk RiskMetrics VaR methodology (Longerstaey and
factors and credit drivers. The Mark-to-Future Zangari 1996). Litterman further emphasized the
framework enables the integration of market and dangers of managing risk using only linear
credit risk and an accurate, computationally approximations. However, in spite of their
efficient estimation of counterparty exposures
onerous assumptions, the insights provided by
and portfolio credit risk.
these tools are very powerful and constitute a
An example of integrated market and credit solid conceptual basis for a risk management
scenarios is presented on page 68. toolkit. (The reader is also referred to the related
papers by Garman (1996, 1997) on marginal VaR
Risk management tools and optimization and risk decomposition.)
applications
Within the Mark-to-Future framework, these
In addition to monitoring risk, an effective risk concepts can be extended to create a simulation-
management function must help the firm
based risk management toolkit. The MtF
understand the sources of its exposures, how
market or portfolio changes affect its risk profile simulation-based tools provide additional insights
and how to obtain optimal trade-offs between when the portfolio contains non-linearities, when
risk and reward, within and across various the market distributions are not normal or when
business lines. A comprehensive risk manager’s there are multiple horizons. Furthermore, they
framework must also explicitly model discrete markets that are
often observed in practice, where trading may be
• represent complex portfolios simply
costly and liquidity limited. The methodology
• decompose risk by asset and/or risk factor also naturally accommodates transaction costs,
68 Step 6: Advanced applications

Integrated Market and Credit Risk Scenarios


The integration of market and
credit risk is achieved in
scenarios which explicitly
define the joint evolution of
market risk factors and credit
drivers (Iscoe et al. 1999).
Market factors drive the
prices of securities, and hence
exposures, and credit drivers
are systemic factors that
govern the creditworthiness of
obligors in the portfolio.
Factors are general and can be
microeconomic,
macroeconomic, economic Merton model of default
and financial.
The financial health of the be observed directly. Default
We illustrate the construction firm is driven by the Merton occurs when the index falls
of integrated market and model of default, in which a below a specified boundary,
credit risk scenarios with a firm defaults when its value called the default boundary.
simple example. Consider a crosses a given boundary. Put
single firm with whom we have simply, default occurs when Consider two scenarios of the
contracted a set of USD the asset levels of a firm fall firm’s CWI and its default
interest rate swaps. The firm is below its liabilities; hence the boundary. In the first scenario,
firm cannot fulfill its the firm defaults in three years,
rated BB. The default
obligations. The financial while in the second scenario, the
probability term structure for
BB firms in time steps of one health of the firm is firm remains solvent since the
year is described by the represented by its scenario does not intersect the
cumulative default creditworthiness index boundary.
probabilities. (CWI), which may or may not We assume that the CWI of the
firm follows a geometric
Brownian motion (GBM). For
simplicity, the process is
standardized to have a unit
standard deviation and zero
mean.
Although the default boundary
in the Merton model cannot be
observed directly, it is possible to
compute it from the default
probability term structure and
the assumption of GBM. The
boundary corresponds to the
boundary for a typical BB firm.
The likelihood that the GBM
BB cumulative default probabilities process crosses the boundaries
Step 6: Advanced applications 69

calibrated to March 3, 2000. We


assume that interest rates follow
a statistical mean-reversion
process, similar, for example, to
the multi-factor model described
on page 28. The mean reversion
speed is 0.1. Target rates are
given by the forward rates. The
S&P 500 credit driver follows a
GBM with instantaneous drift of
–4
5 × 10 and volatility of 1.479.
The instantaneous correlation
between interest rates and the
Ten scenarios on the S&P 500 index credit driver is about –14%.
A scenario contains only
systemic information about the
economy. Since the firm’s CWI
also contains an idiosyncratic
component (which accounts for
95% of its variance), it is not
possible to know whether the
firm has defaulted in a given
scenario. However, we can
determine whether its
probabilities of default have
increased or decreased
conditional on the scenario.
For example, the unconditional
Ten scenarios on the six-month US rate likelihood of default of a BB firm
in five years is 9.6%. In scenario
for the first time by each time rate term structure. The S9 the credit driver decreased
point is represented by the scenario generation model is about 1.2 standard deviations in
cumulative probabilities.
Now assume that the
creditworthiness index of the
firm is driven by a single factor
linear model. The S&P 500
index is the systemic credit
driver; the credit driver
explains 5% of the total
variance of the
creditworthiness index.
To measure credit risk, we
construct an integrated
scenario set of the
simultaneous realizations of
the S&P 500 and the interest Ten scenarios on conditional default probabilities
70 Step 6: Advanced applications

five years. Since the obligor has a


positive correlation to the driver,
the health of the counterparty is
expected to decrease and the
likelihood of default to increase
as the value of the index
decreases. Conditional on
scenario S9, the probability of
default in five years is 11.4%.
In general, conditional
probabilities in a scenario can be
calculated from the combination
of the Merton model and the
multi-factor model of the
Scenario S9 default probabilities creditworthiness index.
Note that the scenarios on the
S&P 500 index, the scenarios on
the six-month rate and the
scenarios on conditional default
probabilities constitute a joint
scenario set in which market and
credit risks are completely
integrated. For example, given
the correlations, when interest
rates increase, the S&P 500 level
is more likely to fall and the
default probabilities are more
likely to increase. This is the
case in scenario S9.

Ten scenarios on conditional default probabilities The systemic component of


for a second counterparty credit risk captured by the multi-
factor model has a significant
impact on results. Consider the
conditional default probabilities
of a second counterparty for
which the credit driver explains
80% of the variance. For this
counterparty, the five-year
conditional default probability
changes from 11.4% in
scenario S9 to 44.4%.
As can be seen, when the
systemic component of credit
risk is higher, the conditional
default probabilities have higher
volatility.
Scenario S9 returns
Step 6: Advanced applications 71

liquidity and other specified user constraints, as Mausser and Rosen (1998, 1999c) demonstrate
well as investor preferences. how the MtF framework can be used to create
computationally efficient scenario-based risk
MtF risk management tools can be based on risk management tools to measure marginal risk, risk
measures other than variance; they work well contributions and triangular risk decomposition
with risk measures such as VaR, expected and to create trade risk profiles. Dembo and
shortfall and regret. In particular, these tools Rosen (1999) discuss the application of inverse
have proven very useful not only for market risk problems in portfolio replication.
analysis, but also for credit risk, for which the
exposure and loss distributions are generally far In what follows, we illustrate how these concepts
from normal (they are skewed and have fat tails), can be applied to construct optimal portfolios
and for ALM, which requires multi-period that trade off risk and reward efficiently.
stochastic decision making.
Portfolio optimization and risk/return efficient
frontiers
It is widely believed that simulation-based risk
management tools are impractical because they Since knowledge of actual product holdings is
require substantial additional computational not required to generate the MtF Cube, various
work (Dowd 1998). In fact, little or no additional risk management tools and portfolio optimization
simulation is required to obtain risk management techniques can be applied in the post-processing
analytics using efficient computational methods stage. Such methodologies typically involve
implemented in the MtF framework. constructing a series of portfolios that are
efficient. Efficient portfolios have the highest
An extensive literature demonstrates these reward for a given level of risk, or equivalently,
points. The basic concepts of MtF-based the lowest risk for a given level of reward. In all
optimization applications were first developed in approaches, the key input is a single MtF Cube,
Dembo (1991), Dembo (1992), Dembo and while the specific optimization models are applied
King (1992) and Dembo (1995). They were more as post-processing applications.
recently extended by Dembo (1998a),
Dembo (1998b), Dembo(1998c), Dembo and Applications for these models include
Freeman (1998), Dembo and Rosen(1999) and
• index tracking
Dembo and Mausser (2000). These papers show
that using an expected downside risk measure • hedging and pricing
leads to relevant and linear programming
• asset allocation
problems which can be readily solved. Konno and
Yamazaki (1991) also present applications of this • portfolio compression
type of risk measure in finance applications.
• asset-liability management
Rockefellar and Uryasev (2000) demonstrate • risk restructuring
that other measures such as conditional VaR
• capital allocation.
(also known as expected shortfall) are also
tractable and lead to linear programs. The efficient frontier traces the optimal trade-
Applications of scenario optimization tools to off between risk and reward. Figure 6.2 illustrates
credit risk are given in Mausser and a typical trade-off profile. By definition, no
Rosen (1999a, 1999b) and Anderson et portfolio can lie above this frontier. Efficient
al. (2000). While these efforts have largely portfolios are contained on the frontier.
focussed on one-period optimization problems, Portfolio A lies below the frontier and, thus, is
there is also a vast literature on multi-stage inefficient because alternative portfolios can be
stochastic programming applications (see for constructed with either a lower level of risk for
example, Carino and Ziemba (1998), Ziemba and the same reward or higher reward for the same
Mulvey (1998) and the references therein). risk.
72 Step 6: Advanced applications

incorporation of instruments with zero cost


(e.g., futures).

• Bid-ask spreads are easily incorporated.


Reward

• The choice of risk and reward measures is


A unrestricted.

• The instruments used in the optimization


need not be actual instruments but may, for
example, represent the values realized by
Risk
particular portfolio regimes.
Figure 6.2: Efficient frontier
Efficient frontiers are obtained as follows. For a
Regardless of the particular risk and reward level of risk k, the portfolio x with the maximum
measures used, a fundamental property of the reward ep(k) is determined by solving the
risk/reward trade-off is that higher levels of following optimization problem:
reward always incur higher levels of risk. In most
cases, the frontier is also concave, implying that e p ( k ) = Maximize x : Reward
the marginal increase in reward for an additional
Subject to : Risk < k
unit of risk decreases with the level of risk. Risk/
reward models may be formulated in a dynamic x∈X
context (multi-time step) or a static context
(single time step). We focus on the static case. In this problem, reward and risk may be any of
The Markowitz efficient frontier is produced by the measures calculated in Step 5 and the
minimizing portfolio variance subject to specified constraints x ∈ X represent all constraints on x.
levels of mean portfolio return. In contrast to a Solving this parametric optimization problem for
model that relies solely on a mean-variance all levels of k yields the entire efficient frontier.
approach, the MtF framework allows much
greater flexibility in defining the risk and reward The actual constraints depend on the particular
dimensions. For example, the Put/Call Efficient
optimization model or application. Common
Frontier, produced by maximizing a portfolio’s
expected upside subject to specified levels of constraints include
expected downside, can only be formulated in a
MtF framework. • lower and upper bounds on individual
positions
The advantages inherent in optimization
applications for risk/reward assessment in the • a maximum portfolio cost
MtF framework include
• an upper bound on portfolio under-
• Risk factor return distributions need not be performance in any scenario
normal. In particular, return distributions
may be skewed or have fat tails. • group constraints (e.g., no more than a
• Underlying risk factor processes may be quite specified percentage in a given asset class,
general and may, for example, incorporate lower and upper bounds on portfolio
jumps. duration or factor sensitivities).
• All instruments including derivatives can be
handled in a consistent fashion. On page 73, this general concept is applied to
• Instrument profits and losses can be used, create the Put/Call Efficient Frontier (Dembo
rather than returns, which enables the and Mausser 2000).
Step 6: Advanced applications 73

The Put/Call Efficient Frontier


In a Mark-to-Future world, a addition to being attractive Investors are affected by the
portfolio can be disaggregated from a computational finite liquidity of financial
into its Put Value and Call standpoint, this makes it markets; typically, as the size of a
Value, corresponding to the possible to quantify the costs trade increases, for example, so
expected amounts by which it of market illiquidity, to find too does the investor’s per unit
respectively falls below or possible (static) arbitrage cost. The Put/Call model
exceeds the value of a opportunities and to obtain accounts for market illiquidity by
benchmark over a given time arbitrage-free prices for new decomposing a security into a
horizon, T. The Put/Call securities in incomplete series of tranches with
Efficient Frontier (Dembo and markets. progressively higher costs. The
Mausser 2000) identifies the sizes of the tranches are defined
optimal trade-offs between the Let qi be the current price of by their lower and upper bounds,
competing objectives of security i and denote its Mark- xL and xU, respectively. The
maximizing a portfolio’s to-Future value in scenario j convexity of the resulting total
expected upside (Call Value) by mijT. Suppose that a one cost function ensures that the
and minimizing its expected dollar investment in the tranches are filled in sequence.
downside (Put Value). As benchmark grows to rj dollars
such, it quantifies a portfolio’s in scenario j. Then the upside The Put/Call Efficient Frontier is
risk and reward in a simple, (u) and downside (d) of a defined by the solutions to
intuitive manner. Given a portfolio containing xi units of
v (k ) = maximize: Call Value
utility function, investors can each security i satisfies subject to: (1)
select the efficient portfolio Put Value ≤ k
that is most desirable with T T liquidity constraints
u – d = ( M – rq )x
respect to their individual risk
preferences. for all k ≥ 0. Problem 1
Given a set of scenario rebalances a portfolio in order to
For risk-averse investors, the probabilities p, we define the obtain the largest possible
relevant Put/Call Efficient portfolio’s Call Value to be pTu expected upside while not
Frontier can be constructed and its Put Value to be pTd. incurring more than k units of
using linear programming. In expected downside. As k is
increased from zero, the optimal
solution values define the
efficient frontier. Problem 1 can
be formulated as a linear
program:

T
max ( x, u, d ) p u
s.t.
T
p d ≤k
T T
u – d – M – rq  x = 0
 
x ≥ xL
x ≤ xU
u ≥0
d ≥0

Total cost vs. quantity


74 Step 6: Advanced applications

The Put/Call model also


provides insights into the prices
of securities, given an investor’s
preferred efficient portfolio. For
example, one can express the
market price of a security as the
sum of an infinite-liquidity price
(i.e., the amount that an
investor is willing to pay for the
security in a perfectly liquid
market) and a corresponding
liquidity adjustment. This
adjustment is positive (i.e., a
premium), negative (i.e., a
Selecting an optimal portfolio discount) or zero, for securities
that are respectively at their
lower bounds, upper bounds or
The Put/Call Efficient Frontier the Call Value is as large as between their bounds.
is concave and piecewise possible and the marginal
linear, and its slope (µ) equals trade-off between Call Value Using linear programming
the marginal Call Value per and Put Value is at least λ. duality theory, one can derive
unit of Put Value. Intuitively, Geometrically, this occurs at benchmark-neutral probabilities
µ decreases with increasing k the point where a line with ρ that result in an expected gain
because the most attractive slope λ is tangent to the of zero, relative to the
opportunities for trading off efficient frontier. benchmark, for all securities at
Call Value and Put Value are their infinite-liquidity prices. It
used up first (i.e., the most In practice, there is typically can be shown that the infinite-
desirable securities attain their an upper limit on the amount liquidity price of a security
bounds). of risk that an investor is equals its expected payoff over ρ,
prepared to assume. Thus, discounted at the rate ro = rTρ.
Investors having a bi-linear investors are effectively This concept of benchmark-
utility function act in a restricted to portfolios on a neutral pricing is equivalent to
manner to maximize truncated efficient frontier utility-invariant pricing, which
that extends from a Put Value finds the price qh at which an
exp ected utility = ( Call value ) of zero up to some level ko. An
investor is indifferent to trading
– λ ( Put value ) investor with risk aversion security h in an optimal
µ2 < λ < µ1 and tolerance portfolio. Benchmark-neutral
where λ ≥ 0 is the degree of level ko < kA will select the prices exist even when markets
risk aversion (an investor is portfolio that corresponds to are illiquid and/or incomplete
said to be risk-averse if λ > 1). point B, rather than A, on the (although the price does depend
Given a utility function of this efficient frontier. on the risk preferences of the
form, an investor will prefer an investor).
efficient portfolio for which
References

Ackworth, P., M. Broadie and P. Glasserman, Bollerslev, T., 1986, “Generalized autoregressive
1997, “A comparison of some Monte Carlo conditional heteroscedasticity,” Journal of
and quasi Monte Carlo techniques for option Econometrics, 31, 307–327.
pricing,” in Monte Carlo and Quasi Monte Carlo
Boyle, P., 1977, “Options: a Monte Carlo
in Scientific Computing, N.Y.: Springer Verlag;
approach,” Journal of Financial Economics, 4(2):
54–79.
323–338.
Ahn, H., M. Bouabci and A. Penaud, 2000,
Boyle, P., M. Broadie and P. Glasserman, 1997,
“Tailor-made for tails,” Risk, 13(2), 95–98.
“Monte Carlo methods for security pricing,”
Anderson, F., H. Mausser, D. Rosen and S. Journal of Economic Dynamics and Control, 21:
Uryasev, 2000, “Credit risk optimization with 1323–1352.
conditional Value-at-Risk criterion,” Working
Paper, University of Florida and Algorithmics Broadie and Glasserman, 1997, “Monte Carlo
Inc., forthcoming. methods for pricing high-dimensional
American options,” Global Derivatives ‘97.
Aziz, J and N. Charupat, 1998, “Calculating Paris: April 1997.
credit exposure and credit loss: a case study,”
Algo Research Quarterly, 1(1): 31–46. Caflisch, R., W. Morokoff and A. Owen, 1997,
“Valuation of mortgage-backed securities using
Basle Committee on Banking Supervision, 1988, Brownian bridges to reduce effective
International Convergence of Capital dimension,” Computational Finance, 1(1): 27–
Measurements and Capital Standards, July,
46.
(http://www.bis.org).
Cardenas, J., E. Fruchard, J. Picron, C. Reyes, K.
Basle Committee on Banking Supervision, 1997,
Walters and W. Yang, 1999, Monte Carlo
Explanatory note: modification of the Basle
within a day,” Risk, 12(2): 55–59.
Capital Accord of July 1988, as amended in
January 1966, (Accessed September 1999, Carino, D. and W. Ziemba, 1998, “Formulation of
http://www.bis.org). the Russel-Yasuda Kasai financial planning
Black, F. and M. Scholes, 1973, “The pricing of model,” Operations Research, 46 (4): 443–449.
options and corporate liabilities,” Journal of Carillo S., P. Fernandez, N. Hernandez and L.
Political Economy, 81 (May–June): 637–654. Seco, 2000, “Scenario generation techniques
Black, F., E. Derman and W. Toy, 1990, “A one- in Mark-to-Future analysis,” Working Paper,
factor model of interest rates and its RiskLab Madrid, Universidad Autonoma de
application to treasury bond options,” Financial Madrid, and RiskLab Toronto, University of
Analysts Journal, Jan/Feb, 33–339. Toronto.
Black, F. and P. Karasinski, 1991, “Bond and Cartolano, P. and S. Verma, 2000, “Using
option pricing when short rates are scenario banding to stress test counterparty
lognormal,” Financial Analyst Journal, July/ credit exposures,” Algo Research Quarterly,
August 1991: 52–59. 2(4): 27–36.
76 References

Cox, J., J. Ingersoll and S. Ross, 1985, “A theory in the evolution of risk, ” Draft Release,
of the term structure of interest rates,” Algorithmics Inc.
Econometrica, 53(2): 385–407. Dembo, R. and H. Mausser, 2000, “The Put/Call
CreditMetrics: The Benchmark for Understanding Efficient Frontier,” Algo Research Quarterly,
Credit Risk, Technical Document, 1997, New 3(1): 13–25.
York, N.Y.: J.P. Morgan Inc.
Dowd, K., 1998, “VaR by increments,” Enterprise
Crédit Suisse Financial Products, 1997, Wide Risk Management Special Report, Risk, 31–
CreditRisk+: A Credit Risk Management 32.
Framework, New York, N.Y.
Duffie, D. and K. Singleton, 1997, “Modeling
Crouhy, M. and R. Mark, 1998, “A comparative term structures of defaultable bonds,” Working
analysis of current credit risk models,” Paper Paper, Graduate School of Business, Stanford
presented at the conference Credit Modeling University, Forthcoming in Review of Financial
and Regulatory Implications, London, Studies.
September 1998.
Embrecht, P., S. Resnick and G. Samorodnitsky,
Dembo, R., 1991, “Scenario optimization,” 1998, “Living on the edge,” Risk, January, 96–
Annals of Operations Research, 8: 267–284.
100.
Dembo, R., 1992, “Scenario optimization,” US
Patent Number: 5148365, dated September Embrecht, P., A. McNeil and D. Straumann,
15, 1992, (filed August 15, 1989). 1999, “Correlation and dependency in risk
management: properties and pitfalls,” Working
Dembo, R. and A. King, 1992, “Tracking models paper, Dept. of Mathematics, ETHZ, Zurich,
and the optimal regret distribution in asset Switzerland.
allocation,” Applied Stochastic Models and Data
Analysis, 8: 151–157. Engle, R. and J. Mezrich, 1995, “Grappling with
GARCH,” Risk, September: 112–117.
Dembo, R., 1995, Optimal Portfolio Replication,
Research Paper Series 95–01 (24 Feb. 1997. Finger, C., 1999, “Conditional approaches for
Revised 4 Feb. 1999). CreditMetrics portfolio distributions,”
CreditMetrics Monitor, April: 14–33.
Dembo, R., 1998a, “Mark-to-Future,” Morgan
Stanley Dean Witter Global Equity and Garman, M., 1996, “Improving on VaR,” Risk,
Derivatives Markets, March 11, 1998: 6–16. 9(5): 61–63.
Dembo, R., 1998b, “Mark-to-Future: a consistent Garman, M., 1997, “Taking VaR to pieces,” Risk,
firm-wide paradigm for measuring risk and 10(10): 70–71.
return,” in Risk Management and Analysis, Vol. Gordy, M., 1998, “A comparative anatomy of
1: Measuring and Modeling Financial Risk, Carol credit risk models,” Federal Reserve Board,
Alexander (Ed.), New York: John Wiley & Finance and Economics Discussion Series,
Sons Ltd. 1998: 47.
Dembo, R., 1998c, “Method and apparatus for Greenspan, A., 1999, “The evolution of bank
optimal portfolio replication,” US Patent supervision,” Speech to the American Bankers
Number: 5799287, dated August 25, 1998, Association, Phoenix, Arizona, October 11,
(filed May 24, 1994). 1999, (http://www.bog.frb.fed.us/boarddocs/
Dembo, R. and A. Freeman, 1998, Seeing speeches/1999/).
Tomorrow, New York: John Wiley & Sons Inc. Hull J. and A. White , 1998a, “Value at risk when
Dembo, R. and D. Rosen, 1999, “The practice of daily changes in market variables are not
portfolio replication: a practical overview of normally distributed,” The Journal of
forward and inverse problems,” Annals of Derivatives, Spring, 9–19.
Operations Research, 85: 267–284. Hull J. and A. White, 1998b, “Incorporating
Dembo, R., A. Aziz, B. Boettcher, J. Farvolden, volatility in to the historical simulation
A. Shaw and M. Zerbs, 1999, “Mark-to- method for VaR,” The Journal of Risk, 1(1): 5–
Future: a comprehensive guide to a revolution 20.
References 77

Hull, J. and A. White, 1990, “Pricing interest- Litterman, R., 1996a, “Hot spots and hedges,”
rate derivative securities,” The Review of Risk Management Series, Goldman Sachs.
Financial Studies, 3(4): 573–592.
Litterman, R., 1996b, “Hot spots and hedges,
Iscoe, I., A. Kreinin and D. Rosen, 1999, “An Journal of Portfolio Management, (Special Issue):
integrated market and credit risk portfolio 52–75.
model,” Algo Research Quarterly, l(2): 3, 21–37. Litterman, R., 1997a, “Hot spots and hedges (I),”
Jamshidian, F. and V. Zhu, 1997, “Scenario Risk, 10(3): 42–45.
simulation theory and methodology,” Finance Litterman, R., 1997b, “Hot spots and hedges
and Stochastics, 4(1): 43–67. (II),” Risk, 10(5): 38–42.
Jarrow, R. and S. Turnbull, 2000, “The Locke, J., 1999, “A fear of floating,” Latin Risk,
intersection of market and credit risk,” Journal (Risk Special Report), July: 4–5.
of Banking and Finance, 24, 271–299.
Longerstaey, J. and P. Zangari, 1996, RiskMetrics-
J.P. Morgan, 1996, (see Longerstaey and Zangari, Technical Document, 4th ed., New York:
1996). Morgan Guaranty Trust Co. (Available at
J.P. Morgan, 1997, (see CreditMetrics, 1997). http://www.riskmetrics.com/rm/index.cgi).
Karatzas, I. and S. Shreve, 1994, Brownian Motion Lopez, J., 1999, “Regulatory evaluation of value-
and Stochastic Calculus, New York: Springer at-risk models,” Risk, 1(2), 37–64.
Verlag.
Lopez, J. and M. Saidenberg, 2000, “Evaluating
Kealhofer, S., 1996, “Managing default risk in credit risk models,” Journal of Banking and
portfolios of derivatives,” Derivative Credit Finance, 24(1/2): 151–165.
Risk, London: Risk Publications, 49–63.
Markowitz, H., 1952, “Portfolio selection,”
Kim, J., A. Malz and J. Mina, 1999, Long Run Journal of Finance, 7(1): 77–91.
Technical Document, New York: RiskMetrics
Group. Marshall, C. and M. Siegel, 1996, “Value-at-Risk:
in search of a risk measurement standard,”
Konno, H. and H. Yamazaki, 1991, “Mean- Draft Document, Massachusetts Inst. of
absolute deviation portfolio optimization and Technology.
its applications to Tokyo stock market,”
Management Science, 37: 519–531. Mausser, H. and D. Rosen, 1998, “Beyond VaR:
from measuring risk to managing risk,” Algo
Koyluoglu, H. and A. Hickman, 1998, Research Quarterly, 1(2): 5–20.
“Reconcilable differences,” Risk, 11(10): 56–
62. Mausser, H. and D. Rosen, 1999a, “Applying
scenario optimization to portfolio credit risk,”
Kreinin, A., L. Merkoulovitch, D. Rosen and M. Algo Research Quarterly, 2(2): 19–33.
Zerbs, 1998a, “Measuring portfolio risk using
quasi Monte Carlo methods,” Algo Research Mausser, H. and D. Rosen, 1999b, “Efficient risk/
Quarterly, 1(1): 17–26. return frontiers for credit risk,” Algo Research
Quarterly, 2(4): 35–48.
Kreinin, A., L. Merkoulovitch, D. Rosen and M.
Zerbs, 1998b, “Principal component analysis in Mausser, H. and D. Rosen, 1999c, “Beyond VaR:
quasi Monte Carlo simulation,” Algo Research triangular risk decomposition,” Algo Research
Quarterly, 1(2): 21–29. Quarterly, 2(1): 31–43.
Lando, D., 1997, Modeling Bonds and Derivatives Merton, R., 1974, “On the pricing of corporate
with Default Risk, (M. Dempster and S. Pliska, debt: the risk structure of interest rates,”
Eds.), 369–393, Cambridge, U.K.: Cambridge Journal of Finance, 29: 449–470.
University Press. Meyer, L., 1999, “Implications of recent global
Lando, D., 1998, “On Cox processes and credit financial crises for bank supervision and
risky securities,” Review of Derivatives Research, regulation,” in a speech before the
2, 99–120. International Finance Conference, Federal
78 References

Reserve Bank of Chicago, Chicago, Ill. Shaw J., 1997, “Beyond VaR and stress testing,”
October 1, 1999, (http://www.bog.frb.fed.us/ in VaR: Understanding and Applying Value-at-
boarddocs/speeches/1999/). Risk, RISK Publications, New York, N.Y., 211–
224.
Nagpal, K. and R. Bahar, 1999, “An analytical
aproach for credit risk analysis under Sobol, I., 1967, “On the distribution of points in
correlated defaults,” CreditMetrics Monitor, a cube and the approximate evaluation of
April: 51–74. integrals,” Computational Mathematics and
Mathematical Physics, 7(4): 86–112.
Niederreiter, H., 1992, “Random number
generation and quasi Monte Carlo methods,” Sundaresan, S., 2000, “Continuous-time
SIAM, Philadelphia. methods in finance: a review and an
Paskov, S. and J. Traub, 1995, “Faster valuation of assessment,” Journal of Finance, to appear in
financial derivatives,” Journal of Portfolio special issue of the American Finance
Management, Fall, 22 (1): 113–120. Association 2000 proceedings.
Peterson I., 1998, The Jungles of Randomness, New United States Federal Reserve Board, 1999,
York, N.Y.: John Wiley and Sons Inc. (http://www.bog.frb.fed.us/Releases/H15/):
select historical data for July 6, 1999.
Reimers, M. and M. Zerbs, 1999, “A multi-factor
statistical model for interest rates,” Algo Wilson, T., 1997a, “Portfolio credit risk I,” Risk,
Research Quarterly, 2(3): 53–64. 10(9): 111–117.
RiskMetrics, 1996, (see Longerstaey and Zangari, Wilson, T., 1997b, “Portfolio credit risk II,” Risk,
1996). 10(10): 56–61.
Rockefellar, R, and S. Uryasev, 2000,
Yung, E., 1999a, “Making a scene...,” Algo
“Optimization of conditional Value-at-Risk,”
Research Quarterly, 2(3): 5–8.
The Journal of Risk, forthcoming.
Samuelson, P., 1991, personal communication to Yung, E., 1999b, “Making a scene...,” Algo
Stephen A. Ross. Research Quarterly, 2(1): 5–7.

Schoenmakers, J. and A. Heemink, 1997, “Fast Yung, E., 1999c, “Making a scene...,” Algo
valuation of financial derivatives,” Research Quarterly, 2(2): 5–9.
Computational Finance, 1(1): 47–62.
Ziemba, W., and J. Mulvey, (Eds.), 1998,
Sharpe, W., 1964, “Capital asset prices: a theory Worldwide Asset and Liability Modeling,
of market equilibrium under conditions of Cambridge, U.K.: Cambridge University Press,
risk,” Journal of Finance, 19(3): 425–442. Publications of the Newton Institute.
Notation

Table N.1: Indices


Definition Dimension

i instrument index 1xN

j scenario index 1xS

t time step index 1xT

k risk factor index 1xK

Table N.2: Variables and parameters


Definition Elements Dimension

u risk factor values u kjt KxSxT

M Mark-to-Future Cube m ijt NxSxT

Mi Mark-to-Future table for instrument i m ijt SxT

MR Mark-to-Future table for portfolio regime R R SxT


mjt

p statistical scenario probabilities pj 1xS

R holdings in portfolio R R NxSxT


x x ijt

pR(t*) probability of default of counterparty portfolio R at R * 1x S


p ( t j)
*
t≤t ≤T

rR(t*) recovery rate (given default) of counterparty portfolio R at R * 1xS


r ( t j)
*
t≤t ≤T

ρ risk-neutral probability of scenario j ρj 1xS

τ MtF value of a benchmark portfolio under scenario j at τ jt SxT


time t
80 Notation

Table N.3: Functions


Definition

f() function mapping risk factors into MtF values

g() function mapping instrument into MtF values into portfolio MtF values

(a)+ function to determine the maximum of two values, max(a, 0)

(a)– function to determine the minimum of two values, min(a, 0)

Table N.4: Notation for Step 5

s random variable representing stock MtF value st 1xT

c random variable representing cash dividend MtF ct 1xT


value

d random variable representing discount factor d t, t + x 1xT

f random variable representing future value factor f t, t + x 1xT

Table N.5: Coefficients


Definition

α one-sided confidence interval

λ risk aversion coefficient


Algo Research
Quarterly
Editor
Judith M. Farvolden
ARQ Review
Assistant Editor
Marguerite Martindale September 1998–March 2000
Editorial Board
Andrew Aziz
Judith M. Farvolden
John Nestor
Dan Rosen This section contains titles and abstracts of papers and columns
Michael Zerbs
published in our seven issues of the Algo Research Quarterly.
Circulation
Marguerite Mantha

Layout
Stephanie Collins

http://clients.algorithmics.com
Statement of Purpose: The Algo
Research Quarterly provides solutions
and information to clients based on
our experience and expertise in
enterprise risk methodologies.
 2000 Algorithmics Incorporated
(“Algo”). All rights reserved. Errors
and omissions excepted. Permission
to make digital/hard copy of part or all
of this work for personal or classroom
use may be granted without fee
provided that no copy is made, used
or distributed for any commercial
purpose whatsoever and provided that
Algo’s prior written consent has been
obtained, and that Algo’s copyright
notice, the title of the publication and
its date appear on any such copy.
To request permission to use part or
all of this work contact:
Algorithmics Incorporated
185 Spadina Avenue
Toronto, Ontario
Canada M5T 2C6
Tel: (416) 217-1500
Fax: (416) 971-6100
ISSN: 1488-0539
There are no representations or
warranties, express or implied, as to
the applicability for any particular
purpose of any of the material(s)
contained herein, and Algo accepts no
liability for any loss or damages,
consequential or otherwise, arising
from any use of the said material(s).
In This Issue: Volume 1.1

Measuring Capital Charges for Specific Risk ......................................... 5


We compare specific risk capital charges based on the standardized approach and the
internal models approach. Both approaches are endorsed by the regulators. We find the
internal models approach leads to a lower specific risk capital charge for equity portfolios
and for bonds with shorter maturities and higher credit ratings.

Measuring Portfolio Risk Using Quasi Monte Carlo Methods ................... 17


We consider the application of Quasi Monte Carlo methods to risk measurement. A
scenario generation technique based on low discrepancy Sobol sequences is compared to
one based on pseudo-random number generation. We demonstrate that these methods
significantly improve the performance of the portfolio simulation and, therefore, reduce
the time to obtain reliable risk measurements.

Technology Trends ............................................................................. 27


David Penny, Algorithmics’ Chief Technology Officer, shares his thoughts on Java–what it
is, what promises it fulfills and what challenges lie ahead.

Calculating Credit Exposure and Credit Loss: A Case Study .................... 31


We report a case study of the estimation of credit exposure and credit loss of a sample
portfolio of derivative transactions. The results of an estimation performed using a Monte
Carlo simulation are compared to the amounts of exposure and loss obtained under the
method recommended by the Bank for International Settlements.

Algo Academy Notes .......................................................................... 47


Andrew Aziz discusses an optimization approach to valuation in complete markets that is
based on a fundamental assumption of investor behaviour–that investors prefer more
wealth to less.

RiskLab Corner .................................................................................. 55


A network of financial research labs enables academia and industry to work together to
investigate and develop solutions to critical problems faced by banks, financial
institutions and corporations. RiskLab Corner offers a glimpse of selected RiskLab
research activities.

Contributors ...................................................................................... 59

ALGO RESEARCH QUARTERLY 83 VOL. 1, NO. 1 SEPTEMBER 1998


In This Issue: Volume 1.2

Beyond VaR: From Measuring Risk to Managing Risk ............................ 5


Simulation-based tools for portfolio risk management are developed and applied to
examine the VaR contribution, marginal VaR and trade risk profiles of two example
portfolios.

Principal Component Analysis in Quasi Monte Carlo Simulation ............. 21


We propose a new approach for selecting the appropriate principal components based on
portfolio sensitivities and compare it to a standard approach.

Technology Trends ............................................................................. 31


David Penny, Algorithmics' CTO, discusses the benefits and risks of various software
architecture concepts, from client server, to RPC, to CORBA/DCOM, to Middleware.

Defining Operational Risk ................................................................... 37


Operational risk is considered in the context of the firm and losses are analyzed in terms
of their causes and the events that trigger them. Current definitions are discussed and a
concise definition is proposed.

Dimension Reduction by Asset Blocks ................................................. 43


We estimate portfolio VaR using the covariances of the most significant principal
components by asset blocks, and then stress test both the risk factors and the correlations.

Algo Academy Notes .......................................................................... 59


Andrew Aziz discusses how no-arbitrage pricing approaches are viable even when the total
number of states exceeds the number of independent securities.

RiskLab Corner .................................................................................. 71


A glimpse of selected research activities at the RiskLabs.

Contributors ...................................................................................... 75

ALGO RESEARCH QUARTERLY 84 VOL. 1, NO. 2 DECEMBER 1998


In This Issue: Volume 2.1

Making a Scene... .............................................................................. 5


We capture the Russian ruble crisis in a scenario set and apply it to a portfolio of Brazilian
Brady bonds to develop valuable insights into how market risk may evolve under extreme
conditions.

Credit Risk of an International Bond Portfolio: A Case Study .................. 9


We apply the CreditMetrics methodology to estimate the credit risk of a portfolio of long
dated corporate and sovereign bonds issued in emerging markets. We assess the sensitivity
of the loss distribution to various parameters.

Beyond VaR: Triangular Risk Decomposition ......................................... 31


This paper describes triangular risk decomposition for arbitrary distributions of changes in
value, consistent with the simulation-based approach for calculating the non-parametric
VaR. We examine a portfolio of foreign exchange contracts and discuss the strengths and
limitations of triangular decomposition.

Technology Trends ............................................................................. 45


David Penny, Algorithmics' CTO, discusses the strengths and weaknesses of several
database technologies from simple file systems to OODBs.

Lookback Over the Barrier .................................................................. 51


This paper discusses and prices a new type of option that combines features of both barrier
and lookback options. A static replication technique provides insights into the links
between barrier and lookback options and builds the framework for both static and
dynamic hedging of these options.

Algo Academy Notes .......................................................................... 65


Andrew Aziz discusses the Fundamental Theorem of Asset Pricing as it applies to the
pricing of securities whose value must incorporate the possibility of default and,
accordingly, recovery given default.

RiskLab Corner .................................................................................. 73


A glimpse of selected research activities at the RiskLabs.

Contributors ...................................................................................... 77

ALGO RESEARCH QUARTERLY 85 VOL. 2, NO. 1 MARCH 1999


In This Issue: Volume 2.2

Making a Scene... .............................................................................. 5


We design a stress test using both upside and downside extreme market scenarios and
apply it to Amazon.com to forecast the future of Internet stocks.

Six Rules for Mark-to-Future ............................................................... 11


A Mark-to-Future feature that presents six rules for forward-looking risk measurement.

Mark-to-Future in Practice................................................................... 15
A Mark-to-Future feature that provides a practical example of the six rules.

Applying Scenario Optimization to Portfolio Credit Risk .........................19


Since the distributions of credit losses are typically far from normal, standard techniques
developed for market risk are ineffective for managing credit risk. This paper presents
scenario-based tools that can restructure portfolios to reduce credit risk and optimize the
risk/return trade-off. Optimization models based on regret are particularly effective here.

Technology Trends ............................................................................. 35


David Penny discusses architectural considerations for Mark-to-Future.

Simulation of Fixed-income Portfolios Using Grids ................................ 41


This paper considers the application of low-dimensional grids to the simulation of complex
fixed-income portfolios. The grid simulation methodology is illustrated by a case study of
a callable bond portfolio to demonstrate that grids provide reasonably accurate non-
parametric VaR estimates while substantially reducing the computational time required.

Algo Academy Notes .......................................................................... 51


Andrew Aziz discusses the pricing of credit risky securities in a dynamically completed
market and the resulting relationship between the credit spread of a given term and the
risk-neutral probability of “survival” to that term.

RiskLab Corner .................................................................................. 61


In Print ............................................................................................. 65
Contributors ...................................................................................... 67

ALGO RESEARCH QUARTERLY 86 VOL. 2, NO. 2 JUNE 1999


In This Issue: Volume 2.3

Making a Scene... .............................................................................. 5


Scenario testing is an important component of liquidity risk models. We design a stress
test that extends the standard Value-at-Risk analysis to include the risk associated with
liquidity.

Liability Management Using Dynamic Portfolio Strategies ...................... 9


This paper couples dynamic portfolio strategies with scenario generation in a simulation
methodology to determine how the costs and risks of a liability portfolio evolve over time.
The design of a simplified government debt program is used to illustrate the benefits of
dynamic portfolio strategies.

An Integrated Market and Credit Risk Portfolio Model ........................... 21


We present a multi-step model to measure portfolio credit risk that integrates exposure
simulation and portfolio credit risk techniques. Thus, it overcomes the major limitation
currently shared by portfolio models with derivatives. The model is computationally
efficient because it combines a Mark-to-Future framework of counterparty exposures and
a conditional default probability framework.

Technology Trends ............................................................................. 39


Darek Okarmus uses story format to examine different approaches in data modeling.

A Multi-factor Statistical Model for Interest Rates ................................. 53


A term structure model that produces realistic scenarios of future interest rates is critical
to the effective measurement of counterparty credit exposures. In this paper, we present
a statistical term structure model with mean reversion. A case study is used to explore the
calibration, application and out-of-sample testing of the model in practice.

Algo Academy Notes .......................................................................... 65


Andrew Aziz discusses the relationship between the pricing of credit risky securities and
the capital structure of the firm. Under this class of models, credit risky securities can be
priced with respect to the prices of equity and riskless debt.

RiskLab Corner .................................................................................. 73


Contributors ...................................................................................... 75

ALGO RESEARCH QUARTERLY 87 VOL. 2, NO. 3 SEPTEMBER 1999


In This Issue: Volume 2.4

Making a Scene... .............................................................................. 5


Is the IPO market really as rewarding as it appears? New issues are high risk investments.
Stress testing, using the appropriate forward-looking scenarios, is essential to
understanding the risks and returns of the IPO market.

An Empirical Study of Mark-to-Future for Measuring Risk Intraday .......... 9


The Mark-to-Future (MtF) methodology offers an efficient framework for intraday risk
measurement. A criticism of this methodology is that intraday market movements may
render overnight results obsolete and therefore, irrelevant for intraday decision making.
This criticism is addressed through an empirical study.

Technology Trends ............................................................................. 27


XML is a format for transferring data across the Web. Neil Bartlett discusses the benefits
that can be realized by adopting XML standards in dot-com, STP and component
architecture initiatives in financial services.

Efficient Risk/Return Frontiers for Credit Risk ....................................... 35


We construct efficient frontiers for relevant measures of credit risk and show that
minimum-variance portfolios are markedly inefficient with respect to expected shortfall,
maximum (percentile) losses and unexpected (percentile) losses. Risk measures such as
maximum losses and unexpected losses are intractable in an optimization sense. We use
a simple heuristic procedure to construct an approximate, or empirical, efficient frontier
for a portfolio of emerging markets bonds.

Algo Academy Notes .......................................................................... 49


Andrew Aziz develops a single period valuation framework that models the behaviour of
investors who prefer more to less and are willing to take actions to maximize expected
payoffs for a given level of risk. This behaviour is captured in an optimization model.

RiskLab Corner .................................................................................. 61


Contributors ...................................................................................... 63
The Year in Review ............................................................................ 65

ALGO RESEARCH QUARTERLY 88 VOL. 2, NO. 4 DECEMBER 1999


In This Issue: Volume 3.1

Making a Scene... .............................................................................. 5


John C. Parker investigates whether knowledge of an impending policy announcement by
the Federal Reserve Board is useful in estimating market volatility, and therefore, risk.

The Put/Call Efficient Frontier ............................................................. 13


In a Mark-to-Future world, a portfolio can be disaggregated into its Put Value and Call
Value, corresponding to the expected amounts by which it respectively falls below or
exceeds the value of a specified benchmark over a given time horizon. The Put/Call
Efficient Frontier optimally trades off a portfolio’s expected overperformance and
underperformance of the benchmark.

Using Scenario Banding to Stress Test Counterparty Credit Exposure ....... 27


We explore the use of scenario banding over multiple factors for systematic stress testing
of counterparty credit exposures over a large number of time steps. The results show that
scenario banding is useful for stress testing counterparty credit risk.

Technology Trends ............................................................................. 37


Neil Bartlett observes trends that indicate disk technology is obeying Moore’s Law and
concludes that disk capacity and speed will not constrain the accommodation of emerging
requirements for enterprise-wide risk management.

Applying Portfolio Credit Risk Models to Retail Portfolios ....................... 45


We present a simulation-based model to estimate the credit loss distribution of retail loan
portfolios and apply the model to a sample credit card portfolio of a North American
financial institution. In addition to measuring expected and unexpected losses, we
demonstrate how the model also allows risk to be decomposed into its various sources,
provides an understanding of concentrations and can be used to test how various
economic factors affect portfolio risk.

Algo Academy Notes .......................................................................... 75


Andrew Aziz demonstrates how tight bounds on the equilibrium price of a new security can
be found by incorporating the trade-off between risk and return as revealed by the prices
of securities in the market.

RiskLab Corner .................................................................................. 87


Contributors ...................................................................................... 91

ALGO RESEARCH QUARTERLY 89 VOL. 3, NO. 1 MARCH 2000

También podría gustarte