123
SAP White Paper - Basel II Pillar 2 SAP Capital Adequacy Solution (CAS) SAP Bank Analyzer Solution Architecture SRP and ICAAP Business Architecture The Specific Requirements of Pillar 2 BASEL II PILLAR 2 THE SUPERVISORY DIMENSION THE SAP CAPITAL ADEQUACY SOLUTION (CAS) This White Paper details how SAP can help clients solve the complex challenges of Basel II Pillar 2, the Supervisory Review Process (SRP) and the ICAAP. It aims to crystallise an approach to implementing Pillar 2 and present a Solution Architecture to reflect that approach. It also discusses the specific challenges of the Pillar 2 requirement, including Interest Rate Risk and Liquidity Risk.

Basel II Pillar

Embed Size (px)

Citation preview

Page 1: Basel II Pillar

SAP White Paper - Basel II Pillar 2 SAP Capital Adequacy Solution (CAS) SAP Bank Analyzer Solution Architecture SRP and ICAAP Business Architecture The Specific Requirements of Pillar 2

BASEL II PILLAR 2 THE SUPERVISORY DIMENSION THE SAP CAPITAL ADEQUACY SOLUTION (CAS) This White Paper details how SAP can help clients solve the complex challenges of Basel II Pillar 2, the Supervisory Review Process (SRP) and the ICAAP. It aims to crystallise an approach to implementing Pillar 2 and present a Solution Architecture to reflect that approach. It also discusses the specific challenges of the Pillar 2 requirement, including Interest Rate Risk and Liquidity Risk.

Page 2: Basel II Pillar

© Copyright 2006 SAP (UK) Limited. All rights reserved. No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP (UK) Limited (“SAP”). The information contained herein may be changed without prior notice. It must not be used other than for evaluation purposes only (and only by the authorized recipient to whom SAP has provided this publication) except with the prior written consent of SAP and then only on condition that SAP’s and any other copyright notices are included in such reproduction. No information as to the contents or subject matter of this publication or any part shall be given or communicated in any manner whatsoever to any third party without the prior written consent of SAP. Some software products marketed by SAP and its parent company, SAP AG, and its distributors contain proprietary software components of other software vendors. Microsoft, Windows, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation. IBM, DB2, DB2 Universal Database, OS/2, Parallel Sysplex, MVS/ESA, AIX, S/390, AS/400, OS/390, OS/400, iSeries, pSeries, xSeries, zSeries, z/OS, AFP, Intelligent Miner, WebSphere, Netfinity, Tivoli, and Informix are trademarks or registered trademarks of IBM Corporation in the United States and/or other countries. Oracle is a registered trademark of Oracle Corporation. UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group.

Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc. HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C®, World Wide Web Consortium, Massachusetts Institute of Technology. Java is a registered trademark of Sun Microsystems, Inc. JavaScript is a registered trademark of Sun Microsystems, Inc., used under license for technology invented and implemented by Netscape. MaxDB is a trademark of MySQL AB, Sweden. SAP, R/3, mySAP, mySAP.com, xApps, xApp, SAP NetWeaver, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and in several other countries all over the world. All other product and service names mentioned are the trademarks of their respective companies. Data contained in this document serves informational purposes only. National product specifications may vary. The furnishing of this document shall not be construed as an offer or as constituting a binding agreement on the part of SAP to enter into any relationship. These materials are subject to change without notice. These materials are provided by SAP and its affiliated companies (“SAP Group”) for informational purposes only, without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Group products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.

Page 3: Basel II Pillar

EXECUTIVE SUMMARY Financial Institutions worldwide are now, at the beginning of 2006, beginning to rise to the challenge of Basel II (B2). All across the globe, banks and other financial institutions are in the process of implementing Basel II solutions in order to meet the Regulatory Capital and disclosure requirements of Basel II – as laid down in Pillars One and Three of the Second Basel Capital Accords. And for many financial institutions in Europe, large and small, it is SAP software which provides the solution to the challenge of Basel II. However, meeting the requirements of Pillars One (Regulatory Capital) and Three (Disclosure and Market Discipline) is only a part of the challenge of Basel II. Basel II also contains a supervisory dimension in the form of Pillar Two (P2). And it is this, more than any other part of the Accord, which is proving to be the hardest part of the challenge. It is Pillar 2 which is currently defeating the efforts of vendors and financial institutions alike to find a solution. Pillar 2 is the hardest part of the B2 challenge to plan for and implement, mainly because its requirements are holistic right across the bank. For a time (during 2005), there were widespread doubts that Pillar 2 would operate as envisaged in the original Basel accords. We at SAP UK and Ireland (UKI) know that Pillar 2 is here to stay. It is clear that Pillar 2 will be implemented in full, despite the fact that some of our customers at senior levels believe that P2 will be diluted by supervisors and regulators. If our clients accept the challenge of P2, it is where their investment in P1 and P3 comes together. P2 is where the financial institution can finally leverage real benefit from the extensive investment, already made in Pillars 1 & 3.

‘Outsourcing’ Supervisory Techniques

Basel II Pillar 2 is the “outsourcing” of the Supervisory Process back to the supervised institution. Basel II Pillar 2 is intended to improve risk management in the institutions by asking them to undertake the type of supervisory modeling which in the less complex past was undertaken by the supervisors themselves. The ICAAP (calculation of Economic Capital) is a process that begins with the quantification of the risks that any given company faces over a given time period. The ICAAP is in turn a process which allocates capital by asset class and business unit against the institution’s model of its risk exposure looking counterfactually at potential scenarios of business conditions (the Stress Test). SAP UKI has developed a White Paper about P2 and the approach of SAP to solving this complex challenge for our clients. The objective of a P2 implementation is not just to meet the challenge of the Supervisory Review Process (SRP) but to provide the foundation and the tools by which Financial Institutions can manage their increasingly complex business more effectively. SAP does not underestimate the challenge of meeting regulatory and supervisory demands in this new Banking world. However in this white paper, SAP aims to crystallize an approach to implementing P2 and to present a Solution Architecture to reflect that approach.

Page 4: Basel II Pillar

SAP Bank Analyzer

SAP Bank Analyzer already offers integrated tools and databases to support time series data and analysis as required by our Quantitative approach. SAP Bank Analyzer has an integrated Historical Database (HDB) to support the underlying time series necessary for a financial institution to innovate its own RF representations of Risk Management scenarios and Economic Capital allocation. Within the framework provided by these tools, any additional analytics and data stores required for Pillar 2 can be developed. SAP understands that regulators and supervisors are taking advantage of a synergistic or symbiotic relationship between technology (processing and storage capacity) becoming ever more commoditized, and therefore cheaper for financial institutions to deploy, and the consequent advances in quantitative techniques and statistical functionality which can only be deployed and tested using advanced technology. That is why the B2 initiative is happening now. At the heart of the B2 initiative is the Economic Capital model which is verified and supported by the Stress Test, i.e. the Pillar Two requirement. Advances in Technology have now made it possible to implement Risk Management Analytics and Modeling derived from the latest mathematical approaches to value appraisal. It is by recognizing this development in the technology cycle that the Supervisors and Regulators (centered at BIS) have taken this opportunity to require Financial Institutions to implement these techniques holistically across the institution.

The State of the Art of Risk Management

Financial Market participants turn to Structured Finance markets to gauge the state of the art in portfolio risk modeling. Continuing development of modeling techniques in Structured Finance markets have pushed forward current practice in modeling portfolio risk. In market risk similar techniques of modeling risk are deployed as those deployed in structured finance, in fact one could argue that there is little distinction today along a continuum between structured products and traded securities. Clearly defined consensus on “best practice” regarding the modeling of portfolio credit risk, however, is still lacking. An Internal rating should be developed internally within the institution and encompass all potential influences upon the risk to that institution of a given asset, position or portfolio. The Supervisors do not require every Bank to become a Ratings Agency or to have a Ratings department. What they do require, however, is rigorous techniques of Quantitative Risk Management applied holistically across the institution in a central business unit governed by well defined processes and informing strategic decision making.

Page 5: Basel II Pillar

Quantitative Risk Management (QRM)

SAP has been guided by the philosophy of Quantitative Risk Management (QRM) in the development of this paper and in the design of our Solution Architecture for the Supervisory Review, the Stress Test and the ICAAP. QRM is different to what is known as Financial Mathematics which is a ‘Front Office’ approach peering in detail into the tail of a distribution or the extremity of a data series. Financial Mathematics is essentially Black and Scholes in the front office for trading and pricing of actual deals in derivative and other complex securities. Masters courses are now springing up in QRM all over Europe (e.g. Dublin, Vienna, and Brussels). No one proprietarily owns QRM; you don't have to be a rocket scientist to do QRM, you need exposure to financial mathematics, you need some econometrics and statistics and possibly some insurance (actuarial) mathematics since there is a convergence between Solvency 2 and Basel at the Pillar 2 point (and particularly now that pension exposure is part of the Stress Test in the UK). QRM is interdisciplinary; it’s a body of skills. In following the approach of QRM we at SAP have been guided by Professor Alexander J McNeil of ETH in Zurich. His new textbook in the space written with his colleagues; Rudiger Frey and Paul Embrechts, is

Quantitative Risk Management, Concepts Techniques and Tools (PRINCETON SERIES IN FINANCE, 2005)

This has been our guide to clarify concepts and ensure precision in our exposition of the mathematics and quantitative techniques underlying the Solution Architecture presented in the next section. Professor McNeil has acted as a Consultant from a methodological assurance perspective during the completion of this paper.

Page 6: Basel II Pillar

Pillar 2 and Organizational Change

The new emphasis on the quantitative is coming to the fore and there is a need for manpower and expertise in the risk management arena. This leads to organizational change. The techniques which provide the Economic Capital calculation in the ICAAP and which support the Stress Test have never been deployed right across financial institutions before. As we have said earlier, pockets or silos of expertise have existed in financial institutions where these techniques and tools are an integral part of the day to day process. Pillar 2 has introduced the need to deploy these tools right across the institution. Organizational processes and by implication the human capital required to support both the organizational process and the advanced technology are important too. P2 brings cultural change and it is not surprising that some resistance to its implementation has been evident over the last twelve months. In the experience of SAP where we have faced this challenge in large scale ERP implementations in different industries; it is the cultural changes, skill and attitudinal changes which are the biggest constraints to successful and rewarding implementations of this type.

The Benefits of Basel II

Much is written about the benefits of Basel II, the claim of reduced capital ratios does not currently seem tenable. There is one overriding benefit of the implementation of an appropriate and rigorous Pillar 2 approach. That benefit accrues to the Bank via a good supervisory relationship, which is predicated upon the institution meeting the supervisory requirement of having a rigorous and well understood P2 process and model. The existence of this process and the supervisory relationship which flows from it provides the institution with the freedom to undertake and plan management policy and strategy for:

The Balance Sheet,

Capital and

Equity (dividend) In fact a rigorous Capital Adequacy Solution is an essential tool to support this process.

Page 7: Basel II Pillar

THE STRUCTURE OF THIS DOCUMENT

A ‘White and Green’ Paper

This document is divided into two importantly distinct parts. The first part is the ‘White Paper’ proper where the Solution Architecture of the SAP Capital Adequacy Solution (CAS) application framework to support the ICAAP, the Stress Test and Supervisory Review is presented. This solution is predicated upon the SAP Bank Analyzer product architecture. The solution architecture is designed to meet the business requirements elucidated in the second part. To achieve this, the Solution Architecture is also predicated upon the SAP Quantitative Approach to modeling Pillar 2, which is described in detail in the ‘White Paper’. Inevitably some of this material is complex and could be regarded as ‘difficult’ but we have not been able to comprehensively describe the requirement and solution without some detailed reference to mathematical and statistical techniques and we believe this will be an ongoing feature of Risk Management in the future. The second part is the ‘Green Paper’ section of this document where a discussion of the business requirements of each of the important Pillar 2 Risk Types is provided. This second section is the statement of requirements upon which our Solution Architecture is based. Thus the section is an exposition of the issues and techniques in each of the major specific risk sources in Pillar 2; securitization, market risk (including interest rate risk) and liquidity risk. Finally we conclude this paper with a brief glimpse over the horizon at Enterprise Risk Management and we explain that a robust implementation of a P2 solution is a platform for development towards ERM since some of the challenges of ERM are met in Pillar 2.

Page 8: Basel II Pillar

8

CONTENTS EXECUTIVE SUMMARY ..................................................... 3 THE STRUCTURE OF THIS DOCUMENT ...................... 7 FOCUS ON PILLAR TWO................................................... 9 MODELLING PILLAR 2 .................................................... 23 SAP CAS SOLUTION ARCHITECTURE ....................... 32 THE CALCULUS OF STRUCTURED FINANCE .......... 60 MARKET RISK A P2 PERSPECTIVE........................... 72 INTEREST RATE RISK ..................................................... 84 LIQUIDITY RISK............................................................... 104 CONCLUSION ................................................................... 117 REFERENCES.................................................................... 119 ACKNOWLEDGEMENTS ................................................ 121

Page 9: Basel II Pillar

9

FOCUS ON PILLAR TWO

FOCUS ON PILLAR TWO 9 PILLAR 2 & PILLAR 1 10 A WHITE PAPER ABOUT P2 10 REVIEW OF DISCUSSIONS OF PILLAR 2 04/05 11 THE BASEL II INITIATIVE IN CONTEXT 12 THE SYNERGY OF MATHEMATICS AND TECHNOLOGY 12 QUANTITATIVE TECHNIQUES AND ORGANIZATIONAL CHANGE 13 THE INADEQUACIES OF BASEL I 14 THE ON & OFF BALANCE SHEET DISTINCTION 15 BCBS AND CEBS – THE REQUIREMENT 16 PILLAR 2 - NECESSARY TO IMPROVE RISK MANAGEMENT 17 THE POSITION OF CEBS AT THE END OF 2005 18 THE STRESS TEST (ST) 21

Page 10: Basel II Pillar

10

PILLAR 2 & PILLAR 1

By late 2005 the “Basel II” (B2) initiative, in Europe and the UK had been through a painful evolution. Pillar 1 and the related Pillar 3 reporting presentation has been relatively straightforward. In late 2005 most UK and European financial institutions have completed this part of the challenge. Regulatory Capital reporting predicated upon statistical descriptions of loan portfolio quality was by then relatively well understood. However this part of the challenge has not only delivered new analytic techniques to financial institutions, it has also imposed significant data management, data design and systems specification obligations upon financial institutions. Pillar 2 has been considered in parallel with the P1 and P3 implementations in the last twelve months and this has been where the authentic issues have emerged. Pillar Two is the hardest part of the B2 challenge to plan for and implement, mainly because it is so complex and wide-ranging. For a time, a clear sense of doubt that Pillar Two would operate as envisaged in the original Basel accords was evident in the commentary upon and discussions of Pillar Two in official regulatory forums and published communications with the Basel Committee for Banking Supervision (BCBS). It is now clear that Pillar Two will be implemented in full. In the view of SAP, P2 is where earlier investment in P1 and P3 comes together and where the financial institution can finally leverage real benefit from the extensive investment in the supporting or outside pillars; one and three.

A WHITE PAPER ABOUT P2

Therefore this is a White Paper about P2 and the SAP approach to solving this complex challenge for our clients. SAP has been working with several financial institutions through the B2 process and has significantly developed its understanding and approach to P2 in this 2005/2006 period. In the view of SAP the objective of a P2 implementation is not just to meet the challenge of the Supervisory Review Process (SRP) but to provide the foundation and the tools by which Financial Institutions can manage their increasingly complex business more effectively. SAP does not underestimate the challenge of meeting regulatory and supervisory demands in this new Banking world. However in this white paper, SAP aims to crystallize an approach to implementing P2 and to present a solution-architecture to reflect that approach.

Page 11: Basel II Pillar

11

Review of Discussions of Pillar 2 04/05

A simple review of some comments made in the public domain about P2 in the last twelve months will illustrate the recent environment. The following comments are taken from a public regulatory forum but they are presented here as reflections of comments heard in boardrooms across Europe, they are not sourced or referenced.

“ Members stated that in principle they did not agree with the requirement to carry out additional stress tests, however recognized that it was a Directive requirement and consideration needed to be given to decide how this would be applied in practice …. “ “Members felt that certain high level concepts and issues were outstanding and that there was a divergence of opinion between the regulator and industry thinking”. “It was recognized that a number of smaller firms were unclear about Pillar 2, which was understandable given that there was little detail on Pillar 2 in the CRD itself”. “One member wanted clarification on whether the approach in developing an ICAAP was a choice between a) Pillar 1 plus; 'plus' being additional capital required or b) substituting the Pillar 1 number with the firm's own economic capital number and seeking more information from firms on matters which concerned the regulator but which were not dealt with in the ECM (Economic Capital Model).” 1

Basel II is driving a new approach to Economic Capital modeling. It is not an everyday activity for most financial institutions, although there will be sections in every financial institution undertaking techniques analogous to those required for Economic Capital modeling - securitization, corporate finance, trading book middle office, economics unit or even in the Chief Executive’s Office.

1 These are extracts from the minutes of the FSA Pillar 2 standing committee meeting minutes from the autumn and winter of 2005/6. They are available at; http://www.fsa.gov.uk/pages/Library/Other_publications/EU/minutes_of_industry_group_meetings/p2sg/index.shtml

SAP believes that what is required to fulfill the Supervisory aspects of Basel II in full is an integrated approach to Economic Capital Analysis and Modeling right across the Bank. To be clear on what we mean by the Economic Capital calculation (in terms of the Stress Test), we present below a depiction of the Economic Capital allocation process extracted from an important Bank for International Settlements (BIS) working group survey of this year. We have utilized this working paper from The Bank for International Settlements (BIS) to guide us on what is envisaged in the accords in terms of implementation of new processes and systems particularly in the area of the Economic Capital calculation in the ICAAP, based around the Stress Test.2 In general for brevity in this white paper we may refer to this paper as the Sorge paper, after the author.

2 http://www.bis.org/publ/work165.htm; Stress-testing financial systems: an overview of current methodologies, by Marco Sorge, Working Papers No. 165, December 2004.

Page 12: Basel II Pillar

12

The Basel II initiative in context

In September 2005, BIS published (another) significant working paper. This is a forward looking discussion of where regulation, supervision and disclosure is proceeding over the medium and long term. P2 is about financial stability; it is the lowest level building block of macro-prudential supervision from the perspective of the regulator/supervisor. Ultimately, the B2 accords initiative aims to achieve the prudential supervision of the financial system. We present below some quotations from this BIS paper intended to illustrate the thinking behind the accords and take a measure of why this B2 initiative is happening now.3

“Risk measurement technology has made enormous strides over the last 30 years or so. The ability to price options represented a major breakthrough (Black and Scholes (1973)). In practice, cheaper and more advanced technology has supported data processing of much greater mathematical complexity. Technological advances will continue to drive advances in risk management over the coming years. The limitations in information technology that lie at the origin of the existing gaps in risk management systems have constrained both the elaboration of the corresponding concepts and their practical implementation. The elaboration of the concepts naturally reflects the speed at which our understanding of fundamental valuation and risk measurement proceeds. …. there are still significant advances to be made in the aggregation of risks across apparently disparate categories, such as credit and liquidity risk. The application of conceptual breakthroughs to the day-to-day risk management of firms is challenging the capital investments in the IT systems required to measure risk, and the associated human capital.“

3 http://www.bis.org/publ/work180.pdf; Accounting, prudential regulation and financial stability: elements of a synthesis by Claudio Borio and Kostas Tsatsaronis

The Synergy of Mathematics and Technology

In summary SAP understands that regulators and supervisors are taking advantage of a synergistic or symbiotic relationship between technology (processing and storage capacity) becoming ever more commoditized, and therefore cheaper for financial institutions to deploy, and the consequent advances in the financial mathematics which can only be deployed and tested using advanced technology. That is why the B2 initiative is happening now. At the heart of the B2 initiative is the Economic Capital model which is verified and supported by the Stress Test, i.e. the Pillar Two requirement. Borio and Tsatsaronis make one fundamental point;-

“Implementation (of mathematical modeling) is [dependant upon] the cost of elaborating the information [i.e. application systems], as determined by the cost of computational capacity and of putting in place the necessary organizational processes.. “

As Borio and Tsatsaronis suggest, ‘computational capacity’ is important because it can analyze the information underlying the ICAAP (the calculation of Economic Capital). Advances in technology have now made it possible to implement Risk Management Analytics and Modeling derived from the latest mathematical approaches to value appraisal. It is by recognizing this development in the technology cycle that the Supervisors and Regulators (centered at BIS) have taken this opportunity to require Financial Institutions to implement these techniques holistically across the institution.

Page 13: Basel II Pillar

13

Quantitative Techniques and Organizational Change

Any Financial Institution today already has deployed certainly two distinct methodological approaches and possibly more in different business units. The two certain implementations will be some deployment of Structural Credit Risk models and some deployment of the Financial Mathematics of the Reduced Form in the trading room for the pricing of derivative instruments and transactions. When optimizing the choice of a methodology for the Stress Test (ST), it is essential to select an approach which allows these extant approaches to coexist with the ST methodology; P2 is not about contradicting or replacing existent operational techniques. SAP has taken this perspective in selecting the Quantitative Approach which underlies the Solution Architecture of the SAP Capital Adequacy Solution (CAS), which is designed to coexist and interoperate where possible with the Bank’s existent methodologies and tools. There is a large set of possible mathematical techniques which could be deployed to support the Stress Test. The objective of SAP in designing a Solution Architecture to support the Stress Test (and thereby the ICAAP) is to first select the right mathematical approach which can be implemented pragmatically. The key here is to select a mathematical approach which can integrate with all of the other approaches (possibly already implemented in systems) which over the years the Bank may have deployed. Quantitative Techniques which sustain the Economic Capital calculation and the Stress Test have never been deployed holistically in financial institutions before. As we have said earlier, pockets or silos of expertise have existed in financial institutions where these tools are an integral part of the day to day process. Pillar 2, has introduced the need to deploy these tools right across the institution.

Organizational processes and by implication the human capital required to support both the organizational process and the advanced technology are important too; this is cultural change and it is not surprising that some resistance to its implementation has been evident over the last twelve months. In the experience of SAP where we have faced this challenge in large scale ERP implementations in different industries; it is the cultural changes, skill and attitudinal changes which are the biggest constraints to successful and rewarding implementations of this type. Below we seek to explain and simplify what we believe to be a reasonable approach in terms of the quantitative techniques which we can support in our standard Bank Analyzer architecture. This methodology can then be deployed to assist our clients to comply with the requirements of the Supervisory Review Process by implementing a robust Economic Capital model supported by a comprehensive and realistic Stress Test.

Page 14: Basel II Pillar

14

The Inadequacies of Basel I

This section discusses the current practices in what we shall describe as “Balance Sheet Modeling” (BSM) but extended to cover “Modeling for Off Balance Sheet Instruments” (MOBI). The criticisms of the first Basel Accord were collectively the premise for designing the second. A summary of those weaknesses of the first Accord (Basel 1) is presented below;-

Insufficient risk discrimination: institutions could “game” the system

Basel I did not take adequate account of new instruments and techniques for hedging and risk mitigation, therefore

Capital required did not reflect a bank‘s true risk profile

It could be easily arbitraged (securitization)

It gave an incentive to take high quality assets off the balance sheet

Not much recognition of credit risk mitigation In summary, the first Basel Accord had become inadequate because banks had moved activity “off-balance sheet” and the RWA (Risk Weighted Assets) presenting regulatory capital were therefore not derived from the totality of liabilities to which the Financial Institution was actually exposed. So, amongst other things it is a primary objective of Basel II to encompass these off-balance sheet instruments in an holistic approach to institutional risk management and institutional reporting and supervision.

We will argue below in this paper that Quantitative Techniques and specifically Factor Modeling is the optimal approach to the ICAAP given the lead we have been given by BIS and some other European Central Banks. Let us now proceed to identify:

What these off-balance sheet instruments are and

What are the methodologies deployed in current practice to price them and to understand their risk profile.

In brief, the off-balance sheet instruments are (i) securitization of all types and (ii) credit derivatives of all types.

Page 15: Basel II Pillar

15

The On & Off Balance Sheet Distinction

The on- and off-balance sheet instruments (ONBSI and OFBSI) distinction can be presented by the somewhat over-simplified depiction of a financial institution to the right. An ONBSI is a debt instrument which has recourse to the originating FI’s other assets. An OFBSI is a debt instrument which only has recourse to the specific assets which it is financing or insuring and there is no recourse to the originators other assets beyond the equity or collateral specified by the originator as relevant to that particular off-balance sheet contract. Therefore in summary, securitization activity transfers on-balance sheet exposures off-balance sheet using quantitative modeling to value and price that pool of originally on-balance sheet exposures as a single entity. Credit Derivatives (off-balance sheet instruments) provide hedging or protection of on-balance sheet positions in the Trading Book and on-balance sheet credit exposures in the Banking Book. This hedging or protection is against all types of financial risk -credit, market and interest rate. There is a gap right now in the academic cannon as to whether or not any financial institution can hedge itself or protect itself against liquidity risk but it is certain that B2 will trigger some defining work in this space soon. There is a view that the more sophisticated and complex these off-balance instruments become, the less liquid markets for them become and indeed the less liquid markets for the instruments they protect become, in certain instances. If we are to develop a methodology to support the stress test and the ICAAP we must demystify the techniques (the financial mathematics) of the OFBSI, so that if not the whole Bank then the personnel who will constitute the Internal Capital Unit; who will conduct the ST and manage and maintain the models can share in those techniques by which the OFBSI are managed and priced.

Page 16: Basel II Pillar

16

THE SUPERVISORY CONTEXT

BCBS AND CEBS – THE REQUIREMENT

The best exposition of the objectives, purpose and criteria of Pillar 2 is still in late 2005, that from the Bank for International Settlements (we shall address the perspectives from CEBS and European Central Banks below). As it is the objective of this paper to establish from the SAP B2 experience an approach and solution architecture for P2 we have set out the four BIS principles of P2, the implications of these principles and the salient points of these principles from a systems development and implementation project perspective (a Business Architecture). Therefore deconstructing this analysis of the four principles of P2, we have set out the salient points necessary to plan and prepare to design a Business Architecture to support P2 in a financial institution, in the environment of today (see the diagram on the left). In summary to fulfill P2 and meet the requirements of the SRP in full, a supervised financial institution requires to have a Formal Modeling and Risk Analysis system in place which is both comprehensive (“other risks”) and forward looking (“consistent with strategy and business plan”, “chosen target levels of capital”). It requires to be “relevant to the current operating environment” and it requires to be supported and embedded in an appropriate and audited business process which “reaches up” appropriately to the Board and Senior Management to effect oversight and engaged risk management.

Page 17: Basel II Pillar

17

Pillar 2 - Necessary to improve Risk Management

The Supervisory Review Process (SRP, P2) is intended by BIS to be supportive of Pillar 1 in that it facilitates the institution’s risk management process to consider dimensions of risks not considered in P1, e.g. large exposures and concentration risk, liquidity risk, interest rate risk and external factors to the bank, i.e. the Banks economic context, which can only be appraised on a forward-looking basis. Therefore the intention of BIS behind P2 (in the opinion of SAP) is certainly to reinforce compliance with Pillar 1 requirements by ensuring that the P1 numbers are fully understood by the supervised institution’s risk management. Extrapolating the implications of the default factors, (from P1) via a well designed business process, predicated upon Formal Modeling and Risk Analysis will foster improvements in risk management. It is on the basis of that modeled extrapolation that the Bank’s risk management understands what the implications of future credit decisions for the Bank’s business plan, strategy and economic capital position are. It is by the “outsourcing” of the supervisory process back to the supervised institution that P2 is improving risk management in the supervised institution. Essentially P2 is asking the supervised institution (backed by the threats of sanctions; i.e. “moral suasion”) to undertake the type of supervisory modeling which in the less complex past was once undertaken by the supervisors themselves.

A recent paper by a senior BIS executive asks the question “Is Pillar 2 the most important pillar?”4 The rhetorical response is most illuminating; “Pillar 1 determines the minimum level of capital, not the optimal level of capital”. The paper goes on to state that Pillar 2 “provides a positive correlation between the capital required to adequately address a bank’s risks and the strength of its risk management process”; in other words the quality of the Bank’s “process” which meets the requirements of the SRP will itself determine the levels of capital required to be retained by the Bank. That “SRP-response” process is a complex process made up of the conjunction of two sub-processes; formal modeling and risk analysis supported by an audited and well defined ‘top-down’ organizational design.

4 Basel II: The Key Components and Challenges of Pillar 2 World Bank/IMF/Federal Reserve Seminar for Senior Bank Supervisors from Emerging Economies, Washington, D.C.17 October 2005, Elizabeth Roberts, Director Financial Stability Institute

Page 18: Basel II Pillar

18

The position of CEBS at the end of 2005

This exposition relies upon several reference discussions. The discussion below is an exposition of the approach presented by The Committee of European Banking Supervisors (CEBS) to the implementation of the general principles of the second Basel Accords.56 There is an evolving framework for the regulation and supervision of banks within the enlarged EU. Part of this approach within the banking sector is the Committee of European Banking Supervisors (CEBS). In the EU the evolving framework is trying to achieve (1) regulation that can adapt quickly to new market developments and practices, support integration and enhance EU competitiveness; (2) strengthened cross-border and cross-sector co-operation amongst supervisory authorities and greater convergence of day-to-day supervisory practices and implementation. This is a difficult challenge. There are three levels to the EU framework of regulation and supervision; 1) legislative 2) regulatory and 3) supervisory. CEBS is the Level 3 Committee. The main bulk of CEBS work is related to the revised international capital framework Basel II and its transposition into EU law. CEBS approach to Pillar 2 is a risk-based approach to Supervisory Review. CEBS promotes the convergence of supervisory practice within Europe as basis of closer cooperation. CEBS guidance will highlight the enhanced responsibility for management and board in capital management and will maintain comparability and global view in our approach.

5 Governance and Structure of European Finance after EU enlargement, 9 March 2005, Panel on Banking and Financial Integration in Europe: the evolving rules, José María Roldán, Chairman of the Committee of European Banking Supervisors 6 Implementation of Supervisory Review Process (Pillar 2), London, 16 November 2005; Kerstin af Jochnick, Committee of European Banking Supervisors

As is argued by CEBS.

“Supervision is about judgement, not mechanics. Pillar 2 of the new Basel II capital framework is a classic example. On the one hand, some parts of the industry have effectively asked for more codification of pillar 2, to provide greater certainty and a level playing field. On the other hand, they argue that they do not want automatic capital add-ons and stress that pillar 2 is about judgment….. we also have to respect the subsidiarity principle. We should only do at centralized levels what should or could not be done at local levels. Supervisory tasks are best performed as close as possible to supervised entities, although an environment of increased cross-border and cross-sectoral activity requires arrangements to facilitate necessary convergence and information flows…….”

CEBS is trying to achieve further harmonization of supervisory reporting by publishing common frameworks for reporting of the solvency ratio and financial data for supervisory purposes. With reference to validation, the work is being focused on the definition of common quantitative and qualitative criteria for discrimination and calibration of rating systems, estimates of the probabilities of default, losses given default and exposures at default Furthermore, CEBS are developing minimum standards for the review of the methodologies applied by credit institutions and investment firms. Again, CEBS is not aiming at detailed guidance, de facto setting up new requirements; the objective is clarifying what the supervisor expects;

“Most risks are quantifiable, and institutions should be expected to devise methods for measuring them “ 7

We interpret the intention (of CEBS in relation to setting standards for the SREP) is to try and develop methods for those risks that could be measured (Interest rate risk in the banking book, concentration risk in the credit portfolio and liquidity risk).

7 CEBS Consultation Paper Application of the Supervisory Review Process under Pillar 2 (CP03 revised), June 2005

Page 19: Basel II Pillar

19

The new capital adequacy framework is expected to evolve with best industry practices; it provides incentives for improving, refining and innovating in risk management. More than a static framework, it represents an evolutionary approach to banking supervision. The three pillars provide a kind of “triple protection”8, by encompassing three complementary approaches that work together towards ensuring the capital adequacy of institutions. Pillar 2 is not about capital add-ons – it is about dialogue and supervisors challenging the industry to promote more sophisticated risk management approaches. At SAP, we found at the end of 2005 that our clients were asking for more detail on P2; that is why we are presenting this paper which explains the background to our view of P2 and our heuristic to meet the requirements of the on coming SRP at this time. Nothing we present in this paper will be inconsistent with what CEBS may articulate; we will take our guidance from BIS, (principally BIS working papers).

8 The Benefits and Challenges of Implementation of Basel II in Europe; José María Roldán| 27 Sept 2005, CEBS

Page 20: Basel II Pillar

20

The ICAAP should be Forward Looking The ICAAP must be embedded in the institution’s business and organizational processes and should not be added-on simply to “tick a box”. The ICAAP is fully owned by the institution and every institution must have an ICAAP. The ICAAP must have some common basic characteristics and elements. The supervised institution must instantiate in the ICAAP the process to relate capital to risks and a process to state strategic capital objectives.

“The ICAAP should take into account the institution's strategic plans and how they relate to macroeconomic factors”.

The institution should develop an internal strategy for maintaining capital levels which can incorporate factors such as loan growth expectations, future sources and uses of funds and dividend policy. The institution should have an explicit, approved capital plan which states the institution's objectives and the time horizon for achieving those objectives, the plan should also lay out how the institution will comply with capital requirements in the future, any relevant limits related to capital, and a general contingency plan for dealing with divergences and unexpected events.

Institutions should conduct appropriate stress tests which take into account, for example, the risks specific to the jurisdiction(s) in which they operate and the particular stage of the business cycle. Institutions should analyze the impact that new legislation, the actions of competitors or other factors may have on their performance, in order to determine what changes in the environment they could sustain. Institutions will not be required to use formal economic capital (or other) models, although it is expected that more sophisticated institutions will elect to do so. The ICAAP may use the result produced by the regulatory Pillar 1 methodologies plus consideration of non Pillar 1 elements. In other words, to obtain a capital goal, institutions may take the Pillar 1 requirements and then assess Pillar 2 concepts that relate to Pillar 1 (such as concentration risk and securitization) and concepts that are not dealt with under Pillar 1 (such as interest rate risk).

“The ICAAP may be constructed of a ‘building block’ approach, using different methodologies for the different risk types (Pillar 1 and Pillar 2 risks) and then calculating a simple sum of the resulting capital requirements.“

Page 21: Basel II Pillar

21

The Stress Test (ST)

The key prerequisite of the process of Economic Capital calculation, consistently described in the BIS Accord papers, BIS working papers, CEBS papers and papers from the European Central Banks are that the ICAAP (The Economic Capital allocation process) is predicated upon a Stress Test which itself is constructed of several plausible scenarios plus a baseline. Therefore several contrafactual measures of Economic Capital are output from the Stress Test plus the baseline Economic Capital measure; where the baseline is defined (roughly speaking) as that scenario which describes the macroeconomic conditions prevailing at the current moment. Therefore contrafactual scenarios which are predicated upon different views of macroeconomic conditions if certain assumptions are varied leading to different conditions of hazard in the risk profile of defaultable exposures are all variations on the baseline scenario. The foundation for the ICAAP is the baseline macroeconomic scenario and that resultant Economic Capital calculation; which itself may differ (sometimes significantly) from the Regulatory Capital calculation produced formulaically in the Banking Book and the Trading Book in the Pillar One RWA regulatory capital calculation. In other words the Economic Capital number and the Regulatory Capital number do not reconcile and it would be surprising if they did.

Stress Test Output It is worth reminding ourselves of why we describe Stress Testing scenarios as contrafactual. It is because they are not predictions. The macroeconomic “scenario” or “varied set of conditions” which we use as an explanatory tool for the heightened level of hazard in a model of Economic Capital is not a “forward looking statement” like a Stockbroker’s forecast or estimate. It is simply a “what if” analysis of an alternative level of Economic Capital should prevailing economic conditions alter or more realistically should;-

The economy accelerate faster than expected to a different position on a stable business cycle or

The amplitude of that business cycle change suddenly as a result of changes to exogenous factors such as Government Policy or Geopolitical events.

Three types of Stress Test In recent SAP experience with several Financial Institutions the preparation of scenario definition as the trigger to the Stress Test is being seen as a great challenge, this of course reflects the fact that this kind of modeling approach is new to Financial Institutions outside of the specialist business units. It is worth pointing out some salient features of Stress Test modeling at this point. There are essentially three distinct types of Stress Test.

I. Those driven by hypothetical events (the concept of the contrafactual scenario),

II. Stress Tests which are driven by the re-running of extreme conditions from the recent past against the current portfolios of both market and credit risk.

III. The Stress Test mechanism whereby one of the factors constituting the model of the Financial Institution is shocked (changed in a deleterious way) and the impact is assessed (this is known as a hazard shock).

Page 22: Basel II Pillar

22

The objective of Stress Test design is to produce plausible scenarios which will assess the response of balance sheet values to extreme conditions. It is the first of these types of Stress Tests with which many Financial Institutions are commencing Stress Test design and arguably this type of ST is the most difficult conceptually, most complex to design and is in fact the approach most exposed to lack of plausibility. This is because predicating the ST upon events we have already experienced and for which we understand the dynamics enhances plausibility, flexing a single factor to the limit of acceptable tolerance likelihood within a reasonable time period assists with plausibility but commencing the development of an ST scenario from scratch and working through all of the interrelated factors and quantifying the relative movements is a complex, time consuming and laborious task, even when one keeps the dimensionality of the factor model low. This latter process exposes the scenario development to errors and to the evaporation of plausibility as the interrelationships envisaged become more and more detailed and numerous. The diagram above right is the standard model of comparison of E.Cap and Regulatory Capital presented by FDIC (The Federal Depository Insurance Corporation) in the US. The diagram below right is the paradigm process model of the Stress Test workflow which we at SAP have deployed through 2005. It is taken from an important BIS working paper, referenced below9. There is another important paper in this series which you also may wish to consult and this is referenced below also10

9 http://www.bis.org/publ/bppdf/bispap22t.pdf; Macro stress tests of UK banks, Glenn Hoggarth, Andrew Logan and Lea Zicchino; Bank of England. 10 http://www.bis.org/bcbs/events/rtf05Drehmann.pdf; A Market Based Macro Stress Test for the Corporate Credit Exposures of UK Banks Mathias Drehmann, Bank of England, April 2005

Page 23: Basel II Pillar

23

MODELLING PILLAR 2 MODELLING PILLAR 2 ......................................................................... 23

QUANTITATIVE TECHNIQUES ......................................................... 24 THE SUPERVISORY EQUATIONS .............................................................. 24 A MODEL FOR A SINGLE FINANCIAL INSTITUTION ............................... 25 THE SRP AND THE SREP ....................................................................... 25 STRESS TESTING METHODOLOGIES ........................................................ 27 FACTOR MODELING .................................................................................. 28 FACTOR MODELS IN CONTEXT – CREDIT RISK ..................................... 28 FACTOR MODELS IN MARKET RISK ....................................................... 28 FACTOR MODELS OF CREDIT AND MARKET RISK. ............................... 29 FACTOR MODELS – ALIGNMENT WITH BIS AND CEBS ...................... 29 THE MATHS OF A FACTOR MODEL OF CREDIT RISK............................. 30 A QUANTITATIVE METHODOLOGY. .......................................................... 31 MODELING TECHNIQUES FOR THE STRESS TEST................................... 31

Page 24: Basel II Pillar

24

QUANTITATIVE TECHNIQUES

If we agree that:

Regulators and supervisors are taking advantage of a symbiotic relationship between technology and the consequent advances in Quantitative Techniques due to ever cheaper technology, and

The premise of P2 is those “Quantitative Techniques”. Then we must explore in greater detail the implications of Quantitative Techniques for P2. Our starting point at SAP throughout 2005 has been based on the BIS working paper we refer to as the Sorge paper.2 We have also consulted the following 2005 Central Bank of Ireland working paper.11 The BIS working paper is seminal and sets out the framework, parameters and scope of a Stress Test and thereby determines the methodology of an Economic Capital calculation. It opens with three important equations which we will use in this paper as our opening exposition of the mathematics of our Quantitative Approach.

I.

II.

III.

11 “The Stress Testing of Irish Credit Institutions” by Andrew Mawdsley, Maurice McGuire and Nuala O’Donnell. Central Bank of Ireland, Financial Stability Report 2004

The Supervisory Equations

In simple terms, what these equations are saying is firstly that the expected value of a macroeconomic variable at some point in the future is a function of both the historic behaviour of that variable (through time in the past) and the set of factors which influenced that variable during the same period in the past. The second equation is then stating that if we apply a statistical approach to the first equation, we can predict the probability of our variable being a given value in the future. Finally the third equation (the macro stress test equation) presents a function to describe what happens when an extreme event occurs; where the distress function of the macro economic variable leads to a function describing losses in the financial system. This particular paper is focused on Macro Stress Testing from the point of view of Central Banks or Supervisors. This in no way contradicts the view that the three equations are the basic conceptual tools of a framework required to construct a Macro Stress Test in an individual financial institution and thereby produce a robust Economic Capital calculation and ICAAP. In the Sorge (BIS) function the factor g is particular to each financial institution; g is the portfolio weight of that aggregated asset class or in other words the portfolio asset allocation of that asset class. These weights (this vector of weights) are dollar value numbers and therefore the g function will provide the dollar value of economic capital. Therefore we would expect that the supervisor will wish to see the capability of the institution to stress factor impacts on the g value and possibly will guide on the stress levels (the 1/100 or 1/1000 event which equates to a 30% fall in FTSE or a 4% rise in base rates, for example). One should of course consider how likely these factor stresses are in terms of historical data; the rational approach would be to treat all of the risk drivers in much the same way by defining a plausible 1/100 event or 1/1000 event

Page 25: Basel II Pillar

25

The difference between the equations presented in the BIS paper and those required by an individual financial institution is that the financial institution requires an equation (or set of equations) which predicts the expected value of a specific balance sheet variable (profits, provisions) given the expected value of a (set of) macro economic variables. The financial institution then requires a second (set of) equations which relate the probability of the out-turn values of balances sheet variables given the forward performance of a set of macro economic variables. Finally and to support the stress test, we are in a position to develop (in a manner analogous to equation III) an equation which describes how the particular institution’s balance sheet will respond to various macro economic scenarios across varying levels of stress. This final equation we at SAP refer to as the Bank’s Reaction Function.

A Model for a single Financial Institution

For various important reasons, we cannot simply generalize the supervisory equations presented by BIS, the principle of which is that not all financial institutions are the same or even similar and that not all financial institutions will respond in the same way or even to the same set of macroeconomic variables. This is obvious. However, perhaps there could be a single large framework of Factor models from which each bank could select the appropriate subset of equations “off the shelf” as it were; again it is not as simple as that. Before we explore the intricacies and history of quantitative modeling techniques without which no exposition of this area would be either comprehensive or complete, we must deal with the practical issue of the SRP and SREP.

The SRP and the SREP

The SRP (Supervisory Review Process) is the process undertaken by the supervisor whereby the supervisor “audits” the P2 process in that institution; namely the Stress Test, the ICAAP, the Economic Capital calculation, the governance and conduct of that calculation, etc. The SREP is the name for that process from the perspective of the supervisor (in basic terms). Although the audit will focus on a final number for the Economic Capital value proposed by the financial institution (FI), that will not be the primary focus. A primary focus of the SRP will be validation, i.e. model and process validation. The supervisor will review and validate the modeling process undertaken by the FI to ensure that;

a) The behaviour of the default factors in under stressed conditions is well understood by the FI modeling team, and

b) That the FI modeling team has a well developed model which links a macroeconomic scenario via default factors to the FI balance sheet, i.e. a Factor model.

Page 26: Basel II Pillar

26

This explains why there is no simple “off the shelf” answer for any FI in the P2 SRP space; each FI must build and understand an individual heuristic or model set which underpins its ICAAP.

Page 27: Basel II Pillar

27

Stress Testing Methodologies

As we have said it is an over-simplification to view the Structural and Reduced Form RF approaches as opposing or contradictory methodologies. They are part of an evolutionary process of the development of methodologies to represent the expected performance of a balance sheet value with respect to a posited change in macroeconomic conditions. There is a great deal of academic literature in this space examining and re-examining the empirical and theoretical validity of the various approaches within the methodologies. The downside of the Merton-type Structural models was that they became “computationally expensive”. Firstly they had to be extended because of the restrictive over simplified assumptions of the initial model. Then the ‘completeness’ of the mathematics deployed entailed that with this general multivariate approach to modeling a call option, dependent upon the distribution of errors, it can be difficult if not impossible to estimate Expected Returns. Also we must estimate or compute the correlations again from first principles every time we run the model. This applies to all so-called structural mathematical models, in all disciplines (e.g. biology, engineering etc) So a computationally inexpensive algorithm for simulating correlated defaults became the objective of applied academic thinking. While most multi-name models require simulation, the need for accurate and fast computation of “Greeks” (mathematical symbols of correlation) has pushed researchers to look for modeling alternatives; i.e. to exploit a low-dimensional factor structure and conditional independence to obtain semi-analytical solutions. This approach is known as Factor Modeling.12

12 Quantitative Risk Management: Concepts, Techniques, and Tools, Alexander J. McNeil, Rüdiger Frey, and Paul Embrechts

In reduced form models the time of default is modeled as exogenously defined; the assumption is that default is an unpredictable event governed by a hazard-rate process. Therefore since in the Structural Model the default process can be endogenously derived, the structural model can be recast as a reduced form model making the structural modeling approach a special case of the reduced form approach. The proprietary Structural Credit Risk Models of KMV or Credit Metrics regularly cited in Basel II related papers are inspired by the Nobel Prize winning Merton model but they are easier to understand in their Reduced Form, the essential stochastic nature of default can be understood more clearly as RF models in continuous time. The proprietary aspect of the Structural Credit Risk models is the manner in which they arrive at levels of default for corporate exposures expected under current conditions which flows from the large databases of corporate default maintained by the companies managing these tools. The structural model is therefore the explanation or the story of default and the RF representation is the stochastic expression of how that default may behave in continuous time. The RF representation of KMV Credit Metrics allows stress testing because you can stress factors to behave extremely. The strength of the RF description of structural credit risk models is in simulation, statistical estimation and comparability with other credit models.

Page 28: Basel II Pillar

28

Factor Modeling

The ST and ICAAP approach selected by SAP is called Factor Modeling. This is consistent with the approach to Stress Testing outlined by BIS in the paper by Marco Sorge. Essentially the Structural models for Credit risk implemented for example by Moody’s KMV are factor models of a complex and structural nature but they can be understood by a single factor equation. The Reduced Form models in the Front Office of the Trading Book are Factor models too; designed for quick calculations to support valuation and pricing tools. They are both Factor models because they take the value we wish to understand on the left hand side of the equation and explain its possible values as relationship between the factors which drive it, which are related on the right hand side of the equation. Factor modeling is the appropriate pragmatic do-able methodological approach for the Stress Test which provides methodological open-ness and is therefore consistent with the open architecture of SAP Bank Analyzer.

Factor Models in Context – Credit Risk

The use of Structural Models in Credit Risk does not really align very well with the original deployment of the Reduced Form (RF) in econometrics. A Structural Credit Risk model can be viewed as a process of writing down how default occurs in credit risk, whereas RF is pragmatic approach to the intensity of default. The apparent distinction of RF and Structural is not a clear divide; they are perspectives and RF can be blown up into a structural model. The deployment of the RF in credit risk is usually associated with Duffie and Singleton (D+S)13 and is an academic approach to modeling Credit Risk which is in continuous time. The conditional independence of default given factors; could be called discrete reduced form modeling or mathematical modeling of default in a discrete time framework. This is perfectly adequate for economic capital modeling and is a good framework for Stress Testing;.

13 Credit Risk: Pricing, Measurement, and Management Darrell Duffie and Kenneth J. Singleton

A Factor Model in Pillar One This approach aligns well with the Gordy paper on the SFA for capital calculation in securitization which implements the asymptotic single risk factor (ASRF) framework and the more general approaches to Stress Testing described by the Sorge paper from BIS. The essence of the ICAAP is a factor model, default is driven by factors; usually macro and either clearly identified or latent and largely unidentified but approximated where appropriate. Factor Modeling is the approach which allows common factors to drive risks (or hazard) in credit and market risk portfolios. On the credit side factor modeling is a discrete time approach to modeling the impact of factors upon the portfolio.

Factor Models in Market Risk

A Factor Model is appropriate in Market Risk; Factor Models have been around in this space for a number of years. Some large Financial Institutions in Europe have deployed factor models to joint-model Market and Credit Risk (i.e. to move towards the aggregation of risk (and thus economic capital)) in a coherent and robust manner. In Market Risk or Trading Book modeling one can deploy a conventional econometric factor model for the trading portfolio. The objective of factor models is to have low dimensionality in the factors influencing the risk being detailed; here one can use vector autoregressive (VAR [uppercase ‘A’]) statistical functionality for low dimensional factors of this type of modeling. There is a key efficiency gain in the modeling activity from the deployment of the same approach to Market Risk modeling as for Credit Risk; some factors for the PD model and some macroeconomic drivers of the Market Risk model will be common. To use the factor model in a forward looking way you have to understand the dynamics of the macro factors; the appropriate time scale of the modeling is also important; one year is the canonical time scale for credit risk, in market risk the time frame can be anything from one day to intra day; one tick (second).

Page 29: Basel II Pillar

29

Factor Models of Credit and Market Risk.

To aggregate the credit and market risk factor influence you could optimally apply the same time period to both for example one month. Keeping factor dimension low and keeping the latent variable one dimensional means that you can model credit risk and market risk analytically. The SRP is annual therefore the time base of Economic Capital calculation (for Supervisory Purposes) in the Trading Book (TB) is one year. Operationally we know that no bank is going to maintain a static portfolio asset allocation over that year, the institution will react very rapidly to an adverse scenario, or an indication of adversity; this makes economic capital calculation for the trading book unrealistic or difficult because the TB doesn't stand still in reality.

Factor Models – Alignment with BIS and CEBS

Factor Modeling is a methodology which encompasses both RF and Structural; the essence of credit portfolio risk modeling is the reduction of the dimension of the exposures to be attributable to a small amount of factors. Factor Modeling is the optimal approach to the Stress Test and the ICAAP; it models portfolios simply and tractably. It is a methodology which can be deployed by any type of Financial Institution within the governance processes required by CEBS for P2, supported by the Solution Architecture of the SAP Capital Adequacy Solution. Multivariate factor modeling is the type of factor modeling which is best possible approach, single factor modeling (like the ASRF of the SFA) is probably over simplified, using one factor means you miss some factors driving the behaviour of pockets of the portfolio but the Gordy single factor approach may be necessary where other complexities exist in the risk being modeled (e.g. the complex transaction structure of a securitized instrument).

Page 30: Basel II Pillar

30

The Maths of a Factor Model of Credit Risk

Below we present the archetype factor model for Credit Risk, prepared for SAP by Professor Alexander J McNeil cited above

Page 31: Basel II Pillar

31

A Quantitative Methodology.

Before we present an exposition of the Solution Architecture of the SAP Capital Adequacy Solution (CAS) it is a necessary as part of a rigorous process to define the Quantitative Approach which SAP has selected as optimal for the Stress Test and the ICAAP and thus upon which the Solution Architecture is predicated. We have made reference to our Quantitative Risk Management (QRM) outlook and described in detail our Factor Modelling methodology in other sections of this document. When preparing to develop a solution to support the Stress Test (ST) and the ICAAP, a Financial Institution is faced with a bewildering array of modelling approaches from the simple modelling of Customer Scoring for Credit decisions to the massively complex Macroeconomic models to support strategy decisions based upon forecasts and all points in between. It is possibly the bewildering choice matrix which led to the view last year that P2 might not happen. So it is important that we clearly present the logic of the manner in which our selection of Factor Modelling is supported by appropriate statistical functions which are developed in a robust Solution Architecture.

Modeling Techniques for the Stress Test

There is a large set of possible mathematical techniques which could be deployed to support the Stress Test. The objective of SAP in designing a solution Architecture to support the Stress Test is (and the ICAAP) is to first select the right mathematical approach which can be implemented pragmatically. The key here is to select a mathematical approach which can integrate with all of the other approaches (possibly already implemented in systems) which over the years the Bank may have deployed. The right approach is called Factor Modelling. This is consistent with the approach to Stress Testing outlined by BIS in the paper by Marco Sorge. Factor modelling is the suitable pragmatic methodology for the Stress Test which provides methodological open-ness and is therefore consistent with the open architecture of SAP Bank Analyzer, which can integrate with proprietary 3rd party structural models and support simple Reduced Form modelling techniques as well. SAP already has completed integration with one vendor's structural product for Credit Risk.

Page 32: Basel II Pillar

32

SAP CAS SOLUTION ARCHITECTURE SAP CAS SOLUTION ARCHITECTURE .................................................................................. 32

STRATEGIC FUNCTION AND PRESSING NEED ............................................................... 33 SUMMARY OF THE P2 BUSINESS REQUIREMENTS ..................................................................... 34 THE PROCESS ARCHITECTURE FOR PILLAR 2 .............................................................................. 36 PILLAR 2 REPORTS......................................................................................................................... 41 A LOGICAL ARCHITECTURE FOR PILLAR 2 ................................................................................... 43 STATISTICAL MODELING AND ANALYTICS IN THE SAP CAPITAL ADEQUACY SOLUTION. ....... 46 THE SAP CAPITAL ADEQUACY SOLUTION FUNCTIONAL OVERVIEW .............................. 53 RISK TYPES AND THE USER INTERFACE ....................................................................................... 53 THE P2 MODELING CYCLE WITH THE SAP CAS.......................................................................... 54 SAP BANK ANALYZER AND P2 .................................................................................................... 55 SAP BANK ANALYZER & CREDIT RISK ....................................................................................... 57 THE SAP CAPITAL ADEQUACY SOLUTION FOR CURRENT SAP BANK ANALYZER CLIENTS .... 58 THE SAP CAPITAL ADEQUACY SOLUTION FOR NEW SAP BANK ANALYZER CLIENTS ............ 58 THE BENEFITS OF IMPLEMENTING THE SAP CAS ...................................................................... 59

Page 33: Basel II Pillar

33

STRATEGIC FUNCTION AND PRESSING NEED

The Pillar 2 requirements set out elsewhere in this White Paper call for a solution of unusual flexibility and exceptionally broad scope. On the one hand, the strategic nature of Pillar 2 calls for an architecture which will be able to support increasingly sophisticated analytics in the years ahead. Factor Models require the use of exogenous time series as driving factors from a potentially wide variety of sources (macro-economic time series, macro economic relationships [term structure], industrial sector indices, market volumes etc); the final set to be deployed in any Financial Institution will be decided by that institution’s personnel. Factor Models also involve the application of relatively advanced mathematical equations and statistical functions. The complexity and sophistication of these requirements is not to be underestimated, although great advances in both the mathematics and the software applications to support have been made in recent years and are still being made (e.g. QRM).

On the other hand, there is an urgent need now, in early 2006, for an architecture which is capable of making maximum reuse of existing data and components, from both SAP and other vendors, in order to allow banks who are currently without a Pillar 2 solution to ‘get over the line’ of supervisory approval when submitting their ICAAP possibly around September of this year but dependent upon the timetable of the home supervisor. SAP proposes a solution which is intended to reconcile these contradictory requirements by being at once both strategic and pragmatic. The framework to achieve this is provided by SAP Bank Analyzer, a product architecture which contains both existing Basel II analytics which can be readily adapted for use in a Pillar 2 context, and the kind of time series modelling functions and data stores essential for implementing a Factor Model Reaction Function for a Financial Institution. Indeed it is this Reaction Function development approach underpinned by Factor Models which gives the bank a solid foundation to evolve towards true Enterprise Risk Management (ERM) with SAP. True ERM is a common goal in Financial Services right now and an objective which has until now eluded even the most sophisticated institutions. In presenting the SAP solution to Pillar 2, we

Review the business requirements for Pillar 2; Set out the process architecture to support Pillar 2; Outline the reporting requirements for Pillar 2; Outline a logical architecture capable of supporting these

requirements; and then Present the SAP Capital Adequacy Solution, based on the

physical architecture of SAP Bank Analyzer, which we match against the template represented by the logical architecture.

Page 34: Basel II Pillar

34

Summary of the P2 Business Requirements

Based on the overview of the business architecture requirements set out elsewhere in this paper, a supervised financial institution must implement a system to meet the requirements of Pillar 2 in full, such a system must be;-

comprehensive (“other risks”), forward looking (“consistent with strategy and business

plan”, “chosen target levels of capital”), focussed on economic capital (“relates capital to the level of

risk” etc.) and auditable to a high standard (“… audit to ensure process

integrity”, requirements for model validation etc.) In terms of the data, systems-relevant processes and reporting outputs expected of a Pillar 2 solution, these are neatly encapsulated in the following graphic published last year by the Bank for International Settlements:14

14 http://www.bis.org/publ/cgfs24.pdf; Stress testing at major financial institutions: survey results and practice, report by a working group established by the Committee on the Global Financial System, January 2005

From a system-based processes perspective, the overriding focus in Pillar 2 is on the Economic Capital (E.Cap) calculation and the Stress Test (ST). However, as set out elsewhere in this paper, the distinction between E.Cap calculation and stress testing relates not so much to the method of calculation as to the input parameters used:

When calculating EC, the baseline assumptions and parameters involved are made as realistic as possible in order to allow the financial institution to evaluate risk capital on as realistic a basis as possible.

The Stress Test, conversely, sets out to test the financial institution’s robustness by using macro-economic parameters which are possible but not expected (e.g. a significant rise in unemployment or interest rates, a large fall in the Dollar against the Euro and so on).

From a systems and process point of view therefore, the baseline E.Cap calculation can be viewed as just one of a number of ST scenarios, set apart from the rest only by the use of realistic rather than contrafactual macro-economic driving factors. The overall scope of the data required for Pillar 2 is covered in the central portion of the BIS diagram, i.e. the requirement is to cover not only credit, market and operational risks but other risks as well. The full breakdown of risks to be covered by the SAP solution is shown in the following Risk Distribution Matrix:

Page 35: Basel II Pillar

35

The challenges posed by this requirement summary table from a process and architectural standpoint are taken up in detail below; it is sufficient to note the sheer diversity of requirements exposed by this analysis of Pillar 2. The Pillar 2 business requirement is for a number of categories of data which are not covered by Pillar 1. However, even those risks which are covered by Pillar 1 cannot be treated identically under Pillars 1 and 2 respectively even for capital requirement calculations (E.Cap calculation), putting aside for the moment the requirements of the Stress Test that this entails. This is because the regulatory assumptions which apply to Pillar 1 may well be unrealistic in the context of Pillar 2 calculations. For instance, the recovery rates specified for secured lending under Pillar 1 are assumed to be relatively high (40% plus), whereas in reality the recovery rates in some markets in the UK and Ireland are well known to be much lower, in some cases almost negligible. Finally, note that the categorisation of Basel II approaches as Standardised, Foundation IRB or Advanced IRB applies only to Pillars 1 and 3. When it comes to Pillar 2, it is up to the supervised institution to propose its own solution to Pillar 2, free of the prescriptive regulations which apply to the other two Pillars.

Page 36: Basel II Pillar

36

The Process Architecture for Pillar 2

The Risk Distribution Matrix shows the diversity of risk types to be supported by the SAP CAS for Pillar 2. For each risk type, there is a separate calculation or set of calculations to be carried out, whether via a Factor Model which can express increasing exposure of the Financial Institution to hazard (e.g. increasing PD in Credit Risk) or the response of the aggregated supervisory view of the Trading portfolio to Extreme Events. However, the apparent complexity implied by the Risk Distribution Matrix should not divert attention from the fact that the basic process of building up a Reaction Function to cover these risk types is essentially rather simple. The overall process is an iterative one, in which models are run for each risk type in turn in order to relate external factors to internal history data to derive Factor equations or sets of equations to describe the manner in which that risk type may respond to changing Macroeconomic conditions. The resulting series of equations, covering all risk types which are modelled via the Factor approach (where appropriate) and other more simple techniques where simple is all that is necessary together constitute the Reaction Function for the individual financial institution. The simplicity of the process of building the Reaction Function derives from the fact that, although many different factors and types of data are involved in building the equations for the different risk types (e.g. default history for Credit Risk, volatility for Market Risk), the modelling process to be undertaken in each case is fundamentally the same; (iterative refinement of fit between external factors and internal bank data for each risk type).

Once the core Reaction Function equations have been created and calibrated, they can then be applied in the running of scenarios – both baseline (unstressed, assuming current conditions continue to prevail) and stressed – i.e. what is normally thought of as constituting the Stress Test. The stages involved in building the Reaction Function and running the Stress Test may be summarised under the following three headings:

1. Collate input data for the Stress Test 2. Define the Factor Model for the Risk Type 3. Run the Stress Test

1. Collate input data for the Stress Test The inputs consist mainly though not exclusively of time series, and fall into two categories – i.e. external (data from central banks and commercial organisations) as opposed to internal (the financial institution’s own history data), as follows: External data is subdivided under two subheadings (all time series), i.e.

Macro-economic data such as GDP, unemployment, inflation and government spending; and

Market data including prices of tradable instruments and their volatility, market indices (FTSE, Dow Jones), yield curves (LIBOR, term structures and forward rates) and transaction volumes (liquidity risk).

Internal data similarly falls under two sub-headings:

Time series, including customer default history by exposure type, actual price history of traded instruments by supervisory category, their volatility if available, history of funding rate paid by maturity, also history of duration of illiquidity due to extreme events (e.g. 9/11/2001); and

Current static portfolio asset allocation and balance data such as exposures for both banking book and traded products.

Page 37: Basel II Pillar

37

Some of the external data can be obtained free from central banks. Other data is available commercially from Reuters, Thomson, the stock Exchanges and other standard providers of historic reference data. Note that, in SAP Bank Analyzer, time series data is stored in the Historic Database (HDB), whereas the static data and balances are held on the main Source Data Layer (SDL, previously termed the FDB). The level at which internal data is aggregated will depend on the requirements of the particular financial institution. The regulators have specified only a high level of summarisation – for example, data must be separated out by instrument type (equity, fixed income, commodities and FX) in the trading book but the lower level details are for each institution to decide based on its own needs and circumstances. Thus, an institution might opt to aggregate data per business unit, so as to leverage its Pillar 2 solution and calculate Economic Capital by business unit rather than at Group level. Similarly, an institution may decide in the interest of convenience and operability to aggregate its credit risk data by Pillar 1 categories such as risk class (shown here), even though this is not required by the regulations: This approach would have the advantage of giving maximum support for reconciliation of Pillar 2 data with data for Pillar 1. In addition for existing users of the Bank Analyzer solution for P1/P3 this approach facilitates complete reuse of the categorisation of the Banking Book from the P1/P3 solution to the P2 CAS platform. In principle, granular data can be used rather than aggregated data.

Page 38: Basel II Pillar

38

2. Definition of the Reaction Function Having collated the input data as described, the next step is to identify, for each relevant risk type, the pairings of external and internal time series which will form the basis of the Stress Test. Examples:

For Credit Risk: GDP or unemployment vs. bank-internal default history for personal or mortgage exposures and possibly an industrial sector time series as a driving factor for corporate default.

For Market Risk: A standard appropriate Stock Exchange Index or Currency or Commodity Index and price volatility measure applied to the bank-internal asset allocation

For Interest Rate Risk: A term structure expectation model plus an inflation expectation model applied to the internal asset and liability maturity bands and the total exposure of the bank therein.

For (Market) Liquidity Risk: Market Depth measures are the challenge, reference market existence (distinguish between a product whose price is / is not observable), traded product degree of uniqueness, market velocity (volume/no. trades) for each traded product.

For each risk type, an equation or set of equations must be defined to relate external to internal data. Regressions must be run iteratively to find the best fit between external historic shocks and internal responses (i.e. the best fit will give the appropriate time lag between driving factor (external and generally macro) and internal value). As part of this process, the level of error and white noise must be assessed e.g. by use of the Durbin-Watson statistic. A key part of the process just outlined will be to select the modelling methodology appropriate to the particular risk type being processed. Each risk type could potentially require a different methodology. For instance, where Credit Enhancement Risk would normally require asymptotic approximation (as used in the Supervisory Formula Approach to calculating Regulatory Capital for Securitized Instruments under Pillar 1), Liquidity Risk might be treated using vector autoregression.

Page 39: Basel II Pillar

39

An overview of the main choices available is given in the following table which is an extension of the Risk Distribution matrix (above) to detail all of the appropriate mathematical approaches, statistical functions and external and internal factors (summary level) envisaged by SAP as required to support initial Stress Testing and ICAAP construction in Europe in 2006/7.

Page 40: Basel II Pillar

40

It must be stressed that the process of defining/modelling the equations by risk type and finding the best fit between internal and external data is by no means mechanistic or automated. ‘Expert Judgement’ must be exercised by informed and skilled staff throughout (arguably it is a factor in every equation), both in defining how the stress scenarios are to be tested (e.g. choice of exogenous variables) and in judging the best fit between internal and external data . The output from this process step is a set of equations, one for each risk type, which defines the Bank’s baseline position assuming the continuation of conditions currently prevailing. The set of equations, taken together, constitute the Bank’s Reaction Function. Running the full set of equations produces the Bank’s Economic Capital under current economic conditions (the Baseline Scenario). The process of defining the Reaction Function is likely to take place at infrequent intervals, e.g. quarterly.

3. Run the Stress Test With input data collated and the Reaction Function defined, the final stage in the process is to run the Stress Test itself, in two steps:

For convenience and completeness, the first step is to rerun the Baseline Scenario, i.e. the scenario which defines the Bank’s Economic Capital under current (unstressed) conditions.

This is then followed by running the stressed scenarios – what is usually thought of as constituting the Stress Test.

This process will run more frequently than the preceding process to define the baseline; for instance, the Stress Test would run on an ad hoc basis, as required by the supervisor, as well as being run periodically, perhaps monthly, to meet the needs of senior executives. A high level summary of the end to end process of defining the Reaction Function and running the Stress Test is given in the graphic below.

Page 41: Basel II Pillar

41

Pillar 2 Reports

Pillar 2 reporting requirements will fall under at least three headings:

Reports to support definition of the Reaction Function Stress Test reports Audit reports

The distinction between reports for Reaction Function definition and those for the Stress Test calls for further explanation. When running the Stress Test proper, the models used will in principle be the same physical models as those used when defining the Reaction Function. However, the respective reporting outputs are likely to be different in each case.

When defining the Reaction Function, the analysts will be looking at a detailed level, and may well require graphic representations of external and internal data outputs.

When running the Stress Test proper, conversely, the target audience will be senior executives and the Supervisor him This audience is likely to require a much more succinct, high level summary view of the Economic Capital position under stressed conditions.

1. Reports to support the Reaction Function When defining the Reaction Function, detailed information required by analysts will include estimates of the lag between a driving macro factor and internal data series impact (which will be held stable for stress purposes, usually), measures of error (kurtosis, skew, Durbin-Watson statistic). Reports to display the underlying time series data will also be required. Reports to display these details could be either conventional reports or graphics.

2. Stress Test Reports Stress Test outputs for senior executives and the supervisor will be highly summarised. The main outputs will be list reports summarising Economic Capital by risk type per scenario. It is likely that indications of methodology used and error percentage will be relevant here. Supporting detail in the form of list reports will clearly also be required. This can be expected to include breakdowns by product category, business unit and other dimensions, along with extra key figure information such as business volume, variance of stressed scenarios against baseline scenario and variance of economic capital against regulatory capital (Pillar 1) for those risk types where this is relevant (i.e. credit risk, market risk and operational risk).

Page 42: Basel II Pillar

42

3. An Executive Cockpit There will be a (developing) demand for graphical reports showing comparisons of one or more stressed scenarios with the baseline. Evidence that this will be required is provided specifically by a paper from the Irish Central Bank (CBFSAI)15 which gives illustrations of the expected output from a stress test. This is summarised in the following graphic, where list reports showing the underlying macro-economic data over 3 years (current and two forecast years) are supplemented by graphical representations of the results of the Stress Test:

Stress Test reports, whether list-based or graphical, will be produced from SAP Business Intelligence (BI).

15 Mawdsley, A. et al., ‘The Stress Testing of Irish Credit Institutions’, Central Bank and Financial Services Authority of Ireland (CBFSAI), 2003.

4. Audit Reports In addition to the executive reports and analyst reports outlined in the two preceding sections, there will be a need for operational reports for reconciliation and other audit purposes such as data privacy. As far as the reports on data privacy are concerned, these are generic reports provided as standard by SAP. The requirement for reconciliation reports will vary depending on whether or not the client is using SAP Bank Analyzer for Pillar 1 as well as Pillar 2. Where this is the case, the underlying data between the two pillars will be identical and therefore all that will be required is detailed reports e.g. with drill-down capability, to trace differences between Regulatory and Economic Capital at whatever level of detail is required by the client. Where the underlying data for Pillar 1 is held on a non SAP system, conversely, there will be a need for reconciliation reports between SAP and non SAP systems.

Page 43: Basel II Pillar

43

A Logical Architecture for Pillar 2

The process and reporting requirements reviewed in the preceding sections call for an architecture based on the following fundamental building blocks:

Granular & extensible Basel II compliant data model Preconfigured data mart providing ‘out of the box’

implementation of the data model Internal analytics (potentially including 3rd party statistical

method libraries). Ability to integrate external (3rd party) analytics &

statistical modelling tools if required Communications services e.g. to integrate 3rd party models

(Service Oriented Architecture or SOA) Aggregated results data layer to supply data for

presentation layer and also e.g. to integrate results from 3rd party models and applications

Modern web-based / graphical presentation layer for executive reporting

Time series indexed database e.g. for regressions, supporting macro economic and internal Bank risk management data

ETL layer to support extraction of data from upstream data sources, including transactional source systems & data warehouses

We expand briefly on the rationale behind each of these statements in the sections immediately below. An initial, logical view of the suggested architecture is shown in the following graphic:

1. Granular & extensible Basel II compliant Data Model The data model must be Basel II compliant, and must support both granular and aggregated information including:

Individual banking contracts or portfolios of contracts (for maturity data, identification of counterparty, matching to collateral and so on);

individual trading contracts or portfolios of contracts (down to individual leg level on swaps for market risk assessments etc.);

balances and cash flows for recovery modelling and so on; counterparty information; time series inputs (contract history data for forward looking

analysis etc.); market data (e.g. LIBOR rates including forward rates, price

data etc.); and macro-economic data for the Stress Test

Page 44: Basel II Pillar

44

It is critically important that the data model should not merely be Basel II compliant as currently understood (Pillar 1 / Pillar 3) but that it should be easily extensible to meet future changes in requirements. Otherwise any solution designed in mid 2006 will be obsolete before it is even implemented. SAP Bank Analyzer and thus the SAP CAS provides this type of platform. 2. Preconfigured Data Mart for Base Data The data model for granular or aggregated data must be implemented in an appropriate data mart. The time pressures facing banks in 2006 (ICAAP due in September 2006) make it imperative that the data model should be delivered as pre-prepared templates already implemented in the data mart and ready for final configuration. To provide a logical view, a mere concept still awaiting realisation, would be of little help in the current situation. The data sources providing data for this data mart are likely to be many and varied – possibly including data warehouses and existing Basel II P1 / P3 data marts as well as transactional front office systems. It is not necessarily the case that the analytics will reside physically in the same environment as the granular data mart which will provide the data to be analysed or the results data to which the outputs will be written. In the increasingly open architectures supported today by systems built on the principles of Service Oriented Architecture (SOA) such as SAP Bank Analyzer, this is outmoded. The availability of numerous specialist calculation engines and modelling tools make it highly likely that 3rd party models will have to be accessed via SOA from the main Pillar 2 environment.

3. Aggregated Results Data Layer The data output by the analytics must be written out to an aggregated results data repository. This could be, but need not necessarily be, physically realised in the same environment as the reporting layer (see below). The aggregated data will in some instances consist of time series data. The aggregated results data layer could in principle serve as a data source for the analytics. So the fact that this layer is shown in our logical architecture graphic as sitting above the analytical layer does not imply that the data flow between the two is always in one direction. 4. Presentation Layer for Executive Reporting As set out in the section on reporting requirements above, Pillar 2 output data will require appropriate graphical tools as well as the kind of reporting flexibility associated with OLAP reporting components. So the presentation layer referred to is likely to consist of one or more specialised reporting tools which extract data from an underlying aggregated results data layer. 5. Time Series Indexed Database Support for time series data and analysis is absolutely indispensable, given that Factor equations are extrapolating historic data to create forward-looking models. Macro economic assumptions, required for both Economic Capital calculations and the Stress Test, can be viewed as being market data from a systems point of view. Their accommodation in the architecture does not therefore pose any special problems. Internal bank risk management data, certainly does pose system issues in one respect – that of data security and privacy. The sheer sensitivity of the data involved is such as to make paramount the need for the tightest possible security. It is therefore essential that the security design adopted is robust and well proven in many existing implementations.

Page 45: Basel II Pillar

45

6. ETL layer This layer is necessary to support extraction of data from upstream data sources, be they transactional source systems or data warehouse environments. Conceptually, this is one layer. In practice, there is hardly ever an implementation where a single middleware tool provides a single physical layer corresponding to the layer shown in the logical view. The source systems feeding the ETL could well be data warehouses or existing Basel II data marts as well as customer-facing transactional operative product systems. For banks with an existing Basel II P1 / P3 solution, it would be a default assumption that the Pillar 2 ETL feeds data into a P2 solution positioned as an extra layer above the solution for P1 and P3.

7. Analytics and Statistical Modeling The analytics must handle the calculations required to support Factor Modelling, define the Reaction Function and run the Stress Test. The range of statistical modelling methodologies to be supported for this is potentially quite extensive, including moving average, time series regression, missing value interpolation, multivariate time series regression; vector autoregression and asymptotic approximation are important examples. The table below illustrates the main statistical functions and the risk types to which they could potentially apply.

Page 46: Basel II Pillar

46

Statistical Modeling and Analytics in the SAP Capital Adequacy Solution.

1. Multiple or Multivariate Linear Regression (MLR) Multiple (or Multivariate) Linear Regression (MLR) is one of the standard workhorses of Statistical Functionality (denoted ‘std’ in the STAT Function column in the table above. Most of the more complex ‘fin’ (financial) statistical functions can be generalised from the standard functions. MLR is standard SAP functionality which supports Factor Modelling. All of the screenshots below are from the standard SAP MLR function library included as part of the SAP CAS. The objective of MLR modelling is to assess how the development of one (dependent) variable can be explained by several (independent) variables and a constant value;- Dependent Vab. = const + coef1 * var1 + coef2 * var2 + ... + error For a causal analysis MLR does the final calculation of the regression coefficients and the constant. Typical (independent) variables are:

Default History, Instrument Price Volatility GDP, inflation, unemployment, Stock Exchange Index. Industrial Sector Index, Company Rating, Personal

demographics. MLR is usually deployed in a three phase workflow;

Specification: The model building activity, understanding the business, identifying the independent variables.

Estimation and Simulation: Collecting the data and fitting the model to the data. This process identifies relationship between independent variables to dependant variables based on historical data and models the effect (e.g. how is one or a combination of some independent variable driving the dependant variable). This phase also involves testing the model and checking plausibility.

Prediction & Forecasting (for the Stress Test, modelling hypothetical scenarios). In this procedure, the user is required to produce a forecast for the independent variables & to forecast dependant variable (based on perhaps different scenarios for independent variables).

Historical data for all independent variables has to be included in the MLR model. The objective of the modelling procedure undertaken by the analyst using the MLR tool is to ascertain precisely (with as low an error coefficient as possible) which variables influence the development of the resultant in which that analyst is interested (e.g. credit default histories and GNP in the case of credit risk); and further the analyst is seeking to establish the manner in which the driving factor determines the explained variable in particular in terms of Time Lag. There is a view that substantial experience is required for modelling causal effects exhibited through lags in MLR models but it is the experience of SAP in the deployment of MLR modelling that an analyst can engage in the feedback workflow quite quickly. The statistical challenges are exhibited as the existence of Autocorrelation and Multi-co linearity in the time series. The first key measure of MLR model quality is Coefficient: R² (R squared). The Coefficient of determination of “Goodness of Fit”. A summary measure telling us how well the variables (drivers) explain the dependant variable. Range: 0 to 1

Page 47: Basel II Pillar

47

The second key measurement is the t-Statistic, which tells us whether or not an independent variable is correlated to the dependant variable. The sign (+ or -) tells us in what direction The Durbin Watson statistic measures Autocorrelation (this is to be avoided since it obfuscates and casts doubt upon your results). The errors from the forecasted history show a pattern. This is bad because the forecast projected into the past is consistently above or below the history. The Durbin Watson

Statistic has a Range: 0 – 4, the acceptable Range is 1.5 - 2.5 The related Durbin-h statistic is doing a similar job if Durbin-h > 1.96, this is bad (you have autocorrelation). The final challenge of MLR modelling is the existence of Multicollinearity which is the existence of a relationship between two or more of the independent variables themselves in the model. All of the good measurement rules above become unreliable! There is no true measurement coefficient for this, only symptoms which are High R², low t-statistic or a coefficient with the wrong sign. Only true way of measuring for Multicollinearity is to regress each independent variable against each other then look at the R² coefficient. If R² is near 1 that means that there is a relationship between the two independent variables. For completeness we present a graphical user interface (for a non financial MLR model) in the column on the left. This is the standard SAP MLR module graphical user interface.

Page 48: Basel II Pillar

48

2. Statistical Inference in Market (&P2) Risk MLR is the implementation of a Bayesian inference technique appropriate for all sorts of applications; SAP deploys MLR in the Supply Chain Demand Planning domain and has done for many years. In some instances specific types of MLR models are most appropriate to the specific type of data being modelled and statistical inference techniques are developing all the time. This is a similar optimising challenge when addressing the Stress Test as we faced when we were selecting the optimal mathematical approach for our model approach in that section of this White Paper. It is surprising maybe that Market Risk is simpler to model than Credit Risk which is because we can generally model Market Risk continuously; it does not need the discrete aspects of credit risk modelling caused by defaults. We have described some different approaches to the aspects of market risk in the table above relating SAT Functions to Risk Types; specifically we have stated that aspects of Market Risk are supported by; ARMA and VARMA, EVT, Volatility Modelling, Copula and DMM. For the functional requirement underlying these statistical functions see other sections of this paper but it is sufficient to note here that these “generalisations” of the MLR framework are the precise response to the business challenge presented by the risk types in terms of P2 modelling. These specific techniques are not radically new types of functionality they are predicated upon the MLR method.

3. Operational versus Supervisory E.Cap in the TB The operational Trading Books (and thus the computation of Economic Capital in the TB) operate on very short time bases. In the Middle Office; rapid calculation of potential loss and risk exposure will be executed instantaneously on a position basis and with a portfolio view. We need to underline here that the Economic Capital being computed then for the SRP will be ‘unrealistic’ or distinct from operational Economic Capital because of (1) the different time base of the economic capital computation for supervisory purposes (quarterly, monthly) and operational purposes (daily, weekly) and (2) the aggregated view of Economic Capital required to be computed for the SRP. BIS requires and CEBS repeats the requirement that Economic Capital in the Trading Book should be “aggregated up” to four aggregate classes; Equity, Fixed Interest, Commodity and FX; this is a very different capital calculation process from the operational one; inevitably leading to the requirement for distinct functionality, database management and user skills from that in the operational middle office. There is a challenge to the modelling requirement for P2 posed by aggregate nature of the supervisory classes of market risk. We can get around that by using an index appropriate to the aggregated class e.g. the FTSE or the DAX for Equity, the FTSE-Gilts Index or the Euro bund option from Eurex for Fixed Interest; the Aluminium index from The LME, Crude Oil from Nymex or the UKPX Spot Index for electricity for commodities and maybe the Euro Spot Forward against the Euro from Reuters for the FX class. Here we are effectively modelling the asset allocation of the aggregated portfolio forward into the contrafactual and we can use the simple Sorge equation from the BIS paper on Stress Testing (a simple low dimensional Factor Model) wherein we make the evaluand Economic Capital. We are relying here on the overwhelming evidence collected since Merton and the CAPM that empirically index returns explain individual returns (i.e. you can't beat the market).

Page 49: Basel II Pillar

49

4. Market Risk; Regulatory Capital & Supervision It does seem likely that the supervisor will have a close focus upon the calculation of Economic Capital in the Trading Book (TB) and that maybe a consequence of some well known deficiencies in the methodology prescribed for the calculation of Regulatory Capital in the TB. The requirement for the regulatory capital calculation is for returns in the trading book assumed to be normally distributed, held for a ten day period at a 99% confidence interval times a factor of three. All empirical work on returns from securities traded on stock exchanges world wide over a very long period suggest that for a ten day holding period one would reject the normal distribution. The prescriptive nature of the P1 Regulatory Capital calculation entails that most Financial Institutions use the delta normal or variance covariance approaches (founded on the normal distribution) to calculate Regulatory Capital in the TB; empirical evidence suggests that the normal distribution underestimates economic capital in the trading book because trading book returns are generally skewed, the Gaussian normal fails in the ten day holding period because it underestimates the tail, it underestimate extreme factors and fails to estimate joint extreme moves. Thus the supervisor is likely to focus more on the economic capital approach in the TB; optimally Historic Simulation or EVT.

5. Analytic Modeling of Credit Risk Factor models are implemented statistically by Bernoulli mixture models (Bernoulli = Binary and Credit Default requires that essentially binary nature of the model since a credit default event can be understood analogously to a “death event” in the dynamics of database modelling). Factors make Bernoulli variables dependent. A mixture model is placing probabilities on independent variables to make them more dependent. This sort of analytical approach must be a capability of the application architecture to support financial or economic stress testing of credit risk portfolios. A factor model is fundamental whether the calculation of economic returns proceeds analytically (via formulas) or by simulation, in the latter case the Factor Modelling approach is the underlying analytics for the Monte Carlo methodology of Simulation. For Credit Risk the external factors drive default probability thus in Credit Risk models Default Probability is driven by observed external factors which are systematic. There are also unobserved systematic factors. Then there are non systematic idiosyncratic risks particular to each Credit Risk type (Personal, Corporate, Secured or Unsecured). So the risk Factor Model is implemented by a regression equation in three variables or possibly four; (the descriptive or driving factors). The formulas for these equations relate conditional probability of default through these three components to external factors.

Page 50: Basel II Pillar

50

6. Recovery Rates, Portfolio Risk and Expert Judgement Once default probability is well defined and modelled in the regression function then recovery rate comes into focus (1-LGD). LGD also has a systematic part driven by factors and an idiosyncratic part (noise). The same or different macro factors could drive PD and LGD (in mortgages these variables are partially independent in that although it is a recession which probably causes the initial personal default the manner in which the value of the collateral performs thereafter is likely to be quite independent of the personal recessionary impact). This is the application of theory or expert judgement to the data, the primary initial focus must be the manner in which the factor model fits the observed external factors to the internal data series in the first instance. General Factor choice will be a function of the iterative process of modelling and lagging. In all cases additional latent variables should be low-dimensional and preferably one-dimensional in the Credit Risk instance. Portfolio Credit Risk (sometimes called K-VaR) is given by the product of PD and LGD. Right at this moment in time this is the leading edge of thinking in the Credit Risk space and developments in the mathematics and statistical techniques are happening fast. It is worth noting that the relationship between PD and Systematic Macro Factors (particularly the demographic factors of the person) is the general specification of standard credit scoring models.

Page 51: Basel II Pillar

51

7. The Inference logic for Credit Risk We can show for the credit risk hazard inference the three step process whereby we traverse from the factor mathematics to the appropriate statistical functionality and then we apply an algebra which implements referentially the model conditions which we require for the Stress Test. The factor mathematics of Credit Risk are given by Professor McNeil’s equation presented earlier;-

The next step then is to explain that formulation more simply in terms of an algebra which is has a reference to the data sets available to us right now, this would be along the over simplified lines described below;-

Multiple Linear Regression, as we have said is a simple linear model, therefore one of its limitations is that it cannot be used for binary phenomena (default or not default), with a simple generalisation you can make an MLR model cope with binary, this gives you a GLM (a Generalised Linear Model). Credit Scoring Models are an instance of GLM; also called logistic regression. One more generalization gets you a GLMM which copes with idiosyncratic risk. GLMM is a Generalised Linear Mixed Model and can be called GLME (Generalised Linear Mixed Effect model), these are more and more being called for in response to B2. The GLME framework will support the stress test based upon rating transition; an ordered categorical response model can be called GLME, however you do need a GLMM to do Stress Testing of Credit Risk; of the PD driven by factor conditions (scenarios). The arguments of the GLMM function are presented at the top of this column.16

16 http://rss.acs.unt.edu/Rdoc/library/repeated/html/glmm.html Thanks to Rich Herrington, MS, PhD, University of North Texas.

Page 52: Basel II Pillar

52

8. High Level Data Sets for the Factor Variables Throughout this document we have given numerous examples of driving factors for the factor model of the Stress Test for both Credit and Market risk, in other sections of this paper describing the business requirements of P2 we will use more detailed examples of the functional data sets appropriate to each risk type specifically. However it would be appropriate to illustrate briefly here what, at a high level the example factors we have discussed are. Therefore in this simple table below we have summarised all of the data time series to which we have referred throughout this section of the paper. We have classified them as essentially internal time series or external time series and we have further classified clearly those data series which are specifically related to Market Risk. This is a simple and apocryphal exposition of the nature of the factors which drive the Stress Test model and the factors which require explanation (the independent and dependant variables) and thus the data sets of time series which a financial institution should be preparing to collect. The SAP Bank Analyzer HDB component (as described earlier) is the optimal historic database platform where particular attention must be paid to the historization of the parameters used in the factor modeling process for P2. The HDB hence stores data in a time-based way. This means that it is a central memory for information related to hazard stressed data, and is optimized for time-based evaluations. The HDB provides a stable infrastructure, which ensures that histories can be created, even for long time series. Since banks’ in-house models, and the related data requirements, vary from one institution to the next, the HDB has an open architecture. You can therefore adjust the database to meet your individual requirements to ensure that data is retained for all necessary parameters. The HDB offers a large number of enhancements to enable future Basel II requirements to be met.

The table is not a systematic and comprehensive statement of appropriate factors, for that we refer our customers to the ideas in the Dynamic Stochastic General Equilibrium (DSGE) academic sphere or to the services of specialist data vendors. This table is simply a structured summary of the data examples we have presented during our discussion here.

Page 53: Basel II Pillar

53

THE SAP CAPITAL ADEQUACY SOLUTION Functional Overview

The SAP Capital Adequacy Solution delivers the following components: Reporting Layer providing web and MS Excel reports plus graphical tools for Stress Test analysis

Flexible Results Data Layer Functions to execute the Stress Test for Credit & Market

Risk, Interest Rate Risk, Liquidity Risk and other risk types Support for 3rd party/ internal models e.g. for Credit VaR

calculation – these are supported by Service Oriented Architecture

Preconfigured and flexible Basel II compliant data model for use with granular or aggregated data as required

Time series Historic Data Base providing preloaded sample macro economic data and market histories as standard

The package specifically includes the following elements which are required for Pillar 2 processing :

User friendly user interface for use during initial modelling e.g. graphics to show the best fit between external and internal time series data

Predefined executive reports for Stress Test scenarios Graphical display of Stress Test results Statistical modelling toolbox for the development of Factor

Models specific to a bank’s own particular requirements Extensible data models for base data (contract / exposure),

results data and time series to be configured to meet the requirements of a bank.

An upload tool which can easily capture Bank internal data series from Microsoft EXCEL or other proprietary (trading) or other functional business systems.

Risk Types and the User Interface

With SAP Capital Adequacy Solution, all of the risk types handled via Factor models are processed in the way described under the section on process architecture, i.e.

First, collate the base data (both static and time series) required for the various risk types to be modelled, storing the data in the appropriate data stores – Historical Data Base (HDB) for time series including macro and other external data, Source Data Layer (SDL) for static data. Existing Bank Analyzer customers will already hold much of what is required, while the SAP Capital Adequacy Solution provides additional functionality for macro-economic and other external data.

The user is simply required to select a methodology for

each risk type and then run an iterative series of regressions to fit the external time series to internal Bank time series data. If the internal history data available is of poor quality then take this into account when selecting a methodology appropriate to the risk type, e.g. you may need to use missing value interpolation for credit risk if the internal data is patchy and uneven.

Page 54: Basel II Pillar

54

The P2 modeling cycle with the SAP CAS

The process to set the baseline to support the commencement of Stress Testing, envisaged by SAP is described below. Here we must emphasise that we have designed a Solution Architecture to support the Supervisory Review process; a key part of which is the Bank’s own internal governance of and thus responsibility for the conduct of the modelling itself. Unfortunately we cannot do the Factor modelling ourselves we can provide the framework solution and advise on its deployment but under the supervisory constraints we cannot conduct the modelling on behalf of our customer.

By running regressions for each risk type handled via Factor equations, you build up a set of equations which together constitute the Bank’s Reaction Function.

Once all necessary equations are defined, run the Baseline Scenario, which is similar to a stress test but which lacks the stressed, counterfactual, extreme assumptions associated with stress testing proper. In the process of running the Baseline Scenario, data will be aggregated across the whole bank so that a holistic view of Economic Capital is calculated and can then be allocated.

With the Baseline Scenario established as a benchmark, the stressed scenarios making up the Stress Test proper can then be executed.

The SAP Capital Adequacy Solution supports both in-house and 3rd party modelling functionality; whether these are external modelling applications or 3rd party libraries of modelling functions housed within SAP Bank Analyzer and called the SAP CAS. The Bank or other financial institution will be required to define the required econometric models and data sets appropriate to its own needs and circumstances using the tools and methods provided by the SAP Capital Adequacy Solution.

Page 55: Basel II Pillar

55

SAP Bank Analyzer and P2

1. Overview of SAP Bank Analyzer SAP Bank Analyzer is the strategic SAP solution to the requirements of a bank for improved analysis and reporting in both the Risk and Finance functions. It is currently being implemented in 19 countries at 44 different customers, 28 of which are already live. Architecturally, it consists of

an underlying data mart for granular data at contract or portfolio level, giving contract balances, counterparty information and so on (the Source Data Layer or SDL);

an increasing range of analytical tools for both risk and finance data (Basel II Analyzer for Pillar 1 / Pillar 3 credit risk, Accounting for Financial Instruments (AFI) for finance etc.);

a results layer for aggregated data such as Pillar 1 regulatory capital (the Results Data Layer or RDL); and

a parallel structure of underlying database, analytics and results database for time-series data (the Historical Database or HDB).

OLAP reporting for SAP Bank Analyzer is provided by SAP Business Warehouse (BW). Integration between Bank Analyzer and BW is provided as standard. BW provides graphical reporting capabilities as well as conventional user interfaces (MS Excel, web-based reporting). For finance, there is also an accounting sub-ledger which provides standard integration with the mySAP ERP General Ledger. Integration with upstream source systems is provided by standard data load methods and error processing functionality. An ETL is provided as an integral part of the solution from Bank Analyzer release 5.0 onwards (due for release H1 2006). However, Bank Analyzer can also be used with a variety of different ETL tools such as IBM WebSphere, and Ascential DataStage.

SAP Bank Analyzer is constructed on the SAP NetWeaver platform and so benefits from the full Service Oriented Architecture capabilities which this provides. SAP Bank Analyzer is delivered with a fully implemented object-based data model with templates covering a wide range of business objects including:

Banking contracts (loans, current accounts, leases ….) trading book contracts (swaps, options, etc) exposures, balances and bank asset allocation structure counterparties collateral objects market data

The object-oriented nature of these objects makes them easily extensible to cater for new requirements.

An overview of the SAP Bank Analyzer is given in the graphic: above.

Page 56: Basel II Pillar

56

2. Matching SAP Bank Analyzer to Pillar 2 A brief comparison of the Pillar 2 requirements put forward above with SAP Bank Analyzer shows that at a high level, the requirements for a platform on which to base a Pillar 2 solution are met almost completely by Bank Analyzer:

The main enhancements to the existing Pillar 1 / Pillar 3 solution needed to meet the requirements of Pillar 2 are:

Addition of extra attributes to existing base data business objects (contract etc.) where necessary

Parameterisation of existing Pillar 1 / Pillar 3 calculation functions where necessary

Addition of extra object templates for macro economic data Development of Factor Model and related analytics within

the context of the HDB Enhancements to the HDB results database to store Pillar 2

outputs Reporting enhancements to BW

These enhancements are illustrated in the graphic above.

Page 57: Basel II Pillar

57

SAP Bank Analyzer & Credit Risk

SAP Bank Analyzer already provides a complete framework for calculation of the credit risk component of Regulatory Capital (RC). Specifically, it provides:

A source data layer (the SDL) for all categories of base data required as input to credit risk calculations (templates for all types of contract data, balance and cash flow data, business partner data, collateral and so on);

Regulatory assumptions concerning PD, EAD, Credit Conversion Factor (CCF), haircuts, risk weight and so on;

Pre-programmed functions to apply the regulatory assumptions in order to calculate RWA and regulatory capital; and

A results data layer (the RDL) to which both intermediate calculations and final results can be written.

Most of these elements can be reused as part of a solution architecture for providing the credit risk component of Economic Capital (EC). Clearly, this applies to both the source and results data layers (SDL and RDL respectively). As far as the calculation of EC for credit risk is concerned, it is also be possible to reuse the pre-programmed functions used for RWA and regulatory capital calculations, since these are parameterised for Pillar 2 to allow the use of assumptions specific to each individual bank in place of the assumptions and factors prescribed in the Second Basel Accords for RC calculation.

To illustrate what is meant by parameterising the RC calculation functions for use in EC calculation, take for instance the derivation of PD for retail customers under IRB for Pillar 1. Under this approach, data for retail customers is aggregated by several criteria e.g. maturity, and borrower characteristics such as (for mortgages) whether owner-occupier, type of property involved and so on. Based on this segmentation of the data, a pre-determined PD is assigned to each retail segment. This approach is sufficient for Regulatory Capital calculation. However, to flex these calculations so that the functions which support them could be used for Economic Capital, it is necessary to provide a realistic measure of PD (and EAD, risk weight .....) appropriate to a particular bank’s default history and circumstances. Furthermore, to support the demands of the Stress Test for credit risk, it is necessary to predict the change in PD which would occur in response to extreme macro-economic shocks such as a large rise in unemployment or fall in the value of the US Dollar. Using the Factor models supported by SAP Capital Adequacy Solution, a Bank is now able to model the impact of macro economic changes on PD and other factors relevant to credit risk calculation. Specifically, the Bank can:

Predict the expected value of a specific data item (in this case, PD for a given customer segment) given the expected value of a set of macro economic variables (unemployment, Dollar / Euro exchange rate) based on historic data.; and

Calculate the likely outturn of that data item (e.g. PD) in future given changed macro economic circumstances.

The stressed data can then be fed through the existing Credit Risk analyzer and stored in the Results Data Layer alongside data of all other risk types ready for further aggregation and reporting.

Page 58: Basel II Pillar

58

The SAP Capital Adequacy Solution for current SAP Bank Analyzer clients

For existing SAP Bank Analyzer users, implementing Pillar 2 with SAP Capital Adequacy Solution offers the great advantage that their existing platform can be leveraged to provide a Pillar 2 solution at low cost and risk. The synergies will allow the user to:

reuse the existing data (e.g. exposures) which is already contained in the Source Data Layer (SDL), with only incremental extensions to the extensible data model where necessary;

eliminate the need to reconcile underlying Pillar 2 exposures with the corresponding data for Pillar 1;

minimise the need for staff training on a new environment; leverage the Basel II knowledge of the bank’s SAP staff to

implement Pillar 2; and extract added value from the existing SAP platform instead

of incurring the costs associated with a separate solution.

The SAP Capital Adequacy Solution for new SAP Bank Analyzer clients

Although clients who already possess an SAP Bank Analyzer installation will benefit from the synergies just outlined, the SAP Capital Adequacy Solution is an appropriate fit to the needs of clients who are new to SAP Bank Analyzer. These clients require a solution which can be implemented as a thin ‘supervisory layer’ above their existing systems, including non SAP solutions for Basel II Pillars 1 and 3. The key features of the solution required by these clients include:

scalable architecture which can be implemented on a small scale and at low cost initially, for expansion later if required, with a very wide range of databases and operating systems to match the existing systems landscape;

predefined data models with templates which can be populated simply with highly aggregated data for ease of implementation;

the possibility to reuse of existing data, e.g. Basel II Pillar 1 data such as LGD and PD, within their Pillar 2 solution;

flexibility around modelling tools, so that existing non SAP models and modelling applications can be integrated cheaply and quickly into the systems framework for Pillar 2;

flexibility around existing risk environments, so that data from existing applications (e.g. for Credit Risk) can be integrated into the overall Pillar 2 solution if desired;

time series data with sample macro and market data already in place to serve as a guideline;

sophisticated reporting tools (web, graphics, MS Excel) to meet the needs of senior executives and supervisors as well as analysts and other operational staff;

industrial strength, proven security concept to guarantee the security and privacy of highly sensitive data;

above all, the flexibility and practicality of approach which guarantees compliance with Pillar 2 in the short term, without sacrificing the long term objective of building a more sophisticated solution going forward.

Page 59: Basel II Pillar

59

© SAP UK and Ireland 2006

SAP Bank Analyzer & The Capital Adequacy Solution

Built on the SAP Bank Analyzer platform

Reporting Layer provides web reports plus Graphical tools for Stress Test Analytics

Flexible Results Data Layer

Functions to apply Stress Test Hazard Factors for Credit & Market Risk, IRR & Liquidity Risk

Factor Models help perform the implementation of the Stress Test

3rd party models supported by Service Oriented Architecture

Preconfigured and flexible Basel II Data Model

Time series Historic Database provides support for macro economic data and market historiesas standard

Financial Product Valuation

Reporting and Analytics

Modeling

Default Factors

FactorModels

HDBTime series

3rd Party models

COREPRWA ICAAP

Results Data Layer

Source Data Layer

CreditRisk

StressTest

Off BalanceSheet

MarketRisk

The SAP Capital Adequacy Solution is a new package

to help banks ‘over the line’ on Pillar 2

The Benefits of implementing the SAP CAS

The SAP Capital Adequacy Solution offers all these possibilities to clients who are new to SAP. But with SAP, the benefits go well beyond the immediate system features, to encompass:

the expert support from SAP Consulting which guarantees that the solution will be implemented to the highest professional standards;

professional advice from SAP and its extensive range of partners on business issues; and

a full range of support and education services to ensure that your staff receive full training and that your system is maintained properly, e.g. on a 24 by 7 basis;

Above all, with SAP, you have the peace of mind of knowing that your Capital Adequacy Solution is underwritten by the world’s leading provider of business software. With SAP CAS the future of Supervision is available now! The integration of the SAP Capital Adequacy Solution and SAP Bank Analyzer is presented in the high level architecture graphic on the right.

Page 60: Basel II Pillar

60

.

THE CALCULUS OF STRUCTURED FINANCE

INTRODUCTION TO STRUCTURED FINANCE ...........................................................................61 OFF BALANCE SHEET SECURITIZATION ................................................................................................61 THE CALCULUS OF SECURITIZATION .....................................................................................................62 THE REAL-WORLD COMPLEXITY OF STRUCTURED FINANCE ..............................................................63 CONSTRAINTS IN THE METHODOLOGY PHASE OF INSTRUMENT STRUCTURE ...................................64 SECURITIZATION METHODOLOGY AND TRANSACTION STRUCTURE....................................................64 CREDIT ENHANCEMENT (CE) & SUBORDINATION..............................................................................66 THE WATERFALL CASH-FLOW................................................................................................................67 LESSONS FROM THE CAPITAL TREATMENT OF STRUCTURED INSTRUMENTS ....................................68 THE REQUIREMENT TO MODEL UNCERTAINTY .....................................................................................69 STRUCTURED INSTRUMENTS AND THE STRESS TEST ..........................................................................70

Page 61: Basel II Pillar

61

INTRODUCTION TO STRUCTURED FINANCE

Securitisation is the term used to describe the process of issuing securities backed by the cash flows from a pool of underlying assets. These assets can be of various types; residential mortgages (RMBS) commercial mortgages (CMBS) or leases, credit card receivables and other types of receivable cash flow sources such as car loans; the latter set are usually described under the generic heading, Asset Backed Securities (ABS). Structured finance instruments can be defined in terms of three distinct characteristics: (1) pooling of assets (either cash-based or synthetically created); (2) de-linking of the credit risk of the collateral asset pool from the credit risk of the originator, usually through the transfer of the underlying assets to a finite-life, standalone Special Purpose Vehicle (SPV), otherwise known as a Special Purpose Entity (SPE); and (3) Tranching of liabilities that are backed by the asset pool. The Tranching of the “structure” of payouts by the SPE is the characteristic that sets Structured Finance apart from other types of financial instrument manufacturing in financial institutions today. It should also be noted that Structured Finance is now a key source of funding for Financial Institutions in the current climate of large personal sector indebtedness in developed countries and is also the key channel of disintermediation in financial markets.

A key aspect of the tranching process is the ability to create one or more classes of securities whose rating is higher than the average rating of the underlying collateral asset pool or to generate rated securities from a pool of unrated assets. This is accomplished through the use of credit support (usually known as Credit Enhancement (CE)) structured into the issued instrument s to create securities with a risk-return profile enhanced upon that of the underlying asset pool. Typically, securitizations are used for pools of homogeneous assets that have relatively predictable cash flows. Common asset classes are residential mortgages, car finance loans and leases, credit card receivables, loans and bonds. The ability of the asset pool to meet the obligations to the note holders is assessed largely on projected cash flows of the assets under various scenarios. These scenarios are illustrated in the offer documents using key assumptions for prepayments and credit losses which occur under projected expectations of defaults and recovery rates. In this way the structured finance instrument is a microcosmic model of the stress test required holistically for the whole Financial Institution.

Off Balance Sheet Securitization

In an Off-Balance Sheet Securitization transaction, the arranger of the securitization (who may be the holder of the assets serving as collateral or maybe an investment bank (a sponsor bank), acting on behalf of one or many institutions who are the holders of the collateral acting in syndicate) creates a special purpose vehicle (SPV). This is a separate legal entity that exists only for the purpose of the securitization and whose operations are limited to the acquisition and servicing of specific assets. The SPV purchases the assets through what is referred to as a “true sale” or an assignment of the assets. An SPV is designed to be “bankruptcy remote” from the asset seller, and from the arranger of the securitization. Thus the failure of the asset seller does not affect the SPV that has purchased the assets, and the sellers’ creditors will not have access to its assets.

Page 62: Basel II Pillar

62

The Calculus of Securitization

It is in securitization, in the calculation of regulatory capital appropriate to securitized portions of the Bank’s balance sheet that we first glimpse approaches from Factor Modeling being deployed in the Basel Accords; this is a Pillar One (P1) deployment. It is necessary to review this deployment of Factor Modeling in the Pillar One treatment of securitized exposures so that we can learn lessons from that deployment for the Economic Capital (E.Cap) calculation appropriate to securitized exposures and further learn lessons for what might be the optimal approach to stress testing and thus scenario based calculations of economic capital for the whole Bank portfolio. In the P1 Regulatory Capital calculation for securitized instruments, Basel II does not recognize internal ratings determined by banks for securitization exposures. IRB banks may only use public ratings by rating agencies for rated tranches (ratings-based approach (RBA)) or where a rating is implied by the next senior tranche, or a supervisory formula (SFA, Supervisory Formula Approach) for unrated tranches.

Recognition of internal assessments (IAA, Internal Assessment Approach) does occur in the very specific case of liquidity lines or credit enhancements extended by sponsor banks to ABCP conduits. This is under stringent conditions to be met by the Bank and is an empowerment which may be removed by the supervisor. It is an open issue whether these ‘stringent conditions’ upon the use of an IAA may be relaxed as we go forward with B2 (towards B 2.5 or B3?). If an exposure is unrated and RBA, IAA and SFA are unavailable; the Bank may use a further exceptional “look through” approach with regulatory consent on a temporary basis for liquidity facilities. Otherwise, the bank must deduct unrated exposure from capital. Right now, the lowest risk weights will be achieved for senior tranches backed by granular pools (residential mortgages or retail assets) i.e. RBA tranches.

Page 63: Basel II Pillar

63

The relationship between Regulatory Capital and E.Cap in Securitization Regulatory Capital requirements regarding securitization exposures continue to evolve with an ultimate goal towards true risk based capital calculations. The objective of Basel II, in the particular aspect of the SFA, uniquely with regard to securitization is to bring us closer to risk adjusted capital than we have ever been. The basic premise is that the SFA will calculate the amount of credit enhancement that the regulators believe is sufficient to cover the underlying credit risk of a given pool of assets. This level is then compared with the credit enhancement in the transaction. Transactions that do not have a comfortable margin of excess credit enhancement over the minimum required will be subject to a significant capital requirement. Thus the SFA approach is a model of economic capital in the structured instrument. At this moment in time the RBA is the “senior” methodology for the calculation of capital for a securitized instrument but it is the SFA which calculates the ‘internal rating’ of the securitized instrument and is close to the ‘spirit’ of the Basel Accords and it is from the SFA that we can generalize the most important lessons in our definition of the optimal approach to P2 Economic Capital calculation.

The Real-World complexity of Structured Finance

Pooling and tranching, while being key sources of value in structured finance, are also the main factors behind what might be called the “complexity” of these instruments. As far as pooling is concerned, evaluation of risk and return of a structured finance security necessitates modeling the loss distribution of the underlying asset pool, which may be complicated when the pool consists of a small number of heterogeneous assets. However, as tranching adds an extra layer of analytical complexity, the evaluation of a structured finance instrument (in other words, a tranche) cannot be confined to analyzing asset pool loss. It is also necessary to model the distribution of cash flows from the asset pool to the tranches; that is, to evaluate the specific structural features of the instrument. These features, defined via covenants, may entail sets of rules for the allocation of principal and interest payments received from the collateral pool and for the redirection of these cash flows in the case of stress situations, in addition to specifying the rights and duties of various third parties involved in the transaction.

Page 64: Basel II Pillar

64

Constraints in the Methodology Phase of Instrument Structure

The limitations of the Discounted Cash Flow (DCF) approach to valuation are nowadays well recognized and there is a large and growing literature on optimal methods of modeling uncertainty. DCF cannot encompass all of the important factors driving collateral performance, the absence of a market value factor in the DCF model cannot account for the volatility in valuation expectations driven from interest rate or term structure shifts for example. This means that the cash flow model of the receivables for a securitized instrument, although a fundamental mechanic of the valuation appraisal of the instrument is a necessary but not in isolation sufficient condition of the quantification of the economic capital requirement of the instrument, in particular in stressed conditions. To progress from necessity to sufficiency in the Economic Capital calculation process for a securitized instrument the next step in the methodological evolutionary process must be some kind of Economic Valuation (ECVM (Economic Capital Valuation Methodology)) which in principle is equivalent to Risk Adjusted Rate of Return on Capital (RARORC) modeling; the key here is “Risk Adjustment” and “Economic Valuation” whereby all assets and liabilities are market valued (cash equivalent). An ECVM considers the estimated magnitude and timing of future cash flows where Economic income is the change in economic value over a period in time. Put very simply in the language of the DCF approach we are considering Forward Looking NPV income formulation (with risk adjustment). This methodological enhancement (from necessity to sufficiency) entails a recommendation to deploy Factor Modeling for the analytics of E.Cap in securitized instruments in the first instance and thus by extrapolation from the SPV microcosm to the holistic appraisal of E.Cap in the whole bank. The factor modeling approach is entirely consistent with the Sorge (BIS) methodology for Stress Testing generally and with the specifics of the ASRF (Asymptotic Single Risk Factor) methodology of Professor Michael Gordy which is the intellectual insight underlying the SFA.

Securitization Methodology and Transaction Structure

The securitization process is nearly identical for all types of transaction, regardless of the asset class (mortgage loans, leasing claims, etc.) and the configuration of the transaction (true sale or synthetic). The initial design of a securitization transaction is a two-step workflow in a feedback loop;- Transaction Pool selection and Transaction Structure definition; the two phases have been defined as the “methodology phase” and the “structure phase”. The two steps iteratively loop until the optimum transaction structure, predicated upon the optimum collateral pool is defined. The cog which links these two wheels is the Cash Flow (DCF) model of the transaction. In this sense the basic accounting DCF is the necessary condition of the structure of such an instrument. There are a surprising number of factors to be considered in transaction structuring and unless these factors are precisely modeled, the resulting capital efficiency of any particular asset pool can be sub-optimal, causing the economics to be wrong and potentially costing the originator or arranger a great deal of money. The step of selection of the underlying exposures to be securitized and the ‘design of a portfolio asset pool’ is currently the most time-consuming segment and consequently a bottleneck in all portfolio management efforts in securitization.

Page 65: Basel II Pillar

65

Optimizing the Securitized Instrument Structure Oversimplifying the process; one starts with the details of the universe of underlying exposures which may potentially be securitized and knowledge of the rating criteria for the particular transaction structure envisaged. Then a capital structure is prototyped that is saleable; market intelligence is essential to this process which assists the structure pre-sales activity to arrange the investors syndicate at the transaction structure stage. Then the Over-Collateralization (O/C) and Interest Coverage (I/C) levels are determined. O/C and I/C mechanisms divert cash flows from the underlying pools which would have otherwise supported subordinate tranches to the more senior tranches. The next challenge is the question of whether or not the envisaged instrument structure will meet the required rating agency conditions of the envisaged rating of the instrument. If not, then the feedback loop returns to the capital structure.

Thus, the development of instrument structure can be an art as well as a science; there are many “moving parts” to adjust. More than one capital structure is possible during the methodology phase; the optimum structure being defined by an optimization of three variables; the underlying pool universe, market demand and rating agency criteria.

It is the Cash Flow model which is (has been) the framework where the optimization of these variables has been tested. From a Risk Management perspective, the most important factor affecting the performance of a securitized instrument is the total loss exposure of the collateral portfolio over the life of the instrument due to correlated defaults among the collateral. Each tranche can withstand a characteristic level of loss on the collateral pool before it does not receive its promised interest and principal payments. Performance is also greatly affected by the timing of the defaults, particularly for the equity tranche (the lowest in the seniority hierarchy and the focus of much attention as a consequence of B2, right now). Other risk factors are interest rates, prepayment rates of collateral, and recovery rates on defaulted collateral. There is also price risk, i.e. the change in value of the collateral due to market perceptions of changes in the underlying credit quality, note here that those market perceptions are predicated upon secondary market participants’ deployment of factor modeling techniques (whether factor, structural or reduced or form) to appraise the relationship between the type of collateral supporting the instrument and the likely development of macroeconomic factors.

Page 66: Basel II Pillar

66

Credit Enhancement (CE) & Subordination

Credit Enhancement is the structural facet of securitized instruments to facilitate higher quality debt (e.g. AAA) being issued on the cash flow basis of lower rated underlying collateral. O/C is an aspect of CE which is they key aspect of the E.Cap of a securitized instrument. O/C is the extent to which the issued par fair value of the instrument is greater than the rated liabilities, as a whole or for a tranche. Subordination refers to the structural priority of payments. Senior notes of the liability side of the transaction have a prior-claim on the cash-flows of the instrument and the Subordinated notes are therefore entitled to the residual proceeds generated by the collateral pools after the payment of expenses and debt servicing on the more senior tranches. Thus the more subordinated tranches are known as equity tranches since the holder has close to an equity stake in the deal with the most subordinate tranche being called the First Loss Provision (FLP). The FLP has until recently normally been retained by the originating house and in that regard can be integrated into a robust and comprehensive ECVM model of E.Cap for a securitized instrument. Now, markets are growing up in the trading of Equity Tranches where Hedge Funds are the key suppliers of liquidity. As J P Morgan point out;

“Under Basel 1, banks were able to reduce regulatory capital held for various assets through securitization…..The capital reduction was achieved without significantly altering the bank’s true risk position, since most originating banks retained the most subordinate risks (excess spread and the reserve fund). …. Originators will seek a rating for residual risks when possible. Banks will seek to structure rated tranches for risks that were historically unrated and retained. The goal is quite clear: minimize the size of any unrated exposures. Even small incremental rating improvements will dramatically improve capital requirement at the bottom of the structure.”18

Page 67: Basel II Pillar

67

The Waterfall Cash-Flow

The priority of payments (also known as the ‘waterfall’) is the sequence in which payments must be made to the holders of the various levels of priority or subordination of note classes and to other parties to the transaction. The payments are usually separated into collections from interest and collections from principal.

In the waterfall, the SPV administration fees are paid first, followed by the interest due to the senior tranche. The tranches are paid in the seniority order of priority.

Returns and risk increase as tranches move down the seniority hierarchy. Instruments with considerable “excess spread” (the amount of cash produced by the collateral pools over and above payments to senior tranches), have equity tranches which may provide considerable price upside (and if retained by the originator are of considerable assistance in reducing E.Cap exposure). As defaults occur in the collateral, the amount of excess spread decreases and the equity tranche soaks up the losses first.

Page 68: Basel II Pillar

68

Lessons from the Capital Treatment of Structured Instruments

The first key lesson is that the Ratings Agencies did not design their rating processes to support the calculation of a single regulatory capital number in a single year for a given securitization transaction. Their robust and rigorous rating evaluation processes have developed over many years on the back of massive databases of default data via a continuous dialogue with the academic community leading to the rating agencies deploying up to date academic techniques in their rating processes. Essentially the rating agencies are appraising the ‘economic capital’ in a securitized transaction for sufficiency and on the basis of that economic capital (which is proportionate to the analyzed and computed risk in the transaction) applying a rating to the tranche and the transaction. This of course is the obverse of the logic deployed by the originating financial institution when optimizing E.Cap (considered here as O/C+I/C+CE+FLP) to achieve a target rating. The relationship between the originator and the rating agency is a dialogue. So crucially the ratings agencies provide a rating for the trances of each structured instrument and for the instrument as a whole based upon their appraisal of the economic risk profile of that instrument. Right now for P1 regulatory capital calculation for issued securitizations, as we have said above IRB banks can only use the RBA for the calculation of regulatory capital for securitized instruments’ rated tranches. The RBA should be deployed where a tranche rating is available or can be inferred from an existent rating. For unrated tranches IRB Banks use the SFA to calculate regulatory capital. The SFA then calculates a genuine internal rating for the tranche whereas the rated tranches have regulatory capital appraised on an external rating basis. If we believe that the spirit of the new supervisory and regulatory framework is a progression towards real internal economic appraisal of capital then it is in the SFA that we will discover the paradigm for the calculation of E.Cap and a methodology for the Stress Test.

Uncertainty in Loss Prioritization The Supervisory Formula Approach (SFA) is a deployment of the Michael Gordy model of ““uncertainty in loss prioritization" (ULP) model of Gordy and Jones (2003).”. The Gordy model is an econometric factor model. It was initiated by the requirements of the second Basel Accord. It is a relatively complex econometric model for those not familiar with this type of approach and recently it has been simplified into the “Simplified Supervisory Formula” (SSF). In the most recent paper on this model and the one which presents in Professor Gordy’s view, the most;

“….flexible mathematical description of the model (i.e., the continuous Dirichlet process in place of the finite-dimensional Dirichlet distribution).”

Gordy describes his model as;

“A general methodology for simplifying computation of capital charges” 17

So, for securitizations regulatory capital and economic capital have been conflated by the methodological approach adopted by the Basel committee for the unrated tranches where an internal rating is necessitated; there really is no distinction between regulatory and economic capital if the SFA is deployed. There is no issue of “additivity” here. Therefore in this White Paper focused upon the P2 Stress Test and the Economic Capital calculation we should take as a framework the Gordy Jones model for calculating the economic capital of a securitization. We need to elucidate the underlying mathematical logic of the SFA prior to presenting an approach to the calculation of stressed economic capital of issued securitizations.

17 Model Foundations for the Supervisory Formula Approach, Michael B. Gordy, Board of Governors of the Federal Reserve System, July 2004

Page 69: Basel II Pillar

69

The Asymptotic Single Risk Factor Framework (ASRF) The Gordy approach is an evolutionary step forward in valuation methodology. Gordy argues the necessity of an ULP approach (Uncertainty in Loss Prioritization) which reflects the uncertainty of accounting information about a securitized tranche and the necessity for an econometric approach to the calculation of the economic capital required to support the instrument. The Gordy model is implemented within an Asymptotic Single Risk-Factor Framework (ASRF); the term asymptotic means approaching a value or curve arbitrarily closely (i.e., as a limit is taken). The IRB approach is an ASRF because Economic Capital is set to cover total mark-to-market credit losses over a one-year horizon with probability q, and it is assumed that a single, common systematic risk factor drives all dependence across credit losses in the portfolio. This is a Factor Modeling approach to Economic Capital, thus by deployment of the Factor Model for regulatory capital in issued securitization unrated tranches are not BIS guiding us towards a factor modeling approach for E.Cap more generally? This discussion may seem academic and irrelevant to the practical considerations of the business of securitization but it is not; as Gordy comments; “The details of the contractual cash flow waterfall are material but unobservable parameters in the true model of the securitization. From the perspective of the econometrician (in our case, the regulator), such parameters act as sources of random error that must be “integrated out" rather than ignored”

The Requirement to Model Uncertainty

Gordy introduces one crucial assumption to ASRF as a premise for the Uncertainty in Loss Provision (ULP) method of appraising economic capital in securitization This crucial assumption is fundamental to the Gordy approach (the SFA);

“It is assumed that the bank's credit portfolio is infinitely fine-grained in the sense that any single obligor represents a negligible share of the portfolio's total exposure”

The SFA is an implementation of the ULP model. From an operational perspective, the only significant challenge in implementing the ULP model lies in obtaining the pool characteristics;

n, Kirb & E(LGD). Therefore securitization transactions may differ in legal form from other types of defaultable exposure but they do not differ in the analytics of their risk. The Gordy critique of the accounting approach to securitization transactions is important; it does not relate narrowly, specifically to securitization transactions, it is an instance of a general view, current in both academic and commercial work to move the analysis and quantification of capital form an accounting basis to an econometric one; where uncertainty and error are acceptable parts of the process. Essentially the accounting certainty is a chimera. Models which rely on quantification of a firms’ assets and (in particular) liabilities in a precise manner are seriously “punctured” or appear flawed when we rely upon (a sometimes implicit) assumption that accounting information is perfect.

Page 70: Basel II Pillar

70

The ULP model adds a stochastic element to the distribution of loss across tranche investors. The credit enhancement level determines ex-ante expected protection against losses in the pool, but the actual protection conferred by the prioritized waterfall is random and realized only at the horizon. J P Morgan point out in a recent important paper in the Structured Finance domain18;

“The output of each Basel 2 formula (which varies by asset type) is the regulatory capital requirement, and is sized to cover only the unexpected losses, since banks are expected to provision for expected losses. …Unexpected losses, under Basel 2, are modeled as a function of a systematic risk factor (a macroeconomic variable). The capital requirement, consequently, also depends on a correlation input (R) which captures the relationship between unexpected loss and the systematic risk factor. Basel 2 formulas assume that idiosyncratic risks are eliminated by diversification. However, individual regulators may adjust the total capital charge for lumpy (less diversified) or highly correlated pools. Adjustments made by regulators to capital requirements fall under “Pillar 2”of the Accord, which relates to (supervisory) oversight.”

The Gordy paradigm (ULP in ASRF) is the key framework fro the stress testing of E.Cap in Off Balance Sheet Structured Finance and the Factor Modeling approach by extension is the right approach to the ICAAP generally.

18 Basel 2 and Securitisation A Paradigm Shift, J P Morgan, European Securitized Products Research, 30 January 2006

Structured Instruments and the Stress Test

To use J P Morgan for a practical perspective again on the Stress Test as it relates to E.Cap is securitized instruments;

“We would highlight that levered returns are highly sensitive to any assumptions about maximum permitted leverage, funding costs and asset spreads. Furthermore, mark-to-market risk may be substantial when using the maximum permitted leverage for each funding option”.18

We should remember some of the key features and stipulations of E.Cap calculations in the BIS Accord literature, BIS working papers, CEBS and other material from the European Central Banks. The ICAAP (E.Cap capital allocation process) is predicated upon a Stress Test (ST) which itself is constructed of several plausible scenarios plus a baseline; so there are several contrafactual measures of E.Cap output from the ST plus the baseline E.Cap measure; where the baseline is defined (roughly speaking) as that scenario which describes the macroeconomic conditions prevailing at the current moment. Therefore the contrafactual scenarios which are predicated upon different views of how macroeconomic conditions might look if certain assumptions were varied leading to different assumptions of hazard in the risk profile of defaultable exposures are all variations on the baseline theme but the key foundation for the ICAAP is the baseline E.Cap calculation; which itself may vary (sometimes significantly) from the Regulatory Capital calculation produced formulaically in the Banking Book and the Trading Book and filtered by constraint and assumption mechanics leveling the regulatory playing field in manner not driven by economics nor by mathematics.

Page 71: Basel II Pillar

71

In Securitization, it’s different! In securitization it’s different; at the baseline the E.Cap and Regulatory Capital numbers are identical if evaluated internally (they have been conflated methodologically, as argued above); the challenge in securitization is to provide stressed measures of E.Cap for securitization issuance (and investment); this is more complex than it is in the Banking Book or the Trading Book simply by virtue of the instrument complexity. Of course implicit in our approach here to E.Cap in securitized instruments is that a calculation of E.Cap for the issued instrument using the ASRF approach is a good benchmark and comparator for the Regulatory Capital number produced largely by RBA.

Page 72: Basel II Pillar

72

MARKET RISK A P2 PERSPECTIVE

MARKET RISK A P2 PERSPECTIVE............................................................................72 DEFINITIONS OF MARKET RISK ................................................................................73 MARKET RISK LOCUS IN THE FRONT OFFICE.....................................................................74 THE TRADING BOOK DEFINED BY INTENT ..........................................................................74 THE BCBS & IOSCO JOINT GROUP .................................................................................75 FAIR VALUE OF POSITIONS IN THE TRADING BOOK ..........................................................76 TRADING POSITIONS FOR WHICH NO ACTIVE REFERENCE MARKET EXISTS.....................77 CREDIT DERIVATIVES HEDGE NON-TRADING POSITIONS. .................................................79 THE DISTINCTION OF THE TRADING AND BANKING BOOKS. ............................................79 CREDIT DEFAULT SWAPS. ....................................................................................................80 VARIANTS OF CREDIT DERIVATIVE; SYNTHETIC TRANSACTIONS. .....................................80 ISDA (THE INTERNATIONAL SWAPS AND DERIVATIVES ASSOCIATION) ........................81 OPERATIONAL RISKS IN CREDIT DERIVATIVES ..................................................................81 CONDITIONS UPON IRB IN THE TRADING BOOK ...............................................................82

Page 73: Basel II Pillar

73

DEFINITIONS OF MARKET RISK

When focused upon supervisory and specifically regulatory requirements upon financial institutions it is easy to forget that risk is the basis of banking. In that context Market Risk is conventionally defined as:

“The risk that the value of 'on' or 'off' balance sheet positions will be adversely affected by movements in equity and interest rate markets, currency exchange rates and commodity prices". 19

Market Risk is only one of the many Financial Risks confronting a Financial Institution today. The set of Financial Risks includes the following set of risk types: Credit, Operational, Regulatory, Governance (Compliance) and Reputation and Model risk. Market Risk is different. It has a specific locus in the trading book and can be broken out into a set of sub-risks which clearly constitute the concept of Market Risk - Interest Rate risk, Liquidity Risk, Model risk and Market Risk proper; where Market Risk means a risk to prices or capital values. Of course these risks also impact the Banking Book, but they are more obvious in the Trading Book. This clearer specification in the Trading Book entails that modeling methodologies for these risks are more clearly defined in the literature of Trading Books than in general Credit Risk modeling approaches. This point has been made elsewhere in this paper with regard to the impact of Basel II in that off-balance sheet modeling techniques are the answer to Pillar Two questions about on-balance sheet risk management.

19 Framework for Supervisory Information about Derivatives and Trading Activities, Joint Report by the Basle Committee on Banking Supervision and the Technical Committee of the International Organization of Securities Commissions (“IOSCO”), September 1998; http://www.bis.org/publ/bcbs39.pdf

Credit Risk justifiably attracts most attention (Non-trading (Balance Sheet) risk consumes more risk capital) but Market Risk also consumes large risk capital. In Market Risk the focus up to now has been on Risk Reporting rather than Risk Management; now with Basel II there is a greater focus on Trading Risk. Market risk is significantly different in nature from credit risk. Credit (Risk) returns are skewed, with a long fat tail, market returns are normally distributed (bell-shaped). Market risks are often highly correlated but can be reduced through hedges and can only be slightly reduced through diversification, credit risk is less correlated and can be reduced though diversification. There are two broad types of Market Risk - directional risk and relative value risk. It can be differentiated into two related risks - price risk and liquidity risk. A trader intentionally takes relative value risk when he expects the relative value of two market factors to change in a particular direction (i.e. the relative difference in value will become either smaller or larger). Price risk is the risk of decrease in the market value of a portfolio. Liquidity risk refers to the risk of being unable to transact a desired volume of contracts at the current market price. It would happen if the size of transaction is materially large.

Page 74: Basel II Pillar

74

Market Risk locus in the Front Office

The (Market) Risk Management function is a front office function. Risk Management (usually located in “The Middle Office) is responsible for inter alia: setting risk limits (such as Value at Risk) and defining market risk policy; monitoring market risk exposures taken by the various trading areas; acting as primary contact for the control functions, audit and regulators; ensuring that market risk is adequately considered in new product proposals; keeping abreast of models employed by the trading areas; and, development of robust stress testing methodologies. The standard response of the Front Office function to increases in risk in general, and regulatory and supervisory risk in particular, is the innovative development of new products that are perceived as arbitraging away that risk. Therefore, despite the fact that the second Basel accords were a response to regulatory arbitrage of the first Basel accord we should expect the third Basel accord to come along shortly (please do not stop reading now!) since arbitrage in the face of increasing risk is the function of the front office. Indeed in the 4th quarter of 2005 increased volumes of sales of collateralized loan obligations (CLO) of medium-sized corporations were observed in the debt markets in London. According to the Financial Times,

“The banks say they are motivated by the desire to reduce the capital that regulators require them to hold. It makes sense to pursue this kind of securitization from both a funding and a capital perspective.” 20

20 “Corporate Debt Market Acts As A Barometer”, By Paul J Davies; Financial Times; Nov 22, 2005

The Trading Book Defined by Intent

A working definition of the trading book could be: “Positions in financial instruments and commodities held either with trading intent or in order to hedge other elements of the trading book or the banking book” There are some basic requirements for Trading Book treatment, the positions in the Trading Book must be

Actively monitored and managed on trading desk; Marked to market at least daily; Reported to senior management and there must exist A clearly documented trading strategy

To be eligible for trading book capital treatment, financial instruments must either be free of any restrictive covenants on their tradability or able to be hedged completely. In addition, positions should be frequently and accurately valued, and the portfolio should be actively managed. This amounts to a collection of positions held entirely with trading intent. Intent is a difficult concept in jurisprudence but the Bank for International Settlements uses an intent-based definition of the Trading Book which it describes as consistent with that of financial institutions which it surveyed in April 2005 on the Trading Book; “Assets held or Positions managed, benefiting from short-term price variations, positions which are actively managed, marked-to-market valuation, and for which in general a clearly defined reference market provides liquidity (this latter clause is only an identifiable predicate of trading book position or asset in its breaching, as we shall see later).

Page 75: Basel II Pillar

75

The BCBS & IOSCO Joint Group

A working definition of the trading book could be: “Positions In January 2004, the Basel Committee on Banking Supervision (BCBS) and the International Organization of Securities Commissions (IOSCO) decided to set up a joint working group (hereafter referred to as ‘the Joint Group’). The Joint Group would consider the issues that could potentially arise as a result of concerns about potential distortions between banking book and trading book regimes. These could potentially be magnified by the fact that, in some jurisdictions, the Revised Framework is going to apply to both banks and investment firms, whose activities are more focused on trading. The Joint Group identified three priority issues:

The treatment of counterparty credit risk arising from certain derivative and securities financing transactions.

Double-default effects on hedged exposures and the maturity adjustment for short-term transactions in the IRB approach.

Adaptation of Trading Book to the recent developments observed in trading activities.

BIS worked jointly with IOSCO to address six issues related to the above:-

Counterparty credit risk for OTC derivatives, Repo-Style and securities financing.

Cross-product netting arrangements. Double-default effects for covered transactions, in trading

and banking books. Short-term maturity adjustment in IRB approach. Current trading book regime, with respect to specific risk. The design of capital treatment for unsettled and failed

transactions.

These are “rules” for banks. National authorities can apply them widely to investment firms, combined groups, any firms subject to prudential banking or securities regulation. BCBS released final Basel II rules revising Market Risk in the summer of 2005. In these regulations the BCBS indicates that market risk is to be measured and monitored, at portfolio level, there should be limits and controls in place; concentration limits (risk diversification) and other more specific controls (e.g. individual risk types in particular books related to specific exposures). The main portfolio measure is Value at Risk (VaR), which is a predictive risk measure of statistical loss, applied to significant market risks in trading portfolios. 21

21 The Application of Basel II to Trading Activities and the Treatment of Double Default Effects, April 2005; http://www.bis.org/publ/bcbs111.htm

Page 76: Basel II Pillar

76

Fair Value of Positions in the Trading Book

In the Trading Book, banks bring their valuation in line with the market, which means that they consider liquidating their positions or hedging out their derivative risks in a normal timeframe and in a normal market environment, without moving the markets. Therefore Fair Value is assessed under the assumption of “normality” prevailing. The normal distribution is suitable as a model of that assumption and the VaR technique is thus appropriate to assess fair value of exposure to risk at given confidence intervals. In terms of the practical mechanics of marking to market, firms then mark derivative positions to the mid-market price and make some adjustments to take into account close-out costs, illiquidity, credit spread and model risk. Firms apply various valuation adjustments or reserves, mostly on a portfolio level to gauge the illiquidity, concentration, pricing and model uncertainties. Reserves refer to pricing changes in the valuation away from fair value. Occasionally, for complex derivatives, reserves are calculated on a transactional basis. Therefore some of the risks which Basel II requires the bank to quantify are already measured in the fair value (VaR) approach to marking to market generally adopted by Banking firms in valuing their portfolios on a daily operational basis (i.e. liquidity, concentration and model risk). However this technique of marking to market to achieve fair value is predicated on the assumption that normality prevails (“ceteris paribus” in a weak sense). In Pillar 2, the assessment of Economic Capital has to be made in terms of the assumption of normality breaking down, i.e. extreme market conditions, entailing thereby that the standard assumptions about liquidity risk in particular do not pertain and an alternative approach is required.

Page 77: Basel II Pillar

77

Trading Positions for which no active reference market exists

In today’s trading book the key issue is “positions for which no active reference market exists”. The accounting definition of fair value (FASB 133) now includes the notion of a reference market. The transaction price is presumed to be of fair value, but in certain instances (e.g., when the transaction did not take place in the reference market), the actual transaction price may not be “rebutted”. The FASB guidance incorporates a most-advantageous-market concept in evaluating fair value at initial recognition. The most advantageous market, called the “reference market” is the most advantageous market in which the entity would conduct the transaction. The presumption that the transaction price is the fair value of the asset or liability at initial recognition is rebutted only if the transaction took place in a less advantageous market. The treatment of the Fair Value of Derivative Instruments is a detailed and tortuous technical accounting space. Although all of the details are not relevant to a discussion of Market risk in Basel II from a Pillar 2 perspective, a brief view of FASB 133 (the US accounting standard relevant here) reveals how far the accounting world has gone to deploy econometric techniques to achieve fair value in the area of derivative securities and indeed how much more do we have to go to stress this fair value. According to FASB 133, an estimate of fair value is valid only if one of four minimum reliability thresholds for the estimate is met. The threshold would be met if the estimate was based on any of the market inputs from Levels 1 – 4 of the FASB proposed fair value hierarchy;-

Level 1 input reflects quoted prices in active markets for identical derivative instruments.

Level 2 inputs reflect quoted prices for similar derivative instruments or quoted prices for identical derivative instruments from markets that are not active.

Level 3 inputs are other than quoted prices that are directly observable for the derivative instrument, such as interest rates, yield curves, volatilities, and default rates.

Level 4 inputs are not directly observable for the derivative, but are corroborated by other market data through correlation or by other means.

Page 78: Basel II Pillar

78

The amount of positions held without an active reference market ranges between 0.2 percent and 28 percent of total trading book positions. On a net basis, the proportion can be up to 80-85 percent, reflecting the fact that positions with an active reference market are typically hedged more fully. The pie chart below is taken from the BCBS survey of Trading Books and shows the proportionate type of transaction for which no active reference market can be identified22.

22 Basel Committee on Banking Supervision, Trading Book Survey: A Summary of Responses, April 2005; http://www.bis.org/publ/bcbs112.pdf

Page 79: Basel II Pillar

79

Credit Derivatives hedge non-trading positions.

OTC (Over the Counter) Credit Derivatives represent a diverse and heterogeneous group of transactions, which are principally concerned with the isolation of credit risk as a separately traded market-variable. The different products focus on structuring financial instruments to allow trading in this attribute in varied formats to allow hedging or risk assumption by market participants. It is worth pointing that in the OTC world, London is the most important market locus with nearly half the global OTC market, wherein CDS (Credit default Swaps) are the most important product (more than half market), followed by TRS (Total Return Swap). Credit derivatives are OTC (“Over the Counter”) derivative financial instruments whose payoff depends on the credit quality of the issuer. This credit quality can be measured by the credit rating of the issuer or by the credit spread of his defaultable bonds over the yield of a comparable default-free bond. Economic theory tells us that market and credit risk are related to each other and are not separable. These financial instruments are traded over-the-counter and therefore entering into these contracts bears the risk of the default of the counterparty. To protect themselves against the risk of default in the Banking Book, institutions enter into transactions to buy that protection from counterparty. Firms generally allocate internal capital for counterparty credit risk exposure on credit default swap transactions in which the firm is purchasing protection. The purchase of protection is known as hedging. Hedging positions within a portfolio are natural “risk offsets” (or “internal hedges”) for others (usually within the Banking Book). A financial hedge seeks to erect counterbalanced positions to keep particular classes of risks out of one's portfolio. So, a Credit Default Swap hedges away the risk of default in a portfolio of Mortgages. The CDS then sits in the trading book but in a part of it where the motivation for holding the CDS is not necessarily “trading intent”.

When a corporation uses derivatives to hedge its risks, by definition the underlying position and the hedging position “have opposite directions”, that is, if we hedge a long USD position, our hedging position is short in USD. If the underlying position is in profit, the hedging position is in loss and vice versa! However, if one believed that risks are not always correctly priced, and there are also other types of imperfections on the markets (asymmetric information for example), under these assumptions, risk management is an essential tool to gain competitive advantage.

The distinction of the Trading and Banking Books.

Credit derivatives hedging banking book items are booked in the trading book because they are fair valued, for accounting purposes. Inconsistencies between regulatory and accounting standards contribute to the blurring of the boundaries between trading book and banking book. Banks now have to hold non-trading regulatory capital books that are marked to market. Conversely of course banks hold loans in their trading book; mainly mortgage loans and distressed loans. These loans are held in order to be securitized and are risk-managed using market risk techniques (VaR, stress tests, limits, etc.). They are generally held for a longer holding period than traditional trading book items (e.g. up to six months, or more). At the discretion of national regulators banks may employ the third tier of capital for the sole purpose of meeting the proportion of capital requirements for market risks. Tier 3 capital limited to 250% of bank’s tier 1 capital is required to support market risks; tier 2 capital may be substituted for tier 3 capital up to the same 250% limit, subject to the overall limits for tier 2 capital in the Basel II Accord. A bank must first calculate capital for credit risk, and only afterwards calculate market risk requirement. The Capital requirement is to be met on a continuous basis at the close each business day.

Page 80: Basel II Pillar

80

Credit Default Swaps.

Credit Default Swaps are derivative contracts in which one counterparty (the "Buyer" of protection) pays a premium to a second party (the "Seller" of protection) for taking credit risk from an issuer or a security (the "Reference Issuer"). If the Reference Issuer suffers a "credit event," generally an event of default, then the Seller of protection pays the loss on the Reference Security to the Buyer. In general the post-credit event payment will either be an agreement to purchase a bond issued by the specified credit at par, or a cash payment based on the market value of the Reference Security at the time of default. Determination of premium paid is closely linked to asset swap levels on Reference Issuer bonds. Standard default swap documentation has been established by the International Swaps and Derivatives Association, Inc. (ISDA). The CDS has nearly half of the market share of all credit derivatives and has become the dominant financial instrument in the credit derivatives market. The risk replication of a CDO via a portfolio of CDS is one of the key market positions or set of transactions and one of the major reasons why the CDS market remains more liquid than the markets for other credit derivative instruments. The CDS market is at the moment more liquid than the corresponding cash market for bonds so that the CDS premium is the key piece of information for analyzing the creditworthiness of companies. It also reflects corporate credit quality better than credit spreads in financial market. In the BIS survey, however, several firms mentioned that liquidity is often questionable for credit derivatives. CDS are not static instruments that only perform in the event of default. They are dynamic, market-sensitive products whose mark-to-market performance is closely related to changes in credit spreads. In buying protection with a CDS, the investor essentially replicates the cash flow of a short position in the corporation’s debt. In the case of default, the investor is able to buy the defaulted debt for its recovery value in the market and sell it to its counterparty, the protection seller (B), for its face value.

Variants of Credit Derivative; synthetic transactions.

In the interest of brevity in this exposition on our approach to Pillar 2, we have focused upon the Credit Default Swap (CDS) which is the major traded instrument by volume in European Trading Books. There are others which we will simply list here. All of these can be treated in a similar manner for the purposes of stress testing and economic capital allocation in the ICAAP. The standard response of the industry to the identification of new financial risk is product innovation. This is the response to what can be described as increased regulatory risk and supervisory risk as a consequence of the new Basel Accords! A brief depiction of the innovation of derivative instruments is the movement of risk off the balance sheet. Thus risk can be seen as shades along a spectrum from on-balance sheet exposure to default, to the purely off-balance sheet default risk of securitized instrument. Innovation of derivative instruments is essentially a strategy towards the same goal. It should be noted that the “newness” of an innovative derivative product is directly proportional to its (lack of) liquidity; the converse being the risk management implication that very new and very innovative products do carry increased liquidity risk. The list of these innovated products along the spectrum towards off-balance sheet securitization is indicated by the following - Credit Linked Notes (CLN), Synthetic Collateralized Loan Obligation. A synthetic transaction is difficult to define precisely but such a transaction concept can be understood by considering a transaction whereby a securitized instrument is exposed to the underlying default synthetically, i.e. via the mechanism of its writing CDS on a pool of underlying corporate exposures or other securitized instruments. Other Credit Derivative innovations include Total Return Swap, Credit Linked Note and Credit Spread transactions; Credit spread forwards and Credit spread options.

Page 81: Basel II Pillar

81

ISDA (The International Swaps and Derivatives Association)

A further key factor for the huge growth in Credit Derivatives is the standardized credit derivatives definitions introduced by ISDA. This project established a new and standardized ISDA confirmation form for credit derivatives, thereby simplifying the documentation and approval process of credit derivative transactions. Even more importantly, transactions based on the new definitions will be based on a common set of definitions. Previously, credit derivatives transactions were based on ISDA documents introduced in early 1998, but different firms and countries had developed minor variations on the standard ISDA long form confirmation. This gave rise to potential legal issues, and as a result, some participants were selective about who they would deal with, and other conservative firms probably did not participate in the credit derivatives market at all. The new definitions, by eliminating this documentation/legal risk, are a significant factor in broadening the investor base for credit derivatives and improving market liquidity. The purpose of the ISDA Master Agreements is to make dealings with derivatives more efficient; they contain the ‘non-economic’ terms such as representations and warranties and default and termination events. Adherence to the ISDA master agreement leaves counterparties free to negotiate the ‘economic’ terms and allows contract wordings to be agreed and signed before risks are assumed. This structured and disciplined approach to the trading process should increase the liquidity of the credit derivatives market and reduce exposure of participants to operational and legal risks.

Operational Risks in Credit Derivatives

One issue current at the end of 2005 was that of operational risks associated with the trading of credit derivatives. OTC derivatives have become a slow and manually-intensive process to process through the back-office, some trades expiring unconfirmed. It had become apparent that there had been litigation costs resulting from non-execution of the ISDA Master Agreement annex. ISDA developed a Strategic Plan which highlighted operational inefficiencies associated with OTC derivatives trading and provided a blueprint for the adoption of more standardized, automated post trade processes. On the 17 February 2006 the Federal Reserve Bank of New York (Fed) announced that the world's major derivatives dealers had met initial targets for reducing the backlog of unconfirmed trades in the $12.4 trillion credit derivatives market. The regulator originally met with the banks in September 2005 to voice concerns about risk management practices in the rapidly growing credit derivatives market. Banks' back offices have struggled to keep up with the booming market, which according to ISDA figures surged at an annual growth rate of 128% to $12.43 trillion in mid-2005. In particular, the Fed was concerned by the high level of unsigned confirmations outstanding between counterparties. By February 2006 the market participants had met targets to reduce the number of confirmations outstanding for more than 30 days; as a group, a 54% reduction was achieved by the end of January. Firms have also signed up to the new ISDA electronic messaging protocol for trades.23

23 SOURCE: www.Finextra.com

Page 82: Basel II Pillar

82

Conditions upon IRB in the Trading Book

There are four conditions of acceptable regulatory and supervisory deployment of an Internal Ratings Based approach in the Trading Book:

The qualitative factors related to the governance of the risk management of the trading book in general and the implementation of stress testing;

The quantitative factors related to the implementation of appropriate quantitative methods for P1 calculations and a rigorous method of stress testing for the calculation of economic capital in P2;

The market risk factors which require that positions in the trading book are aggregated and summarized by the four instrument type classes described in the table above (Equity, Commodity, FX and Bonds);

The implementation of Stress Testing in a manner which takes account of the fact that the P1 methodology (VaR), being predicated upon the assumption of normality, is ineffectual at the measurement of the loss exposure faced by an institution during extreme events.

These four conditions are detailed below. 1. The Qualitative Factors

There must be an independent risk control unit which must demonstrate integration with day-to-day risk management of the bank;

There must be trading limits; The risk management of the trading book must be overseen

by the compliance function; Stress-testing must be implemented, verified by back-testing

with process assurance subject to external validation and independent review in the risk control unit or model validation unit.

2. The Quantitative Factors For Pillar One regulatory compliance value-at-risk must be calculated daily (to 99th percentile, one-tailed confidence interval; with instantaneous price shock equivalent to 10 day movement in prices). It is necessary to update data sets no less frequently than every three months. The regulations prescribe no particular type of model - it is simply stipulated that the models must accurately capture option risks. The institution has discretion to recognise empirical correlations within broad risk categories. Pillar One regulatory requirements insist upon daily calculation of capital requirement (which is widely in place with VaR models anyway). Regulatory Capital will be arrived at via a multiplication factor based on supervisor judgment of quality of bank’s risk management system (the number 3 is the most often quoted). 3. Market risk Factors The Market Risk factors require that the institution has deployed a risk measurement system that models yield curve (interest rate risk), foreign exchange exposures (foreign exchange risk), market movements in equities (equity risk), convenience yield (commodities risk). The risk measurement system would be independent of the operational trading book day-to-day positions and model risk on the basis of these instrument types.

We at SAP would expect most of our clients to further disaggregate these aggregates as a matter of course.

Page 83: Basel II Pillar

83

4. Stress Testing Stress testing allows firms to look at the impact of more extreme events on their business, which are not generally conveyed by traditional risk management models, such as Value at Risk (VaR). Good risk management practice suggests that firms should conduct some form of stress testing of the key risks within their business and that senior management should be fully engaged in the process. A key function of stress testing is to sensitize senior management to the concept of the ‘stressed risk appetite’. A well thought out and conducted stress test can help senior management to focus on whether it would be comfortable with the risk/return consequences of a set of extreme, but plausible, business conditions. If the likely outcomes are outside management’s stressed risk appetite, some adjustment to the business/risk profile of the firm may be warranted. This also provides an important link for effective capital allocation by senior management. A survey by the FSA found that Stress Testing awareness and competency is most developed within firms that have a short dated trading book risk profile. However, in the majority of cases, stress testing models were backward looking, i.e. focused on historical rather than future possible events. There was surprisingly little evidence of firms having developed methodologies to examine the relatively straightforward and potentially important scenario of the failure of one or more major counterparties. 24

24 The FSA Discussion paper 05/2 is entitled 'Stress testing'. It was published in May 2005, http://www.fsa.gov.uk/pages/library/policy/dp/2005/05_02.shtml

Page 84: Basel II Pillar

84

INTEREST RATE RISK INTEREST RATE RISK ............................................................................................................................ 84

B2 TREATMENT OF IRR ...................................................................................................................... 85 DEFINITIONS OF IRR ................................................................................................................................. 86 THE TERM STRUCTURE OF INTEREST RATES............................................................................. 87 APPLIED MODELS OF INTEREST RATE RISK .............................................................................. 91 IR PORTFOLIO MANAGEMENT STRATEGIES............................................................................... 92 THE RE-PRICING & THE MATURITY MODEL ............................................................................... 93 IR SENSITIVE ASSETS AND TRANSACTIONS............................................................................. 94 DURATION, MATURITY AND RE-PRICING................................................................................... 94 DURATION ............................................................................................................................................... 95 DURATION: SPECIAL FOCUS IN MORTGAGES ........................................................................... 95 MACAULAY DURATION............................................................................................................................... 96 MODIFIED DURATION................................................................................................................................. 96 DURATION – METHODOLOGIES FOR MONITORING IRR .......................................................................... 96 IRR, SIMULATION AND STRESS TESTING ............................................................................................... 99 THE STANDARDIZED RATE SHOCK (BCBS) ........................................................................................... 99 INTEREST RATE DERIVATIVES & HULL AND WHITE. ............................................................................ 101 NO-ARBITRAGE TERM STRUCTURE MODELS ........................................................................................ 102

Page 85: Basel II Pillar

85

B2 TREATMENT OF IRR

Basel II treats interest rate risk (IRR) under the Second Pillar of Supervisory Review (rather than the First Pillar of Regulatory Capital) due to differences in methods used by banks to handle this risk (specifically IRR in the Banking Book or IRRBB). IRR is the risk that an institution will experience deterioration in its financial position as interest rates (IR) move over time, mainly if its assets and liabilities are of different maturities and priced off different interest rates. Given the treatment of IRRBB in Pillar 2, then it is only in Pillar 2 that we can see the risk distribution of Financial Risk over the whole Bank. It may be described at a high level by the table below:

Page 86: Basel II Pillar

86

Interest Rate Risk (IRR) is generally sourced in the relevant Central Bank policy. It used to be the case that institutions tried to model the manner in which a Central Bank would respond to changes in macroeconomic conditions with interest rate shifts by building models of Central Bank Reaction Functions. Ironically we at SAP are now advocating an alternative appropriate response to Supervisory Review. We believe that the Financial Institution should have its own Reaction Function in terms of the manner in which its Economic Capital position will respond to changes in macroeconomic conditions, including Central Bank interest rate decisions (important in terms of IRR). Arguably the most important variable in any model of the economy is the rate of interest. This is because the rate of interest is the cog in the wheels of any model of the economy which relates the real economy (productivity, investment, consumption and property prices) with the financial economy (saving, asset prices in terms of stock and bond markets, government securities and maintenance of levels of liquidity). It is in this space that most of the debates in economic theory and applied economics have occurred in the 20th Century, such as the Friedman versus Keynes(ian) debate and the consequent debate between Structural or Keynesian models of the Economy and Reduced Form (RF) factoids! The positivist approaches to modeling the economy associated with Milton Friedman (and Anna Schwarz) were consequently developed with such important impact upon Political Economic policy in both the US and UK in the late 1970’s through to current policy models. The Friedman versus Keynesian debate pivots upon Keynes’ Liquidity Preference theory of Money Demand, which was established first by John Maynard Keynes in The General Theory of Employment Interest and Money 1936.

Definitions of IRR

A (simple) working definition of Interest Rate Risk (IRR) would be

“The risk that changes in interest rates may adversely affect an investment’s fair value”

The Basel Committee on Banking Supervision (BCBS) defines interest rate risk as 25

“The exposure of a bank's financial condition to adverse movements in interest rates. Changes in interest rates affect a bank's earnings by changing its net interest income and the level of other interest sensitive income and operating expenses. Changes in interest rates also affect the underlying value of the bank's assets, liabilities, and off-balance-sheet (OBS) instruments because the present value of future cash flows (and in some cases, the cash flows themselves) change when interest rates change.”

25 BCBS, Principles for the management and supervision of interest rate risk, July 2004; http://www.bis.org/publ/bcbs108.htm

Page 87: Basel II Pillar

87

THE TERM STRUCTURE OF INTEREST RATES

We cannot begin a discussion of the Term Structure of Interest Rates (TSIR) without addressing some fundamental principles of basic economic theory. Arguably we cannot present a comprehensive exposition of the TSIR without a detailed discussion of some fundamental aspects of both micro and macroeconomic theory. However as we are focusing here upon the requirements of Basel II Pillar 2, we have neither the space nor the freedom to discuss these areas in an appropriate depth. Save to mention that when a concept, such as TSIR, pivots on other concepts like inflation and productivity then a full exposition would be more detailed than the exposition below. The TSIR and the views held by each financial institution about the forward paths of inflation and productivity are fundamental to the business of banking, the business of financial intermediation and the management of risk. We must therefore begin this exposition with the basic textbook definition of financial intermediation. 1. Financial Intermediation and Banking Practice In financial markets buyers and sellers usually do not deal directly with one another. Intermediaries, such as banks, pension funds and unit trust companies, sit between them. These intermediaries provide financial asset maturity transformation. Banks borrow short-term and lend long-term, keeping enough reserves to meet the expected short-term liquidity needs of depositors. The establishment of the financial intermediation function or industry in an economy provides (so the theory goes) economies of scale in the process of matching individuals’ demand for and supply of credit as part of their own management of their own individual business cycles. Pooling of deposits means that the transactions’ costs of a lending decision (risk evaluation) are only paid once. The Financial Institutions provide risk-spreading whereby pooling of deposits also means that a depositor, or the purchaser of a unit trust, does not have to worry about the risk of any single borrower. By being spread, risk is reduced.

2. The Mechanics of the Term Structure The term structure of interest rates is the relationship between time to maturity and interest rates for default-free, pure discount instruments. The term structure is sometimes called the “zero coupon yield curve” to distinguish it from the Treasury (or Gilts in the UK) yield curve, which is based on coupon bonds. Zero-coupon bonds are US treasury securities which make no interest payments. “Zeros” have normally a time-to-maturity of less than one year. The “Term Structure” is the relationship between the yield (the IRR) and the time-to-maturity for risk-free zero-coupon bonds of progressively greater maturities. The term structure is a series of interest rates corresponding to the yields on zeros of progressively greater maturities. A yield curve is a plot of yields to maturity as a function of maturity. For maturities up to one year, the yields are extracted from zero-coupon bonds. For longer maturities, the yields are those of on-the-run coupon bonds. On-the-run coupon bonds are bonds that have just been issued. Since they are issued at par, typically their coupon rate is very similar to their yield rate. We use the term structure to value Treasury bonds. We also use the term structure to value derivative securities on T-bonds. The Yield Curve is typically used to value corporate bonds, mortgages, and other securities that have a cash flow structure similar to that of coupon bonds. The shape of the yield curve is the same as the shape of the term structure. In other words, inferences about the state of the economy made using either the yield curve or the term structure will be identical. Pricing a coupon bond as a portfolio of discount bonds is equivalent to discounting the coupon bond’s cash flows at the term structure of yields of the zero coupon bonds. Any risk-free (default-free) coupon bond can be viewed as a portfolio of risk-free (default-free) discount bonds. Any risk-free cash flow should be discounted at the term structure of zero-coupon yields.

Page 88: Basel II Pillar

88

3. Yield Curves and the Term Structure A Yield curve is a graph of the (best-fit) GRY (Gross Redemption Yield) on (UK) Gilts against the unexpired term of that security. Often a separate curve is drawn for high, medium, and low coupon bonds, for example FTSE Actuaries Government Securities (UK) yield indices. Other methods to overcome the problem of different coupon levels are generally too elaborate, e.g., the yield surface (see Duffie and Singleton). The Strip curve (or zero coupon curve or spot curve) is the yield curve when all gilts are zero coupon. Gilts are the UK equivalent of US Treasury Bills and define the risk free rate. The yield curve when all Gilts are zero coupon is also known as the term structure of interest rates

The concept of the risk free rate and the term structure are key for all theoretical (and applied) considerations of interest rate risk management and indeed default risk management. 26

The Par yield curve is the coupon that would be required by the market for a gilt issued at par for the given term. The axes of the term structure are coupon value (y-axis) against term to redemption (x-axis). The par yield curve is generally upwardly sloping. There are a number of theories as to why that is the case.

26 Shane Whelan, University College Dublin (UCD)

4. Theories of the Term Structure of Interest Rates There are several standard theories of the Term Structure which are the theories of the Yield Curve. They can be described as follows:-

The Expectations Theory This approach can be described as a (crude) application of the REH (Rational Expectations Hypothesis) whereby forward rates equal expected future zero rates or the Yield curve reflects the market’s expectations of future short-term rates. On this basis long-term rates are a geometric average of current and expected short-term rates.

Market Segmentation or Preferred Habitat theory: Yields at each term to redemption are determined by supply and demand from investors with (nominal) liabilities of that term. In other words, different investors have different needs - banks at the short end, life insurance companies and pension funds at the long end. Price is a function of supply and demand in each segment. Short, medium and long rates are determined independently of each other. Different types of investors have specific needs in terms of maturity. Yield curve reflects the intersection of demand and supply of individual maturities.

Liquidity Preference Theory Forward rates are higher than expected. Future zero rates analyses the upward sloping yield curve as a function of future uncertainty. The explanation of the relationship is given by the position that a premium is required to hold long-term debt. In other words investors as a class prefer liquid assets to illiquid ones. Longer dated stocks are less liquid than shorter dated stocks hence yields should be higher on higher dated stocks.

Page 89: Basel II Pillar

89

5. Liquidity Preference; Keynes The Liquidity Preference Theory of interest rates states that

“...market rates of interest adjust to balance the supply and demand for money.”

The quantity of money supplied in the economy is fixed at whatever level the Central Bank decides to set it. The Money Demand is determined by several factors. However, the most important is the interest rate.

“People choose to hold money instead of other assets that offer higher rates of return because money can be used to buy goods and services.”

(i.e. a desire for liquidity or Liquidity Preference.) The primary opportunity cost of the convenience of holding money is the interest income that one surrenders when one holds cash or cheque account balances. An increase in the interest rate raises the cost of holding money and thus reduces the quantity of money balances people wish to hold. The general price level of all goods and services in the economy influences money demand and therefore interest rates: a higher price level raises money demand (i.e. a shift in the money demand curve). Increased money demand leads to a higher interest rate. Increased interest rates reduce the quantity of goods and services demanded. 6. Procyclicality The working of the credit system works to expand during booms and contract during slumps, the notion of procyclicality, was recently brought to the fore by the new Basel Accords (the subject of this paper). However it was initially elucidated by Keynes in what he termed the “Liquidity Trap”, which remained a key challenge to Banking academics and Supervisors until challenged by James Tobin in his paper on wealth Effects in 1958. As interest rates increase, the cost of borrowing and the return to saving is greater. Fewer households and firms borrow money, leading to a decrease in spending.

An increase in government purchases causes the interest rate to rise, and a higher interest rate tends to choke off the demand for goods and services. The reduction in demand that can result when a fiscal expansion raises the interest rate is called the ‘crowding-out effect’. 7. The Definition of Money Money throughout history has been defined by its function: as a means of payment, store of value, unit of account or standard of deferred payment; it has been defined by Aristotle, Hume, Marx and Keynes (amongst others). The “Store of value” concept was not seen as a unique attribute of money; the “Unit of account” and “standard of deferred payment” seen as same thing as “Store of Value” but with different time horizons. Marx viewed the unit of account aspect of money as the driver towards “accumulation, accumulation, accumulation”!! The view of Marx was that modern capitalist societies had destroyed the use-value of money (or exchange) and driven the function of money towards the accumulation of “surplus value”. Modern economists see through this (once) emotionally charged language and surplus value is seen as precisely the maintenance function of investment markets and financial institutions. The function of the Basel Accords is to constrain our institutional structures from destroying or depleting that surplus value. What was to Marx a pejorative aspect of the classical economic system is now a crucial aspect of that system’s function and purpose. Today in practice, as the Central Bank accepts the role of lender-of-last-resort in order to maintain confidence in the banking system, the banks are not constrained by a given stock of reserves. They are however still subject to reserve requirements, and the central bank can influence the demand for reserves by manipulating short term interest rates. But if the banks are prepared to pay the required interest rate to borrow reserves, then there is no limit on their credit creation.

Page 90: Basel II Pillar

90

8. Combining theories of the Term Structure. The majority of investors want positive real return hence there is a demand for issued sovereign or corporate paper. The overall level and slope of the yield curve depends, nowadays in practical terms on the inflation outlook of market participants and issuers. This theory comes from the canon of Rational Expectations theories. Market participants want compensation for uncertainty with an inflation risk premium in addition. This demand for premia is necessarily upward sloping with term. It is also generally accepted that there is a positive real interest rate related to real factors in the economy; usually associated in the late 20th century with productivity. This allows us to present (using Professor McNeil’s notation) a factor model for the Term Structure of Interest Rates which can integrate the Keynesian, Classical and Expectation theories of the term structure.

Page 91: Basel II Pillar

91

APPLIED MODELS OF INTEREST RATE RISK

One of the standard mathematical models of the Term Structure is the Cox, Ingersoll, and Ross (CIR) Mean-reverting, square-root diffusion process, which is given by the equation below;- where α = speed of reversion to the long-run mean interest rate r = current short term interest rate R = long run mean of short term interest rate σ = volatility factor dz = standard normal distribution (a standard Wiener process)

The CIR model is a one factor model of the term structure. This means there is only one factor determining the term structure of interest rates, namely the short rate which follows the equation specified above. The CIR model (like the Hull and White model described below) is a Stochastic Volatility Model of the Term Structure of Interest Rates. Stochastic Volatility is a Deterministic Function of Time and (pure) Risk (which can be characterized as statistical “Noise”). The CIR model is a mean-reverting process, which means that it pushes away from zero to keep it positive. The model is the basic underlying framework deployed and enhanced by Financial Mathematicians whenever a term structure model is needed as an input to another modeling challenge (time to default or derivative pricing as in Duffie and Singleton or Hull and White, respectively). Being “Mean Reverting” the model is exposed to Autocorrelation errors in application and thus needs practical development, most often this is achieved by enhancing the model to a two factor approach. In plain English this means that the single factor model could overweight some scenarios and underweight others.

The CIR model is an Equilibrium model of the Term Structure, which means that the term structure of interest rates is viewed positivistically as an output given some assumptions about how an overall economic equilibrium is achieved. This means that some strong background assumptions are required for the CIR model – that the market is frictionless, there are no taxes or transaction costs, securities are perfectly divisible, the bond market is complete and there exists a discount bond for each maturity. This is an efficient markets hypothesis. The CIR model is usually ascribed a publication date of 1985, although it was certainly around and influencing academic thinking before that.

dzrdtrRdr )( σα +−×=

Page 92: Basel II Pillar

92

IR PORTFOLIO MANAGEMENT STRATEGIES

1. Passive Interest Rate Portfolio Management Passive Bond Management takes no view on future movements in interest rates, and seeks to control the inherent risk of a portfolio of assets, both accruing and expecting returns based upon the term structure, and of liabilities with expected payments to be made as a function of the term structure. There are in general two strategies of passive interest rate management:

Indexing; or Immunization, where immunization means to insulate the

portfolio from interest rate risk by “Gap management.” 2. Active Interest Rate Portfolio Management Active Bond Management attempts to beat the market by exploiting forecasts of interest rate changes using two types of strategy:

A Rate Anticipation Strategy, where the investment manager takes a view on the future course of the level of interest rates and constructs a portfolio that benefits most from interest rate moves in the anticipated direction; or

A Spread Anticipation Strategy where the portfolio manager takes a view on how the yield curve spread will vary over time and constructs a portfolio accordingly, In this instance, no specific view is taken on how the level of rates will vary over time.

A third strategy which is more akin to modern techniques of Stress Testing is one focused upon Interest Rate Volatility Anticipation where the portfolio manager takes a view on the future volatility of interest rates and constructs a portfolio accordingly again no specific view is taken on how the level of rates will vary over time.

3. The Relationship of Interest Rates and Bond Prices An increase in interest rates causes bond prices to fall and a decrease in interest rates causes bond prices to rise. Longer maturity debt securities tend to be more volatile in price in response to underlying changes in the term structure as a consequence of an alteration in Central Bank base rates. For a given change in interest rates, the price of a longer term bond generally changes more than the price of a shorter term bond. Two bonds with the same term to maturity do not have the same interest-rate risk. A 10 year zero coupon bond makes all of its payments at the end of the term. A 10 year coupon bond makes payments before the maturity date. When interest rates rise, the prices of low coupon securities tend to fall faster than the prices of high coupon securities. Similarly, when interest rates decline, the prices of low coupon rate securities tend to rise faster than the prices of high coupon rate securities. Knowledge of the impact of varying coupon rates on security price volatility led to the development of a new index of maturity other than straight calendar time. The new measure permits analysts to construct a linear relationship between term to maturity and security price volatility, regardless of differing coupon rates. The new index of maturity is known as “Duration”. The term “Duration” has a special meaning in the context of bonds. It is a measurement of how long in years it takes for the price of a bond to be repaid by its internal cash flows. It is an important measure for investors to consider, as bonds with higher durations are more risky and have higher price volatility than bonds with lower durations. Throughout the life of the bond, the duration is continually decreasing as time to the bond’s maturity decreases.

Page 93: Basel II Pillar

93

Duration is important because Financial Institutions attempt to manage interest rate risk by balancing the duration of assets and liabilities. This is the locus of IRR and it is in the modeling processes deployed here that the level of economic capital required for that risk will be identified. 4. The Relationship between Duration and Maturity In computing the duration of bonds, we use the yield-to-maturity (IRR) to discount the cash flows. For Coupon Bonds the duration is shorter than maturity. For Discount bonds yield is greater than coupon, i.e. duration increases at a decreasing rate up to a point, after which it declines. For Par value bonds duration increases with maturity. For Premium bonds (yield is less than coupon) duration increases throughout but at a lesser rate than with a par value bond.

Duration depends on yield-to-maturity. The higher the yield the shorter the duration, other things being equal (ceteris paribus). The higher the coupon rate, the lower the duration of the bond.

As interest rates increase, duration decreases and the bond becomes less sensitive to further rate changes. This implies that a positive relation exists between term to maturity and duration. However duration increases at a decreasing rate with maturity, which implies an inverse relation between YTM and duration.

THE RE-PRICING & THE MATURITY MODEL

The two older approaches to modeling IRR are the Maturity and Re-Pricing schedule approaches. In a recent paper, the Basel Committee for Banking Supervision (BCBS) describes these two approaches as the simplest techniques for measuring a bank's interest rate risk. 25 The BCBS states that when a Financial Institution uses this approach, it should begin with a maturity/repricing schedule that distributes interest-sensitive assets, liabilities, and OBS positions into “time bands” according to their maturity (if fixed-rate) or time remaining to their next repricing (if floating-rate). When this approach is used to assess the interest rate risk of current earnings, it is typically referred to as gap analysis. 5. The Re-Pricing Model (or Approach) The Re-Pricing or funding gap model is based on the book value of the assets. It contrasts with market value-based maturity and duration models recommended by the Bank for International Settlements (BIS). In the Re-Pricing approach a book-value accounting cash-flow analysis of the Re-Pricing gap between the interest revenue earned on a Financial Institution’s assets and the interest paid on its liabilities over some particular period is undertaken. The institution then reports Re-Pricing gaps for assets and liabilities with maturities of (in maturity buckets of):

One day More than one day to three months More than 3 three months to six months More than six months to twelve months More than one year to five years Over five years.

Page 94: Basel II Pillar

94

In the Re-Pricing approach rate sensitivity means time to Re-Pricing. The Re-Pricing gap is the difference between the rate sensitivity of each asset and the rate sensitivity of each liability, in other words the Re-Pricing gap is the difference between those assets whose interest rates will be Re-Priced or changed over some future period (Rate sensitive assets) and liabilities whose interest rates will be Re-Priced or changed over some future period (Rate sensitive liabilities). There are several serious weaknesses of the Re-Pricing approach or “Model”. It ignores market value effects and off-balance sheet cash flows. It is over-aggregative - the distribution of assets and liabilities within individual buckets is not considered and mismatches within buckets can be substantial. It ignores the effects of runoffs (term to maturity is not fully modeled). Given that a Bank will, for example, continuously originate and retire consumer and mortgage loans, the impact of runoffs may be rate-sensitive. 6. The Maturity Model (or Approach) The Maturity model explicitly incorporates market value effects. For fixed-income assets and liabilities, a rise/fall in interest rates leads to a fall/rise in market price. The longer the maturity, the greater the effect of interest rate changes on market price. A fall in the value of longer-term securities increases at diminishing rate for given increase in interest rates. The maturity of a portfolio of assets (liabilities) equals the weighted average of maturities of individual components of the portfolio. [In this way a portfolio of homogenous mortgages may be viewed as a single bond]. Maturity matching is about immunizing the institution from exposure to the impacts of interest rate risk. In general, the longer the term to maturity, the greater the sensitivity to interest rate changes. The portfolio’s sensitivity to interest rate risk is as follows: there is a fair value Increase related to a Decrease in interest rates.

IR SENSITIVE ASSETS AND TRANSACTIONS.

Examples of Rate-Sensitive Assets are:

Short-term consumer loans Three-month Treasury bills Re-Priced on maturity every 3

months Six-month Treasury-notes Re-Priced on maturity every 6

months 30-year floating-rate mortgages Re-Priced (rate reset) every

9 months One example of a Highly (Interest) Sensitive Investment would be a debt investment with contract terms that makes the investment’s fair value highly sensitive to interest rate changes, e.g. collateralized mortgage obligations, such as interest-only or residual tranches. Interest rate risk is generally heightened when the institution is undertaking fixed to variable swaps; this is known as Basis Risk or the risk that when variable interest rates on a derivative and associated bond are based on a different basis (the derivative is variable while the bond or homogenized portfolio is on a fixed rate basis).

DURATION, MATURITY AND RE-PRICING

The BCBS recommends the use of a Duration approach in combination with a maturity/repricing schedule to provide a rough approximation of the change in a bank's economic value that would occur given a particular set of changes in market interest rates. Duration weights are based on estimates of the duration of the assets and liabilities that fall into each time band, where duration is a measure of the percentage change in the economic value of a position that will occur given a small change in the level of interest rates.

Page 95: Basel II Pillar

95

DURATION

Duration is the key measure of interest rate risk. Duration measures the average time taken by the security, on a discounted basis, to pay back the original investment. The longer the duration, the greater the risk. Duration is also the price elasticity, which is the percentage change in price for a percentage change in yield. Again, the greater the duration of a security, the greater the risk of that security. The longer the duration the longer it takes on average to get your money back. Duration is calculated from the first derivative of the price of a bond with respect to the yield - time before the “average” dollar is repaid it is an essential tool in immunizing portfolios from interest rate risk. In order to immunize an overall position from interest rate movements, one should construct portfolios that minimize the “duration gap” of assets and liabilities. Alternative definitions of Duration are ‘the weighted average of term-to-maturity of discounted cash flows’ or ‘the point at which discounted cash flows balance’.

DURATION: SPECIAL FOCUS IN MORTGAGES

Duration in all its forms (Modified duration and Macaulay duration; see below) measures a mortgage's price sensitivity to rate movements, assuming the cash-flows are held constant. This may be in certain instances not a good assumption in mortgage product owing to prepayments, a problem normally confronted in the securitization of Mortgage Books. Durations are often quoted as a percentage of modified duration, where an Option-Adjusted Duration (OAD) model measures price sensitivity for small rate movements. This may not account for how securities actually trade; it is reliant on the existence of a prepayment model as in a securitization waterfall cash-flow. Empirical duration, EOAD, is a regression of actual cash flow performance versus rates, where the regressor can be price or OAD versus rates, adjusted for volatility, the slope of the curve, may asymptotically approximate the duration of the mortgage (or indeed the mortgage portfolio). Most mortgage agreements have a Prepayment Option. Any payments made by a borrower in excess of scheduled principal payments are called prepayments. The option is defined by the borrower's right to prepay all or part of the mortgage at any given time. The uncertainty for the mortgage holder which results is termed prepayment risk. The Prepayment Motivation may be complex. Prepayment may occur for one of several reasons, such as sale of the property, default or refinancing. Motivations beyond rational economic considerations play an important role in assessing prepayment risk. The Risk for the Mortgage Holder is an Interest rate risk (re-investment risk): should the mortgage be fixed-rate, market risk arises as a result of prepayment if rates fall and coupons are above market rates. Liquidity risk occurs should the mortgage portfolio be securitized for debt issuance; prepayment implies the need to raise new financing or apply CE in the securitized portfolio.

b

t

P

nrCnD

N

n∑ −+

= =1

)1)((

Page 96: Basel II Pillar

96

⎥⎦⎤

⎢⎣⎡

+= )1( rDMD

Macaulay Duration

The formula usually used to calculate a bond’s basic duration is the Macaulay duration, which was created by Frederick Macaulay in 1938 but not commonly used until the 1970s. The Macaulay duration is calculated by adding the results of multiplying the present value of each cash flow by the time it is received, and dividing by the total price of the security. The Macaulay Duration expresses the mathematical relationship whereby prices of bonds with longer maturities drop more steeply with an increase of yield, which as described above is because bonds of longer maturity have longer Macaulay duration: When the coupon rate is lower than the yield, the duration first increases with maturity to some maximum value then decreases to the asymptotic limit value.

Modified Duration

The Modified Duration which is the measure most commonly in current use is the Macaulay duration divided by 1+r, the negative of which is the change in price for a given change in yield or in other words

The approximate percentage change in price for a given change in yield to maturity.

Duration – Methodologies for monitoring IRR

The Methodology for the calculation of Duration is to prepare a weighted average of durations of individual instruments or cash flows. When using a Duration Methodology for IRR, the implicit assumption is of parallel shifts in the yield curve, since Duration is a linear way to measure a nonlinear risk. Duration may be regarded as weighted average time to maturity using the relative present values of the cash flows as weights. It combines the effects of differences in coupon rates and differences in maturity. 1. Effective Duration The interest rate risk of instruments with cash flows that depend on interest rates such as Callable/Put-able bonds, mortgages or inverse floating rate notes is given by Effective Duration. It is straightforward to implement with cash flows that are deterministic functions of interest rates (e.g., inverse floaters), however implementation with more complicated instruments (e.g., mortgages) requires mathematical models of term structure, cash flows. The assumptions underlying Macaulay and Modified Duration may be said to be unrealistic, e.g. cash flows do not change with interest rates. This does not however hold for: Collateralized Mortgage Obligations or any other securitized instrument, Callable bonds or Profit and Loss Account liabilities which themselves vary with interest rates due to the inflation-interest rate correlation via expectations. Macaulay and Modified Duration also assume Interest rates shift in parallel fashion which is unrealistic since as we have already stated, short term interest rates tend to be more volatile than longer term rates. Effective Duration accommodates interest sensitive cash flows and can be based on any term structure. Effective Duration allows for non-parallel interest rate shifts and can be used to value such assets as Collateralized Mortgage Obligations.

Page 97: Basel II Pillar

97

Pdr

Pd

Convexity2

2

=

≈0 M 0ΔP D P Δr

2. Duration Matching Duration Matching involves hedging against interest rate risk by matching the durations of assets and liabilities. It provides protection against small parallel shifts in the zero curve. Duration matching applies a weighted average time to maturity using the relative present values of the cash flows as weights. It combines the effects of differences in coupon rates and differences in maturity in models of a portfolio of interest sensitive assets and liabilities based on each instrument’s elasticity of bond price with respect to interest rate. Effectively Duration is a measure of interest rate sensitivity or elasticity of a liability or asset. Immunizing the entire balance sheet need not be costly. Duration can be employed in combination with hedge positions to achieve that immunization using a modeling framework similar to that of the Maturity model approach above. 3. Limits to Duration Modeling Immunization is a dynamic process since duration depends on instantaneous R. Large interest rate change effects are therefore not accurately captured. As a result of Duration being a linear representation of a Convex relationship, Duration works well locally but when interest rate changes are large, duration does not give a good estimate of the price response. This has to do with the fact that the relationship between a bond’s price and its yield is convex, whereas duration implies a linear relationship. Convexity is easily explained as the ‘Second derivative of price with respect to yield’. Modifying the duration estimate for convexity helps better approximate the true relationship between price and yield. The longer the maturity, the greater the convexity, ceteris paribus. The higher the coupon rate, the greater the convexity and the larger the interest rate change the bigger the convexity factor. Modeling becomes more complex and Duration less effective if non-parallel shift in yield curve occur or are to be modeled. These constraints imply that if one is modeling high coupon rate environments or large interest rate changes then the models of interest rate risk must be presented with higher levels of model risk when measuring associated economic capital

4. Convexity and Duration The duration measure is a linear approximation of a non-linear function. If there are large changes in R, the approximation is much less accurate. All fixed-income securities are convex. Convexity is desirable, but greater convexity causes larger errors in the duration-based estimate of price changes. Recall that duration involves only the first derivative of the price function. We can improve on the estimate using a Taylor expansion. In practice, the expansion rarely goes beyond second order (using the second derivative). Duration only works well for small changes in yield or price; the approach essentially tries to come up with a linear approximation without considering the convexity of the price-yield function. The convex price-yield relationship will differ among bonds or other cash flow streams depending on the coupon and maturity. The convexity of the price-yield relationship declines more slowly as the yield increases. Convexity is the percentage change in price for a given change in yield 5. Volatility and Duration The higher a bond’s modified duration measure, the more sensitive it will be to changes in the interest rate.

Page 98: Basel II Pillar

98

6. Duration and Portfolio Immunization Portfolio immunization is an interest rate portfolio management strategy that tries to protect the expected yield from a security or portfolio of securities by acquiring those securities whose duration equals the length of the institution’s planned holding period. If the average duration of a portfolio equals the institution’s desired holding period, the effect is to hold the institution’s total return constant regardless of whether interest rates rise or fall. The institution earns identical total earnings whether interest rates go up or down. With duration set equal to the buyer’s planned holding period, a fall (rise) in the reinvestment rate is completely offset by an increase (a decrease) in the bond’s market price. At least that is the strategy. If an institution faces a series of cash obligations in the future and wishes to model a portfolio of bonds that it will use to pay these obligations. The solution requires matching the duration as well as the present values of the portfolio and the future cash obligations. This process is called immunization (protection against changes in yield). By matching duration, portfolio value and present value of cash obligations will respond identically (to first order approximation) to a change in yield.

7. Difficulties with the immunization procedure using

Duration The principal difficulty flows from the problem of convexity and the linear nature of duration described above. It is necessary to rebalance or re-immunize the portfolio from time to time since the duration depends on yield. The immunization method assumes that all yields are equal (it is not quite realistic to have bonds with different maturities to have the same yield). When the prevailing interest rate changes, it is unlikely that the yields on all bonds all change by the same amount. The paradigmatic answer here would be to deploy Convexity Matching instead of Duration Matching, this could mathematically arduous but without this approach one is carrying a priori model risk. It should also be remembered that Interest rate changes also affect value of off-balance sheet claims. Duration gap hedging strategy must include the effects on off-balance sheet items such as futures, options, swaps, caps, and other contingent claims as well as issued securitized instruments.

Page 99: Basel II Pillar

99

IRR, Simulation and Stress Testing

In the BCBS paper on IRR Stress Testing 25 an exposure simulation is advocated which is described as typically involving detailed assessments of the potential effects of changes in interest rates on earnings and economic value by simulating the future path of interest rates and their impact on cash flows. In static simulations, the cash flows arising solely from the bank's current on- and off-balance sheet positions are assessed. In a dynamic simulation approach, the simulation builds in more detailed assumptions about the future course of interest rates and expected changes in a bank's business activity over that time. The BCBS paper goes on to state that a meaningful evaluation of the effect of stressful market conditions on the bank is necessary. Stress testing should be designed to provide information on the kinds of conditions under which the bank's strategies or positions would be most vulnerable, and thus may be tailored to the risk characteristics of the bank. Possible stress scenarios might include abrupt changes in the general level of interest rates, changes in the relationships among key market rates (i.e. basis risk), changes in the slope and the shape of the yield curve (i.e. yield curve risk), changes in the liquidity of key financial markets, or changes in the volatility of market rates.

The Standardized Rate Shock (BCBS)

For exposures in G10 currencies, either: An upward and downward 200 basis point parallel rate

shock; or 1st and 99th percentile of observed interest rate changes

using a one-year (240 working days) holding period and a minimum five years of observations.

For exposures in non-G10 currencies, either:

A parallel rate shock substantially consistent with 1st and 99th percentile of observed interest rate changes using a one-year (240 working days) holding period and a minimum five years of observations for the particular non-G10 currency; or

1st and 99th percentile of observed interest rate changes using a one-year (240 working days) holding period and a minimum five years of observations.

The relative simplicity of the 200 basis point parallel rate shock has the disadvantage of ignoring exposures that might be revealed through scenarios that include yield curve twists, inversions, and other relevant scenarios.

Page 100: Basel II Pillar

100

The Standardized Rate Shock – Calculation Steps 1. The first step is to offset the longs and shorts in each time

band, resulting in a single short or long position in each time band.

2. The second step is to weight these resulting short and long positions by a factor that is designed to reflect the sensitivity of the positions in the different time bands to an assumed change in interest rates. The set of weighting factors for each time band is set out in Table 1 below. These factors are based on an assumed parallel shift of 200 basis points throughout the time spectrum, and on a proxy of modified duration of positions situated at the middle of each time band and yielding 5%.

3. The third step is to sum these resulting weighted positions, offsetting longs and shorts, leading to the net short- or long-weighted position of the banking book in the given currency.

4. The fourth step is to calculate the weighted position of the whole banking book by summing the net short- and long-weighted positions calculated for different currencies.

5. The fifth step is to relate the weighted position of the whole banking book to capital.

Page 101: Basel II Pillar

101

Interest Rate Derivatives & Hull and White.

1. Some Basic Definitions - Interest Rate Derivatives The bond yield is the discount rate that makes the present value of the cash flows on the bond equal to the market price of the bond. The par yield for a certain maturity is the coupon rate that causes the bond price to equal its face value. The forward rate is the future zero rate implied by today’s term structure of interest rates. A zero rate (or spot rate), for maturity T, is the rate of interest earned on an investment that provides a payoff only at time T. To calculate the cash price of a bond we discount each cash flow at the appropriate zero rate. For an upward sloping yield curve: Fwd Rate > Zero Rate > Par Yield. For a downward sloping yield curve; Par Yield > Zero Rate > Fwd Rate 2. An introduction and overview of Interest Rate

Derivative Products A basic Security (i.e. not a derivative) is a piece of paper representing a promise. Derivative Securities which derive their value from a basic security are such things as;

Options Forward Contracts (an agreement to buy or sell something

at a future date for a set price (the forward price)) Swaps-agreements between two counterparts to exchange

cash flows in the future to a prearranged formula Variance and Volatility Swaps, where Variance is a measure

of the uncertainty of a stock price and Volatility (standard deviation) is the square root of the variance (the amount of “noise”, risk or variability in the stock price). Volatility swaps are forward contracts on future realized stock volatility and Variance swaps are forward contracts on future realized stock variance.

Newer and more sophisticated derivative securities in the fixed interest markets which are usually deployed as derivatives of securitized instruments are Covariance and Correlation Swaps. Examples of standard interest rate derivative products used to hedge interest rate portfolios are:

Exchange-Traded Interest Rate Options Treasury bond futures and options (traded primarily on

CBOT) Eurodollar futures and options (traded primarily in

London). More complex examples are:

Embedded Bond Options or Callable bonds, where the Issuer has an option to buy a bond back at the “call price” where the call price may be a function of time, or

Put-table bonds, where the Bond-Holder has an option to sell bond back to issuer, given some contractual conditions usually triggered by interest rates breaching some condition. A cap provides a payment every-time a specified floating rate like the 6 month LIBOR exceeds the agreed cap rate.

The change in forward bond price is related to the change in forward bond yield by D, where D is the (modified) duration of the forward bond at option maturity. This relationship implies the following approximation;-

Where σy is the yield volatility and σB is the price volatility.

Often σy is quoted with the understanding that this relationship will be used to calculate σB (they are endogenous).

σ σB yDBy=

Page 102: Basel II Pillar

102

No-Arbitrage Term Structure Models

No-arbitrage term structure models take the initial term structure as an input by using a time-varying parameters procedure of adjusting parameters so that the initial term structure is exactly matched. This process is generally called calibrating. No arbitrage term structure models are usually associated with Hull and White (although not exclusively). The Hull and White approach allows mean reversion as in CIR. The modern approach to Term Structure Models to price Interest Rate Derivatives has some basic positivist approach rules:

Capture “states of the world” with limited number of state variables or factors

Use realistic but simple stochastic processes to describe their dynamics

Formulate pricing kernel consistent with no arbitrage. The concept of Arbitrage has deep roots in finance; in particular the Modigliani and Miller capital structure invariance of 1958 (on which so much of modern Quantitative Financial Engineering is built) and the Black-Scholes model of 1973. But assuming away arbitrage is one way to get an Efficient Markets hypothesis. As arbitrageurs expand their positions to capture price-to-value discrepancies, portfolio risk rises faster than portfolio reward. We therefore need to find a no-arbitrage approach to risk to develop an initial model for pricing interest rate derivatives. Arbitrage activity is when, for example a Hedge Fund takes a position in a stock (or the relevant corporate bond) and bids spreads tighter because the Hedge Fund expects the issuing entity to be acquired or subject to some other market news which will impact its price. Markets tend to be efficient when three conditions are in place: Agents are diverse, there is a proper aggregation mechanism and incentives are in place. Markets tend to become inefficient when one (or more) of the three conditions is violated. The most likely to be violated is diversity.

In an equilibrium model (like CIR) today’s term structure is an output, In a no-arbitrage model (like Hull and White) today’s term structure is an input; the objective of the no-arbitrage model is accurately price interest rate and other derivatives (including credit derivatives where Hull and White has bee successfully deployed).

The graphic above shows the growth in the use of Interest Rate derivative Products in Ireland between 1999 and 2003, it is taken from a publication of The Central Bank of Ireland referenced below.27

27 Interest-Rate-Related Derivatives Growth at Credit Institutions in Ireland by David Doran, Financial Stability Report 2004

Page 103: Basel II Pillar

103

)(tθ)(tσ)(ta

The Hull and White Model This is a very positivist view of markets; the market is what it is and behaves the way it does, whether or not we can explain it theoretically; it is therefore the standard for pricing Interest Rate derivatives since it stands or falls upon its successful predictive power. The short rate is driven by a linear stochastic differential equation: where

Calibrates to the yield curve

Calibrates to Caps or Swaptions And is set to zero to allow for parallel shocks interest rate change shocks The Hull and White model has many analytic results for bond prices and option prices. It deploys two volatility parameters, a and s, it assumes that interest rates are normally distributed and that the Standard deviation of a forward rate is a declining function of its maturity. Using the applied or positivist approach of Hull and White, the historical evidence is that interest rates are rather normal than lognormal. Swaption skew recovers market consensus and rejects lognormality. The Hull-White model has analytical tractability, accurate volatility calibration and an ease of implementation. Prices will change even if the yield curve and volatilities are properly calibrated in your model. Most changes are small-to-moderate in size but they will undoubtedly occur. Constant vigilance and constant model development and calibration is therefore necessary for rigorous risk management.

dztdtrtatdr )(])()([ σθ +−=

Page 104: Basel II Pillar

104

LIQUIDITY RISK

THE IMPORTANCE OF LIQUIDITY RISK. .........................................................................................105 LIQUIDITY DISAPPEARANCE AND CONTAGION...........................................................................................105 TYPES OF LIQUIDITY RISK ...........................................................................................................................106 THE JOINT FORUM DEFINITIONS OF TYPES OF LIQUIDITY RISK ............................................................107 THE BANK OF ENGLAND VIEW OF THE DEFINITION OF LIQUIDITY RISK .................................................107 THE CUSTOMER FUNDING GAP ..................................................................................................................108 FUNDING AND MARKET LIQUIDITY.............................................................................................................108 MARKET MICROSTRUCTURE .......................................................................................................................109 DISTRESS IS VANISHING LIQUIDITY ...........................................................................................................110 THREE PHASES OF DISTRESS......................................................................................................................110 INSTITUTIONAL AND MARKET DISTRESS. ..................................................................................................111 SUPERVISORY RESPONSES: LOLR AND ELA...........................................................................................112 LOLR AND ELA. ..........................................................................................................................................116

Page 105: Basel II Pillar

105

THE IMPORTANCE OF LIQUIDITY RISK.

Liquidity Risk (LR) is the last of risk types we will deal with in this exposition of Market Risk from a Basel II Pillar 2 perspective. It is in fact the last topic of our paper. It is without question both the most difficult and opaque topic of which we will attempt a clear exposition in this paper. Further it is the Risk of which Risk Managers “dare not speak its name”; being the one risk which is singularly blamed for all events of market failure and institutional collapse in the late 20th and 21st Centuries. Lack of liquidity entails that an otherwise solvent institution may fail if the Central Bank as lender of last resort will not, for other policy considerations, replace that lost liquidity at the moment of the institution’s need. In a recent speech the Deputy Governor of the Bank of England noted:

“There are certain features of the prevailing financial and economic environment that give us pause for thought. Market prices are at historically unusual levels: real and nominal returns on risk free assets are low and credit spreads are tight, both in traditional and structured products. It is of course hard to say definitively the extent to which today’s markets are merely reflecting changed fundamentals. But it is quite possible that some investors have unwittingly taken on higher levels of risk in pursuit of what they would consider to be “normal” levels of return. And it is certainly prudent to plan for the possibility of a sharp reversion of prices to historically more normal levels (or even beyond them, given the tendency of markets to overshoot). There could be a period of impaired market liquidity during any such correction. One could imagine a number of potential catalysts for such a correction, ranging from a geo-political event to some form of major operational disruption.”28

28 Speech by Sir Andrew Large, Deputy Governor of the Bank of England, Financial Stability Managing Liquidity Risk In A Global System, the Fourteenth City of London Central Banking and Regulatory Conference, London, 28 November 2005. Published in the BIS Review 7/9/2005

Liquidity disappearance and Contagion

Mathematically we can demonstrate that contagion is greatest for a small, definite number of interconnections between a distressed financial institution and its peers. For large diversified interconnections contagion has less impact. In this way Bank failure (weakness) can have micro-systemic impacts upon borrowers and on other banks, arising through informational contagion, financial contagion and the impact of common shocks. i.e. the concept of weakness impacting upon sentiment. There are severe economic costs if liquidity provision to enterprises is disrupted and/or if solvent but illiquid providers of ‘relationship credit’ are forced to close. In this manner liquidity distress transforms/transfers? through the transmission mechanism into the real economy. Below we discuss in detail the macroprudential responsibilities of Supervisors and Central Banks with regard to this problem. Banks are opaque and can suffer withdrawal of (wholesale) funding as a result of uncertainty about a bank’s solvency. This may result in liquidation of illiquid assets and can lead to insolvency. In its Stability Review of June 2005, the Bank of England quotes the definitive description of the operation of contagion through the financial system.

Suppose that the volatility of a given asset rises sharply, the models will tell all the firms to sell. As all try to sell, liquidity dries up. As liquidity dries up, volatility spreads from one asset to another. Previously uncorrelated assets are now correlated in the general sell-off, enhanced by the model driven behaviour of other institutions caught up in the contagion. Whilst in normal times such models may encompass a wide range of behaviour, in extreme circumstances the models will encourage firms to act as a herd, charging toward the cliff edge together. 29

29 Persaud, A. (2000). Sending the Herd off the Cliff Edge: The disturbing interaction between herding and market-sensitive risk management practices. State Street Bank.

Page 106: Basel II Pillar

106

Liquidity Risk (LR) in the P2 Context Liquidity Risk is a confused topic (from a supervisory or P2 perspective) because it is not clear whether this is a risk type to be treated qualitatively or quantitatively. In the initial months after the first Basel Accords were published most European regulators discussed the challenge of Liquidity Risk in qualitative terms. Latterly however the emphasis has been on the need for regulated financial institutions (FI) to stress test this aspect of Market Risk. This stress testing requirement demands that Liquidity Risk be treated quantitatively, from the perspective of a methodological approach to capturing how the FI’s exposure to this risk may fluctuate under extreme conditions.

Since the introduction of the Basel Accords a vast literature, both regulatory and academic, has developed on credit, market and operational risk. In sharp contrast almost no attention has been given to liquidity risk. 28

As we will discuss below LR, although not difficult to define, has a complex meaning and can mean different things in different contexts. Its precise meaning can only be clear if we define LR in terms of what it is not and what it is interdependent upon. Once we are clear upon its meaning then we can clearly set out the challenge to the FI in quantifying its exposure to LR in P2 during the Stress Test and thereby provisioning economic capital appropriately to meet the exposure to this risk. The graphic on the left illustrates Euro Area real Money growth in recent experience, it is precisely referenced below.30

30 Recent Developments in Asset Prices and Liquidity in the Context of an Evolving Relationship, Frank Browne, David Cronin and Edward J. O’Brien, CBFSAI (Central Bank and Financial Services Authority of Ireland), Financial Stability Report 2005.

Types of Liquidity Risk

A conventional analysis of liquidity risk distinguishes between funding liquidity risk and market liquidity risk. Funding Liquidity Risk is the risk that the counterparties who provide the bank with short-term funding will withdraw or not roll over that funding, e.g. there will be a 'run on the banks' as depositors withdraw their funds. Market Liquidity Risk is the risk of a generalized disruption in asset markets that make normally-liquid assets illiquid. The first is more important in the context of the maturity transformation that occurs in the banking book. The second is more important in the context of tradable assets in the trading book. 28

Page 107: Basel II Pillar

107

The Joint Forum Definitions of Types of Liquidity Risk

Paul Sharma, head of the FSA Department which covers all prudential risks – market risk, credit risk, operational risk, insurance risk etc as well as liquidity risk, in his speech referred to above, takes his lead from the work of the Joint Forum of the Basel Committee and its international regulatory counterparts, IOSCO and the IAIS, in the securities and insurance fields in defining a more comprehensive definition of LR. As Mr. Sharma points out, the Joint Forum (JF) is the senior international body. The JF sees liquidity risk in terms of adverse liquidity outcomes that arise from a combination of an external or non-liquidity trigger event and an internal vulnerability. An adverse liquidity outcome is:-

The inability of an institution to pay its liabilities as they fall due.

Realizing a loss on the forced sale of assets to raise liquidity Continuing, Sharma argues that internal vulnerabilities to liquidity risk arise principally either because:-

Assets are in relative terms less liquid than liabilities, or A bank has granted its counterparties significant

optionality.

The Bank of England view of the definition of Liquidity Risk

An often repeated analysis of the inevitability of LR in a financial institution is described in the Bank of England’s Stability Review of December 2005:-

The role of banks and building societies as intermediaries, transforming deposits into illiquid loans, leaves them vulnerable to liquidity risk. Therefore, it is important to assess both market and funding liquidity to ensure that banks hold a sufficient stock of liquid assets to fulfill both expected and unexpected financial commitments as they arise. 31

In a recent speech by Sir Andrew Large, Deputy Governor of the Bank of England, a detailed description of the inevitable fragility of a financial institution, flowing form its role in intermediation is given:

The traditional route [to liquidity risk] arises from the banking system’s role in maturity transformation between short term deposits and long term loans. Managing this mismatch whilst maintaining the confidence of depositors is the essence of the business of banking. However, the presence of this maturity mismatch means that individual banks are by their nature fragile. The connections between banks, and the potential for doubts about one bank to spread to others, mean that the failure of one bank to manage its mismatch can potentially put at risk the financial system more widely.28

This Bank of England paper goes on to point out a key empirical relationship in the UK whereby the two types of liquidity risk which could be seen as fundamentally different are in fact to a degree identical.

31 Bank of England, Financial Stability Review: December 2005; The Financial Stability Conjuncture And Outlook.

Page 108: Basel II Pillar

108

The Customer Funding Gap

Major UK banks have seen the annual growth rate in lending to non-bank ‘customers’ outpace the corresponding growth in deposits from this sector. This has created a ‘customer funding gap’: the stock of lending to customers exceeds the stock of customer deposits. All the major UK banks now have a customer funding gap. The major UK banks have filled the customer funding gap by issuing debt securities, such as certificates of deposits (CDs), and by borrowing in the Interbank market. Wholesale funding is typically more expensive, lumpier, and more volatile than retail funding. It is also generally short term and needs to be refinanced regularly. In times of market-wide stress, these liabilities may therefore pose liquidity risks. Banks continue to mitigate liquidity risk by diversifying their funding and issuing debt with lengthier maturities. They have issued both senior debt and covered bonds in the UK. The Bank of England Stability Review paper 31 above also goes on to state that; Financial institutions also hold high-quality liquid assets to mitigate the liquidity risk inherent in both their on and off balance sheet activities. The regulatory minimum for major UK banks’ holdings of liquid assets is determined by the sterling stock liquidity ratio (SSLR).

Funding and Market Liquidity

If liquidity risk is defined (the narrow or funding risk definition) as the risk that the bank will have insufficient funds to hand at a given time to deal with depositors’ cash demands and day-to-day cash and regulatory requirements, and the source of the bank’s liquidity is the issuance of debt instruments to fund a inherent funding gap, then any reduction in the liquidity of the market in that debt instrument is a threat to the bank’s liquidity. It is in this manner that market liquidity and funding liquidity are two sides of the same coin. The Bank of France delivered an admonishing analysis of the use of debt markets to close the funding gap by French institutions in a recent Stability Review:

In a context of low interest rates, the relatively abundant liquidity conditions have spurred investors to seek high yield investments. Consequently, the French structured finance market recorded strong growth in 2004 and the amount of risk transfer transactions doubled compared with 2003. French banks made a significant contribution, in particular as asset managers, to the rebound in synthetic arbitrage CDO activity. French banking groups could be adversely affected by a possible downswing in those segments that have been particularly profitable over the past few years. This adjustment could cause liquidity to dry up on these markets (emerging financial markets, speculative grade corporate bonds, structured products, etc.) and damage the risk profile of banks’ trading books)32

32 Banque de France Financial Stability Review No. 6 June 2005

Page 109: Basel II Pillar

109

A Liquid Market and Liquidity A market is liquid when buyers are broadly balanced by sellers. In a liquid market, the difference between buy and sell prices is small (e.g. bid-ask spreads are “tight”), the size of the transactions that can be absorbed without affecting prices is large (i.e. there is “depth”), the speed of execution is high (i.e. there is “immediacy”) and prices quickly return to “normal” after temporary order imbalances (i.e. there is “resilience”). For example the relative liquidity of the bond and CDS markets is very different. But the difference is not systematic across maturities. Typically, the CDS market is most liquid at the five-year maturity, followed by the three-year then 10- and one-year points. The bond market is usually liquid wherever on the yield curve the largest outstanding notional amounts lie. Not only is this likely to be at different maturities for different issuers, but it will shift over time for a given issuer as the bonds age.33 Typically, the higher the liquidity in the two markets, the more bonds/CDS maturities are quoted. The indicator used is simply the number of CDS prices minus the number of bonds quoted on any given day, a rising number indicating relatively more liquidity in the CDS market. A more liquid market is usually associated with a tight bid-ask spread. This is just one dimension of liquidity, but it has the clear advantage that it is the easiest to observe. The indicator used is relative spreads (CDS minus bonds), with rising numbers indicating lower relative liquidity in the CDS market.

33 What central banks can learn about default risk from credit markets, Ian W Marsh, Bank of England, BIS Papers No 12, http://www.bis.org/publ/bppdf/bispap12o.pdf

Econometric modeling indicates relative bond market liquidity at horizons between two and three years and in excess of five years. The CDS market is relatively more liquid below two years and around the five-year mark. OTC derivatives contracts tend to be less standardized than exchange traded contracts, which give rise to difficulties in trading the contracts, particularly in volatile conditions. This creates a liquidity risk associated with these OTC derivatives. 27

Market Microstructure

The theoretical literature on market functioning and the determinants of market liquidity, has predominantly focused on the behaviour of markets under normal conditions. Thus, it has paid particular attention to one specific type of asymmetric information, where some traders are assumed to know more about the value of the asset traded than others. It has also addressed questions concerning the link between the release of pre-trade and post-trade information and market liquidity. Never before in history have markets remotely approached their current breadth, depth and richness, as exemplified by the extraordinary spectrum of financial instruments traded and the unprecedented volume of transactions. The recent development of credit derivatives, for instance, could yet herald a qualitative change in the way financial systems intermediate funds and allocate risks, and represents just the latest addition to a bewildering variety of derivative instruments. Similarly, daily turnover in markets nowadays amounts to huge multiples of GDP.34

34 Market distress and vanishing liquidity anatomy and policy options, BIS Working Papers No 158, by Claudio Borio, July 2004, http://www.bis.org/publ/work158.htm

Page 110: Basel II Pillar

110

Distress is vanishing Liquidity

At the heart of market distress is vanishing liquidity. Under stress, risk management practices, funding liquidity constraints and, in the most severe cases, concerns with counterparty risk become critical. The dynamics of distress are not so much the result of extraneous large unexpected untoward events (“shocks”) that hit financial markets, as it were, from outside (i.e., that are “exogenous”). Rather, they often result from the collective behaviour of market participants, which sows the seeds of, and subsequently amplifies, the market turbulence. In this sense, risk is fundamentally “endogenous” 34 A market in distress is defined as one that experiences a sudden and substantial reduction in its liquidity. The graphic below illustrates the turning climate for euro area short interest rates right now. 30

Three Phases of Distress

Build Up. Balance sheets of institutions become overextended through the accumulation of risk exposure in relation to the ability to absorb that risk. In other words, they are preceded by the accumulation of “leverage” in relation to the corresponding market risk factor. Eruption. Volatility rises and liquidity evaporates, the dynamics of market distress take on a life of their own. They are largely driven by the interaction of risk management systems, funding liquidity constraints, and, possibly, heightened concerns with counterparty risk. Their net effect is to undermine either the ability or the willingness to trade. The surge in hedging and trading activity naturally generates large and highly variable demands on cash flows (“cash or funding liquidity”) required to complete transactions. Difficulties in raising the corresponding cash liquidity can further exacerbate market distress, by precipitating distress sales and closures of positions. Aftermath. Severe dislocations leave a legacy of reduced liquidity in the market segments affected and higher liquidity premia in asset prices. Scars take time to heal, especially if market-makers experience severe losses and doubts arise about the profitability of trading strategies, or the validity of hedging practices, that were directly or indirectly providing liquidity to the market during the build-up phase.

Page 111: Basel II Pillar

111

Institutional and Market distress.

The processes that lead institutions to become overstretched, while at the same time masking the signs of rising risk, are self reinforcing. During the boom phase rising asset prices, loosening external financing constraints and profits feed on each other and disguise the overextension in balance sheets. The specific trigger and timing of the reversal is rather unpredictable, just as in the case of market distress. But when it comes, the processes go into reverse. If the system has not built up sufficient defences during the upswing, the subsequent contraction can result in serious strains on institutions. Liquidity constraints can add to the strains on solvency, by precipitating distress asset sales and the need to retrench from lending. In addition, difficulties in distinguishing sound from unsound banks, not least owing to the web of contractual relationships that ties them together, can help to generalize the liquidity withdrawal. And the process has certain self-fulfilling aspects to it: concerns about delaying the withdrawal of funds precipitate their withdrawal. In the case of markets, exactly the same factors are at work. A tightening of liquidity constraints and concerns with the creditworthiness of counterparties (credit risk) are precisely what underpins market distress. 34

The consequences of a bank allowing a large maturity mismatch on its books to go un-hedged could conceivably have adverse consequences. Suppose a sharp rise in interest rates occurred as a result of a sudden manifestation of inflation. The bank could, by virtue of a maturity mismatch, be committed to funding loans for a period of time into the future at the earlier lower interest rate from more expensive deposits, which it must accept at the new higher interest rate. This will have an adverse effect on the bank’s profits and capital ratio and increase the likelihood of insolvency. This creates the risk that the bank in question could find itself in a position of poor liquidity causing it to default on its payment obligations. Such a shock to an individual credit institution could propagate through the rest of the financial system through contagion, whereby other credit institutions suffer loss resulting from their claims on customers of the defaulting credit institution, or perhaps through Interbank lending with that particular credit institution. This possible sequence of events would jeopardize the financial stability of the economy. 27

Page 112: Basel II Pillar

112

Supervisory Responses: LOLR and ELA.

A simple decision rule for Central Bank liquidity provision If during a time of distress (severely constrained liquidity from a market perspective, we are not considering “runs on banks” here in terms of individual customers rushing to withdraw deposits), and it maybe the case that an institution’s capital adequacy is no longer satisfied in the immediate term, then the institution, in partnership with the supervisor, must take steps to return to that capital adequacy position. It is interesting to note that ratio in the French regulatory environment the Tier One Capital ratio is referred as the Tier One Solvency ratio. An institution may be illiquid and solvent at the same time. In general there is a decision rule for the supervisory authorities over whether to offer assistance or not to an individual institution. This decision is based upon whether or not the institution can be regarded as having a positive net worth; if so then the central bank can provide liquidity whilst the institution sells liquid then illiquid assets to return to a position of maintaining capital adequacy. If the institution has a net worth of zero or less than zero then liquidation of the institution could be the consequence.

The Central Bank and the Market The following comment by Sir Andrew Large in his speech on Liquidity Risk referred above28 makes clear the nature of the relationship between the Central Bank as ‘Lender of Last Resort’ (LOLR) to both the distressed institution and to the market microstructure (the microprudential responsibility).

Under stressed conditions, and with the associated uncertainties, attempts by a perfectly sound bank to borrow unusually large amounts from the market, even against good quality collateral, have the potential to raise, or exacerbate, doubts about that bank’s solvency. But a solvent bank that is in need of liquidity can safely reveal its need to the central bank without precipitating a crisis in market confidence.

This goes to the heart of understanding Liquidity Risk where the liquidity of a financial instrument is understood at the limit as the biggest size of a trade that does not move the market price. If a trade of a bigger size is transacted, then that trade at the margin changes the market price. That marginal trade in conditions of distress may also impact market sentiment.

Page 113: Basel II Pillar

113

The Psychology and Systems of Market Distress Evidenced in several episodes of “market distress”, from the stock market crash of 1987 to the 1998 autumn market turbulence in fixed income markets via the turbulence in high-yield and related markets in 1990 centered on Drexel Burnham Lambert, there is a well understood process of the failure of standard techniques and the fear factor driving pathology in market behaviour. During these episodes, market liquidity suddenly evaporated, as signaled by disorderly adjustments in asset prices, a sharp increase in the costs of executing transactions and, in the most acute cases, a “seizing up” of markets.

The physiology of markets is by now rather well understood; their pathology much less 34

Illiquidity which prevents dynamic portfolio insurance, makes nonsense of every assumption in a VaR model, which is predicated upon the assumption of “normality”. When normality breaks down the use of VaR models only serves to exacerbate the herd behaviour driven by fear which in turn exacerbates the market distress. Stress Testing Distress When evaluating what might happen to a portfolio of instruments which is maintained to ensure institutional liquidity, during the stress test the institution should consider all exposures to risk as a result of vanishing liquidity as a mathematical function of time. The actual risk as a result of lack of liquidity might be to capital prices of an instrument or to the portfolio as a consequence of derivative movements but the source of the risk is market illiquidity. The appraisal to be made during the stress test is one of taking into account the extra time and cost that would be necessary to hedge out the position, in an orderly fashion, or how much bid/offer would move over time to close out the position.

It is necessary to understand by modeling from past events and thus to define a reasonable liquidation period for each product. That period varies by firms (e.g. one firm may consider a 60-90 day period as a reasonable time period for certain types of products or trading strategies, while another may consider one week to be the norm). 22 A comprehensive Stress Testing Programme is an essential supplement to a VAR model. Stress testing involves subjecting trading portfolios to unexpected but possible shocks in market or political conditions. This enables an institution to evaluate its capacity to absorb potentially large losses and to identify steps that it can take to reduce its risk and conserve capital. The move towards more regular stress testing is in part being driven by the market risk capital requirements whereby banks using internal models will be required to submit the results of their stress testing scenarios to the Supervisor on a quarterly basis.35 There are difficulties in Stress Testing for Liquidity Risk. For example, a full stress test of market risks requires detailed data on positions and contracts that are neither publicly disclosed nor subject to regulatory reporting. Moreover, surveys reveal that individual banks themselves are not yet able to integrate stress testing of market, credit and liquidity risks systematically. 36

35 Calculated Risk: How banks make sure they stay off the Barings path, By Neil Hereford Senior Analyst, Market Risk, Reserve Bank of Australia 36 Bank of England Financial Stability Review June 2005, Stress testing as a tool for assessing systemic risks; Philip Bunn, Macroprudential Risks Division, Alastair Cunningham and Mathias Drehmann, Financial Industry and Regulation Division, Bank of England

Page 114: Basel II Pillar

114

The Macro and Micro Prudential Imperative The Supervisor and the Central Bank which are generally one and the same thing but may be different (e.g. the FSA which functions with a memorandum of understanding with the Bank of England) have a microprudential responsibility to institutions which is to ensure that the institutions can play their role in the market in proper manner and a macroprudential responsibility to markets which is to ensure that markets function in an orderly manner. Under Basel II it is the Pillar 2 process of SRP (The Supervisory Review Process) which is the manner in which the Central Banks and Supervisors expedite this responsibility. As we have argued previously in this paper, under this new supervisory regime the supervisors have effectively outsourced the process of micro review which they used to undertake themselves back to the institutions which they supervise. It is on the basis of comprehensive completion of the requirements of the SRP that it is likely that microprudential decisions in times of distress will be taken by Supervisory Authorities in the future. All that has changed is the manner in which the supervisory analytics are undertaken. The microprudential approach focuses primarily on the perspective of individual institutions and tends to take the market risks that they face as largely independent of their individual behaviour (exogenous). Its macroprudential counterpart adopts a more system-wide perspective and tends to stress that those risks are to a considerable extent the result of the collective behaviour of institutions (endogenous).Microprudential efforts are very well advanced; macroprudential ones are just in their infancy.

The Microprudential Perspective Financial institutions, especially those acting as market-makers, should operate with sufficient safety cushions in terms of capital and liquidity so as to be able to absorb market strains without seeing their soundness endangered. They should not assume ex ante the existence of liquid markets in which to hedge and lay off risks. And they should have sufficient information about the market participants with whom they transact to be fully aware of the risks they incur in the process. The limitations of purely backward-looking and mechanical measures of risk, such as VaR, have become better appreciated. For instance, VaR outputs are nowadays used only as one source of information and not necessarily as bindings constraints on positions, other than when regulatory minima become binding. Correspondingly, the use of stress-testing techniques has been strongly encouraged.

Page 115: Basel II Pillar

115

The Macroprudential Perspective Market distress will depend not so much on the risk profile of individual institutions but on the extent to which these institutions share similar exposures, i.e. on the correlation of exposures across them. Leading indicators of distress would seek to develop probabilistic statements about the likelihood of its emergence. However, except in the narrow and rather specific context of exchange rate crises, there is no extant work in this area. Conceptual and informational constraints remain daunting. None of these efforts at aggregate stress tests in Central Banks have so far been able to take into account in a meaningful way feedback effects. 34 Improving the safeguards against instability for a financial system that is larger and more interconnected, and where the endogenous component of risk is more prominent, naturally calls for a strengthening of the macroprudential orientation of prudential frameworks. After all, it is now well accepted that a system-wide perspective and a focus on the endogenous component of risk are precisely the distinguishing features of such an approach. This “macro” orientation requires a shift away from the notion that the stability of the system is simply a consequence of the soundness of its individual components. It involves the same shift in focus that a stock analyst is required to make in order to become a portfolio manager. In evaluating financial system vulnerabilities, a macroprudential approach would focus on the commonality in the risk exposures of the different segments of the financial system.37

37 Speech by Malcolm Knight, General Manager of the BIS, at the 25th SUERF Colloquium in Madrid, 14 October 2004. Markets and institutions: Managing the evolving financial risk http://www.bis.org/speeches/sp041014.htm

Macroprudence – Supervisory Convergence Both insurance supervisors and banking supervisors are becoming increasingly aware of the need to address risks also on a system-wide, sometimes referred to as the “macroprudential”, basis. Convergence and the greater ease with which risks can now be shifted across sectors, not least to exploit regulatory differences, have put a premium on this type of analysis, which focuses on systemic vulnerabilities. An obvious example of this new focus is the set of studies done on the impact of credit risk transfer instruments on the allocation of credit risk across the financial system, including work by the Joint Forum. Another example is the IAIS work under way aimed at improving transparency in the reinsurance sector - a sector which, owing to its potential concentration of risks and its strategic role, can be of major significance for systemic stability and is still largely unsupervised in some countries. More generally, participation of the IAIS in the Financial Stability Forum alongside other international regulators, international financial institutions, central banks and finance ministries has allowed them to take an active part in regular macroprudential assessments of the global financial system.38

38 Regulation and supervision in insurance and banking: Greater convergence, shared challenges Speech by Malcolm Knight, General Manager of the BIS, at the IAIS 11th Annual Conference in Amman, Jordan, 6 October 2004, http://www.bis.org/speeches/sp041006.htm

Page 116: Basel II Pillar

116

Hedge Funds and “Basel 2.5” In a recent Financial Stability Review the Banque de France discussed the appropriate locus of the French Banking System’s exposure to Hedge Funds in terms of “Basel 2.5”.

“In an environment of relatively abundant liquidity and extremely low interest rates, banks have continued to favour the highest yielding market segments and encouraged the development of complex financial products such as structured products (particularly CDOs). At the same time, credit institutions have increased their direct exposure vis-à-vis hedge funds, whose investment strategies are particularly risky given that they concentrate their positions on narrow markets with uncertain liquidity. Against this backdrop, the banking groups most active in the se areas have been encouraged, by the supervisory authorities to put in place appropriate risk monitoring systems and procedures (crisis scenarios, setting of limits, notably on an aggregated basis).These transactions generate significant liquidity risk and valuation difficulties. Moreover, the discussions within the framework of “Basel 2.5” favor the idea of outstandings linked to activities with hedge funds being recorded in the banking book (rather than the trading book), on account of their being relatively illiquid”. 39

LOLR and ELA.

The Supervisor (or the Central Bank, in times of distress they will function as one) is The Lender of Last Resort (LOLR) and thus the provider of ELA (Emergency Liquidity Assistance). We have presented above a simple decision rule which a Supervisor might deploy in making LOLR decisions which are decisions to provide ELA. This process could be viewed as the ‘end of the line’ of Supervisory Review and it is therefore appropriate to conclude our discussion of Pillar 2 with a brief examination of LOLR and ELA decision rules.

39 Banque de France Financial Stability Review No. 7 November 2005

The Central Bank is just one player in a broader safety net, along with for example The Supervisor and possibly The Department of Finance of The (local) Central Government; the respective roles do differ considerably. There is an incipient trend towards the loss of supervisory responsibilities for Central Banks per se, therefore the Central Banks become one step further removed from sources of information. The changing nature of liquidity distress from “bank runs” to “market runs”, given that market distress can have systemic implications, has implications for the information needs of the LOLR (The Central Bank). Borio identifies three stylized types of ELA provision to institutions (of which the first two are relevant for our summary discussion);-

Type 1 (“prudent micro-prudential”); ELA if and only if institution solvent but illiquid.

Type 2 (“prudent macro-prudential”); ELA like type 1 plus only if the institution presents a systemic risk to the domestic system of supervisory competence.

Prudential information is necessary for the LOLR to make ELA decisions in distress types 1 + 2 but is not sufficient for 2. Further information would be necessary on system-wide distribution of exposures and potential behavioural responses of other market participants to support the LOLR decision on provision of ELA. But note the importance of Supervisory information in supporting the decision in either the macro prudential or micro prudential case of the requirement for ELA.40

40 How much financial strength information do central banks need? By Claudio Borio, Head of Research and Policy Analysis Bank for International Settlements, Basel, Presentation for Norges Bank Conference on “Banking Crisis Resolution – Theory and Policy” Oslo, 16-17 June 2005 Norway

Page 117: Basel II Pillar

117

CONCLUSION We have tried to present in this paper not only how to solve the problems of the challenge of the new regime of (Financial) Institutional Supervision but also why this new regime is being enforced now and what it is that supervisors wish their clients to do. However we do understand that our clients (who are effectively the supervisors’ clients) are also asking themselves “but what do I get out of all of this?” and we agree that this is the right question to ask. We at SAP believe that the correct thing to do right now is to engage and commit to the challenge of the new supervisory process. In so doing the financial institution can take the first step on the road towards fully integrated Enterprise Risk Management (ERM). That is the real pay-back from investment in systems and organizational change necessary to undertake the requirements of P2 now. P2 is taking steps towards ERM, anyway, in particular by requiring a Risk Management philosophy and organizational structure which takes a holistic view of risk across the institution irrespective of the differences in practice between business units and geographical locations. That is why we have developed a quantitative approach and a product solution architecture which supports common techniques for both credit risk and market risk for both the trading book and the banking book. We have referred, sometimes lightly, to “Basel III” (some readers will be dismayed) but we believe there are clear indications that further enhancement of the supervisory requirements are inevitable logical consequences of some of the compromises (e.g. “the factor of three”) made to get B2 off the ground. We see this particularly in securitization, trading book E.Cap, liquidity risk modeling and the supervision of further product innovation off balance sheet. That is why we are confident that the right thing to do now is to deploy the SAP Capital Adequacy Solution. The graphic on the right illustrates the complexity vector towards integrated front to back Credit Risk beyond B2.

A step on the path towards true ERM. There is one other aspect to this conclusion and that is the path towards ERM. It is challenging now to deploy the factor modeling technique to support holistic modeling of risk but still from a purist modeling perspective one can see limitations to this. An example from the trading book (TB) will illustrate. The stress of E.Cap in the TB is of the static asset allocation of the TB at the commencement of the period to which plausible stress scenarios are applied (at an aggregated level). A professional trader will argue that requirement is totally unrealistic (as would capital markets theory) since the efficient market will deploy every signal, every piece of data to detect the onset of stress and adjust the asset allocation instantaneously. Quite right! This is true both theoretically and from an applied perspective. The same conditional logic applies to a rating applied by an ECAI (External Capital Assessment Institution or Rating Agency) to a securitized instrument, the real value of that instrument is moving around daily and only an internal rating computed frequently using a factor model can appraise E.Cap for that instrument realistically.

Page 118: Basel II Pillar

118

What is the answer? Well it’s simulation isn’t it? The basis of true ERM has to be the ability of the institution to truly simulate its own adaptive processes to market and macroeconomic events. To set the foundation for simulative capacity one first has to have factor analytics in place because simulation of adaptive processes is a logical extension of the factor analytics which in turn underpin that simulative logical-time dynamic modeling. One has to have the analytics in place before one can begin to simulate the adaptive logic of the enterprise through a multi-step process model. If you view the simulated model of the institution like a train set model then without the rails, the train set has nothing to run on. The rails are the factor models you develop for P2. To get to true ERM, the SAP Capital Adequacy Solution and its optimized support for Factor Modeling is the right place to start.

Page 119: Basel II Pillar

119

REFERENCES 1. These are extracts from the minutes of the FSA Pillar 2 standing

committee meeting minutes from the autumn and winter of 2005/6. They are available at; http://www.fsa.gov.uk/pages/Library/Other_publications/EU/minutes_of_industry_group_meetings/p2sg/index.shtml

2. http://www.bis.org/publ/work165.htm; Stress-testing financial systems: an overview of current methodologies, by Marco Sorge, Working Papers No. 165, December 2004.

3. http://www.bis.org/publ/work180.pdf; Accounting, prudential regulation and financial stability: elements of a synthesis, by Claudio Borio and Kostas Tsatsaronis

4. Basel II: The Key Components and Challenges of Pillar 2 World Bank/IMF/Federal Reserve Seminar for Senior Bank Supervisors from Emerging Economies, Washington, D.C.17 October 2005, Elizabeth Roberts, Director Financial Stability Institute

5. Governance and Structure of European Finance after EU enlargement, 9 March 2005, Panel on Banking and Financial Integration in Europe: the evolving rules, José María Roldán, Chairman of the Committee of European Banking Supervisors

6. Implementation of Supervisory Review Process (Pillar 2), London, 16 November 2005; Kerstin af Jochnick, Committee of European Banking Supervisors

7. CEBS Consultation Paper Application of the Supervisory Review Process under Pillar 2 (CP03 revised), June 2005

8. The Benefits and Challenges of Implementation of Basel II in Europe; José María Roldán| 27 Sept 2005, CEBS

9. http://www.bis.org/publ/bppdf/bispap22t.pdf; Macro stress tests of UK banks, Glenn Hoggarth, Andrew Logan and Lea Zicchino; Bank of England.

10. http://www.bis.org/bcbs/events/rtf05Drehmann.pdf; A Market Based Macro Stress Test for the Corporate Credit Exposures of UK Banks Mathias Drehmann, Bank of England, April 2005

11. “The Stress Testing of Irish Credit Institutions” by Andrew Mawdsley, Maurice McGuire and Nuala O’Donnell. Central Bank of Ireland, Financial Stability Report 2004

12. Quantitative Risk Management: Concepts, Techniques, and Tools, Alexander J. McNeil, Rüdiger Frey, and Paul Embrechts

13. Credit Risk: Pricing, Measurement, and Management, Darrell Duffie and Kenneth J. Singleton

14. http://www.bis.org/publ/cgfs24.pdf; Stress testing at major financial institutions: survey results and practice, report by a working group established by the Committee on the Global Financial System, January 2005

15. http://rss.acs.unt.edu/Rdoc/library/repeated/html/glmm.html, Thanks to Rich Herrington, MS, PhD, University of North Texas.

16. Model Foundations for the Supervisory Formula Approach, Michael B. Gordy, Board of Governors of the Federal Reserve System, July 2004

17. Basel 2 and Securitisation A Paradigm Shift, J P Morgan, European Securitized Products Research, 30 January 2006

18. Framework for Supervisory Information about Derivatives and Trading Activities, Joint Report by the Basle Committee on Banking Supervision and the Technical Committee of the International Organization of Securities Commissions (“IOSCO”), September 1998; http://www.bis.org/publ/bcbs39.pdf

19. “Corporate Debt Market Acts As A Barometer”, By Paul J Davies; Financial Times; Nov 22, 2005

20. The Application of Basel II to Trading Activities and the Treatment of Double Default Effects, April 2005; http://www.bis.org/publ/bcbs111.htm

21. Basel Committee on Banking Supervision, Trading Book Survey: A Summary of Responses, April 2005; http://www.bis.org/publ/bcbs112.pdf

22. SOURCE: www.Finextra.com 23. The FSA Discussion paper 05/2 is entitled 'Stress testing'. It was

published in May 2005, http://www.fsa.gov.uk/pages/library/policy/dp/2005/05_02.shtml

24. BCBS, Principles for the management and supervision of interest rate risk, July 2004; http://www.bis.org/publ/bcbs108.htm

25. Shane Whelan, University College Dublin (UCD) 26. Interest-Rate-Related Derivatives Growth at Credit Institutions in

Ireland by David Doran, Financial Stability Report 2004, Central Bank and Financial Service Authority of Ireland.

27. Speech by Sir Andrew Large, Deputy Governor of the Bank of England, Financial Stability Managing Liquidity Risk In A Global System, the Fourteenth City of London Central Banking and Regulatory Conference, London, 28 November 2005. Published in the BIS Review 7/9/2005

28. Persaud, A. (2000). Sending the Herd off the Cliff Edge: The disturbing interaction between herding and market-sensitive risk management practices. State Street Bank.

29. Recent Developments in Asset Prices and Liquidity in the Context of an Evolving Relationship, Frank Browne, David Cronin and Edward J. O’Brien, CBFSAI (Central Bank and Financial Services Authority of Ireland), Financial Stability Report 2005.

30. Bank of England, Financial Stability Review: December 2005; The Financial Stability Conjuncture And Outlook.

31. Banque de France Financial Stability Review No. 6 June 2005 32. What central banks can learn about default risk from credit markets,

Ian W Marsh, Bank of England, BIS Papers No 12, http://www.bis.org/publ/bppdf/bispap12o.pdf

33. Market distress and vanishing liquidity anatomy and policy options, BIS Working Papers No 158, by Claudio Borio, July 2004, http://www.bis.org/publ/work158.htm

34. Calculated Risk: How banks make sure they stay off the Barings path, By Neil Hereford Senior Analyst, Market Risk, Reserve Bank of Australia,

35. Bank of England Financial Stability Review June 2005, Stress testing as a tool for assessing systemic risks; Philip Bunn, Macroprudential Risks

36. Division, Alastair Cunningham and Mathias Drehmann, Financial Industry and Regulation Division, Bank of England

Page 120: Basel II Pillar

120

37. Speech by Malcolm Knight, General Manager of the BIS, at the 25th SUERF Colloquium in Madrid, 14 October 2004. Markets and institutions: Managing the evolving financial risk, http://www.bis.org/speeches/sp041014.htm

38. Regulation and supervision in insurance and banking: Greater convergence, shared challenges Speech by Malcolm Knight, General Manager of the BIS, at the IAIS 11th Annual Conference in Amman, Jordan, 6 October 2004, http://www.bis.org/speeches/sp041006.htm

39. Banque de France Financial Stability Review No. 7 November 2005 40. How much financial strength information do central banks need? By

Claudio Borio, Head of Research and Policy Analysis Bank for International Settlements, Basel, Presentation for Norges Bank Conference on “Banking Crisis Resolution – Theory and Policy” Oslo, 16-17 June 2005 Norway

Page 121: Basel II Pillar

121

ACKNOWLEDGEMENTS

SAP B2P2 Initiative: Design & Direction

Les Biscomb, Andy Milner, Nigel Trapp & Thorsten Zapf

Document QA and Methodology

Professor Alex J McNeill (ETH, Zurich)

Document Review

Colin Burnside, Michael Adam, James Devern, Ronan Gavin and Fiona Lunny

Marketing & Graphics

Sarah-Jane Hamlyn, Paul Johnson, Mark Taylor and Mark Jago

Co-Author, SAP CAS Solution Architecture.

Dr. Nick Illingworth

Author

J A Morrison Dublin, April 2006

Page 122: Basel II Pillar

122

EXECUTIVE SUMMARY ................................................... 3

‘OUTSOURCING’ SUPERVISORY TECHNIQUES .................. 3 SAP BANK ANALYZER ....................................................... 4 THE STATE OF THE ART OF RISK MANAGEMENT ............. 4 QUANTITATIVE RISK MANAGEMENT (QRM) ................... 5 PILLAR 2 AND ORGANIZATIONAL CHANGE....................... 6 THE BENEFITS OF BASEL II................................................ 6

THE STRUCTURE OF THIS DOCUMENT ................... 7 A ‘WHITE AND GREEN’ PAPER.......................................... 7

FOCUS ON PILLAR TWO................................................. 9 PILLAR 2 & PILLAR 1 ................................................. 10 A WHITE PAPER ABOUT P2 ...................................... 10 REVIEW OF DISCUSSIONS OF PILLAR 2 04/05........... 11 THE BASEL II INITIATIVE IN CONTEXT ............................. 12 THE SYNERGY OF MATHEMATICS AND TECHNOLOGY .... 12 QUANTITATIVE TECHNIQUES AND ORGANIZATIONAL

CHANGE.............................................................................. 13 THE INADEQUACIES OF BASEL I ...................................... 14 THE ON & OFF BALANCE SHEET DISTINCTION ............. 15 BCBS AND CEBS – THE REQUIREMENT............... 16 PILLAR 2 - NECESSARY TO IMPROVE RISK MANAGEMENT ................................................................... 17 THE POSITION OF CEBS AT THE END OF 2005 ............ 18 THE STRESS TEST (ST) .................................................... 21

MODELLING PILLAR 2 .................................................. 23 QUANTITATIVE TECHNIQUES .................................. 24 THE SUPERVISORY EQUATIONS ....................................... 24 A MODEL FOR A SINGLE FINANCIAL INSTITUTION ........ 25 THE SRP AND THE SREP ................................................ 25 STRESS TESTING METHODOLOGIES ................................. 27 FACTOR MODELING ........................................................... 28 FACTOR MODELS IN CONTEXT – CREDIT RISK .............. 28 FACTOR MODELS IN MARKET RISK ................................ 28 FACTOR MODELS OF CREDIT AND MARKET RISK. ........ 29 FACTOR MODELS – ALIGNMENT WITH BIS AND CEBS 29 THE MATHS OF A FACTOR MODEL OF CREDIT RISK...... 30 A QUANTITATIVE METHODOLOGY. ................................... 31 MODELING TECHNIQUES FOR THE STRESS TEST............ 31

SAP CAS SOLUTION ARCHITECTURE..................... 32 STRATEGIC FUNCTION AND PRESSING NEED.. 33 SUMMARY OF THE P2 BUSINESS REQUIREMENTS........ 34 THE PROCESS ARCHITECTURE FOR PILLAR 2 ................ 36 PILLAR 2 REPORTS ........................................................... 41 A LOGICAL ARCHITECTURE FOR PILLAR 2...................... 43 STATISTICAL MODELING AND ANALYTICS IN THE SAP CAPITAL ADEQUACY SOLUTION. ...................................... 46 THE SAP CAPITAL ADEQUACY SOLUTION FUNCTIONAL OVERVIEW.................................................... 53 RISK TYPES AND THE USER INTERFACE ......................... 53 THE P2 MODELING CYCLE WITH THE SAP CAS ............ 54 SAP BANK ANALYZER AND P2 ...................................... 55 SAP BANK ANALYZER & CREDIT RISK ......................... 57 THE SAP CAPITAL ADEQUACY SOLUTION FOR CURRENT

SAP BANK ANALYZER CLIENTS ...................................... 58 THE SAP CAPITAL ADEQUACY SOLUTION FOR NEW SAP

BANK ANALYZER CLIENTS ................................................ 58 THE BENEFITS OF IMPLEMENTING THE SAP CAS ........ 59

THE CALCULUS OF STRUCTURED FINANCE....... 60 INTRODUCTION TO STRUCTURED FINANCE...... 61 OFF BALANCE SHEET SECURITIZATION........................... 61 THE CALCULUS OF SECURITIZATION................................ 62 THE REAL-WORLD COMPLEXITY OF STRUCTURED FINANCE ............................................................................. 63 CONSTRAINTS IN THE METHODOLOGY PHASE OF INSTRUMENT STRUCTURE................................................. 64 SECURITIZATION METHODOLOGY AND TRANSACTION STRUCTURE ........................................................................ 64 CREDIT ENHANCEMENT (CE) & SUBORDINATION ........ 66 THE WATERFALL CASH-FLOW .......................................... 67 LESSONS FROM THE CAPITAL TREATMENT OF

STRUCTURED INSTRUMENTS ............................................ 68 THE REQUIREMENT TO MODEL UNCERTAINTY................ 69 STRUCTURED INSTRUMENTS AND THE STRESS TEST .... 70

Page 123: Basel II Pillar

123

MARKET RISK A P2 PERSPECTIVE........................ 72 DEFINITIONS OF MARKET RISK ............................ 73 MARKET RISK LOCUS IN THE FRONT OFFICE................. 74 THE TRADING BOOK DEFINED BY INTENT ...................... 74 THE BCBS & IOSCO JOINT GROUP ............................. 75 FAIR VALUE OF POSITIONS IN THE TRADING BOOK ...... 76 TRADING POSITIONS FOR WHICH NO ACTIVE REFERENCE MARKET EXISTS ................................................................. 77 CREDIT DERIVATIVES HEDGE NON-TRADING POSITIONS............................................................................................. 79 THE DISTINCTION OF THE TRADING AND BANKING BOOKS. .............................................................................. 79 CREDIT DEFAULT SWAPS. ................................................ 80 VARIANTS OF CREDIT DERIVATIVE; SYNTHETIC TRANSACTIONS.................................................................. 80 ISDA (THE INTERNATIONAL SWAPS AND DERIVATIVES ASSOCIATION) ................................................................... 81 OPERATIONAL RISKS IN CREDIT DERIVATIVES .............. 81 CONDITIONS UPON IRB IN THE TRADING BOOK ........... 82

INTEREST RATE RISK ................................................... 84 B2 TREATMENT OF IRR ............................................. 85 DEFINITIONS OF IRR ........................................................ 86 THE TERM STRUCTURE OF INTEREST RATES.... 87 APPLIED MODELS OF INTEREST RATE RISK ..... 91 IR PORTFOLIO MANAGEMENT STRATEGIES...... 92 THE RE-PRICING & THE MATURITY MODEL ...... 93 IR SENSITIVE ASSETS AND TRANSACTIONS.... 94 DURATION, MATURITY AND RE-PRICING.......... 94 DURATION ...................................................................... 95 DURATION: SPECIAL FOCUS IN MORTGAGES .. 95 MACAULAY DURATION...................................................... 96 MODIFIED DURATION........................................................ 96 DURATION – METHODOLOGIES FOR MONITORING IRR . 96 IRR, SIMULATION AND STRESS TESTING ...................... 99 THE STANDARDIZED RATE SHOCK (BCBS) .................. 99 INTEREST RATE DERIVATIVES & HULL AND WHITE. ... 101 NO-ARBITRAGE TERM STRUCTURE MODELS ............... 102

LIQUIDITY RISK ............................................................. 104 THE IMPORTANCE OF LIQUIDITY RISK. ............ 105 LIQUIDITY DISAPPEARANCE AND CONTAGION.............. 105 TYPES OF LIQUIDITY RISK .............................................. 106 THE JOINT FORUM DEFINITIONS OF TYPES OF LIQUIDITY

RISK ................................................................................. 107

THE BANK OF ENGLAND VIEW OF THE DEFINITION OF

LIQUIDITY RISK................................................................ 107 THE CUSTOMER FUNDING GAP...................................... 108 FUNDING AND MARKET LIQUIDITY ................................ 108 MARKET MICROSTRUCTURE........................................... 109 DISTRESS IS VANISHING LIQUIDITY............................... 110 THREE PHASES OF DISTRESS ......................................... 110 INSTITUTIONAL AND MARKET DISTRESS. ..................... 111 SUPERVISORY RESPONSES: LOLR AND ELA. ............. 112 LOLR AND ELA............................................................... 116

CONCLUSION .................................................................. 117 REFERENCES ................................................................... 119 ACKNOWLEDGEMENTS............................................... 121

SAP B2P2 INITIATIVE: DESIGN & DIRECTION .......... 121 DOCUMENT QA AND METHODOLOGY............................. 121 DOCUMENT REVIEW ........................................................ 121 MARKETING & GRAPHICS .............................................. 121 CO-AUTHOR, SAP CAS SOLUTION ARCHITECTURE... 121 AUTHOR ............................................................................ 121