Val Econ Cap Mdls Risk Conf Jacobs 1 10 V1

1. Validation of Economic Capital Models: State of the Practice, Supervisory Expectations and Results from a Bank Study Michael Jacobs, Ph.D., CFA Senior Economist /…
of 35
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
  • 1. Validation of Economic Capital Models: State of the Practice, Supervisory Expectations and Results from a Bank Study Michael Jacobs, Ph.D., CFA Senior Economist / Credit Risk Analysis Division U.S. Office of the Comptroller of the Currency Risk Conference on Economic Capital, February 2010 The views expressed herein are those of the authos and do not necessarily represent the views of the Office of the Comptroller of the Currency or the Department of the Treasury.
  • 2. Outline <ul><li>Introduction, Background and Motivation </li></ul><ul><li>Fitness for Use of Economic Capital (EC) Models </li></ul><ul><li>Providing Confidence Regarding EC Model Assumptions </li></ul><ul><li>Assessing the Value of Validation Methodologies </li></ul><ul><ul><li>Qualitative Approaches </li></ul></ul><ul><ul><ul><li>Use Testing </li></ul></ul></ul><ul><ul><ul><li>Data Quality Analysis </li></ul></ul></ul><ul><ul><li>Quantitative Approaches </li></ul></ul><ul><ul><ul><li>Validating of Inputs and Parameters </li></ul></ul></ul><ul><ul><ul><li>Model Replication and Benchmarking </li></ul></ul></ul><ul><ul><ul><li>Stress Testing </li></ul></ul></ul><ul><li>Technical Challenges in Testing the Accuracy of EC Models </li></ul><ul><ul><li>The Tails of the Loss Distribution </li></ul></ul><ul><ul><li>Example: Alternative Models for Risk Aggregation </li></ul></ul><ul><li>Effective Reporting of EC Model Outputs </li></ul><ul><ul><li>Avoidance of Misuse and Misunderstanding of the EC Model </li></ul></ul>
  • 3. Introduction, Background and Motivation <ul><li>The validation of EC models is at a very preliminary stage </li></ul><ul><li>EC models can be complex, having many components, and it may not be immediately obvious that a such model works satisfactorily </li></ul><ul><li>Models may embody assumptions about relationships amongst or behavior of variables that may not always hold (e.g., stress) </li></ul><ul><li>Validation can provide a degree of confidence that assumptions are appropriate, increasing the confidence of users in the model outputs </li></ul><ul><li>Additionally, validation can be also useful in identifying the limitations of EC models (i.e., where embedded assumptions do not fit reality) </li></ul><ul><li>There exists a wide range of validation techniques, each providing evidence regarding only some of the desirable properties of a model </li></ul><ul><li>Such techniques are powerful in some areas (risk sensitivity) but not in others (accuracy - overall / absolute or In the tail of the distribution) </li></ul><ul><ul><li>.. </li></ul></ul>
  • 4. Introduction, Background and Motivation (continued) <ul><li>Used in combination, particularly with good controls and governance, a range of validation techniques can provide more substantial evidence for or against the performance of the model </li></ul><ul><li>There appears to be scope for the industry to improve the validation practices that shed light on the overall calibration of models, particularly in cases where assessment of overall capital is an important application of the model </li></ul><ul><li>. </li></ul>
  • 5. Fitness for Purpose of Economic Capital Models <ul><li>In some cases the term validation is used exclusively to refer to statistical ex post validation (e.g., backtesting of a VaR) </li></ul><ul><li>In other cases it is seen as a broader but still quantitative process that also incorporates evidence from the model development stage </li></ul><ul><li>Herein, “validation” is meant broadly, meaning all the processes that provide evidence-based assessment of a model's fitness for purpose </li></ul><ul><li>This assessment might extend to the management and systems environment within which the model is operated </li></ul><ul><li>It is advisable that validation processes are designed alongside development of the models, rather than chronologically </li></ul><ul><li>This interpretation of validation is consistent with the Basel Committee (2005) in relation to the Basel II Framework </li></ul><ul><ul><li>However, that was phrased in terms of the IRB parameters & developed in the context of assessment of risk estimates for use in minimum capital requirements </li></ul></ul><ul><ul><li>Validation of EC differs to an IRB model as the output is a distribution rather than a single predicted forecast against which actual outcomes may be compared </li></ul></ul>
  • 6. Fitness for Purpose of EC Models (continued) <ul><li>EC are conceptually similar to VaR models, but several differences force validation methods to differ in practice from those used in VaR </li></ul><ul><ul><li>Long time horizon, high confidence levels, and the scarcity of data </li></ul></ul><ul><li>Full internal EC models are not used for Pillar 1 minimum capital requirements, so fitness for purpose needs to cover a range of uses </li></ul><ul><ul><li>Most of which and perhaps all these uses are internal to the firm in question </li></ul></ul><ul><li>Note that EC models and regulatory capital serve different objectives & may reasonably differ in some details of implementation </li></ul><ul><li>BCBS’s validation principle 1 refers to predictive ability of credit rating systems, an emphasis on performance of model forecasts </li></ul><ul><li>The natural evolution of this principle for EC is that validation is concerned with the predictive properties of those models </li></ul><ul><ul><li>I.e., embody forward-looking estimates of risk & their validation involves assessing those estimates, so this related principle remains appropriate. </li></ul></ul><ul><li>Broadly interpreted validation processes set out herein in different ways all provide insight into the predictive ability of EC model </li></ul>
  • 7. Providing Confidence Regarding EC Model Assumptions <ul><li>Properties of an EC model that can be assessed using powerful tools, and hence that are capable of robust assessment, include: </li></ul><ul><ul><li>Integrity of model implementation </li></ul></ul><ul><ul><li>Grounded in historical experience </li></ul></ul><ul><ul><li>Sensitivity to risk & to external environment </li></ul></ul><ul><ul><li>Good marginal properties </li></ul></ul><ul><ul><li>Rank ordering & relative quantification. </li></ul></ul><ul><li>Properties for which only weaker processes are available include: </li></ul><ul><ul><li>Conceptual soundness </li></ul></ul><ul><ul><li>Degree to which forward-looking </li></ul></ul><ul><ul><li>Absolute risk quantification </li></ul></ul><ul><li>It is important to stress the power of individual tests & acknowledge that views as to strength and weakness are likely to differ. </li></ul>
  • 8. Providing Confidence Regarding EC Model Assumptions (cont.) <ul><li>There is great difficulty in validating conceptual soundness of an EC model due to many untestable or hard-to-test assumptions made: </li></ul><ul><ul><li>Family of statistical distributions for risk factors </li></ul></ul><ul><ul><li>Economic processes driving default or loss </li></ul></ul><ul><ul><li>Dependency structure among defaults or losses </li></ul></ul><ul><ul><li>Likely behavior of management or economic agents how these vary over time. </li></ul></ul><ul><li>Some EC models are of risk aggregation models where estimates for individual categories are combined to generate a single risk figure </li></ul><ul><ul><li>There may be no best or unique way to do this aggregation </li></ul></ul><ul><li>Since many of these assumptions may be untestable, it may be impossible to be certain that a model is conceptually sound </li></ul><ul><li>While the conceptual underpinnings may appear coherent and plausible, they may in practice be no more than untested hypotheses </li></ul><ul><li>Opinions may reasonably differ about the strength or weakness of any particular process in respect of any given property </li></ul>
  • 9. Validation of EC Models: Introduction to Range of Practice <ul><li>While we will describe the types of validation processes that are in use or could be used, note that the list is not comprehensive </li></ul><ul><li>We do not suggest that all techniques should or could be used by banks </li></ul><ul><li>We wish to demonstrate that there is a wide range of techniques potentially covered by our broad definition of validation </li></ul><ul><li>This is creating a layered approach, the more (fewer) of which that can be provided, the more (less) comfort that validation is able to provide evidence for or against the performance of the model </li></ul><ul><li>Each validation process provides evidence for (or against) only some of the desirable properties of a model </li></ul><ul><li>The list presented below moves from the more qualitative to the more quantitative validation processes, and the extent of use is briefly discussed </li></ul>
  • 10. Validation of EC Models: Range of Practice in Qualitative Approaches <ul><li>The philosophy of the use test as incorporated into the Basel II Framework: if a bank is actually using its risk measurement systems for internal purposes, then we can place more reliance on it </li></ul><ul><ul><li>Applying the use test successfully will entail gaining a careful understanding of which model properties are being used and which are not </li></ul></ul><ul><li>Banks tend to subject their models to some form of qualitative review process, which could entail: </li></ul><ul><ul><li>Review of documentation or development work </li></ul></ul><ul><ul><li>Dialogue with model developers or model managers </li></ul></ul><ul><ul><li>Review and derivation of any formulae or algorithms </li></ul></ul><ul><ul><li>Comparison to other firms or with publicly available information </li></ul></ul><ul><li>Qualitative review is best able to answer questions such as: </li></ul><ul><ul><li>Does the model work in theory? </li></ul></ul><ul><ul><li>Does the model incorporate the right risk drivers? </li></ul></ul><ul><ul><li>Is any theory underpinning it conceptually well-founded? </li></ul></ul><ul><ul><li>Is the mathematics of the model right? </li></ul></ul>
  • 11. Range of Practice in Qualitative Approaches to Validation (continued) <ul><li>Extensive systems implementation testing is standard for production-level risk measurement systems prior to implementation </li></ul><ul><ul><li>Such as user acceptance testing, checking of model code etc. </li></ul></ul><ul><ul><li>These processes could be viewed as part of the overall validation effort, since they would assist in evaluating whether the model is implemented with integrity </li></ul></ul><ul><li>Management oversight is the involvement of senior management in the validation process </li></ul><ul><ul><li>E.g., reviewing output from the model & using the results in business decisions. </li></ul></ul><ul><ul><li>Senior management knowing how the model is used & outputs are interpreted, </li></ul></ul><ul><ul><li>This should take account of the specific implementation framework adopted and the assumptions underlying the model and its parameterization. </li></ul></ul><ul><li>Data quality checks refer to the processes designed to provide assurance of the completeness, accuracy and appropriateness of data used to develop, validate and operate the model. </li></ul><ul><ul><li>E.g., Review of: data collection and storage, data cleaning of errors, extent of proxy data, processes that need to be followed to convert raw data into suitable model inputs, and verification of transaction data such as exposure levels </li></ul></ul><ul><ul><li>While not traditionally viewed by the industry as a form of validation, increasingly forming a major part of regulatory thinking </li></ul></ul>
  • 12. Range of Practice in Qualitative Approaches to Validation (concluded) <ul><li>As all models rest on premises of various kinds, varying in the degree to which obvious, we have examination of assumptions </li></ul><ul><li>Certain aspects of an EC model are 'built-in' and cannot be altered without fundamentally changing the model. </li></ul><ul><li>To illustrate, these assumptions could be about: </li></ul><ul><ul><li>Fixed model parameters (PDs, correlations or recovery rates) </li></ul></ul><ul><ul><li>Distributional assumptions (shape of tail distributions) </li></ul></ul><ul><ul><li>Behavior of senior management or of customers </li></ul></ul><ul><li>Some banks go through a deliberate process of detailing the assumptions underpinning their models, including examination of: </li></ul><ul><ul><li>Impact on model outputs </li></ul></ul><ul><ul><li>Limitations that the assumptions place on model usage and applicability. </li></ul></ul>
  • 13. Range of Practice in Quantitative Approaches to Validation: Inputs <ul><li>A complete validation of an EC model would involve the inputs and parameters , both those that may be statistically estimated </li></ul><ul><ul><li>Examples of estimated (assumed) parameters are the main IRB parameters such PD or LGD) (is PD in a low default portfolio) </li></ul></ul><ul><li>Techniques could include assessing parameters against: </li></ul><ul><ul><li>Historical data through replication of estimators </li></ul></ul><ul><ul><li>Outcomes over time through backtesting </li></ul></ul><ul><ul><li>Market-implied parameters such as implied volatility or implied correlation </li></ul></ul><ul><ul><li>Materiality of model output to input and parameters through sensitivity testing </li></ul></ul><ul><li>This testing of input parameters could complement examination of assumptions & sensitivity testing described previously </li></ul><ul><ul><li>However, that checking of model inputs is unlikely to be fully satisfactory since, every model is based on underlying assumptions </li></ul></ul><ul><li>The more sophisticated the model, the more susceptible to model error, so checking input parameters will not help here </li></ul><ul><ul><li>However, model accuracy and appropriateness can be assessed, at least to some degree, using the processes described in this section </li></ul></ul>
  • 14. Range of Practice in Quantitative Validation: Model Replication <ul><li>Model replication is useful technique that attempts to replicate EC model results obtained by the bank </li></ul><ul><li>This could use independently developed algorithms or data sources, but in practice replication might leverage a bank’s existing processes </li></ul><ul><ul><li>E.g., run a the bank's algorithms on a different data-set or vice versa, but once the either of these have been validated and are reliable </li></ul></ul><ul><li>This technique and the questions that often arise in implementing replication can help identify if: </li></ul><ul><ul><li>Definitions & algorithms the bank claims to use correctly are understood by staff develop, maintain, operate and validate the model </li></ul></ul><ul><ul><li>The bank is using in practice the modeling framework that it purports to </li></ul></ul><ul><ul><li>Computer code is correct, efficient and well-documented </li></ul></ul><ul><ul><li>Data used in this validation are those used by the bank to obtain its results </li></ul></ul><ul><li>However, this technique is rarely sufficient to validate models, and in practice there is little evidence of it being used by banks for either validation or to explore the degree of accuracy of their models </li></ul><ul><li>Note that replication simply by re-running a set of algorithms to produce an identical set of results would not be sufficient model validation due diligence </li></ul>
  • 15. Range of Practice in Quantitative Validation: Benchmarking <ul><li>Benchmarking and hypothetical portfolio testing is examination of whether the model produces results comparable to a standard reference model or comparing models on a set of reference portfolio </li></ul><ul><ul><li>E.g., benchmarking could be a comparison of an in-house EC model to other well-known or vendor models (after standardization of parameters) </li></ul></ul><ul><li>Benchmarking is among the most commonly used forms of quantitative validation </li></ul>
  • We Need Your Support
    Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

    Thanks to everyone for your continued support.

    No, Thanks

    We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

    More details...

    Sign Now!

    We are very appreciated for your Prompt Action!