«Value-at-Risk: Strengths, Caveats and Considerations for Risk Managers and Regulators Master Thesis by Bogdan Izmaylov Supervisor: Thomas Berngruber ...»
VaR has been a recommended measure since Basel II. Such measure change from Basel I is a step to more customized requirements, specifically tailored to each institution, according its risk exposure. Thus the institutions with higher risk appetite have to set aside more capital than the institutions with lower risk appetite. Moreover, BCBS allows the use of internal methods for estimation of VaR, so the institutions may choose from more or less conservative approaches to modelling of VaR. Aggressive approach aims to minimize the daily capital charges (the amount of capital set aside) and improves the profitability. The downside of such approach comes from the variable multiplier, which penalizes the institutions using the models which for some reason have more VaR violations then allowed by the Accords.
Such specific requirements stimulate the financial institutions to apply the best possible VaR models in order to free the capital and minimize the holding costs.
Overly aggressive (low) VaR and poor models can create too many violations and lead to higher multiplier increasing the amount of required capital.
There can be identified several problems with the approach of BCBS to the measurement of capital requirements using VaR.
First of all, the multiplier can be argued to be arbitrary. Philippe Jorion (2007) elaborates on the choice by reviewing the precision of VaR estimate and concludes that the multiplier of 3 produces the maximum value of the estimate.
amount of capital reserves would virtually never go bankrupt. In the light of the GFC, it is clear that there is something wrong with this logic. Such approach to VaR calculations is valid for the normally distributed returns, which is not a very good approximation of reality for financial markets. Meaning that even if VaR calculations, using internal models, are made for non-normal returns, the BCBS requires multiplication to be made with the method which assumes the normality, meaning that the final figure will not be the maximum value of the estimate and will not correspond to the implied (by the BCBS) level of risk.
Danielsson & Zigrand (Daníelsson & Zigrand, 2006) found that there is no underestimation of VaR for 10 day period, but only when the probability of crashes (extremes of the left distribution tail) is low.
Figure 9. Illustration of the relative error (the ratio VaR[η]/(√ηVaR)) from the use of the square-root of time rule, where 1/λ is the expected time to crash in years.
Source: Danielsson & Zigrand (2006).
Second, because of the difficulties in calculating the high-precision VaR, the BCBS recommends the use of square-root of time rule in order to calculate the 10-day VaR. But such approach is valid for normally distributed returns, applying it to the calculated VaR will underestimate risk, which, at least in theory, the institutions would prefer, since it leads to lower amounts of their capital being locked in. The interesting fact is that the empirical findings of (Pérignon, Deng,
reports. While it can be argued that such overstatement can help mitigate the potential underestimation of losses if VaR model used is underestimating the risk on average, it creates unnecessary costs for the institution. Another potential problem may be that when the VaR reporting consistently understates the estimates, the model itself cannot be assessed properly. Minor violations caused by model deficiencies can be missed because of the overstatement and these deficiencies could prevent the bank from foreseeing the increase in risk level.
Third, as pointed out by Danielsson (2011) in his corollary of Goodhart’s law, when VaR becomes a target, it ceases to be a good measure of market risk exposure for the institutions. When the measure becomes a target, incentives for manipulation are created. Financial institutions can start tailoring the VaR figures produced by their models to match the level of capital they are willing to set aside for the risks they are taking. Because VaR is only a quantile of the distribution, financial derivatives can be used to alter the payoff at the specific point in the distribution and produce a lower risk estimate. In the subsequent steps this problem is only made worse by the use of the square-root of time rule and the multiplier, both of which assume the normality of returns.
An example of such manipulation could be writing a put option with the strike price just below the VaR and buy one put option with the strike price above the desired VaR value.
Figure 10. P&L distribution before and after VaR manipulation. Source:
based on Danielsson (2011) This way the institution will lower the VaR amount, but decrease the expected profit on all confidence levels. There is really nothing that prevents banks from using such instruments, unless the RMP decides that this is not in line with the bank’s risk management policy. For the external observer there is no way to
VaR (lower risk) and the institution which has modified VaR (higher risk).
The biggest beneficial feature of VaR as a tool recommended by the Basel Accords lies, as pointed out by Jorion (2007), not in the final result (VaR as a metric), but in the process of getting it (VaR as a measure). Institutions which have to calculate their VaR for reporting purposes have to face the risks in their portfolio and decide whether to take any measures to mitigate those risks. The process of calculating VaR also gives the RMPs a lot of insight into the riskiness of individual exposures and how they contribute to the overall risk of the institution.
Since the measures of the first two Basel Accords have proven to be unable to ensure protection against crises, the third Accord of the BCBS has increased the capital requirements for financial institutions. The new regulations increase the amount of capital that the institutions would have to hold and decrease their leverage. Apart from the increase in the institution risk-specific ratios, more general ones like the “mandatory conservation buffer” of 2,5% of common equity and the “discretionary counter-cyclical buffer” of 2,5% which could be required by the national regulators. Such changes in the regulations point towards the requirements based less on the institution-specific VaR and more on their leverage. The transition to higher capital ratios is planned to be gradual in order to allow institutions to adjust. For example, minimum capital requirements are planned to be phased-in during 2014-2015 and conservation buffer – during 2016The increases in required capital are clearly going to impact the profitability of financial institutions. Since the costs are likely to be transferred to the customers (Grosen, 2011), the new Accord also is expected to have significant negative macroeconomic impact (Slovik & Cournède, 2011). It may look as if the cost is the
guarantee that they will prevent systemic failures, and as Grosen (Grosen, 2010) writes, the Basel accords are “the Maginot line of the banking sector”. However, the costs may be justified if we factor in the long-term costs of crises, which can be calculated as the missed GDP growth (Rangvid, 2010).
It may be reasonable to consider different alternatives to existing regulation, which may improve the stability of the financial system. Instead of requirements for the level of capital ratios, requirements for insurance of certain levels of risk could be implemented. Danielsson (2011) also argues in favour of such system, where the banks would insure and re-insure other banks. Such strategy should be considered, since as a whole the banking industry possesses enough capital to absorb the shocks. An area of future research can be to analyze how the aggregate cost of insurances will compare to the costs of the Basel regulations. Using VaR is a more precise, more efficient way of regulating the industry than imposing arbitrary capital ratios. An important tool for regulation is stress-testing, especially when it is used as a combination of different risk measures and stressscenarios (Plesner, 2012). Another option is to abandon all capital requirements and let the institutions to be market disciplined. The role of the BCBS in such case
case, VaR and ES could be disclosed among other measures of risks for each institution to provide an overview of loss distributions. Better transparency could potentially have greater effect on the long-term stability of the financial sector, compared to the increases in the amount of required capital.
Value-at-Risk in the context of the GFC and recent developments in the risk management field. The main focus was set on finding out what are the caveats of using VaR, how and to what degree they can be mitigated by the existing methodology. The factors which could jeopardize the risk measure estimates are discussed together with the measures which can prevent this from happening.
The current VaR methodology is highly complex and presents many tools to the managers, which can be used to estimate precise figures for the purpose of risk measurement and mitigation. Empirical studies have shown VaR methods can produce highly accurate estimates when appropriate models are used. It is hard to overestimate the role that the risk manager plays in the choice of the methodology for VaR calculations. From the conducted analysis it is clear that the errors in VaR estimates which arise from the models are much smaller than the errors from application of inappropriate methods. Both theoretical and empirical literature support this view by providing a wide range of models which perform at their best in different market situations. And while we can easily “blame the models”, it is necessary first to assess the applicability of these models to the reality of the markets.
Another emphasis of this thesis is the failure of statistical measures to provide adequate basis for regulation. While the benefits of using VaR make it an appropriate measure for highly customized capital requirements, the consequence of such regulation is deterioration in the quality of VaR estimates caused by manipulation.
Overall, the focus of risk management should be set on the process of VaR calculation, the analysis of component risk measures and their impact on aggregate risk. In this area the most promising area is the use of simulations and
factors on overall risk in both financial and non-financial companies.
After looking into the mechanics of VaR calculation and its relation to decision making processes in companies, it is clear that VaR is an important tool in the RMP’s toolbox. The principles of risk management also apply to the measurement process itself, and portfolio of risk measures should be used in order to capture the relevant information about risks. No matter how complex the model, it would not be able to capture all this information in a single number or measure. Since this may be critical for assessment of extreme tail events, VaR analysis should be complemented by simulations and other risk measures.
Future research should be conducted for development of methods which would aid risk managers in their choice of VaR estimation methods, since this is a critical step in the estimation process. Another potential research area is the analysis of stress tests which would use the extended variables ranges as proposed by Taleb (2012) for his heuristic measure. For the regulation purposes, the alternative of insurance instead of capital requirements for banks should be further studied in the context of its ability to prevent systemic crises and its macroeconomic impact in comparison to the current regulations of BCBS.
Artzner, P., Delbaen, F., Eber, J., & Heath, D. (1999). Coherent measures of risk.
Mathematical Finance, 9(3), 203.
Belles-Sampera, J., Guillén, M., & Santolino, M. (2014). Beyond value-at-risk:
GlueVaR distortion risk measures. Risk Analysis, 34(1), 121-134.
BlackRock: The Monolith and the Markets. (2013), The Economist, Daníelsson, J. (2002). The emperor has no clothes: Limits to risk modelling.
Journal of Banking & Finance, 26(7), 1273.
Daníelsson, J. (2011). Financial risk forecasting: The theory and practice of forecasting market risk, with implementation in R and MATLAB. Chichester: John Wiley.
Daníelsson, J., & Zigrand, J. (2006). On time-scaling of risk and the square-rootof-time rule. Journal of Banking & Finance, 30(10), 2701-2713.
Dowd, K. (1998). Beyond value at risk : The new science of risk management.
Dowd, K., & Cotter, J. (2007). Evaluating the precision of estimators of quantile-based
at-risk for daily electricity spot prices. International Journal of Forecasting, 22(2), 283-300.
Gendron, M., & Genest, C. (2009). The advent of copulas in finance. The European Journal of Finance, 15(7/8), 609.
Giannopoulos, K., & Tunaru, R. (2005). Coherent risk measurement under filtered historical simulation. Journal of Banking & Finance, 29(4), 979.
Grosen, A. (2010). Basel III - et fæstningsværk i bankreguleringens maginotlinje? Finans/Invest, (8), 2.
Grosen, A. (2011). Bankernes kasino-model under lup. Finans/Invest, (8/11), 2.
Guégan, D., & Tarrant, W. (2012). On the necessity of five risk measures. Annals of Finance, 8(4), 533.
Huang, J., Lee, K., Liang, H., & Lin, W. (2009). Estimating value at risk of portfolio by conditional copula-GARCH method. Insurance Mathematics and Economics, 45(3), 315.
Jorion, P. (2007). Value at Risk: The new benchmark for managing financial risk (3.
edition ed.). London: McGraw-Hill.
(1998). Roundtable: "The limits of VaR". Derivatives Strategy, Kothari, C. R. (2011). Research methodology : Methods and techniques (2. ed. ed.).
New Delhi : New Age International Ltd.
Lin, C., & Shen, S. (2006). Can the student-t distribution provide accurate value at risk? The Journal of Risk Finance, 7(3), 292.
Malz, A. M. (2011). Financial risk management : Models, history, and institutions.
Hoboken, N.J.: Wiley.