FREE ELECTRONIC LIBRARY - Dissertations, online materials

Pages:     | 1 | 2 || 4 | 5 |   ...   | 6 |

«Value-at-Risk: Strengths, Caveats and Considerations for Risk Managers and Regulators Master Thesis by Bogdan Izmaylov Supervisor: Thomas Berngruber ...»

-- [ Page 3 ] --

Daníelsson (2002) examines the limitations of the modern risk models both for risk management and regulation. In particular, he studies the precision of the models at different confidence levels and finds that in terms of precision, GARCH and RiskMetrics models are the best at 95% level, but their performance diminishes at 99%. The BCBS capital requirements based on VaR are criticized as arbitrary and ineffective. The author believes that when VaR is targeted by institutions in order to meet capital requirements, it poses risks of manipulation of the measures and escalates the losses in times of crises. He also proposes a

corollary of Goodhart’s Law applied to VaR:

“A risk model breaks down when used for regulatory purposes” The scaling of VaR for different time horizons and distributions has been addressed in another paper by Daníelsson and Zigrand (2006), where the authors conclude that the square-root-of-time rule leads to underestimation of risk. This bias is very small for the 10-day period (which coincides with the requirements of BCBS), but increases at an increasing rate for longer horizons.

In his book on financial risk forecasting, Daníelsson (2011) has combined the methodology of VaR with the versatility and processing power of modern software (R and MATLAB). The comprehensive guide on implementation of risk

–  –  –

articles. The author stresses on the importance of the assumptions made for each model and the critical interpretation of results. For the calculation of highconfidence VaR, EVT methodology is used to calculate VaR in the extreme regions of the distribution tails. This approach mitigates the underestimation of risk, since the distribution of returns in the tails is better captured by the EVT.

The endogenous prices in financial markets are pointed out as the cases, when the RMPs need to utilize common sense and intuitive understanding of risk measures to be able to make rational decisions.

Nocera (2009), in the Risk Mismanagement article, presents us with an excellent discussion of opinions from VaR opponents and advocates. The main arguments presented in the article against VaR are that it cannot predict extreme risks and that it is valid for distributions which are not common for financial returns. The responses of the VaR proponents are suggesting that the mistakes in risk management were primarily in the managerial decisions and not in the models.

Risk management experts share in the interviews with the author, that VaR measure has been manipulated by managers in order to create “asymmetric risk positions”, which increased the losses that were not captured by the models. VaR also did not capture the effects of leverage. The author concludes that the latest crisis has shown once again the critical need for better understanding of risk.

Guégan and Tarrant (2012) resent in their paper theoretically the insufficiency of up to four risk measures combined in order to capture the risk from peculiar, although possible, loss distributions. Since the measures can produce the same risk estimates for different distributions, the proposed solution for assessment of risk profile is to use 95% and 99% VaR and TCE in combination with Maximum Loss (ML). The authors suggest that the banks should be required to submit multiple risk measures to the regulatory bodies.

–  –  –

for calculating VaR and the daily capital charges for authorized deposit taking institutions. The study found that there is no single best model, but each model performs best at certain time periods (stages of the economic cycle). Before the GFC, GARCH is the best in terms of days with minimized capital charges.

Riskmetrics model performs best in the beginning of the crisis, until September

2009. Exponential GARCH with shocks following the Student-t distribution performs best until the end of 2009. The authors conclude that the institutions should use multiple risk models in order to minimize the capital charges and thus the penalties from VaR violations.

In the “BlackRock: The monolith and the markets” (2013), a concern on the widespread use of same risk models in the financial markets is presented. The BlackRock’s risk-management system popularity can be compared with the wide-spread use of VaR since the RiskMetrics release and its implementation in the Basel accords. The main concern is connected to the limitations of the models, also as pointed out by Danielsson (2002), which are used by the increasingly large groups of people. Even the systems much more sophisticated than VaR, like the BlackRock’s “Aladdin”, raise concerns when used by large group of market participants.

–  –  –

VaR. The ambition of the study is to provide an overview of caveats and strengths of VaR use in risk management practice, as well as to provide some suggestions for mitigation of possible issues with the use of this risk measure.

4.1 VaR as a measure for non-normally distributed returns.

Assumption of normality is crucial for some methods of calculating VaR. This is quite similar to many other financial models, since normality greatly simplifies

–  –  –

incomprehensively complex. Since the normal distribution can be described by only 2 moments – the mean and the standard deviation, the VaR calculation is the simplest when the assumption of normality holds. But the methodology does not break up when the returns are not normally distributed, even though VaR may lose some desirable properties (Artzner et al., 1999) One of the main critiques of VaR is that it relies on parametric distributions, which can be a poor fit for the market data. Danielsson (2011) emphasizes in his

book the stylized facts about financial returns:

–  –  –

Figure 6. Daily S&P 500 returns 1989-2009 density plot and the normal distribution fit (red line).

Source: own calculations.

Figures 5 and 6 clearly show the differences between normally distributed returns and the distribution of S&P500 data. The real probability is higher-thannormal near the mean of the distribution, it is lower when the distance from the mean increases, and it is again higher-than-normal in the tails of the distribution, which is a very important fact for risk analysis.

–  –  –

- Volatility estimation models have to be flexible enough in order to capture the clusters – high/low volatility has to be followed by

–  –  –

- Linear dependency can approximate the joint movements in the centre of the distributions, but it becomes progressively less precise for

–  –  –

The obvious choice and common practice for calculating VaR in the case of on non-normally distributed returns is the Historical Simulation, Nonparametric VaR. But such approach raises another question, whether it is realistic to assume that the history repeats itself and as such whether the past contains enough information for prediction of future events. There is also a trade-off between the size of the estimation window (the number of observations used to forecast VaR) and the speed at which the forecasts will adjust to new information. The years of returns data before the first part of 2008 (before the GFC) are clearly a poor fit for the VaR forecast of returns in the second part of 2008. The HS VaR is a relatively simple and precise tool when there are no or very little structural changes in risk, like in the years before the GFC (Daníelsson, 2011).

To account for non-normality of returns while allowing for losses that are higher than the ones from the estimation window, either a parametric distribution of choice (e.g. Student-t) should be used, or the Generalized Pareto Distribution applied to model the distribution in the tails, while accounting for skewness and kurtosis of the data.

In order to incorporate risk changes quickly and precisely into VaR forecasts,

–  –  –

1998; Jorion, 2007; Danielsson, 2011).

Nassim Taleb criticizes VaR methods for using past return data (HS) and relying on normal distribution (parametric methods), concluding that VaR cannot predict extreme events (“Black Swans”) and is harmful because it creates a false sense of security and confidence in RMPs. The obvious counter-argument is that the non-parametric and parametric methods should be used in the context of the market situation. This is where the managerial discretion is important – it is up to the risk manager to choose the best appropriate method. The choice of distribution for calculation of VaR is also important and the normal distribution should not be used to simplify calculations at the cost of precision. Existing methods are sufficiently precise given that the choice of method is justified by the data and market conditions (Daníelsson, 2011; Jorion, 2007). By definition, VaR gives the maximum loss for a given confidence interval, given normal market conditions (when the markets are abnormal, VaR estimates are not valid).

The limitations of models also have to be taken into account and the manager’s understanding of these models and the risks they forecast is crucial for the quality of VaR forecasts.

Given the reality of the modern financial markets, it is unrealistic to assume normal distribution of financial returns, meaning that more sophisticated VaR methods should be used by RMPs, combined with improved risk understanding.

Using of HS in order to capture the non-normality should be done in the context of the fact that such calculations do not provide sufficient information about the future volatility (Fong Chan & Gray, 2006; Pérignon & Smith, 2010).

4.2 The quality of high confidence level VaR forecasts – 95%, 99% VaR and higher.

Due to the nature of VaR as a quantile risk measure, the precision of the estimates

–  –  –

which can be used to estimate VaR, as well as from the fact that the financial data distribution has fat tails. The problem is even worse for estimation of ES, since the measure itself is not a quantile, but the mean of all the quantiles beyond α.

This is confirmed by the empirical findings of Giannopoulos & Tunaru (2005) and Yamai & Yoshiba (Yamai & Yoshiba, 2005). There are several most common ways

of overcoming such limitations in practice:

–  –  –

4) Applying EVT to model the tails of the distribution (99%+ VaR) Gathering more data can be problematic, and the trade-off between the estimation window length and speed of adjustment to new information is an important concern. Daily VaR is a commonly calculated measure, and gathering more daily data may not be very challenging in most situations. But if the weekly, 10-day or even monthly VaR needs to be calculated, it can be impossible to obtain enough observations in order to get sufficiently precise estimates.

If the available data is limited, bootstrapping can provide more precise estimates of VaR, ES and their standard errors, without any assumptions about the distribution of returns. The observations are drawn randomly with replacement from the collected sample. This method also adjusts slowly to new information in the sample and relies on the assumption that the history repeats itself, but improves on HS by providing more precise estimates. Pritsker (2006) confirms that bootstrapping is superior to HS by analysing the returns of the exchange rates of 10 currencies versus the USD.

If the distribution of returns can be approximated and defined in the simulation

–  –  –

generator, the number of simulations, and the quality of the transformation method. The transformation method is the way of converting the randomly generated numbers into the random numbers from the distribution of interest.

Since the number of calculations can be very high for a big financial institution, pseudo-random number generators can be used. These produce a pre-defined sequence of “random” numbers, which allows the simulation to converge faster and produce accurate results with fewer simulations. Even with these tweaks, Monte Carlo (MC) is the most computationally demanding method.

Another way to speed up simulations and VaR calculation is to use factor models.

Such models assume that the changes in risk are driven by changes in one or several risk factors. Each position is modelled according to its exposure to these risk factors, thus only a handful of factors need to be simulated by using MC.

For calculation of high-precision VaR with confidence levels above 99%, the EVT can be applied. It accounts for the shape of the tail of the distribution, usually producing more precise estimates. EVT should be used with caution, since the precision of the estimates is even more sensitive to the underlying distribution of returns in comparison to lower confidence level VaR. Danielsson (2011) suggests using EVT with sample sizes above 1000 and for confidence levels of 99,6% and higher. (Lin & Shen, 2006) also provide evidence on improved performance of VaR when calculated based on student-t or EVT modelled tails, in comparison to normal distribution tails for confidence level of 98,5% and higher. An easy approximation would be to use normal or student-t tails up to 95% confidence level, and use student-t for 99% or when the number of observations does not allow to implement EVT efficiently.

There is no method superior to others in all situations (although usually MC can better forecast the tails of returns distributions at the cost of computation time),

–  –  –

the forecast and fundamental understanding of the models and nature of risk.

4.3 VaR as an aggregate measure of risk.

Pages:     | 1 | 2 || 4 | 5 |   ...   | 6 |

Similar works:

«Volume 6, Issue 2, Winter 2003 A Dialectic Perspective on the Organization Theatre Metaphor David M. Boje, John T. Luhman, & Ann L. Cunliffe Printer-friendly PDF version David M. Boje Department of Management College of Business Administration and Economics Abstract New Mexico State University Las Cruces, NM 88003 Organization studies uses “theater” as a metaphor dboje@nmsu.edu for organization life in two ways: first as organizing-is-like-theatre, a perspective adopted by John T. Luhman...»

«Alþingi Erindi nr. Þ 141/855 PositiveMoney komudagur 4.12.2012 Support for Proposal No. 262 (Separating Money Creation from Bank Lending) To whom it may concern, I am writing on behalf of Positive Money in support of proposal No. 262. Positive Money is a not-for-profit research group focused on monetary system reform. Proposal No. 262 proposes the establishment of a committee to consider how in the current banking system the function of money creation can be separated from the function of...»

«Australian Journal of Business and Management Research Vol.1 No.2 | May-2011 PROSPECTS OF ISLAMIC BANKING: REFLECTIONS FROM PAKISTAN Muhammad Akram Hailey College of Commerce, University of the Punjab Lahore – Pakistan Email: makram.hcc.pu.edu.pk@gmail.com Mamoona Rafique Hailey College of Commerce, University of the Punjab Lahore – Pakistan Hassan Mobeen Alam Hailey College of Commerce, University of the Punjab Lahore – Pakistan ABSTRACT This study examines the growth and development...»

«1500 Highway 36 West Roseville, MN 55113-4266 APPLICATION SECTION COVER SHEET TTY: (651)582-8201 Formula – Limited Eligibility Finance Code 438 – Adult Basic Education u c a t i o n. s t a t e. m n. u s h t t p : / / e d (ABE) Application for Federal Funding and Finance Code 322 Adult Basic Education State Funding Federal Year 2015, State Fiscal Year 2016 ORGANIZATION INFORMATION Legal Name of Applicant: (District/ Organization Name): Communication Service for the Deaf Minnesota SWIFT...»

«Causes of Bank Suspensions in the Panic of 1893 * Mark Carlson Federal Reserve Board There are two competing theories explaining bank panics. One argues that panics are driven by real shocks, asymmetric information, and concerns about insolvency. The other theory argues that bank runs are self-fulfilling, driven by illiquidity and the beliefs of depositors. This paper tests predictions of these two theories using information uniquely available for the Panic of 1893. The results suggest that...»

«Research Division Federal Reserve Bank of St. Louis Working Paper Series The Impact of Milton Friedman on Modern Monetary Economics: Setting the Record Straight on Paul Krugman’s “Who Was Milton Friedman?” Edward Nelson and Anna J. Schwartz Working Paper 2007-048D http://research.stlouisfed.org/wp/2007/2007-048.pdf October 2007 Revised January 2008 FEDERAL RESERVE BANK OF ST. LOUIS Research Division P.O. Box 442 St. Louis, MO 63166 The views expressed are those of the individual authors...»

«International Journal of Humanities and Social Science Invention ISSN (Online): 2319 – 7722, ISSN (Print): 2319 – 7714 www.ijhssi.org Volume 2 Issue 4 ǁ April. 2013ǁ PP.42-46 Examining the Impact of Slate Schools (Makarantun Allo) in Nasarawa State, Nigeria. Abdullahi Adamu Sulaiman Department of Religious Studies (Islamic Studies Unit) Nasarawa State University Keffi, Nasarawa State, Nigeria.ABSTRACT Reflections over the situation in Nasarawa State and its inhabitants in the earliest...»

«GLOBAL FINANCIAL AND EUROZONE REFORM: FIVE QUESTIONS ON A COMMON THEME ADAIR TURNER SPEECH AT SEMINAR AT SVERIGES RIKSBANK Stockholm, 18 February 2013 People tend to attribute blame for a crisis according to their ideological predilection. When the financial crisis of 2007 to 2008 first broke many commentators in continental Europe blamed the excesses of Anglo-Saxon finance capitalism, financial liberalisation, and an explosion of private debt. From 2010 onwards, some of the criticism flowed...»

«Masaryk University Faculty of Informatics IMC as a business strategy in service-oriented IT organisations Diploma thesis Bc. Branislav Šandala Brno, 2012 Declaration Hereby I declare that this paper is my original authorial work, which I have worked out on my own. All sources, references and literature used or excerpted during elaboration of this work are properly cited and listed in complete reference to the due source. Bc. Branislav Šandala Supervisor — Sandra Kumorowski, MBA ii...»

«DONALD R. LESSARD Epoch Foundation Professor of International Management, Emeritus MIT Sloan School of Management 100 Main Street E62-460 Cambridge, MA 02142-1347 Tel: 617-253-6688 Fax: 617-253-2660 E-mail: dlessard@mit.edu Academic Advisor, The Brattle Group, Cambridge, MA Member, Global Infrastructure Project Research Network Fields: Global strategy, major projects, and energy. Current research: Dynamic capabilities in global strategy; Global dynamics of development, production, and...»

«Middle Africa Insight Series | Commodities | Cement 24 July 2014 Middle Africa’s cement sector: explosive growth Highlights Middle Africa’s cement sector is undergoing the strongest period of sustained growth in its  history, with multiple investments across the region to boost production capacity. The region produced 116mn MT of cement in 2013, led by Nigeria, South Africa & Ethiopia,  while Angola is emerging as Centra l Africa’s cement production hub. Cement demand has risen at...»

«MONETARY POLICY AND THE ROLE OF EXCHANGE RATE THE CASE OF Jordan By NABIH YOSEF ABDALLAF MOUSA A Thesis Submitted to The University of Birmingham For the degree of DOCTOR OF PHILOSOPHY Department of Economics School of Social Sciences The University of Birmingham May 2010 University of Birmingham Research Archive e-theses repository This unpublished thesis/dissertation is copyright of the author and/or third parties. The intellectual property rights of the author or third parties in respect of...»

<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.