# What is VaR? Value at risk

IN PRINT ARCHIVE **CIR Summer 1999**

What is VaR? Value at Risk |

by Alan White |

Value at risk (VaR) has been the flavor of the month in risk management circles recently. The idea of a VaR type calculation was originally stimulated by the requests from senior management in financial institutions for a simpler risk report. ("Just one number -- I only want to see one number that tells me how bad things can get.") Much of the interest in VaR has been stimulated by the fact that under an agreement reached by the Bank for International Settlements, all bank regulators now require that banks calculate the VaR for their trading portfolios. The regulatory capital required for these portfolios is then based on the calculated VaR.1 Here we will provide a simple explanation of what VaR is and how it is calculated. Then we will discuss the difficulties that are likely to be encountered in implementing a VaR system. Finally, we will explore the circumstances under which VaR is useful and when it is not likely to be of much value.
The official definition of VaR is the level of loss that we are X% confident will not be exceeded in some time period, T. For the determination of bank capital X is 1% and T is 10 days. Thus, if the VaR for some bank portfolio is $5 million, there is less than a one-percent chance that more than $5 million will be lost in the next 10 trading days. The VaR calculation is highly idealized and should be interpreted with caution. There are three ways that it is commonly misinterpreted:
The most recent 500 realized returns are sorted from lowest to highest (most negative to most positive) and the five worst returns (1% of all the returns considered) are applied to the current stock price to determine the losses that would occur if these returns occurred tomorrow. The least negative of these is considered to be the VaR since no more than 1% of all the losses reached or exceeded this level in the last 500 days. The VaR calculation is shown in Table 2. The current VaR is $1.22. The alternative method used to calculate VaR is known as the model based approach. In this approach it is assumed that some statistical model is driving the returns on our stock. In order to calculate the VaR we first determine the parameters of the model from the historical returns and then use the model to predict the future returns that might occur. Usually the statistical models used are very simple. In our example we will assume that the daily stock returns are drawn from a normal distribution with a constant mean and variance. The mean and variance are estimated from the last 500 days of stock returns2 and then the 1-percentile point of the distribution is calculated. In our example the average of the last 500 returns is 0.073% and the standard deviation is 1.963%. For a normal distribution the 1-percentile level is about 2.33 standard deviations below the mean, in our case a return 4.49% below the mean return. Applying these values to the $30 stock price yields an expected stock price of $30.02, and a 4.49% decline of -$1.35 to $28.67. This gives a VaR of - $1.33 ($30.00 - 28.67). If we assume a mean return of zero the expected price is $30, the 4.49% decline is $28.65 and the VaR is -$1.35. The difference in the VaR calculated by the two methods arises from the differing assumptions underlying them. The assumption of normality of stock returns is reasonable but not completely accurate. Figure 1 shows the actual distribution of stock returns over the last 500 days and the normal probability distribution with the same mean and variance. Each of the methods for calculating VaR has its advantages and disadvantages. The historical simulation approach is very simple and captures the exact range of outcomes that have been observed in the past. Usually there are a greater number of extreme events (unusually large gains and losses) than can be predicted by any theoretical model. However, the historical simulation approach is difficult to apply to VaR measured over longer time intervals. In order to compute a 10-day VaR we need a history of 10-day returns. If we want to have 500 observations we need 5,000 days or about 20 years of data.3 While theoretical models are not very good at reflecting the most extreme gains and losses that can occur they can easily be used to calculate the VaR over any time interval. In our example we assumed that the one-day returns were normally distributed with some mean and variance. If the market is efficient, sequential daily returns are independent of one another so the 10-day return will also be normally distributed with a mean and variance equal to 10 times the daily mean and variance. If we assume that the mean return is zero VaR is proportional to the standard deviation of the returns. In this case the 10-day VaR is (check)10 times the one-day VaR.4 There is one final variation that is often applied to both methodologies for computing VaR. As any market watcher knows, there are times when the markets are volatile and times when they are quiet. We expect more risk in volatile markets and should adjust the VaR calculation accordingly. In the model based approach this is done by using a scheme to calculate variance that weights more recent observations more heavily so that the computed variance is more reflective of current conditions and less reflective of the events of two years ago. The schemes that are used vary from exponentially weighted moving averages in which today's variance estimate is a weighted average of yesterday's variance estimate and the square of today's return5 to complicated statistical models such as ARCH (auto-regressive conditional heteroskedasticity). The historical simulation approach is modified by doing a similar variance calculation and then scaling each historical observation up or down by the ration of the current estimate standard deviation to the estimated standard deviation that applied on the day the observation occurred.6 Modifying either approach in this way significantly improves the effectiveness of the VaR calculation. The Complications in Computing VaR In the previous section we discussed the basic method for calculating VaR for a single share of stock. As can be seen, the ideas underlying the calculations and the basic methods used are extremely simple. In this section we will discuss some of the difficulties that occur when this basic approach is extended to apply it to real portfolios. These problems are generally related to data management issues for large portfolios.
The first complication that arises in computing VaR for a portfolio is tied to the amount of historical data that is required. Every addition to the portfolio requires a historic data series either to carry out the historic simulation or from which to estimate a model. This means that for a realistic portfolio containing hundreds or thousands of assets that a huge amount of historical data has to be accumulated. Further, we must now consider the relationship between all of the assets in our portfolio. It is likely that losses on one asset will be offset to some extent by gains on another asset. In the historical simulation approach this is handled automatically. We apply all the returns that were observed t-days ago to the current portfolio for t = 1 to 500. The extent that these returns either augmented or offset each other on any one day will be reflected in the computed results. For the model-based approach it is necessary to compute the covariance or correlation between every pair of assets. If we have n assets there will be n(n - 1) / 2 of these correlations. The actual estimation of these correlations is usually difficult. Often the correlation between asset returns is approximated by the use of some sort of index model of asset returns. In this framework there are one or more common factors that drive asset returns and all the correlation between returns can be attributed to these common factors. This simplifies the correlation structure but only approximately captures the true relationships between assets. For large institutions with portfolios spread around the globe it is a challenge to aggregate the information necessary to carry out the VaR calculation on a timely basis. As one risk manager said, "If I just knew our position in French bonds at the end of the day my job would be much easier." Associated with this is the problem of ensuring that all positions are accurately captured every day. This is just a technological problem, one that is often related to older systems that are difficult to integrate into newer information systems. Nonetheless, it is a problem that occupies much time when a VaR system is being implemented.
Normally we do not worry too much about how much time it takes a computer to do some calculations. However, when a VaR system is implemented calculation time and computer power becomes an issue. It is particularly the case for the historical simulation approach to calculating VaR. Consider the case in which it takes one minute to generate the mark-to-market value of your portfolio at the end of the day. This would not normally be considered to be a significant issue. However, if a historical simulation VaR system is implemented in which 500 different scenarios must be assessed, the computation time is now 500 minutes or more than six hours. If it takes only one second to generate the mark-to-market portfolio values, the VaR calculation will take about 6 minutes. The only ways to improve computation time is to invest in more, faster hardware or to try simpler methods for calculating the portfolio value. As an illustration of the type of simplification that might be made, consider a portfolio of corporate bonds. One way of dealing with this portfolio would be to treat the bonds as we might treat equities, to model how each bond price changes independently. This will capture all the credit quality characteristics of each bond quite accurately. However, the data available for corporate bonds may be sparse so that we do not have reliable statistical information to relate how all the bond prices might change. A simplification would be to use a simple term structure model to price the bonds. The model of the term structure model might include 3-, 6-, and 12-month discount rates as well as 1-, 2-, 3-, 5-, and 10-year discount rates. By modeling how each of these 8 rates can change and then computing the bond prices as the present value of coupon and principal payments we can generate approximate bond values quite quickly. Unfortunately, introducing simplifications into the portfolio valuation calculation leads to problems of computational accuracy. Computational Accuracy As with computational time, computational accuracy is not normally an issue that causes much concern. However, when VaR is being calculated it is sometimes a problem. Often institutions purchase external vendor systems for calculating VaR. These systems are very sophisticated tools but it is often the case that the methods that are implemented in these systems for valuing the portfolio are not exactly the same as the methods that are used in the system used for reporting end-of-day profit and loss. A similar problem arises if some sort of simplification is used to value the portfolio. If the VaR calculation does not agree with the system of record it is not clear that it is producing meaningful results. If the VaR system is consistent in its errors then the worst case change in P&L it forecasts is reliable. For example, if the VaR system always overstates the portfolio value by some set amount, it is overstating both the current value and the forecast value so the forecast change in value is correct. More typically however there is a certain amount of randomness in the errors that are caused by day-to-day changes in the portfolio composition. As a result, substantial effort must be made to determine if the results are meaningful.
To understand when a VaR calculation is useful we must look at the assumptions underlying the calculation. The premise underlying VaR is that there is a set of random variables that affect the value of our portfolio and that it is possible to determine the future statistical properties of these variables. In our first example our portfolio contained one share. In that case the random variable was the stock price itself. We estimated the statistical properties by examining how the stock price had behaved in the past. In the historical simulation case we assumed that the only possible future outcomes were found in the last 500 observed returns. In the model implementation we assumed that we could use the last 500 observations to determine the mean and variance of future stock price changes. We further assumed that future changes would be drawn from a normal distribution. VaR seems most useful as a risk measure in liquid markets. Invariably we end up assuming that the future will be like the past and that we can observe the past sufficiently clearly to determine its statistical properties. These assumptions seem to work well in liquid markets that are actively traded. For instance, we have enough observations to show that the statistical properties of stock prices do not change too much over time, and that the changes that occur seem to be fairly gradual in general. Thus, we can have reasonable confidence that recent stock price history allows us to forecast the near future statistical behavior of stock prices.7 In liquid, active markets we can use VaR to measure how much risk or exposure we are willing to take on. It provides a reasonably good measure of the likely worst case loss that we will face and if this loss occurs we can use the market liquidity to terminate further losses. In less liquid markets VaR does not seem to be as useful a risk measure for several reasons. It is usually not possible to observe enough transactions to determine what the statistical properties of the market are. For instance, in the real estate market there are quite a few transactions but liquidity is not high and each transaction is to some extent unique. For any particular real estate asset not only do we not know how the prices may vary over time but we may be reasonably uncertain about the current value of the asset. Even if we knew the statistical properties of the market, the lack of liquidity means that it might take several months to sell our portfolio. In this case we would have to forecast our loss limits over this much greater time span. However, as we try to predict farther into the future our predictions become much more uncertain. The way prices behave may change slowly but it does change. Three weeks from now the market may be much more or less volatile than it is now. In summary VaR seems to be most useful as a risk control for actively managed portfolios in liquid markets. In these circumstances it provides a plausible measure of how large the losses might be before mitigating action can instigated. Even in these circumstances VaR is not an infallible risk measure. There are rare occasions when the nature of the market changes drastically in a very short space of time. The dramatic one-day decline of global equity markets in October 1987 is a good illustration of such an event. The risk of this type of event is not captured by VaR measures. Often, the VaR analysis is augmented by stress scenarios of this type to measure the catastrophic risks that a portfolio faces.
For a more in depth discussion of the technical issues involved in computing value at risk the following two sources are a good starting point: The J.P. Morgan RiskMetrics website. "Value at risk: the new benchmark for controlling market risk," by Philippe Jorion, McGraw-Hill, 1997. Alan White is a professor at the Joseph L. Rotman School of Management, University of Toronto. |