Archive for the ‘Interesting External Papers’ Category

US Corporate Bond Spreads: Taxation Effects

Saturday, February 21st, 2009

An interesting paper by Edwin J. Elton, Martin J. Gruber, Deepak Agrawal & Christoper Mann, Explaining the Rate Spread on Corporate Bonds. The authors’ thesis is:

Spreads in rates between corporate and government bonds differ across rating classes and should be positive for each rating class for the following reasons:
1. Expected default loss—some corporate bonds will default and investors require a higher promised payment to compensate for the expected loss from defaults.
2. Tax premium—interest payments on corporate bonds are taxed at the state level whereas interest payments on government bonds are not.
3. Risk premium—The return on corporate bonds is riskier than the return on government bonds, and investors should require a premium for the higher risk. As we will show, this occurs because a large part of the risk on corporate bonds is systematic rather than diversifiable.

… and they conclude …

Several findings are of particular interest. The ratings of corporate bonds, whether provided by Moody’s or Standard and Poor’s, provide material information about spot rates. However, only a small part of the spread between corporate and treasuries and the difference in spreads on bonds with different ratings is explained by the expected default loss. For example, for 10-year A-rated industrials, expected loss from default accounts for only 17.8
percent of the spread.

Differential taxes are a more important influence on spreads. Taxes account for a significantly larger portion of the differential between corporate and treasuries than do expected losses. For example, for 10-year A-rated bonds, taxes accounted for 36.1 percent of the difference compared to the 17.8 percent accounted for by expected loss. State and local taxes are important because they are paid on the entire coupon of corporate bonds, not just on the difference in coupon between corporate and treasuries. Despite the importance of the state and local taxes in explaining return differentials, their impact has been ignored in almost all prior studies of corporate rates.

Even after we account for the impact of default and taxes, there still remains a large part of the differential between corporate and treasuries that remains unexplained. In the case of 10-year corporates, 46.17 percent of the difference is unexplained by taxes or expected default. We have shown that the vast majority of this difference is compensation for systematic risk and is affected by the same influences that affect systematic risks in the stock market. Making use of the Fama–French factors, we show that as much as 85 percent of that part of the spread that is not accounted for by taxes and expected default can be explained as a reward for bearing systematic risk.

The assumption embedded in their argument is that the marginal US corporate bond buyer is taxable.

The authors claim:

Because the marginal tax rate used to price bonds should be a weighted average of the active traders, we assume that a maximum marginal tax rate would be approximately the midpoint of the range of maximum state taxes, or 7.5 percent. In almost all states, state tax for financial institutions (the main holder of bonds) is paid on income subject to federal tax. Thus, if interest is subject to maximum state rates, it must also be subject to maximum federal tax, and we assume the maximum federal tax rate of 35 percent.

Why Banks Failed the Stress Test

Thursday, February 19th, 2009

Andrew G Haldane: Why banks failed the stress test – Speech by Mr Andrew G Haldane, Executive Director, Financial Stability, Bank of England, at the Marcus-Evans Conference on Stress-Testing, London, 9-10 February 2009.

It’s wonderful! We’ll start with sigma-rigging:

Back in August 2007, the Chief Financial Officer of Goldman Sachs, David Viniar, commented to the Financial Times:

“We are seeing things that were 25-standard deviation moves, several days in a row”

To provide some context, assuming a normal distribution, a 7.26-sigma daily loss would be expected to occur once every 13.7 billion or so years. That is roughly the estimated age of the universe.

A 25-sigma event would be expected to occur once every 6 x 10124 lives of the universe. That is quite a lot of human histories. When I tried to calculate the probability of a 25-sigma event occurring on several successive days, the lights visibly dimmed over London and, in a scene reminiscent of that Little Britain sketch, the computer said “No”.

… and proceed to …

A few years ago, ahead of the present crisis, the Bank of England and the FSA commenced a series of seminars with financial firms, exploring their stress-testing practices. The first meeting of that group sticks in my mind. We had asked firms to tell us the sorts of stress which they routinely used for their stress-tests. A quick survey suggested these were very modest stresses. We asked why. Perhaps disaster myopia – disappointing, but perhaps unsurprising? Or network externalities – we understood how difficult these were to capture?

No. There was a much simpler explanation according to one of those present. There was absolutely no incentive for individuals or teams to run severe stress tests and show these to management. First, because if there were such a severe shock, they would very likely lose their bonus and possibly their jobs. Second, because in that event the authorities would have to step-in anyway to save a bank and others suffering a similar plight.

All of the other assembled bankers began subjecting their shoes to intense scrutiny.

You don’t build a career by telling your boss what he doesn’t want to hear. This is why regulatory capital charges must be progressive, so that larger firms are more conservatively capitalized than smaller.

Update, 2010-8-5: See also FTU.PR.A Provides 11-Sigma Update … but remember WFS.PR.A

Bank of Canada Releases Winter '08-'09 Review

Thursday, February 19th, 2009

The Bank of Canada has announced the release of the Bank of Canada Review of Winter 2008-2009.

Of most interest – to me! – was a paper on the value of information in the FX market, The Role of Dealers in Providing Interday Liquidity in the Canadian-Dollar Market by Chris D’Souza, which concluded:

Overall, results suggest that the relationship between the positions of commercial clients and market-makers, and the role played by dealers in interday liquidity provision, has been understated. There is considerable evidence that not all customer trades are equal. In particular, market-makers are quick to provide liquidity to [Foreign Domiciled] customers, possibly in an attempt to capture any fundamental information contained in these trades. Over time, dealers will off-load their positions to commercial clients as the information becomes stale, or as
the risks associated with holding these undesired balances becomes too costly.

Other papers were:

  • Merchants’ Costs of Accepting Means of Payment: Is Cash the Least Costly?
  • The Market Impact of Forward-Looking Policy Statements: Transparency vs. Predictability
  • Conference Summary: International Experience with the Conduct of Monetary Policy under Inflation Targeting

Dudley Speaks on TIPS

Sunday, February 15th, 2009

William C Dudley, President and Chief Executive Officer of the Federal Reserve Bank of New York, gave a speech at the Federal Reserve Bank of New York Inflation-Indexed Securities and Inflation Risk Management Conference, New York on 10 February 2009.

He addressed the problems of ex-ante vs. ex-post calculations of the costs of TIPS issuance:

Over the long run – and I mean the very long run – there should be roughly as many downward surprises in inflation performance as upward surprises. But within any relatively short period, such as the last decade, this certainly does not need to be the case. In other words, over such a short period, the outcome of an ex-post analysis can be heavily influenced by which of the two sides – the Treasury or investors – was the lucky recipient of the net inflation surprise that occurred over the period in question. For example, in countries such as the United Kingdom, where inflation declined following the inception of an inflation-linked debt program, ex-post studies generally suggest that these programs have reduced financing costs for these countries.
The fact that the Treasury saved or lost money ex-post is thus not a very reliable guide as to whether the strategic decision to implement a TIPS program has been a good idea. The relevant question is whether the Treasury obtained the financing it needed at a lower ex-ante cost. If the experiment were to be run thousands of times drawing from the underlying distribution of possible inflation outcomes, would Treasury’s costs have been lower, on average, with TIPS or with nominal Treasuries? To conclude on the basis of one coin flip or roll of the dice as ex-post analysis essentially does surely is not the best way to evaluate the respective costs of TIPS issuance versus nominal Treasuries.

… and examined the factors affecting TIPS pricing:

There are two primary factors underlying the relative cost differences:
1) the compensation investors require to hold a security that is less liquid than its nominal counterpart, termed the illiquidity premium, and
2) the insurance value they attach to obtaining protection against inflation risk, known as the inflation risk premium.

[footnote]In addition to these primary factors, TIPS yields also reflect the taxation difference between TIPS and nominal issues, the convexity difference between real and nominal yields and the price of the embedded deflation floor.

… and refers to some Fed analysis:

To determine the impact of the illiquidity premium and inflation risk premium on these results, we decomposed our ex-ante analysis, comparing the breakeven rate of inflation excluding the illiquidity premium in TIPS yields to the SPF forecast. This comparison yields an estimate of the premium investors were willing to pay for inflation protection at previous TIPS auctions. We found an average risk premium estimate of 47 basis points over our sample period. This suggests that the TIPS program does satisfy a real demand that is not met by nominal Treasuries.

It also suggests that if the Treasury were to take steps to shrink the illiquidity premium by, for example, improving secondary market trading in TIPS, this would shift the cost-benefit analysis more firmly in TIPS direction.

[Footnote] We used the illiquidity premium in TIPS yields estimated in D’Amico, Kim and Wei (2008). D’Amico, Kim and Wei calculated the liquidity component for five- and ten-year TIPS yields, which we used to adjust the auction prices for 5- and 10-year TIPS issues. For twenty- and thirty-year TIPS issues, we assumed that the liquidity component is equal to the component for a ten-year security, which in the event that these securities are less liquid than the ten-year note, understates this effect and thus underestimates the risk premium at this horizon. For further information, see Dudley, Roush and Steinberg Ezer (2008).

… and refers to some external studies of extremely hard to quantify benefits:

A few studies have found that an increase in supply in a particular segment of the Treasury yield curve has contributed to a rise in yields. As a result, by issuing securities in a segmented TIPS market, the Treasury may keep realized yields on bill and nominal coupon securities lower than they otherwise would have been.

and, importantly, hints at a process involving larger issues of longer dated TIPS:

I would be willing to make two modest suggestions here. First, it may make sense to emphasize longer-dated TIPS issuance rather than shorter-dated issuance. Analytically, the logic goes as follows. Inflation uncertainty is likely to increase at longer time horizons. Thus, investors are likely to pay a greater premium for inflation protection at longer-time horizons. This implies that the cost savings associated with TIPS are likely to be greater for longer maturities rather than shorter maturities.

This prediction is supported by empirical studies that have examined the premium that investors pay for inflation protection both in the United States and elsewhere. For example, a study by Brian Sack of Macroeconomic Advisors finds that forward breakeven inflation rates increase as maturity lengthens. In contrast, the level of survey-based measures of inflation expectations is quite constant beyond a time horizon of a few years. This means that the difference between forward breakeven inflation and inflation expectations climbs as the time horizon extends. This strongly suggests that the premium investors pay for inflation protection increases as maturities lengthen.

Second, it may make sense to structure the TIPS program in a way that would help reduce the illiquidity premium associated with TIPS relative to on-the-run nominal Treasuries. Some of the current illiquidity premium is likely to shrink as financial markets stabilize. However, further improvements may require a change in either the structure of the TIPS program or the secondary market trading environment.

The notion that issuance of 5-Year TIPS might be halted has been discussed on PrefBlog, as has a BoE Working Paper on the term-structure of inflation indexed bonds.

Calomiris on Regulatory Reform

Thursday, February 12th, 2009

Charles W. Calomiris of Columbia University has been mentioned on PrefBlog before, most recently on September 23. He has just posted a piece on VoxEU, Financial Innovation, Regulation and Reform that is thoughtful enough to deserve a thorough review.

He suggests that the current crisis is due to

  • the Fed’s easy monetary policy in 2002-05
  • official encouragement for sub-prime lending
  • restrictions on bank ownership and
  • ineffective prudential regulation

To fix this, he suggests a six-point plan.

The first point addresses the measurement of risk. He states:

If subprime risk had been correctly identified in 2005, the run-up in subprime lending in 2006 and 2007 could have been avoided.

The essence of the solution to this problem is to bring objective information from the market into the regulatory process, and to bring outside (market) sources of discipline in debt markets to bear in penalising bank risk-taking. These approaches have been tried with success outside the US, and they have often worked.

With respect to bringing market information to bear in measuring risk, one approach to measuring the risk of a loan is to use the interest rate paid on a loan as an index of its risk. Higher-risk loans tend to pay higher interest. Argentine bank capital standards introduced this approach successfully in the 1990s by setting capital requirements on loans using loan interest rates (Calomiris and Powell 2001). If that had been done with high-interest subprime loans, the capital requirements on those loans would have been much higher. Another complementary measure would be to use observed yields on uninsured debts of banks, or their credit default swaps, to inform supervisors about the overall risk of an institution.

Laudable objectives, but these are unworkable in practice.

Firstly, it is a lot easier to look back at sub-prime and say how nutty it was than it is to identify a bubble when you’re in the middle of it. See Making Sense of Subprime

Secondly, it’s extremely procyclical. Say we have a bank that accepts deposits, but puts its money to work by buying corporate bonds. In good times, spreads will be narrow, they will be making money and capital requirements will be small. In bad times, spreads will widen, losing them money and increasing capital requirements at the same time. This is not a good recipe.

The Argentine approach may address the procyclicity angle, but it is not apparrent in the essay. Dr. Calomiris needs to address this point head-on.

The second point is macro-prudential regulation triggers. Dr. Calomiris suggests that some form of countercyclical regulatory environment is desirable:

Borio and Drehmann (2008) develop a practical approach to identifying ex ante signals of bubbles that could be used by policy makers to vary prudential regulations in a timely way in reaction to the beginning of a bubble. They find that moments of high credit growth that coincide with unusually rapid stock market appreciation or unusually rapid house price appreciation are followed by unusually severe recessions. They show that a signalling model that identifies bubbles in this way (i.e., as moments in which both credit growth is rapid and one or both key asset price indicators is rising rapidly) would have allowed policy makers to prevent some of the worst boom-and-bust cycles in the recent experience of developed countries. They find that the signal-to-noise ratio of their model is high – adjustment of prudential rules in response to a signal indicating the presence of a bubble would miss few bubbles and would only rarely signal a bubble in the absence of one.

I think it’s entirely reasonable to adjust risk-weightings based on the age of the facility. Never mind macro-prudential considerations, I suspect that new relationships are inherently more risk than old, even in the absence of a growing balance sheet.

The third point is a desire for disaster planning by each institution to include pre-approved bail-out plans for “too big to fail” (TBTF) banks. I have problems with this. Lehman, for instance, may now be clearly seen as having been too big to fail in September 2008; but if it had blown up, for whatever reason, in September 2006 it would have been no big deal. This proposal brings with it a false sense of security.

I suggest that the TBTF problem be addressed by a progressive capital charge on size. The problem with bureacracies is that you get ahead by telling your boss whatever he wants to hear. Many of the problems we’re having is that the big boss (and the directors) were so many layers removed from the action that it’s no wonder they suffered a little hubris. If your first $20-billion in assets required $1-billion capital and your second $20-billion required $1.5-billion, I suggest that risk/reward analysis would be simpler. For the extremely limited amount of business that genuinely requires size, the banks could simply form one-off consortia.

The fourth point is a plea for housing finance reform. Dr. Calomiris suggests that the agencies be wound up and replaced with, for instance, conditional grants to first time buyers. My own suggestions, frequently droned, are:

Americans should also be taking a hard look at the ultimate consumer friendliness of their financial expectations. They take as a matter of course mortgages that are:

  • 30 years in term
  • refinancable at little or no charge (usually; this may apply only to GSE mortgages; I don’t know all the rules)
  • non-recourse to borrower (there may be exceptions in some states)
  • guaranteed by institutions that simply could not operate as a private enterprise without considerably more financing
  • Added 2008-3-8: How could I forget? Tax Deductible

The fifth point is relaxing restrictions on bank ownership to make accountability a little more of a practical concept, and I couldn’t agree more. I will also suggest that a progressive charge on size will help in this regard.

The sixth point seeks transparency in derivatives transactions. due to perceived opacity in counterparty risk.

The problem with requiring that all OTC transaction clear through a clearing house is that this may not be practical for the most customised OTC contracts. A better approach would be to attach a regulatory cost to OTC contracts that do not clear through the clearing house to encourage, but not require, clearing-house clearing.

I sort-of agree with this. I will suggest that the major problem with counterparty risk in this episode was that the big name players (AIG and the Monolines) were able to leave their committments uncollateralized. I suggest that uncollateralized derivative exposures should attract a significant capital charge … EL = PD * EAD * LD, right (that’s Basel-geek-speak for Expected Loss = Probability of Default * Exposure at Default * Loss on Default). Incorporate that in the capital charges and there will be little further problem with counterparty exposure.

UK FSA Publishes 2009 Financial Risk Outlook

Wednesday, February 11th, 2009

The Financial Services Authority has released its Financial Risk Outlook 2009 report, a very good review of the current situation, its causes and possible effects.

I will note in passing that it’s hard to learn about these releases! The FSA restricts its eMail notifications to journalists, citing “high demand”. I don’t understand! I spend seven hours per day deleting eMail offering me many interesting pills, and the FSA can’t send me an eMail that I’ve specifically requested? This makes no sense.

One highly interesting and topical subject is the decomposition of corporate bond yields, which have been discussed many times on PrefBlog – for instance, in the post announcing BoE Releases October 2008 Financial Stability Report. According to the Financial Times:

Fears are mounting over possible dividend cuts by life assurers, after a demand from the Financial Services Authority that they hold enough capital to survive fresh market shocks.

Life assurers are being told to test whether they would have a capital buffer after what is in effect a 60 per cent reduction in equity markets from current levels, as well as a significant increase in bond defaults, according to people who are aware of the project.

In the latest tests – regarded as more extreme than the December exercise – the FSA has told companies to test whether they would have a capital buffer in the event of a 1980s-style sharp and deep recession, according to people familiar with the plans.

This includes a 20 per cent reduction in equity markets from current levels, followed by another 39 per cent decline, which would take the FTSE 100 index to around the 2,000 level.

It is also asking companies to test whether they would have a capital buffer after a further significant increase in the returns that investors demand for holding bonds that are more risky than gilts and a significant decline in property prices.

The tests represent yet another change in approach by the FSA, which has alternated between pragmatism and strict solvency demands.

“Its like dealing with the police if they kept changing the crime laws,” added the executive.

Tergiversations by regulators are nothing new … but then, neither are flip-flops by the Financial Times, as noted in a very informative article published by The Actuary:

In October 2008, the Financial Times opined that “life assurers should not be using rising yields on corporate bonds to reduce estimates of their future liabilities. The higher yields… represent a higher risk of default and added potential costs”. However, a later article suggested that “the markets are utterly divorced from fundamental value or risks of defaults”.

The Actuary article provides a review of the Liquidity Premium from an investment standpoint:

The spread on corporate bonds over the liquid risk-free rate (for example, government bonds) represents compensation for several different factors:

A Expected default losses
B Unexpected default risk, such as default and recovery rate risk
C Mark-to-market risk, such as the risk of a fall in the market price of the bond
D Liquidity risk, such as the risk of not finding a ready buyer at the theoretical market price.

Investors concerned with the realisable value of their investment in the short-term require compensation for all these risks.

However, investors who can hold bonds to maturity need compensation only for A and B. Such investors can enjoy the premiums for C and D, and we refer to these collectively as a ‘liquidity premium’.

The traditional method for credit deductions only allowed directly for expected default losses, albeit measured on a prudent basis. The Financial Services Authority (FSA), in its September 2008 Insurance Sector Briefing, observed that insurers should allow for both expected losses and the risk of unexpected losses, although they have since deferred any recommendations until 2009.

… while pointing out the dangers of blind adherence to classical structural models:

These models are not without their issues. For example, in the Bank of England model, the residual premium on Sterling investment grade bonds fell from 155bps at 30 September 2008 to a negative premium of -9bps on 10 October, before rising to 118bps by the end of October. This is due to the use of equity market volatility to quantify credit default risk — on 10 October equity markets fell 10% with corresponding spikes in volatility, while credit markets were largely unaffected.

Nevertheless, structural models provide a valuable new tool for actuaries to quantify liquidity premiums, and also strong evidence of their existence.

The FSA does not go out of its way to talk tough on this issue, but does note:

As most life insurers hold corporate bonds to back various classes of business, analysis of market developments should be a key consideration for them in determining the discount rate used to value long-term liabilities. In setting this discount rate, most insurers make an assessment of the extent to which bond spreads can be explained by liquidity premiums, rather than the probability of default. Bond spreads have widened significantly, particularly in the third quarter of 2008, and whether insurers attribute this to an increase in the liquidity premium or an increase in credit default risk affects the value of their liabilities. Moreover, asset values will themselves be affected by the increased risk of corporate bond defaults. It is therefore critical that insurers holding corporate bond portfolios properly review underlying credit developments, in order to understand the state of their balance sheets and their capital positions.

Some insurers may have experienced difficulty with the valuation of their assets and, in particular, corporate bonds, because of the considerable reduction in market activity in many asset classes.

and provides a picture:

The FSA returns to this issue when discussing annuities:

Insurers operating predominantly or exclusively in annuity business are exposed to a concentration of longevity and credit risk. Market conditions have added to the impact of these risks, increasing the risk profile of this part of the sector. The risks arising from widening corporate bond spreads (as outlined above) are a particular issue for annuity business, in which long-term assets such as corporate bonds are used to match long-term liabilities (such as annuities in payment).

and included an exhortation to be prudent when decomposing the spreads in their “key messages to insurers”.

Also of interest was the FSA’s distinction between the much reviled “originate and distribute” model and the more precise “acquire and arbitrage” practice:

The need to support the growing levels of property and mortgage lending led to the increasing scale and size of securitised markets, and their mounting complexity were accompanied by a significant escalation in the leverage of banks, investment banks and off balance-sheet vehicles, and the growing role of hedge funds. (Chart A5 and A6) Large positions in securitised credit and related derivatives were increasingly held by banks, near banks, and shadow banks, rather than passed through to traditional hold-to-maturity investors.

Hence, the new model of securitised credit intermediation was not solely or indeed primarily one of originate and distribute. Rather, credit intermediation passed through multiple trading books in banks, leading to a proliferation of relationships within the financial sector. This ‘acquire and arbitrage’ model resulted in the majority of incurred losses falling on banks and investment banks involved in risky maturity transformation activities, rather than investors outside the banking system. This explosion of claims within the financial system resulted in financial sector balance sheets becoming of greater consequence to the economy.

IIROC Report on Short Selling Ban

Wednesday, February 4th, 2009

IIROC has released its Study on the Impact of the Prohibition on the Short Sale of Interlisted Financial Sector Issuers. This report, according to the press release:

was undertaken at the request of the Canadian Securities Administrators to examine the impact of the Ontario Securities Commission Order prohibiting short sales of certain financial sector issuers (“Restricted Financials”) on trading activity. The Order, issued on September 19, 2008, prohibited short sales of Restricted Financials listed on the Toronto Stock Exchange that were also inter-listed with exchanges in the United States. The IIROC study shows that, prior to the introduction of the prohibition on short sales, short selling activity in Restricted Financials was generally consistent with historic levels of short selling for inter-listed securities.

A big argument in favour of the short selling ban, as some may remember, was:

prevent regulatory arbitrage with respect to short selling in Ontario of the securities of Restricted Financials as a result of initiatives undertaken in the United States by the Securities and Exchange Commission (“SEC”)

Geez, you know, I’ve always thought that one of the differences between Canada and the US is that in Canada we are not subject to US law; we get to make up our own laws instead. Now, I am more than willing to agree that a little communication, if not cooperation, is always in order; and I will also agree that from time to time we will defer to our gigantic neighbor simply because they control the game; and I will also agree that in the case of interlisted securities a difference in short-selling rules could cause unknown-and-possibly-bad things to happen to market tone … but none of these rationales were addressed. In today’s brave new world it is, apparently, entirely sufficient to mumble something about “regulatory arbitrage” without the need to complicate matters by determining what is good for Canadian markets and good for Canada.

I would have met each US political blather about bonus control and criminalization of CDS trading with a political announcement that such activities in Canada were not just going to be encouraged, but actually subsidized! Let’s steal all their business, that’s what I say! But I digress.

One source of amusement is IIROC’s use of significant figures:

For example, as at August 1, 2008, the market capitalization for the Restricted Financials as a group was $291,409,251,788 ranging from a high of $62,525,252,799 for Royal Bank of Canada to a low of $138,929,587 for Thomas Weisel Partners Group Inc. with three other securities having a market capitalization of less than $400,000,000.

It is rare that I see a report containing twelve significant figures (was that at the ask, the bid, the close, the Volume-Weighted-Average-Price, or what, I wonder) and I thank IIROC for publishing these data.

The summary of findings is consistent with everything else I’ve seen:

In summary, during the Study Period:
• the issuance of the Orders did not appear to have had any appreciable effect on the price of securities of either Restricted Financials or Non-Restricted Financial (both of which have performed better than benchmark index of market performance);
• there were “unusual” levels of activity in “financial sector” securities (both the Restricted and Non-Restricted Financials) in the Pre-Order Week;
• the proportion of short sales of Restricted Financials in the Pre-Order Period was in line with, or less than, historic patterns and the evels of short selling for inter-listed securities generally;
• there was no evidence of undue short selling pressure in the Non-Restricted Financials in the Pre-Order Period (including the Pre-Order Week);
• the issuance of the Orders appeared to have had a significant impact on market quality for the trading of the Restricted Financials by:
o reducing liquidity available in the Restricted Financials, and
o increasing the “spread” for Restricted Financials as measured by the difference between the closing bid and ask prices;

However, the most delicious thing about the report is the selection of the control group – the “Non-Restricted Financials”:

The Original Temporary Order applied to the securities of the Restricted Financials, being thirteen issuers from the financial sector that are listed on the TSX and also listed on an exchange in the United States and subject to the initiatives taken by the SEC. IIROC identified seventy-seven other issuers in the financial sector, of which thirty-three securities (“Non-Restricted Financials”) had more than minimal trading activity during the Pre-Order Period.6 Appendix “A” lists the securities which are considered either a Restricted Financial or a Non-Restricted Financial.

Where possible, the analysis compares the performance of the Restricted Financials and the Non-Restricted Financials with the performance of the S&P/TSX Composite Index. Certain of the measures used in the analysis therefore weight the results for the Restricted Financials and Non-Restricted Financials by their market capitalization. Market capitalization was calculated by multiplying the issued capital of each Restricted Financial as at August 31, 2008 by the relevant price for a particular trading day.

Turning to Appendix A, we find two somewhat startling inclusions in the Non-Restricted Financials: CL.PR.B and HSB.PR.C.

The data is not presented in fine enough detail for the impact of the inclusion of preferred shares in the control group to be estimated; I suspect that the effect was small. But to the extent that there was any effect at all, it should have been regarded as irrelevant.

I sympathize with IIROC with respect to the problems they faced in compiling a control group for the major financial issuers … but they might appreciate some advice from an old quant: If you don’t have the data, don’t do the analysis.

Update, 2013-11-13: Link to report updated.

NYT Piece on Value at Risk

Wednesday, January 28th, 2009

I read this a while ago … was looking for it in my “notes” (as I refer to the Interesting External Papers category of PrefBlog … and couldn’t find it!

Anyway, Joe Nocera of the New York Times wrote an excellent feature article on Value at Risk: Risk Mismanagement.

The major point to be understood is that management of Goldman Sachs used VaR in an intelligent manner:

in December 2006, Goldman’s various indicators, including VaR and other risk models, began suggesting that something was wrong. Not hugely wrong, mind you, but wrong enough to warrant a closer look.

“We look at the P.& L. of our businesses every day,” said Goldman Sachs’ chief financial officer, David Viniar, when I went to see him recently to hear the story for myself. (P.& L. stands for profit and loss.) “We have lots of models here that are important, but none are more important than the P.& L., and we check every day to make sure our P.& L. is consistent with where our risk models say it should be. In December our mortgage business lost money for 10 days in a row. It wasn’t a lot of money, but by the 10th day we thought that we should sit down and talk about it.”

So Goldman called a meeting of about 15 people, including several risk managers and the senior people on the various trading desks. They examined a thick report that included every trading position the firm held. For the next three hours, they pored over everything. They examined their VaR numbers and their other risk models. They talked about how the mortgage-backed securities market “felt.” “Our guys said that it felt like it was going to get worse before it got better,” Viniar recalled. “So we made a decision: let’s get closer to home.”

Various other elements of VaR and its critiques have been referenced in An Early Debate on Value at Risk.

The main problem as I see it is that VaR does not – and cannot – account for trends. If, for instance, you measure your daily VaR based on data from, say, an environment of steadily increasing real-estate prices, that tells you nothing – NOTHING! – about what happens when they decline. Especially if they decline suddenly and interact with factors not in your model, such as “jingle mail”.

And the other problem is – as Taleb appears to have made a career out of saying – fat tails and black swans.

IMF Releases Global Financial Stability Report Update

Wednesday, January 28th, 2009

The International Monetary Fund has released its January 28 Update to the October Global Financial Stability Report:

Financial markets worldwide reflect ongoing deleveraging pressures amidst a deepening economic downturn. In spite of extensive policies, the global financial system remains under intense stress. Moreover, worsening economic conditions are producing new, large writedowns for financial institutions. In response, balance sheets are being cut back through asset sales and the retiring of maturing credits. These actions have increased downward pressure on asset prices and reduced credit availability. Restoring financial sector functionality and confidence are necessary elements of economic recovery. However, more aggressive actions by both policymakers and market participants are needed to ensure that the necessary deleveraging process is less disorderly. A broad three-pronged approach—including liquidity provision, capital injections, and disposal of problem assets—should be implemented fully and quickly so as to encourage balance sheet cleansing. At the same time, international cooperation will be required to ensure the policy coherence and consistency needed to re-establish financial stability.

They suggest:

International cooperation on a common framework for financial policies should receive high priority. The application of substantially different conditions when supporting financial institutions should be avoided in order to prevent unintended consequences that may arise from competitive distortions and regulatory arbitrage. International coordination is also needed to avoid excessive “national bias,” whereby domestic institutions are favored or local credit provision is encouraged, to the detriment of other countries. A more consistent insolvency framework for financial institutions would also help.

An Early Debate on Value at Risk

Tuesday, January 27th, 2009

I found the The Jorion-Taleb Debate from 1997 to be most interesting – particularly Taleb’s comment:

Banks have the ingrained habit of plunging headlong into mistakes together where blame-minimizing managers appear to feel comfortable making blunders so long as their competitors are making the same ones. The state of the Japanese and French banking systems, the stories of lending to Latin America, the chronic real estate booms and busts, and the S&L debacle provide us with an interesting cycle of communal irrationality. I believe that the VAR is the alibi bankers will give shareholders (and the bailing-out taxpayer) to show documented due diligence, and will express that their blow-up came from truly unforeseeable circumstances and events with low probability-not from taking large risks they did not understand. But my sense of social responsibility will force me to point my finger menacingly. I maintain that the due-diligence VAR tool encourages untrained people to take misdirected risk with shareholders’, and ultimately the taxpayers’, money.

There’s also a debate between David Einhorn & Aaron Brown from the summer of 2008 that is of great interest.

And straight line from CAPM through VaR to Basel II is drawn by Kaplanski, Guy and Levy, Haim,Value-at-Risk Capital Requirement Regulation and Asset Allocation: A Mean-Variance Analysis(August 2007). Available at SSRN: http://ssrn.com/abstract=1081288