Category: Regulation

Regulation

Dickson speaks against IFRS Exposure Draft

Julie Dickson spoke against the IFRS Exposure Draft on Insurance Contracts in a speech at the 2010 Life Insurance Invitational Forum:

On the “positive side”, the new approach might better capture financial risks of companies, particularly equity and interest rate risks, and thus provide more early warnings of risks. On the “negative” side, the discount rate change could potentially lead to extreme earnings volatility especially given the large blocks of long-duration guaranteed product liabilities on the books of Canadian insurance companies. As such, we think the proposals may go too far in terms of capturing short-term interest rate movements on long-term exposures. Consequently, we are working on options to help deal with this issue.

In fact we are encouraged by recent developments in this regard. One such development is that the IASB’s Insurance Working Group is meeting later this week to discuss possible ways to minimize the effects of any inappropriate volatility. This group’s objective is to analyze accounting issues relating to insurance contracts. The group brings together a wide range of interests and includes senior financial executives who are involved in financial reporting. Other developments closer to home are discussions by the Canadian Accounting Standards Board’s Insurance Accounting Task Force and the Canadian Institute of Actuaries group to develop their comment letters to the IASB. Both these groups are discussing the volatility issue.

OSFI is committed to continuing to work with industry and other international stakeholders as we complete our response to the IASB, which is due November 30th. We encourage the industry to contribute to this work; the more that we work together, the better the result will be.

See also the Canada Life and Health Insurance Association comment letter, discussed briefly in the post SLF Coy on Capital Rule Changes.

Regulation

The Flash Crash: The Impact of High Frequency Trading on an Electronic Market

Themis Trading refers me to a comment letter from R T Leuchtkafer which in turn referred me to an excellent paper by Andrei A. Kirilenko, Albert S. Kyle, Mehrdad Samadi and Tugkan Tuzun titled Flash Crash: The Impact of High Frequency Trading on an Electronic Market.

We define Intermediaries as those traders who follow a strategy of buying and selling a large number of contracts to stay around a relatively low target level of inventory. Specifically, we designate a trading account as an Intermediary if its trading activity satisfies the following two criteria. First, the account’s net holdings fluctuate within 1.5% of its end of day level. Second, the account’s end of day net position is no more than 5% of its daily trading volume. Together, these two criteria select accounts whose trading strategy is to participate in a large number of transactions, but to rarely accumulate a significant net position.

We define High Frequency Traders as a subset of Intermediaries, who individually participate in a very large number of transactions. Specifically, we order Intermediaries by the number of transactions they participated in during a day (daily trading frequency), and then designate accounts that rank in the top 3% as High Frequency Traders. Once we designate a trading account as a HFT, we remove this account from the Intermediary category to prevent double counting.

This seems like an entirely sensible division, although one might quibble about the 3% cut-off. Why not 2% or 4%? It might also be illuminating to make the division based on the technology used.

Some Fundamental Traders accumulate directional positions by executing many small-size orders, while others execute a few larger-size orders. Fundamental Traders which accumulate net positions by executing just a few orders look like Noise Traders, while Fundamental Traders who trade a lot resemble Opportunistic Traders. In fact, it is quite possible that in order not to be taken advantage of by the market, some Fundamental Traders deliberately pursue execution strategies that make them appear as though they are Noise or Opportunistic Traders. In contrast, HFTs appear to play a very distinct role in the market and do not disguise their market activity.

Naturally, the better you are at disguising your activity, the better you are going to do for your clients. This point is lost upon the regulators, who generally take the view that an order cancellation is an indication of fraudulent activity and spend their time crafting rules to penalize smart traders and their clients.

It will also be noted that the ultimate disguise consists of not showing your order publicly at all – which means using a dark pool.

In order to further characterize whether categories of traders were primarily takers of liquidity, we compute the ratio of transactions in which they removed liquidity from the market as a share of their transactions.[Footnote] According to Table 2, HFTs and Intermediaries have aggressiveness ratios of 45.68% and 41.62%, respectively. In contrast, Fundamental Buyers and Sellers have aggressiveness ratios of 64.09% and 61.13%, respectively. This is consistent with a view that HFTs and Intermediaries generally provide liquidity while Fundamental Traders generally take liquidity. The aggressiveness ratio of High Frequency Traders, however, is higher than what a conventional definition of passive liquidity provision would predict. [Footnote]

In order to better characterize the liquidity provision/removal across trader categories, we compute the proportion of each order that was executed aggressively.[Footnote] Table 3 presents the cumulative distribution of ratios of order aggressiveness.

Footnote: When any two orders in this market are matched, the CME Globex platform automatically classifies an order as ‘Aggressive’ when it is executed against a ‘Passive’ order that was resting in the limit order book. From a liquidity standpoint, a passive order (either to buy or to sell) has provided visible liquidity to the market and an aggressive order has taken liquidity from the market. Aggressiveness ratio is the ratio of aggressive trade executions to total trade executions. In order to adjust for the trading activity of different categories of traders, the aggressiveness ratio is weighted either by the number of transactions or trading volume.

Footnote: One possible explanation for the order aggressiveness ratios of HFTs is that some of them may actively engage in “sniping” orders resting in the limit order book. Cvitanic and Kirilenko (2010) model this trading behavior and conclude that under some conditions this trading strategy may have impact on prices. Similarly, Hasbrouck and Saar (2009) provide empirical support for a possibility that some traders may have altered their strategies by actively searching for liquidity rather than passively posting it. Yet another explanation is that after passively buying at the bid or selling at the offer, HFTs quickly reduce their inventories by trading aggressively if necessary.

Footnote: The following example illustrates how we compute the proportion of each order that was executed aggressively. Suppose that a trader submits an executable limit order to buy 10 contracts and this order is immediately executed against a resting sell order of 8 contracts, while the remainder of the buy order rests in the order book until it is executed against a new sell order of 2 contracts. This sequence of executions yields an aggressiveness ratio of 80% for the buy order, 0% for the sell order of 8 contracts, and 100% for the sell order of 2 contracts.

This is a much better indicator of order intent than the puerile “Order Toxicity” metric, but remains flawed, as shown by the last footnote. If somebody needs to sell a large block, for instance, and places an offer well below the prevailing market price, the vast majority of it will execute as buyers take advantage of this low offer (this happens on a routine basis in the preferred share market). However, since this order is “resting”, these execution will indicate that the seller is providing liquidity and the buyers are taking it – when the actual situation is the other way around.

In fact, the first quoted section above explicitly demonstrates this fact of trading, with Fundamental Traders going to great lengths to look like Noise and Opportunistic traders.

According to Figure 4, HFTs do not accumulate a significant net position and their position tends to quickly revert to a mean of about zero. The net position of the HFTs fluctuates between approximately +/- 3000 contracts. Figure 5 presents the net position of the Intermediaries during May 3-6, 2010.

According to Figure 5, Intermediaries exhibit trading behavior similar to that of HFTs. They also do not accumulate a significant net position. Compared to the HFTs, the net position of the Intermediaries fluctuates within a more narrow band of +/- 2000 contracts, and reverts to a lower target level of net holdings at a slower rate.

We also find a notable decrease in the number of active Intermediaries on May 6. As the Figure 6 shows, the number of active Intermediaries dropped from 66 to 33, as the large price decline ensues.

In contrast, as presented in Figure 7, the number of active HFTs decreases from 13 to 10.

This demonstrates the position limits highlighted by the SEC report.

We interpret these results as follows. HFTs appear to trade in the same direction as the contemporaneous price and prices of the past five seconds. In other words, they buy, if the immediate prices are rising. However, after about ten seconds, they appear to reverse the direction of their trading – they sell, if the prices 10-20 seconds before were rising. These regression results suggest that, possibly due to their speed advantage or superior ability to predict price changes, HFTs are able to buy right as the prices are about to increase. HFTs then turn around and begin selling 10 to 20 seconds after a price increase.

The Intermediaries sell when the immediate prices are rising, and buy if the prices 3-9 seconds before were rising. These regression results suggest that, possibly due to their slower speed or inability to anticipate possible changes in prices, Intermediaries buy when the prices are already falling and sell when the prices are already rising.

So in other words, part of the thing that differentiates HFT and Intermediaries is not simply the volume of trade, but also that the HFT guys can do it better. In many cases, HFT strategies attempt to predict the (short-term) future direction of the market by looking at the order book … if there’s a huge volume of offers compared to the bids, get out of the way! One method of counter-attack against this is, as mentioned above, the use of dark pools for trading.

We consider Intermediaries and HFTs to be very short term investors. They do not hold positions over long periods of time and revert to their target inventory level quickly. Observed trading activity of HFTs can be separated into three parts. First, HFTs seem to anticipate price changes (in either direction) and trade aggressively to profit from it. Second, HFTs seem to provide liquidity by putting resting orders in the direction of the anticipated the price move. Third, HFTs trade to keep their inventories within a target level. The inventory management trading objective of HFTs may interact with their price-anticipation objective. In other words, at times, inventory-management considerations of HFTs may lead them to aggressively trade in the same direction as the prices are moving, thus, taking liquidity. At other times, in order to revert to their target inventory levels, HFTs may passively trade against price movements and, thus, provide liquidity.

This is consistent with my speculation on October 25 that HFT acts as a capacitator that will discharge if a certain inventory level is breached.

We find that compared to the three days prior to May 6, there was an unusually level of HFT “hot potato” trading volume — due to repeated buying and selling of contracts accompanied a relatively small change in net position. The hot potato effect was especially pronounced between 13:45:13 and 13:45:27 CT, when HFTs traded over 27,000 contracts, which accounted for approximately 49% of the total trading volume, while their net position changed by only about 200 contracts.

We interpret this finding as follows: the lack of Opportunistic and Fundamental Trader, as well as Intermediaries, with whom HFTs typically trade, resulted in higher trading volume among HFTs, creating a hot potato effect. It is possible that during the period of high volatility, Opportunistic and Fundamental Traders were either unable or unwilling to efficiently submit orders. In the absence of their usual trading counterparties, HFTs were left to trade with other HFTs.

So in other words, it wasn’t the HFTs that left the market, it was the Opportunistic and Fundamental Traders.

Aggressiveness Imbalance is constructed as the difference between aggressive buy transactions minus aggressive sell transactions. Figure 8 shows the relationship between price and cumulative Aggressiveness Imbalance (aggressive buys – aggressive sells).

In addition, we calculate Aggressiveness Imbalance for each category of traders over one minute intervals. For illustrative purposes, the Aggressiveness Imbalance indicator for HFTs and Intermediaries are presented in Figures 9 and 10, respectively.

According, to Figures 9 and 10, visually, HFTs behave very differently during the Flash Crash compared to the Intermediaries. HFTs aggressively sold on the way down and aggressively bought on the way up. IN contrast, Intermediaries are about equally passive and aggressive both down and up.

As suggested above, this could simply be a result of HFT looking at the order book and taking a view, in addition to the considerations implied by their inventories.

I have added emphasis below to what I suggest is the central conclusion to be drawn from the Flash Crash.

We believe that the events on May 6 unfolded as follows. Financial markets, already tense over concerns about the European sovereign debt crisis, opened to news concerning the Greek government’s ability to service its sovereign debt. As a result, premiums rose for buying protection against default on sovereign debt securities of Greece and a number of other European countries. In addition, the S&P 500 volatility index (“VIX”) increased, and yields of ten-year Treasuries fell as investors engaged in a “flight to quality.” By midafternoon, the Dow Jones Industrial Average was down about 2.5%.

Sometime after 2:30 p.m., Fundamental Sellers began executing a large sell program. Typically, such a large sell program would not be executed at once, but rather spread out over time, perhaps over hours. The magnitude of the Fundamental Sellers’ trading program began to significantly outweigh the ability of Fundamental Buyers to absorb the selling pressure.

HFTs and Intermediaries were the likely buyers of the initial batch of sell orders from Fundamental Sellers, thus accumulating temporary long positions. Thus, during the early moments of this sell program’s execution, HFTs and Intermediaries provided liquidity to this sell order. However, just like market intermediaries in the days of floor trading, HFTs and Intermediaries had no desire to hold their positions over a long time horizon. A few minutes after they bought the first batch of contracts sold by Fundamental Sellers, HFTs aggressively sold contracts to reduce their inventories. As they sold contracts, HFTs were no longer providers of liquidity to the selling program. In fact, HFTs competed for liquidity with the selling program, further amplifying the price impact of this program.

Furthermore, total trading volume and trading volume of HFTs increased significantly minutes before and during the Flash Crash. Finally, as the price of the E-mini rapidly fell and many traders were unwilling or unable to submit orders, HFTs repeatedly bought and sold from one another, generating a “hot-potato” effect. Yet, Opportunistic Buyers, who may have realized significant profits from this large decrease in price, did not seem to be willing or able to provide ample buy-side liquidity. As a result, between 2:45:13 and 2:45:27, prices of the E-mini fell about 1.7%.

At 2:45:28, a 5 second trading pause was automatically activated in the E-mini. Opportunistic and Fundamental Buyers aggressively executed trades which led to a rapid recovery in prices. HFTs continued their strategy of rapidly buying and selling contracts, while about half of the Intermediaries closed their positions and got out of the market. In light of these events, a few fundamental questions arise. Why did it take so long for opportunistic buyers to enter the market and why did the price concessions had to be so large? It seems possible that some opportunistic buyers could not distinguish between macroeconomic fundamentals and market-specific liquidity events. It also seems possible that the opportunistic buyers have already accumulated a significant positive inventory earlier in the day as prices were steadily declining. Furthermore, it is possible that they could not quickly find opportunities to hedge additional positive inventory in other markets which also experienced significant volatility and higher latencies. An examination of these hypotheses requires data from all venues, products, and traders on the day of the Flash Crash.

I suggest that the reason this happened is because Opportunistic traders are simply not very smart people. They’re prep-school smiley-boys who got their jobs through Daddy’s connections and can make a fat living without the necessity of labour or thought. This will not change until performance genuinely becomes a desirable metric in the marketplace (as opposed to consumer-goods style branding) and regulators dispose of their fixation on turnover, which is simply a hangover from legitimate concern regarding commission-driven churning.

But a lot of it is simply ultimate investors’ desire for a good story. In general, investors want to hear “I bought it because Bernanke this and Buffet that and in-depth macro-economic analysis the other thing”, not “I bought it because somebody really, really wanted to sell it and it was outside its fair-value range compared to what I sold. I think. Maybe. This type of trade works about 60% of the time.”

That being said, however, I will also suggest that it is possible that the Opportunistic Buyers were dissuaded from entering the market through the quote-stuffing identified by Nanex, which has yet to be explained in a satisfactory manner.

And, I think, one piece of information we need is a look at the order book at the time – such as it was! It is possible that the selling by HFT was not due merely to a desire to square their books, but there was also the motivation supplied by a huge volume of resting sells relative to resting buys. Appendix IV.2 of the SEC Flash Crash Report gave order-book data for seven securites, but not the eMini contract. I republished two of the depth-charts (for Accenture) in my post regarding the report.

Based on our analysis, we believe that High Frequency Traders exhibit trading patterns consistent with market making. In doing so, they provide very short term liquidity to traders who demand it. This activity comprises a large percentage of total trading volume, but does not result in a significant accumulation of inventory. As a result, whether under normal market conditions or during periods of high volatility, High Frequency Traders are not willing to accumulate large positions or absorb large losses. Moreover, their contribution to higher trading volumes may be mistaken for liquidity by Fundamental Traders. Finally, when rebalancing their positions, High Frequency Traders may compete for liquidity and amplify price volatility. Consequently, we believe, that irrespective of technology, markets can become fragile when imbalances arise as a result of large traders seeking to buy or sell quantities larger than intermediaries are willing to temporarily hold, and simultaneously long-term suppliers of liquidity are not forthcoming even if significant price concessions are offered. We believe that technological innovation is critical for market development. However, as markets change, appropriate safeguards must be implemented to keep pace with trading practices enabled by advances in technology.

Update: This is probably as good a place as any to pass on some information about Stop-Loss orders, from Mary Schapiro’s September 7 speech to the Economic Club of New York, titled Strengthening Our Equity Market Structure:

To understand where individual investors are coming from, we must truly recognize the impact of severe price volatility on their interests: one example is the use and impact of stop loss orders on May 6. Stop loss orders are designed to help limit losses by selling a stock when it drops below a specified price, and are a safety tool used by many individual investors to limit losses.

The fundamental premise of these orders is to rely on the integrity of market prices to signal when the investor should sell a holding. On May 6, this reliance proved misplaced and the use of this tool backfired.

A staggering total of more than $2 billion in individual investor stop loss orders is estimated to have been triggered during the half hour between 2:30 and 3 p.m. on May 6. As a hypothetical illustration, if each of those orders were executed at a very conservative estimate of 10 percent less than the closing price, then those individual investors suffered losses of more than $200 million compared to the closing price on that day.

I disagree with her view of the fundamental premise of a stop-loss order. The purpose of a stop-loss order is to demonstrate that you’re an ignorant little turd who deserves to go bankrupt. If we consider an earlier section of Ms. Schapiro’s speech …:

Those who purchase stock in an initial public offering, for example, can have confidence that they will be able to sell that stock at a fair and efficient price in the secondary market when they need or want to. And of course, the values assigned to stocks in the secondary market play an important role in the ability of companies to raise additional funding.

Markets are powerful and they are the most efficient and effective tools for turning savings into capital and growth.

But, if the equity market structure breaks down — if it fails to provide the necessary and expected fairness, stability, and efficiency — investors and companies pull back, raising costs and reducing growth.

… we see that the fundamental premise of a market is to indicate a fair value of a listed company. I have no arguments with that. A stop-loss order says “I don’t want to sell this stock at $50. But if it goes down to $40, that’s the time I want to sell it.” – a sentiment completely divorced from the objective of fairly valuing a listed company.

Update: Despite all this – despite the complete lack of evidence that either HFT or algorithmic trading was in any way the root cause of the debacle – there are some who don’t want to be confused with facts:

“While I do not believe that the Flash Crash was the direct result of reckless misconduct in the futures market, I question what the CFTC could have done if the opposite were true. When does high frequency or algorithmic trading cross the line into being disruptive to our markets? And, along those same lines, who is responsible when technology goes awry? Do we treat rogue algorithms like rogue traders? These are the issues I hope to explore at our October 12th meeting,” stated Commissioner O’Malia.

Nothing wrong with the world that a few extra rules wouldn’t cure, eh Commisioner?

Issue Comments

MFC Vague on Capital Requirements; Downgraded by Moody's; S&P Watch-Negative

Manulife Financial Corporation has released its 3Q10 Financials. The Press Release states:

The net loss attributed to shareholders of $947 million included the following notable items:

  • Net gains of $1,041 million related to higher equity markets and lower interest rates.
  • Charges of $2,031 million related to basis changes resulting from the annual review of all
    actuarial methods and assumptions.

  • A $1,039 million (US$1,000 million) goodwill impairment charge on our U.S. Insurance business related to the economic outlook and the repositioning of that business.
  • Other notable items netted to a $303 million gain and are described in more detail below.

After adjusting for these notable items, adjusted earnings from operations was $779 million.

They made a lot of Long-Term Care sales in the States … but was it profitable and can it be sustained?

John Hancock Long-Term Care (“JH LTC”) sales increased 20 per cent in the third quarter compared to the prior year, driven by sales of retail products which increased in advance of June new business price increases taking effect. As a result of the recently completed claims experience study and the continuing low interest rate environment, JH LTC has temporarily suspended new group sales and is planning other retail product changes. JH LTC sales are expected to decline in the fourth quarter of 2010. In addition, JH LTC will be raising premiums on in-force business and is actively working with regulators to implement increases that are on average 40 per cent and affect the majority of the in-force business.

There are not enough details given to form a firm opinion … but a 40% increase in rates? on average? That has all the hallmarks of a major fuck-up. Do these guys know what they’re doing?

They do provide a clearer warning of the effect of OSFI’s new risk requirements:

The Office of the Superintendent of Financial Institutions (“OSFI”) has been conducting a review of segregated fund/variable annuity capital requirements. On October 29, 2010, OSFI issued a draft advisory containing new minimum calibration criteria for determining capital requirements for segregated fund business written after January 1, 2011. It is expected that the new calibration criteria will increase capital requirements on these products and our 2011 product offerings will be developed and priced taking into account these new rules. As drafted the new capital requirements will also apply to subsequent deposits to existing contracts and to contracts that reset their guarantee levels after January 1, 2011.

… and the capital requirements for seg funds will be getting even stricter:

OSFI is also expected to continue its consultative review of its capital rules for more general application, likely in 2013. OSFI notes that it is premature to draw conclusions about the cumulative impact this process will have, but the general direction has been one of increased capital requirements. OSFI has stated that increases in capital may be offset by other changes, such as hedge recognition. The Company will continue to monitor developments. However, at this time, it appears that it is more likely than not that the capital requirements for in-force business will increase and this increase could be material.

They are worried by the IFRS Exposure Draft on Insurance Contracts (see also commentary on SLF 3Q10) and are busily sleazing around the regulators and politicians to get an exemption:

This mismatch between the underlying economics of our business and reported results and potentially our capital requirements could have significant unintended negative consequences on our business model which would potentially affect our customers, shareholders and the capital markets. We believe the accounting rules under discussion could put Canadian insurers at a significant disadvantage relative to their U.S. and global peers, and also to the banking sector in Canada. We are currently reviewing the proposals contained in the Exposure Draft, and, along with other companies in the Canadian insurance industry, expect to provide comments and input to the IASB. The insurance industry in Canada is also currently working with OSFI and the federal government with respect to the potential impact of these proposals on Canadian insurance companies, and the industry is urging policymakers to ensure that any future accounting and capital proposals appropriately consider the underlying business model of a life insurance company and in particular, the implications for long duration guaranteed products which are much more prevalent in North America than elsewhere.

Sadly, Prentice has already been bought hired by CIBC, but there are probably many other politicians for sale eager to devote their expertise to the private sector.

DBRS comments:

DBRS has reviewed the Q3 2010 results of Manulife Financial Corporation (Manulife or the Company) released today and believes that, notwithstanding the negative net earnings figure, the Company is on the right track to restoring sustainable profitability. The ratings of Manulife and its affiliates remain unchanged, including the Issuer Rating of its major operating subsidiary, The Manufacturers Life Insurance Company (MLI), at AA (low). The ratings were recently downgraded on August 9, 2010.

The Company has been actively repositioning its product offering by selectively increasing prices and emphasizing products that are less capital intensive. Integrated risk management and control is leading to a systematic reduction in equity and interest rate risk through portfolio shifts. As part of its risk management framework, the Company has now hedged 54% of its variable annuity exposures to equity markets and has plans to use actions based on time schedules and market triggers to reach its risk reduction goals. The Company expects to reduce its equity sensitivity by approximately 60% by 2012 and approximately 75% by 2014. It also expects to take actions to further reduce its interest rate exposures, as measured by the impact on shareholders’ net income, by approximately 25% by the end of 2012 and approximately 50% by the end of 2014. While this could ultimately be expensive for the Company, DBRS believes that ridding itself of this equity market and interest rate risk hangover is fundamental to restoring market confidence in the Company’s longer-term outlook.

DBRS did not comment on the size of the write-down due to changes in actuarial assumptions: they had previously estimated a charge of $700- to $800-million.

Moody’s downgraded the operating subsidiaries:

Moody’s Investors Service downgraded the insurance financial strength (IFS) ratings of Manulife Financial Corporation’s (Manulife; TSX: MFC, unrated) subsidiaries to A1 from Aa3. These subsidiaries include Manufacturers Life Insurance Company (MLI) and John Hancock Life Insurance Company (USA) (JHUSA). Short-term ratings were affirmed. The rating outlook for Manulife’s subsidiaries is stable. These rating actions conclude the reviews for downgrade initiated on August 5, 2010.

The rating agency said the downgrades follow MFC’s announcement of a nearly $1 billion net loss in 3q10 and incorporated the following business developments. First, Manulife’s acknowledgement of higher morbidity experience within its US long-term-care block and the resulting need for an average rate increase exceeding 40% in coming months. Also, the company faces the challenge of redesigning products to restore earnings power, combined with the possibility of continued earnings volatility until the firm’s enhanced market-risk hedging program is substantially complete.

Moody’s said the downgrades also reflect Manulife’s diminished financial flexibility because of reduced earnings coverage and increased financial leverage. At the end of the third quarter, Moody’s estimates that Manulife’s adjusted financial leverage is now over 30%, which exceeds Moody’s limit on this rating sub-factor. Furthermore, the company faces further goodwill charges as it adopts IFRS accounting in 2011. Although these goodwill write-downs are non-cash, they will lead to further deterioration on Moody’s leverage metrics. MLI reported a 234% minimum continuing capital and surplus requirements (MCCSR, the Canadian regulatory capital ratio for life insurers) ratio at the end of the third quarter, which Moody’s views as strong; however, this was due in part to the downstreaming of proceeds from debt raised at the parent as capital to MLI. As the firm’s total leverage increases, management’s ability to deploy double leverage to capitalize the operating company decreases.

According to Moody’s, upward pressure on the ratings would result from a substantial completion of the company’s equity and interest rate hedging programs, the maintenance of a MCCSR ratio above 220% and an NAIC RBC ratio at JHUSA of at least 325% on a sustained basis, with improved financial flexibility including adjusted leverage below 30% and earnings coverage above 8x on a sustained basis. Downward pressure would result from a failure to complete the hedging programs, a MCCSR ratio that dropped below 200%, and/or an NAIC RBC ratio at JHUSA of less than 275%.

S&P had the grace to admit the actuarial change was larger than expected:

2010–Standard & Poor’s Ratings Services today said it placed its ‘A’ counterparty credit rating on Manulife Financial Corp. (TSX/NYSE:MFC) on CreditWatch with negative implications. At the same time, Standard & Poor’s placed its ‘AA’ counterparty credit and financial
strength ratings on MFC’s core and guaranteed insurance operating subsidiaries on CreditWatch with negative implications.

“We placed the ratings on all of the companies in the Manulife group on CreditWatch negative because of Manulife’s continuing earnings volatility and material noncash goodwill impairments that could reach C$3.2 billion,” said Standard & Poor’s credit analyst Robert Hafner. “These goodwill impairments include $2.2 billion under IFRS accounting rules that could follow in the
first quarter of 2011.”

“The earnings volatility is evident in the group’s consolidated third-quarter loss of C$947 million that includes a basis change charge of about C$2 billion arising from its annual review of all actuarial assumptions and methods,” said Mr. Hafner. “These charges somewhat exceed the amount we assumed when we lowered the ratings on the group on Aug. 5 and assigned a negative outlook.”

Issue Comments

IAG Silent on Regulatory Change

Industrial Alliance has released its third quarter financials, and managed to do so without mentioning any prospects for regulatory change. This is the same policy as was followed with the 2Q10 Shareholders’ Report.

However, it was another good quarter:

Top-line growth in the third quarter continued to show strong momentum. Premiums and deposits increased 15% to $1.4 billion and the value of new business rose 44% to $41.4 million. For the nine-month period, premiums and deposits were up 31% over 2009 and 7% over 2007 – the Company’s record year. This growth is fuelled primarily by the Individual Wealth Management sector that continues to benefit from stock market gains and high net sales.

Top-line growth in the third quarter continued to be strong for the fourth quarter in a row. Almost all sectors contributed to this growth, with Individual Wealth Management in the lead as a result of the upswing in equity markets. For the period ended September 30th, this sector had gross sales of $686.7 million, up 29% over the previous year, and net sales of $243.3 million, up 52% over 2009. For the first nine months of 2010, Industrial Alliance ranked second in Canada for net sales of segregated funds, with a 34.1% market share, and fifth in terms of net mutual fund sales.

There are a few changes planned for their asset mix and hedging practice:

Management has taken a number of initiatives to reduce its sensitivity to interest rate risk. These initiatives are in the process of being implemented and will include a 5% increase in the proportion of stocks backing long-term liabilities. Had these initiatives been in place at September 30, 2010, the Company expects that it would be able to absorb a 15% decline in equity markets and that provisions for future policy benefits would not have to be strengthened as long as the S&P/TSX remains above 10,500 points.

Additionally, as part of its risk management process, the Company has implemented a dynamic hedging program to manage the equity risk related to its guaranteed annuity (GMWB) product, effective October 20th, 2010. The GMWB portfolio represents approximately $1.5 billion of assets under management, including $900 million in equities. The Company also entered into a reinsurance agreement during the third quarter to share 60% of the longevity risk related to its $2.5 billion insured annuity block of business.

As far as current sensitivities are concerned:

The Company’s sensitivity analysis varies from one quarter to another according to numerous factors, including changes in the economic and financial environment and the normal evolution of the Company’s business. The results of these analyses show that the leeway the Company has to absorb potential market downturns remains very high overall.

At September 30, 2010, the analysis was as follows:

  • Stocks matched to the long-term liabilities – The Company believes that it will not have to strengthen its provisions for future policy benefits for stocks matched to long-term liabilities as long as the S&P/TSX index remains above 9,400 points.
  • Solvency ratio – The Company believes that the solvency ratio will stay above 175% as long as the S&P/TSX index remains above 7,650 points, and will stay above 150% as long as the S&P/TSX index remains above 6,450 points.
  • Ultimate reinvestment rate (“URR”) – The Company estimates that a 10 basis point decrease (or increase) in the ultimate reinvestment rate would require the provisions for future policy benefits to be strengthened (or would allow them to be released) by some $44 million after taxes.
  • Initial reinvestment rate (“IRR”) – The Company estimates that a 10 basis point decrease (or increase) in the initial reinvestment rate would require the provisions for future policy benefits to be strengthened (or would allow them to be released) by some $25 million after taxes.

They estimate that a sudden 10% decline in equity markets would take $18-million off their net income; unfortunately, they neither provide pro-forma figures reflecting the asset mix changes, nor provide estimates of the effect of larger equity market declines – which will, of course, not be proportional to the adverse effect of such a normal correction.

Issue Comments

SLF Coy on Capital Rule Changes

Sun Life Financial has released its 3Q10 Financials. They had a decent – not great – quarter, but I’m more interested in their commentary on the capital rules:

In Canada, the Office of the Superintendent of Financial Institutions Canada (OSFI) is considering a number of changes to the insurance company capital rules, including new guidelines that would establish stand-alone capital adequacy requirements for operating life insurance companies, such as Sun Life Assurance, and that would update OSFI’s regulatory guidance for non-operating insurance companies acting as holding companies, such as Sun Life Financial Inc.

These proposals from the US Treasury (hopping mad about AIG) are now over a year old and can’t be implemented too soon according to me. Julie Dickson alluded to the possibility in a speech.

In addition, OSFI may change the definition of available regulatory capital for determining regulatory capital to align insurance definitions with any changed definitions that emerge for banks under the proposed new Basel Capital Accord.

Presumably this (mainly) refers to efforts to make the loss-absorption potential of regulatory capital more explicit (although the proposals are framed in such a way that it simply represents a regulatory-political end-run around the bankruptcy courts).

OSFI is considering more sophisticated risk-based modeling approaches to Minimum Continuing Capital and Surplus Requirements (MCCSR), which could apply to segregated funds and other life insurance products. In particular, OSFI is considering how advanced modeling techniques can produce more robust and risk-sensitive capital requirements for Canadian life insurers. This process includes internal models for segregated fund guarantee exposures. On October 29, 2010 OSFI released a draft advisory, for consultation with the industry and other stakeholders, setting out revised criteria for determining segregated fund capital requirements using an approved model. It is proposed that the new criteria, when finalized, will apply to qualifying segregated fund guarantee models for business written on or after January 1, 2011. The Company is in the process of reviewing the advisory to determine the potential impact of the proposed changes, and will continue to actively participate in the accompanying consultation process.

It is very disappointing that they are not more specific, given that implementation is two months’ away.

In particular, the draft advisory on changes to existing capital requirements in respect of new segregated fund business may result in an increase in the capital requirements for variable annuity and segregated fund policies currently sold by the Company in the United States and Canada on and after the date the new rules come into effect. The Company competes with providers of variable annuity and segregated fund products that operate under different accounting and regulatory reporting bases in different countries, which may create differences in capital requirements, profitability and reported earnings on these products that may cause the Company to be at a disadvantage compared to some of its competitors in certain of its businesses. In addition, the final changes implemented as a result of OSFI’s review of internal models for in-force segregated fund guarantee exposures may materially change the capital required to support the Company’s in-force variable annuity and segregated fund guarantee business.

Scary words, but no meat in the sandwich.

Similar was their commentary on the proposed rules regarding hedging:

On July 30, 2010 the International Accounting Standards Board (IASB) issued an exposure draft for comment, which sets out recognition, measurement and disclosure principles for insurance contracts. The insurance contracts standard under IFRS, as currently drafted, proposes that liabilities be discounted at a rate that is independent of the assets used to support those liabilities. This is in contrast to current rules under Canadian GAAP, where changes in the measurement of assets supporting actuarial liabilities is largely offset by a corresponding change in the measurement of the liabilities.

The Company is in the process of reviewing the exposure draft, and is working with a number of industry groups and associations, including the Canadian Life and Health Insurance Association, which submitted a comment letter to the IASB on October 15, 2010. It is expected that measurement changes on insurance contracts, if implemented as drafted, will result in fundamental differences from current provisions in Canadian GAAP, which will in turn have a significant impact on the Company’s business activities. In addition, the IASB has a project on accounting for financial instruments, with changes to classification, measurement, impairment and hedging. It is expected the mandatory implementation of both these standards will be no earlier than 2013.

The IASB continues to make changes to other IFRSs and has a number of ongoing projects. The Company continues to monitor all of the IASB projects that are in progress with regards to the 2011 IFRS changeover plan to ensure timely implementation and accounting.

The proposed new standard has been discussed on PrefBlog. The CLHIA letter does not appear to have been made public by the CLHIA but has been published by IFRS. I can’t say I find the CLHIA arguments – or those of the sell-side analysts quoted in the appendix – particularly convincing. It boils down to another round of the market-value vs. historical cost debate, but they spend more time discussing why fair value will be so inconvenient than on why it is inferior.

As far as earnings are concerned:

Sun Life Financial reported net income attributable to common shareholders of $453 million for the quarter ended September 30, 2010, compared to a loss of $140 million in the third quarter of 2009. Net income in the third quarter of 2010 was favourably impacted by $156 million from improved equity market conditions, and $49 million from assumption changes and management actions. The Company increased its mortgage sectoral allowance by $57 million, which reduced net income by $40 million, in anticipation of continued pressure in the U.S. commercial mortgage market, however overall credit experience continued to show improvement over the prior year. The net impact from interest rates on third quarter results was not material as the unfavourable impact of lower interest rates was largely offset by favourable movement in interest rate swaps used for asset-liability management.

In its interim MD&A for the third quarter of 2009, the Company provided a range for its “estimated 2010 adjusted earnings from operations”(2) of $1.4 billion to $1.7 billion. Based on the assumptions and methodology used to determine the Company’s estimated adjusted earnings from operations, the Company’s adjusted earnings from operations for the third quarter of 2010 were $353 million and $1,087 million for the nine months ended September 30, 2010. Additional information can be found in this news release under the heading Estimated 2010 Adjusted Earnings from Operations.

So it looks like, at best, they’re going to just squeeze in to the bottom of that range.

Q3 2010 adjusted earnings from operations

($ millions) Q3’10
————————————————————————-
Adjusted earnings from operations(1) (after-tax) 353
Adjusting items:
Net equity market impact 156
Management actions and updates to actuarial estimates and
assumptions 49
Tax 16
Sectoral allowance in anticipation of continued pressure in
the U.S. commercial mortgage market (40)
Net interest rate impact (15)
Currency impact (6)
Other experience gains (losses) (includes $32 million
unfavourable mortality/morbidity experience and $4 million
unfavourable credit impact) (60)
————————————————————————-
Common shareholders’ net income 453
————————————————————————-

and

Market risk sensitivities

September 30, 2010
————————————————————————-
Changes in Net income(3)
interest rates(1) ($ millions) MCCSR(4)
————————————————————————-
1% increase 225 – 325 Up to 8 percentage points increase
1% decrease (375) – (475) Up to 15 percentage points decrease
————————————————————————-

Changes in equity markets(2)
————————————————————————-
10% increase 75 – 125 Up to 5 percentage points increase
10% decrease (175) – (225) Up to 5 percentage points decrease
————————————————————————-
————————————————————————-
25% increase 125 – 225 Up to 5 percentage points increase
25% decrease (575) – (675) Up to 15 percentage points decrease
————————————————————————-

Given that the Globe & Mail reports .. :

[UBS analyst Peter] Rozenberg calculated that the weighted average equity markets in the United States, Canada, Japan and Hong Kong increased 9.7 per cent quarter over quarter.

… it is a bit disappointing not to see a better match-up between the published sensitivity to a 10% equity market decline and the adjusting entry in the derivation of operating earnings.

It is also disappointing to see that their commentary on potential regulatory changes is so similar to their commentary in the 2Q10 report.

Regulation

OSFI Releases New Seg Fund Risk Guidelines

The Office of the Superintendent of Financial Institutions has released Revised Guidance for Companies that Determine Segregated Fund Guarantee Capital Requirements Using an Approved Model. This is the change we were told to expect in August.

The guts of the change appears to be distribution and correlation requirements for equity indices:

New minimum quantitative calibration criteria are mandated for the scenarios used to model the returns of the following total return equity indexes (henceforth referred to as “listed indexes”):

  • TSX
  • Canadian small cap equity, mid cap equity and specialty equity
  • S&P 500
  • US small cap equity, mid cap equity and specialty equity
  • MSCI World Equity and MSCI EAFE

The actual investment return scenarios for each of the listed indexes used in the determination of total requirements must meet the criteria specified in the following table.

Furthermore, the arithmetic average of the actual investment return scenarios for each listed index over any one-year period (including the one-year period starting on the valuation date) cannot be greater than 10%. All of these criteria must be met for the scenarios of a listed index to be in accordance with the new minimum calibration criteria.

Modeled scenarios of TSX total return indexes must continue to satisfy the CIA calibration criteria at all percentiles over the five- and ten-year time horizons as published in the CIA’s March 2002 report, in addition to the criteria above. Modeled scenarios of S&P 500 total return indexes must satisfy the American Academy of Actuaries’ calibration criteria for equities [footnote] at all percentiles over the five-, ten- and twenty-year time horizons, in addition to the criteria above.

The scenarios used to model returns of an equity index that is not one of the listed indexes need not meet the same calibration criteria, but must still be consistent with the calibrated scenarios used to model the returns of the listed indexes.

Correlation: The scenarios used to model returns for different equity indexes should be positively correlated with one another. Unless it can be justified otherwise, the correlation between the returns generated for any two equity indexes (whether or not they are listed) should be at least 70%. If scenarios are generated using a model that distinguishes between positive and negative trend market phases (e.g. the regime-switching lognormal model with two regimes) then, unless it can be justified otherwise, the scenarios should be such that there is a very high probability that different equity indexes will be in the same market phase at the same time, and a very low probability that different equity indexes will be in different phases at the same time.

Footnote: For example, as published in the June 2005 document entitled “Recommended Approach for Setting Regulatory Risk-Based Capital Requirements for Variable Annuities and Similar Products”.

The table excised from the quotation above is:

  Time Period
  6 Months 1 Year
Left Tail Criteria    
2.5th percentile of return not greater than -25% -35%
5th percentile of return not greater than -18% -26%
10th percentile of return not greater than -10% -15%
Right Tail Criteria    
90th percentile of return not less than 20% 30%
95th percentile of return not less than 25% 38%
97.5th percentile of return not less than 30% 45%

These criteria equate, very approximately, to a mean expected return of 8% and a standard deviation of 20.5%. Interested readers can fiddle with the variables and log-normal distributions in the comments.

The American Academy of Actuaries’ Recommended Approach for Setting Regulatory Risk-Based Capital Requirements for Variable Annuities and Similar Products notes:

Short period distributions of historic equity returns typically show negative skewness, positive kurtosis (fat tails) with time varying volatility and increased volatility in bear markets. The measure of kurtosis declines when looking at returns over longer time horizons and successive application of a short-term model with finite higher moments will result in longer horizon returns that converge towards normality. Ideally the distribution of returns for a given model should reflect these characteristics. Of course, due to random sampling, not every scenario would show such attributes.

Unfortunately, at longer time horizons the small sample sizes of the historic data make it much more difficult to make credible inferences about the characteristics of the return distribution, especially in the tails. As such, the calibration criteria are derived from a model (fitted to historic S&P500 monthly returns) and not based solely on empirical observations. However, the calibration points are not strictly taken from one specific model for market returns; instead, they have been adjusted slightly to permit several well known and reasonable models (appropriately parameterized) to pass the criteria. Statistics for the observed data are offered as support for the recommendations.

… and they also provide a table:

Table 1: Calibration Standard for Total Return Wealth Ratios
Percentile 1 Year 5 Years 10 Years 20 Years
2.5% 0.78 0.72 0.79 n/a
5.0% 0.84 0.81 0.94 1.51
10.0% 0.90 0.94 1.16 2.10
90.0% 1.28 2.17 3.63 9.02
95.0% 1.35 2.45 4.36 11.70
2.5% 1.42 2.72 5.12 n/a

where:

The ‘wealth factors’ are defined as gross accumulated values (i.e., before the deduction of fees and charges) with complete reinvestment of income and maturities, starting with a unit investment. These can be less than 1, with “1” meaning a zero return over the holding period.

To interpret the above values, consider the 5-year point of 0.72 at the α = 2.5th percentile. This value implies that there is a 2.5 percent probability of the accumulated value of a unit investment being less than 0.72 in 5-years time, ignoring fees and expenses and without knowing the initial state of the process (i.e., this is an unconditional probability). For left-tail calibration points (i.e., those quantiles less than 50%), lower factors after model calibration are required. For right-tail calibration points (quantiles above 50%), the model must produce higher factors.

To my astonishment, I was able to find a copy of CIA Document 202012 (sounds like an analysis of the Mayan calendar) via the World Bank. I will refer to it as Final Report of the CIA Task Force on Segregated Fund Investment Guarantees. Ths calibration is:

Table 1
Accumulation Period 2.5th percentile 5th percentile 10th percentile
One Year 0.76 0.82 0.90
Five Years 0.75 0.85 1.05
Ten Years 0.85 1.05 1.35

The new standard has a significantly nastier left-tail than the prior standards:

Comparison of Left Tails
One Year Horizon
Standard 2.5th %-ile 5th %-ile 10th %-ile
OSFI New -35% -26% -15%
American -22% -16% -10%
Canadian -24% -18% -10%

However, in the absence of information regarding the insurers’ models together with detailed data, it is impossible to determine how capital requirements will be affected by the change. This could be a welcome first step towards rationalizing seg fund capital requirements; it could also be window-dressing that OSFI knows will have no effect but makes them look tough. As suggested by Desjardins, we will simply have to wait for commentary in the coming batch of quarterly reports. It will be noted that GWO, SLF and MFC have all warned about the potential for adverse change.

For myself, I am disappointed that while OSFI is addressing intricacies of model calibration, it is not mandating additional disclosures or reviewing their highly politicized cover-up from the Fall of 2008. It became quite apparent during the Panic of 2007 that the currently mandated disclosure of the effect of a 10% decline in equity prices is nowhere near good enough to allow investors to take an informed view on the adequacy of capitalization. Would it really be so difficult and so invasive to mandate a table showing the effects on capital and comprehensive income of the effects of 10%, 20% and 30% declines?

Contingent Capital

Nagel, Tory's Opine on Preferred Shares, Contingent Capital

Financial Webring Forum brings to my attention a Globe & Mail article titled What happens to rate reset prefs in Basel III?:

John Nagel at Desjardins Securities has been watching the issue closely, trying to get clarity from the Office of the Superintendent of Financial Institutions. He has a vested interest in the outcome because the Desjardins team invented the structure.

At the moment nothing has been decided, but Mr. Nagel said the last he heard, a contingent capital clause was being considered for all new rate reset issues. As a reminder, contingent capital simply means a security type that will convert to common equity when things get rocky.

As far as he knows, outstanding rate reset issues will be grandfathered under Basel III and will count as Tier 1 capital and equity. Going forward, though, Mr. Nagel thinks prospectuses for these issues could have a section, possibly called the Automatic Exchange Event, that describes how preferred shares are exchanged into common equity.

However, this type of “trigger event” would only happen if OSFI declares the financial institution “non-viable” and Mr. Nagel suspects it’s unlikely that will happen in Canada.

“If a bank or an institution was in trouble, long before it would be declared non-viable they would halt trading and OSFI would say ‘Fine, you’re merging with BMO or RBC,” he said. If a merger occurred, the distressed institution’s preferred shares would then become obligations of the acquirer.

No moral hazard here, no way, not in Canada!

The critical “point of non-viability” at which Mr. Nagel believes conversion will be triggered is in accord with Dickson’s speech in May, most recently referenced in PrefBlog in the post A Structural Model of Contingent Bank Capital. The recent BIS proposals insist on some conversion point, setting the point of non-viability as the floor limit, as discussed in BIS Proposes CoCos: Regulatory Trigger, Infinite Dilution.

As I have discussed, many a time and oft, I think that’s a crazy place to have the conversion trigger. It may help somewhat in paying for a crisis, but it will do nothing to prevent a crisis. S&P agrees.

In order to prevent a crisis, the conversion trigger has to be set much further from the point of bankruptcy; the McDonald CoCos are an academic treatment of a model I have advocated for some time: there is automatic conversion if the common price falls below a pre-set trigger price; the conversion is from par value of the preferreds into common at that pre-set price. I suggest that a sensible place to start thinking about setting the trigger price is one-half the common equity price at the time of issue of the preferreds.

Tory’s published a piece by Blair W Keefe in May, titled Canada Pushes Embedded Contingent Capital:

A number of concerns arise with the use of embedded contingent capital.

First, it is likely that the conversion itself could cause a “run” on the troubled bank: effectively, the conversion means that the bank is on the eve of insolvency and the conversion does not create any additional capital; it merely improves the quality of the capital. As a practical matter, it will likely be essential for the government to immediately provide funding to the bank; however, with the former holders of subordinated debt and preferred shares being converted into holders of common shares, the government could replenish the subordinated debt rather than being required to replenish the Tier 1 capital, which occurred in the financial crisis. Therefore, it should be less likely that the government would suffer a financial loss.

This echoes my point about prevention vs. cure.

Third, the cost of capital could increase significantly for banks, particularly if the new capital instruments are viewed as equity – given their conversions in times of financial difficulty to common share equity – rather than debt instruments. OSFI is sensitive to this concern and is the reason why OSFI is advocating a trigger that occurs on the eve of insolvency (rather than earlier in the process) when the holders of subordinated debt and preferred shares would anticipate incurring losses in any event.

In other words, OSFI thinks you can get something for nothing. Ain’t gonna happen. Either we’ll raise the cost of capital for the banks, or we’ll do this pretend-regulation thing for free and then find out it doesn’t work. One or the other.

Seventh, if the embedded contingent capital proposals are adopted, how will those requirements need to be reflected in the Basel III capital proposals? Similarly, what treatment will rating agencies give to contingent capital? If the triggering event is considered remote, rating agencies may not give “equity” credit treatment for the instruments.

Finally, with any change of this nature, market participants worry about the unexpected consequences: Will hedge funds or other market participants be able to “game” the system? Will the conversion features create more instability for a bank experiencing some financial difficulty? Could the conversion create a death spiral of dilution? and so on.

Ms. Dickson’s beloved “Market Price Conversion” formula will almost definitely create a death spiral. While fixed-price conversion may create multiple equilibria (which the Fed worries about), I see that as being the lesser of a host of evils. Gaming can be reduced if the conversion trigger is based on a long enough period of time: my original and current suggestion is VWAP measured over 20 consecutive trading days. It would be very expensive to game that to any significant extent, and not very profitable. On the other had, if the conversion trigger is a single share trading below the conversion price … yes, that presents more of a problem.

Regulation

Gensler: Regulate Everything Stupidly!

The thing about the United States and its institutions, I’ve found, is that the research is excellent. When Congress or a government agency want to know what’s going on or how something works – they hire some really good people, give them a decent budget, a reasonable time-frame and good authority to get answers and the final product is generally good.

Unfortunately, once the regulators and politicians get ahold of this report, they ignore it and pursue their own idiotic agendae; or the agendae of those who appointed them.

And so it is with the Flash Crash. The Flash Crash report was really good, but now Gary Gensler, a political hack who knows which side his bread’s buttered on, has given a ridiculous speech about possible new rules:

Gensler, speaking today at a conference in Washington, said brokers using computer algorithms might need to face limits on price or the size of orders they can execute for clients. He also questioned whether market participants might benefit from “fuller visibility” of exchanges’ order books.

The CFTC and the Securities and Exchange Commission said in a report last week that a large trader’s attempt to hedge against losses helped set off a chain of events that sent the Dow Jones Industrial Average down 998.50 points on May 6. The trader, who tried to sell 75,000 futures contracts worth $4.1 billion, used an algorithm that gave no regard to price or time.

“The large customer did not execute the trade itself, but used an executing broker,” Gensler said at the Wholesale Markers Brokers’ Association meeting. The event raises questions about whether brokers should “have to adopt certain trading practices when executing a large order,” he said.

Participants in the futures market can only “see up to the tenth offer or bid in an order book,” Gensler said. Liquidity might not have been so “overwhelmed” by a single, large sell order on May 6 if traders had more transparency, he said.

This is insane. The last paragraph is contrary to everything we know about markets. There have been countless studies on the effect of TRACE and of opening access to order books that show that increased tranparency leads to a thinner, more brittle market. Making the order book more accessible will increase the chance that a single dumb order can overwhelm the market, not less.

Additionally, setting the brokers up to police whether portfolio managers’ orders are good enough is just a dumb idea. In the first place, it assumes that brokers are smarter than portfolio managers – a highly dubious assumption – in the second place it adds another layer of red tape to the investment industry, leading to decreased efficiency.

The Flash Crash was caused by a bozo trader taking a huge market impact cost. A few tweaks to the rules seem indicated, but market impact happens every single time an order is executed. Given that there is far more “real money” in the markets than “hot money”, there will, from time to time, be market paroxysms that don’t make much sense. The Flash Crash was unusula only in its size – but Gensler wants to make life safe for the incompetent.

Contingent Capital

Contingent Capital Update

A Reuters columnist suggested Big banks winners from new contingent capital move:

Plans to make hybrid bond investors share the pain when banks run into trouble could polarise the financial sector into big firms that can afford to pay up for capital and smaller players that cannot.

But these plans from the Basel Committee on Banking Supervision could reinforce a pattern emerging in the aftermath of the crisis — a two-tier banking market with international banks that investors favour over smaller banks seen as riskier.

“It could polarise the market further in terms of issuer access and could shut out some smaller institutions and give larger firms a competitive advantage,” said one debt capital markets banker at a major international banking group.

I don’t think that this is necessarily the case. Small banks in the US have never been able to issue their own non-equity regulatory capital – it has all been repackaged into CDOs. This was one of the sideswipes of the Panic of 2007 – the CDO market froze up and these smaller banks were unable to issue.

Investors have mixed views on contingent capital. They would have problems with more issues along the lines of bonds sold by British bank Lloyds, which are designed to convert to equity in the early stages of a bank running into difficulties.

“We don’t think there is a large market for them, certainly among institutional bond investors,” said Roger Doig, credit analyst at Schroders. Analysts say that such issues are difficult for credit rating agencies to evaluate and many institutional credit investors are not mandated to hold equity.

Well, we will see. It’s not fair focussing on the poorly structure Lloyds ECN issue as that gave no first-loss protection to holders.

The McDonald CoCos are not only much better structured and better investments, but they will also work better in averting a crisis, rather than helping to clean up.

Stan Maes and William Schoutens provide Contingent Capital: An In-Depth Discussion:

Somewhat paradoxically, funded contingent capital or CoCos may actually increase the systemic risks they are intended to reduce. For example, whereas some banking regulators recorded CoCos as capital, some insurance regulators treated them as debt. Hence, significant amounts of CoCos were held by insurers, creating a risk of contagion from the banking sector to the insurance sector. Also a problem of moral hazard arises. Taking excessive risks (by for example buying additional risky assets) could lead to a triggering of the note and hence the wiping out of a lot of outstanding debt. Banks with contingent debt could therefore be tempted to seek additional risk near the trigger point (taking risk on the back of the CoCo holders and maybe taxpayers as well).

Finally, Hart and Zingales (2010) argue that contingent capital introduces inefficiency as conversion eliminates default, which forces inefficient businesses to restructure and incompetent managers to be replaced.

Allowing CoCos to be held as assets by other financial institutions and risk-weighted as debt is just stupid. I won’t waste time discussing stupidity.

Given the above, it may make a lot of sense to define triggers in terms of market based terms. Note however that a simple market based trigger may not be desirable as short sellers may be tempted to push down the stock price in order to profit from the resulting dilution of the bank’s stock following the conversion triggered by the stock price drop. Such a self-generated decline in shares prices is referred to as a “death spiral”. The above problem can be mitigated by making the trigger dependent on a rolling average stock price (say the average closing price of the stock over the preceding 20 business days, as Duffie (2010) and Goodhart (2010) propose). In fact, Flannery (2009) demonstrates that the incentive for speculative attack is lessened or even eliminated altogether by setting a sufficiently high contractual conversion price, such that the conversion becomes anti-dilutive (raising the price of the share rather than lowering it).

A market based trigger has the additional advantage that it limits the ability of management to engage in balance sheet manipulation. Also, it prevents forbearance on behalf of the regulators, as it eliminates regulatory discretion in deciding when the trigger should be invoked. Some analysts refer to the double trigger as the double disaster (regulatory discretion as well as politics).

My own preference is for the Volume Weighted Average Price over a relatively lengthy period (20 trading days?) to be the trigger.

If the conversion ratio is based on the stock price at the time of the triggering point, the amount of capital to be brought in can be very substantial and will make thecounterparty a major, if not the largest, shareholder. Original shareholders will be diluted. On the one hand, there is a clear potential dilution effect which could affect the bank’s equity price itself. On the other hand, CoCos may as well introduce a floor on the equity price in these situations.

When the conversion ratio is determined at the time of conversion and not at the time of issuance, the conversion is likely to be relatively generous to the holder of the contingent capital instrument. When the debt holders can expect to get out at close to par value, it would reduce the cost of the contingent capital instrument, making it a significantly cheaper form of capital than equity (of course its low coupon would reduce investors’ appetite).

The authors close with:

We close by raising concerns about the pricing of the instruments by highlighting the similarities between CoCos and equity barrier options and credit default swaps. These barrier-like features and the fact that CoCos are fat-tail event claims, in combination with calibration and model risks, imply that these contingent instruments are very hard to value under a particular model. Since CoCos are expected not to be highly liquid instruments (and until real market prices are widely available), the extreme complexity of mark to modeling CoCos will be a big disadvantage that may hamper their success.

Contingent Capital

Carney: Central Planning = Good

PrefBlog’s Department of Thesis Title Suggestions has another offering for aspiring MAs and MBAs: is the period of market ascendence over? I suggest that it is arguable that the fall of the Soviet Union in 1990 brought with it a period of free-market ascendency: behind every political and regulatory decision was the knowledge that central planning doesn’t work.

However, the Panic of 2007 has brought with it the knowledge that free markets don’t work either, and 1990 is ancient history, of no relevance to today’s perceptive and hard-nosed bureaucrats. So the pendulum is swinging and the pendulum never swings half way.

In his role as a leading proponent of central planning, Bank of Canada Governor Mark Carney gave a speech today titled The Economic Consequences of the Reforms:

Consider the jaded attitudes of the bank CEO who recounted: ―My daughter called me from school one day, and said, ‗Dad, what‘s a financial crisis?‘ And, without trying to be funny, I said, ‗This type of thing happens every five to seven years.‘‖

Footnote: J. Dimon, Chairman and CEO, JP Morgan Chase & Company, in testimony to the U.S. Financial Crisis Inquiry Commission, 13 January 2010

Possibly the most intelligent remark in the whole speech, but it was set up as straw man.

Should we be content with a dreary cycle of upheaval?

Such resignation would be costly. Even after heroic efforts to limit its impact on the real economy, the global financial crisis left a legacy of foregone output, lost jobs, and enormous fiscal deficits. As is typically the case, much of the cost has been borne by countries, businesses, and individuals who did not directly contribute to the fiasco.

This is true to a certain extent. Society is comprised of networks of relationships, some productive, others being a waste of time (do you believe that institutional bond salesmen are prized by employers because of their keen insight into the market and their uncanny ability to discern budding trends in the market? Ha-ha! They have a book of clients who will call them when the client wants to trade, that’s all). Humans form these networks with little more intelligence than an ant-hill; we only survive because recessions come along every now and then to sweep away at least a portion of the unproductive networks, leaving its participants to get new jobs, move, change their lifestyle and basically try again to form links to other networks that may, one hopes, be productive.

A financial crisis is larger than a normal recession, as Carmen M. Reinhart & Kenneth S. Rogoff have written. This has two effects – first, the number of inefficient networks that are swept away simultaneously is larger, and secondly a number of effiicient networks gets caught up in the frenzy and are swept away as well (they’re dependent upon the availability of credit. Trade finance took a beating during the crisis, for instance).

So yeah, financial crises are bad. But the most expensive North American bail-out has been GM (and is continuing to be GM, since they are being restored to health with the aid of electric car subsidies in addition to their usual welfare cheques) and GM was most certainly not an efficient network. The financial crisis was the trigger, not the cause.

Thus, we cannot blame all the pain on faceless bankers; much of it would have occurred anyway.

Carney claims:

By using securitization to diversify the funding sources and reduce credit risks, banks created new exposures. The severing of the relationship between originator and risk holder lowered underwriting and monitoring standards.

There is some doubt about this. The FRBB notes:

The evolving landscape of mortgage lending is also relevant to an ongoing debate in the literature about the direction of causality between reduced underwriting standards and higher house prices. Did lax lending standards shift out the demand curve for new homes and raise house prices, or did higher house prices reduce the chance of future loan losses, thereby encouraging lenders to relax their standards? Economists will debate this issue for some time.

It appears that this inconvenient debate will occur behind closed doors, as far as Carney is concerned.

Carney goes on to state:

In addition, the transfer of risk itself was frequently incomplete, with banks retaining large quantities of supposedly risk-free leveraged super senior tranches of structured products.

This is a clear failure of regulation, but we won’t won’t hear any discussion of this point, either.

These exposures were compounded by the rapid expansion of banks into over-the-counter derivative products. In essence, banks wrote a series of large out-of-the-money options in markets such as those for credit default swaps. As credit standards deteriorated, the tail risks embedded in these strategies became fatter. With pricing and risk management lagging reality, there was a widespread misallocation of capital.

footnote: See A. Haldane, ―The Contribution of the Financial Sector—Miracle or Mirage?‖ Speech delivered at the Future of Finance Conference, London, 14 July 2010.

An interesting viewpoint, since writing a CDS is the same thing as buying a bond, but without the funding risk. I’ll have to check out that reference sometime.

The shortcomings of regulation were similarly exposed. The shadow banking system was not supported, regulated, or monitored in the same fashion as the conventional banking system, despite the fact they were of equal size on the eve of the crisis.

There were also major flaws in the regulation and supervision of banks themselves. Basel II fed procyclicalities, underestimated risks, and permitted excess leverage. Gallingly, on the day before each went under, every bank that failed (or was saved by the state) reported capital that exceeded the Basel II standard by a wide margin.

So part of the problem was that not enough of the system was badly regulated?

In particular, keeping markets continuously open requires policies and infrastructure that reinforce the private generation of liquidity in normal times and facilitate central bank support in times of crisis. The cornerstone is clearing and settlement processes with risk-reducing elements, particularly central clearing counterparties or ―CCPs‖ for repos and OTC derivatives. Properly risk-proofed CCPs act as firewalls against the propagation of default shocks across major market participants. Through centralised clearing, authorities can also require the use of through-the-cycle margins, which would reduce liquidity spirals and their contribution to boom-bust cycles.(footnote)

The second G-20 imperative is to create a system that can withstand the failure of any single financial institution. From Bear Stearns to Hypo Real Estate to Lehman Brothers, markets failed that test.

Footnote: Market resiliency can also be improved through better and more-readily available information. This reduces information asymmetry, facilitates the valuation process and, hence, supports market efficiency and stability. In this regard, priorities are an expansion of the use of trade repositories for OTC derivatives markets and substantial enhancements to continuous disclosure standards for securitization.

This part is breathtaking. In the first paragraph, Carney extolls the virtues of setting up centralized single points of failure; in the second, he decries the system of having single points of failure. I have not seen this contradiction addressed in a scholarly and robust manner; the attitude seems to be that single points of failure are not important as long as they don’t fail; and they won’t fail because they’re new and will be supervised.

It is, however, the footnote that is egregious in either its ignorance or its intellectual dishonesty – one of the two. It has been shown time and time again that increased public information reduces dealer capital allocation, making the market more shallow and brittle (eg, see PrefBlog posts regarding TRACE. Additionally, see the work on what happened when the TSX started making level 2 quotes available back in 1993 or whenever it was. I feel quite certain that, somewhere, there is some investigation on what Bloomberg terminals did to the Eurobond market in the late eighties, but I’ve never seen any.)

Today, after a series of extraordinary, but necessary, measures to keep the system functioning, we are awash in moral hazard. If left unchecked, this will distort private behaviour and inflate public costs.

So, as part of the campaign to eliminate moral hazard, we’re going to have central clearinghouses? So it won’t matter if Bank of America does a $50-billion dollar deal with the Bank of Downtown Beanville, as long as it’s centrally cleared? And this will reduce moral hazard?

There’s another internal contradiction here, but I don’t think it will be discussed any time soon.

Another promising avenue is to embed contingent capital features into debt and preferred shares issued by financial institutions. Contingent capital is a security that converts to capital when a financial institution is in serious trouble, thereby replenishing capital without the use of taxpayer funds. Contingent conversions could be embedded in all future new issues of senior unsecured debt and subordinated securities to create a broader bail-in approach. Its presence would also discipline management, since common shareholders would be incented to act prudently to avoid having their stakes diluted by conversion. Overall, the Bank of Canada believes that contingent capital can reduce moral hazard and increase the efficiency of bank capital structures. We correspondingly welcome the Basel Committee‘s recent public consultation paper on this topic.

Carney’s proposed inclusion of senior debt as a form of contingent capital has been discussed in the post Carney: Ban the bond!. As has been often discussed on PrefBlog, this is simply a mechanism whereby bureaucrats can be given the power of bankruptcy courts, with none of those inconvenient creditors’ rights and committees to worry about. Just like the GM bail-out!

He then reprises the BoC paper on the effects of increased bank capitalization on mortgage rates, which has been discussed in the post BIS Assesses Effects of Increasing Bank Capitalization among others.

First, banks are assumed to fully pass on the costs of higher capital and liquidity requirements to borrowers rather than reducing their current returns on shareholders‘ equity or operating expenses, such as compensation, to adjust to the new rules.

Consider the alternative. If banks were to reduce personnel expenses by only 10 per cent (equal to a 5 per cent reduction in operating expenses), they could lower spreads by an amount that would completely offset the impact of a 2-percentage-point increase in capital requirements.

Second, higher capital and liquidity requirements are assumed to have a permanent effect on lending spreads, and hence on the level of economic output. No allowance is made for the possibility that households and firms may find cheaper alternative sources of financing.

The second point is critical. It seems quite definite that this will happen – if bank mortgages go up 25-50bp in the absence of other changes, then mortgage brokers will do a booming business. But he wants to regulate shadow-banks, too. And it will mean that shadow banks (or unregulated shadow-shadow-banks) will skim the cream off the market, leaving the banks with lower credit quality.

There has been nowhere near enough work done on the knock-on effect of these changes.

However, there are a variety of other potential benefits from higher capital and liquidity standards and the broader range of G-20 reforms.
First, the variability of economic cycles should be reduced by a host of macroprudential measures. Analysis by the Bank of Canada and the Basel group suggests a modest dampening in output volatility can be achieved from the Basel III proposals, as higher capital and liquidity allow banks to smooth the supply of credit over the cycle. For instance, a 2-percentage-point rise in capital ratios lowers the standard deviation of output by about 3 per cent.

So it would seem that we’re going to have another Great Moderation, except that this time irrational exuberance will not occur and we’ll live in the Land of Milk and Honey forever. Well, it’s a nice dream.

Greater competition commonly leads to more innovative and diverse strategies, which would further promote resiliency of the system. Greater competition and safer banks may also contribute to lower expected return on equity (ROE) for financial institutions. This, in turn, could help offset the costs and increase the net benefits discussed earlier.

These gains from competition could be considerable. The financial services sector earns a 50 per cent higher return on equity than the economy-wide average. If greater competition leads to a one-percentage-point decline in the ROE (through a decline in spreads), the estimated cost from a one-percentage point increase in capital would be completely offset.

Do all you bank equity investors hear this properly? What will the desired 1% decline in ROE do to your portfolio?

This was, quite frankly, a very scary speech.