Do We Really Need the Speed?
Algorithms overwhelmed on May 6th.
June 10, 2010
Algorithmic trading, while part of the tool kit for brokerages and their institutional clients, is nevertheless controversial. In part it was blamed for May 6th’s flash crash. In testimony before the U.S. Congress, Commodity Futures Trading Commission chair Gary Gensler explained:
“We understand that this particular market participant sought to hedge its stock portfolio in the futures markets by selling a pre-determined amount of futures through an executing broker’s automated execution system. In this circumstance, we further understand that the trade was executed through an executing broker’s algorithm that was meant to limit market impact by limiting volume at an average of nine percent of the volume traded during that period.”
In this case, however, the algos backfired.
“In normal markets, market participants would anticipate lower market impact by restricting their volume to some single digit percent because they generally view that higher volume is a reasonable proxy for better liquidity.
“In the market on May 6, however, as the staff preliminary review indicates, higher volume did not necessarily mean better liquidity.”
“Before algorithmic trading took hold, a pension fund manager who wanted to buy 30,000 shares of IBM might hire a broker-dealer to search for a counterparty to execute the entire quantity at once in a block trade. Alternatively, that institutional investor might have hired a New York Stock Exchange (NYSE) floor broker to go stand at the IBM post and quietly “work” the order, using his judgment and discretion to buy a little bit here and there over the course of the trading day to keep from driving the IBM share price up too far. As trading became more electronic, it became easier and cheaper to replicate that floor trader with a computer program doing algorithmic trading.”
Of course, it’s not always clear what the benefit is to displacing humans with computers, the authors note – almost in anticipation of May 6th:
“For example, the intense activity generated by algorithms threatens to overwhelm exchanges and market data providers, forcing significant upgrades to their infrastructures.”
Except the basis of their paper involves the 2003 introduction of autoquotes by the NYSE – to replace manual entries from the specialists on the trading floor. While algos can have less-than liquidity-providing effects, the technological arms race can also drain liquidity – at least from the executing brokers. On balance, the authors conclude:
“We find that algorithmic trading does in fact improve liquidity for large-cap stocks. Quoted and effective spreads narrow under autoquote. The narrower spreads are a result of a sharp decline in adverse selection, or equivalently a decrease in the amount of price discovery associated with trades. Algorithmic trading increases the amount of price discovery that occurs without trading, implying that quotes become more informative.”
But there’s an anomaly, associated with the technological arms race:
“Surprisingly, we find that algorithmic trading increases realized spreads and other measures of liquidity supplier revenues. This is surprising because we initially expected that if AT improved liquidity, the mechanism would be competition between liquidity providers. However, the evidence clearly indicates that liquidity suppliers are capturing some of the surplus for themselves. The most natural explanation is that, at least during the introduction of autoquote, algorithms had market power. Over a longer time period liquidity supplier revenues decline, suggesting that any market power was temporary, perhaps because new algorithms require considerable investment and time-to-build.”