Zero Hedge 
July 13, 2010
Up until recently, any debate between proponents and opponents of High Frequency Trading would typically be represented by heated debates of high conviction on either side, with discussions rapidly deteriorating into ad hominem attacks and the producer screaming ‘cut to commercial’ to prevent fistfights. Luckily, all this is about to change. In a research paper by Reginald Smith of the Bouchet Franklin Institute in Rochester titled “Is high-frequency trading inducing changes in market microstructure and dynamics? ” the author finds that he “can clearly demonstrate that HFT is having an increasingly large impact on the microstructure of equity trading dynamics. Traded value, and by extension trading volume, fluctuations are starting to show self-similarity at increasingly shorter timescales. Values which were once only present on the orders of several hours or days are now commonplace in the timescale of seconds or minutes. It is important that the trading algorithms of HFT traders, as well as those who seek to understand, improve, or regulate HFT realize that the overall structure of trading is influenced in a measurable manner by HFT and that Gaussian noise models of short term trading volume fluctuations likely are increasingly inapplicable.” In other words, the author finds ample evidence that during the past decade (on the NASDAQ) and especially since the 2005 revision of Reg NMS (on the NYSE), stock trading increasingly demonstrates “self similar” fractal patterns, resulting in volatility surges, recursive feedback loops, and a market structure which is increasingly becoming a product of the actual trading mechanism. In the process, as demonstrated by a Hurst Exponent gravitating increasingly further away from 0.5 (i.e., Brown Noise territory), the Markov Process nature of stock trading is put under question, and thus the whole premise of an efficient market has to be reevaluated. Simply said: HFT has been shown to affect the fairness of trading.
The paper is, needless to say, a must read for everyone who has an even passing interest in stock trading and market regulation (alas, yes, that would mean the SEC, and Congress). And while one of the key qualities of the paper is presenting the history and implications of High Frequency Trading, and its rise to market dominance primarily as a result of the revision of Reg NMS, allowing stock trading to become a free for all for every algo, and ECN/ATM imaginable, the key findings are what makes it unique. In analyzing stochastic processes and fractal phenomena, and concluding that the Hurst Exponent of transactions that involve less than 1,500 shares per trade (and especially less than 250 – a distinct subdomain relegated to HFT strategies) is no longer 0.5, the author validates the skepticism of all those who for over a year (such as Zero Hedge) have claimed that the direct and increasing involvement of HFT is an de-evolutionary process that is leading to increasing market fragmentation, self-sameness, destabilization, and volatility, offset merely by allegedly improved liquidity, which incidentally disappears on a moment’s notice when the negative side-effects of HFT overwhelm the positive, such as was the case on May 6. Furthermore, the authors find that the type of fractal recursive feedback loops inspired by increasing HFT participation lead to spikes in correlation: “Correlations previously only seen across hours or days in trading time series are increasingly showing up in the timescales of seconds or minutes.” And due to the implied fractal nature of trading (think standing waves, fern leaves, sandy beaches, and all other goodies unleashed upon the world courtesy of Benoit Mandelbrot), it appears investors now have to consider such quixotic issues as Lorenzian Attractors  when it comes to simple trading. What is most troubling, is that micro similarities, as postulated by non-linear theory, tend to rapidly evolve into massively scaled topological disturbances, and thus a few simple resonant trades can rapidly avalanche into a major market destabilizing event…. such as that seen on May 6.
While the math of the article is a little daunting, and the author appears to derive a peculiar satisfaction from throwing the Riemann Zeta function in the general mathematical stew (incidentally, with the prevalent IQ of Zero Hedge readers being sufficiently high to allow at least a valiant effort at proving the Riemann Hypothesis , we hope some of our more industrious readers take it upon themselves to venture and pocket the generous proceeds from the Millennium Prize , for which we will be content to receive a mere pittance as a donation for proffering this forum), the observations and conclusions are water tight:
Given the complex nature of HFT trades and the frequent opacity of firm trading strategies, it is difficult to pinpoint exactly what about HFT causes a higher correlation structure. One answer could be that HFT is the only type of trading that can exhibit trades that are reactive and exhibit feedback effects on short timescales that traditional trading generates over longer timescales.
Another cause may be the nature of HFT strategies themselves. Most HFT strategies can fall into two buckets Lehoczky and Schervish (2009):
(i) Optimal order execution: trades whose purpose is to break large share size trades into smaller ones for easier execution in the market without affecting market prices and eroding profit. There are two possibilities here. One that the breaking down of large orders to smaller ones approximates a multiplicative cascade which can generate self-similar behavior over time Mandelbrot (1974). Second, the queuing of chunks of larger orders under an M/G/1 queue could also generate correlations in the trade flow. However, it is questionable whether the “service time”, or time to sell shares in a limit order, is a distribution with infinite variance as this queuing model requires.
(ii) Statistical arbitrage: trades who use the properties of stock fluctuations and volatility to gain quick profits. Anecdotally, these are most profitable in times of high market volatility. Perhaps since these algorithms work through measuring market fluctuations and reacting on them, a complex system of feedback based trades could generate self-similarity from a variety of yet unknown processes.
Since firm trade strategies are carefully guarded secrets, it is difficult to tell which of these strategies predominate and induces most of the temporal correlations.
When it comes to the interplay of optimal order execution and statistical arbitrage, it can easily be seen why large block splitting into child orders could conceive a self-similar trading pattern that reverberates across the market, in an increasingly micro-correlated and fractal marketplace. In the course of events on May 6, it is perfectly feasible that as many mutual funds commenced dumping large blocks of stock, assorted algorithms had to work overtime to split these orders into millions of small trade blocks. And with statistical arbitrage models programmed to game and front-run large order blocks by diving the intentions of repeated micro orders, it becomes all too clear how a rapid selling event can rapidly culminate into a bid-less environment where both the stat arb and order execution HFT algorithms are all on the same side of the boat. Consider the market action from the past several days as indicative of micro volume accumulation by HFTs, which is only offset by mega volume dumping – once all the HFTs are forced to unwind and go to cash, the actual principal liquidity providers who in their desperation become liquidity takers, suddenly find themselves with no recourse but to hit any bid. Which is why the NYSE explanation of Liquidity Replenishment Points is nothing but complete BS – the market meltdown had nothing to do with selective order routing to non-NYSE venues, and everything to do with a fractal implosion, in which, as Nassim Taleb would explain, the Hurst Exponent briefly went from 0.5 to infinity minus 1, and the entire market became correlated with itself.
Of course if the paper is correct, and the empirical evidence presented in it is sufficient to eliminate doubt, it means that in the coming years we will have an exponentially growing number of days in which May 6-type event will recur over and over.
The paper’s conclusion:
Given the above research results, we can clearly demonstrate that HFT is having an increasingly large impact on the microstructure of equity trading dynamics. We can determine this through several main pieces of evidence. First, the Hurst exponent H of traded value in short time scales (15 minutes or less) is increasing over time from its previous Gaussian white noise values of 0.5. Second, this increase becomes most marked, especially in the NYSE stocks, following the implementation of Reg NMS by the SEC which led to the boom in HFT. Finally, H > 0.5 traded value activity is clearly linked with small share trades which are the trades dominated by HFT traffic.
In addition, this small share trade activity has grown rapidly as a proportion of all trades. The clear transition to HFT influenced trading noise is more easily seen in the NYSE stocks than with the NASDAQ stocks except NWSA. The main exceptions seem to be GENZ and GILD in the NASDAQ which are less widely traded stocks. There are values of H consistently above 0.5 but not to the magnitude of the other stocks. The electronic nature of the NASDAQ market and its earlier adoption of HFT likely has made the higher H values not as recent a development as in the NYSE, but a development nevertheless.
Given the relative burstiness of signals with H > 0.5 we can also determine that volatility in trading patterns is no longer due to just adverse events but is becoming an increasingly intrinsic part of trading activity. Like internet traffic Leland et. al. (1994), if HFT trades are self-similar with H > 0.5, more participants in the market generate more volatility, not more predictable behavior.
There are a few caveats to be recognized. First, given the limited timescale investigated, it is impossible to determine from these results alone what, if any, long-term effects are incorporating the short-term fluctuations. Second, it is an open questions whether the benefits of liquidity offset the increased volatility. Third, this increased volatility due to self-similarity is not necessarily [TD: but very well could be as described above] the cause of several high profile crashes in stock prices such as that of Proctor & Gamble (PG) on May 6, 2010 or a subsequent jump (which initiated circuit breakers) of the Washington Post (WPO) on June 16, 2010. Dramatic events due to traceable causes such as error or a rogue algorithm are not accounted for in the increased volatility though it does not rule out larger events caused by typical trading activities. Finally, this paper does not investigate any induced correlations, or lack thereof, in pricing and returns on short timescales which is another crucial issue.
Traded value, and by extension trading volume, fluctuations are starting to show self-similarity at increasingly shorter timescales. Values which were once only present on the orders of several hours or days are now commonplace in the timescale of seconds or minutes. It is important that the trading algorithms of HFT traders, as well as those who seek to understand, improve, or regulate HFT realize that the overall structure of trading is influenced in a measurable manner by HFT and that Gaussian noise models of short term trading volume fluctuations likely are increasingly inapplicable.
As for evidence, we refer readers to the paper itself, but in a nutshell, the authors find that over the years, on both the NYSE (after Reg NMS revision in 2005) and on the Nasdaq (from before, as the Nasdaq was the original spawn of HFT strategies), as the prevalent share bucket moved from greater than 1,000 shares per trade, to 250 or less, direct evidence of increasing HFT dominance, especially coupled with previous Tabb group evidence that over 70% of all trading is conducted by HFT, the Hurst Exponent of all trading increasingly moved away from 0.5, and has hit as high as 0.7 in some case: a stunning result which puts the entire stochastic nature of stock markets in question! (see charts below).
Charting the average size per trade since 2002:
- A d v e r t i s e m e n t
And charting the Hurst Exponent as calculated by the authors in a variety of Nasdaq and NYSE stocks:
We are confident that this paper will serve as the guiding light to much more comparable research, due to the unique approach the author takes in analyzing stock behavior. In moving away from a traditional and simplistic Gaussian frame, Smith isolates the very nature of the problem, which like any other non-linear system, and thus prone to Black Swanness, has to be sought in the plane of fractal geometry. Luckily, the author provides the one elusive observation which many market participants (at least those whose livelihoods are not tied into the perpetuation of the destructive HFT processes) had long sensed was on the tips of their tongues, yet the only comprehensible elucidation was the trite and overworn “the market is broken.” At least now we know that this is a fact.
Unfortunately, as the paper requires slightly more than first grade comprehension and math skills, it will never be read by anyone at the SEC, or those in Congress, who are pretending to be conducting Financial Regulation Reform, when the items described in this paper are precisely the things that any reform should be addressing.
And while we again urge everyone to read the full paper, below we present the section of the paper that does a terrific job in explaining the arrival of HFT, its development over the ages, and its parasitic role in market structure.
A brief history of the events leading up to high frequency trading
In 1792, as a new nation worked out its new constitution and laws, another less heralded revolution began when several men met under a Buttonwood tree, and later coffee houses, on Wall St. in New York City to buy and sell equity positions in fledgling companies. An exclusive members club from the beginning, the New York Stock Exchange (NYSE) rapidly grew to become one of the most powerful exchanges in the world. Ironically, even the non-member curbside traders outside the coffee houses gradually evolved into over-the-counter (OTC) traders and later, the NASDAQ. A very detailed and colorful history of this evolution is given in Markham and Harty (2008); Harris (2003).
Broadly, the role of the exchange is to act as a market maker for company stocks where buyers with excess capital would like to purchase shares and sellers with excess stock would like to liquidate their position. Several roles developed in the NYSE to encourage smooth operation and liquidity. There came to be several types of market makers for buyers and sellers known as brokers, dealers, and specialists. The usual transaction involves the execution of a limit order versus other offers. A limit order, as contrasted to a market order which buys or sells a stock at the prevailing market rate, instructs the purchase of a stock up to a limit ceiling or the sale of a stock down to a limit floor. Brokers act on behalf of a third-party, typically an institutional investor, to buy or sell stock according to the pricing of the limit order. Dealers, also known as market-makers, buy and sell stock using their own capital, purchasing at the bid price and selling at the ask price, pocketing the bid-ask spread as profit. This helps ensure market liquidity. A specialist acts as either a broker or dealer but only for a specific list of stocks that he or she is responsible for. As a broker, the specialist executes trades from a display book of outstanding orders and as a dealer a specialist can trade on his or her own account to stabilize stock prices.
The great rupture in the business-as-usual came with the Great Depression and the unfolding revelations of corrupt stock dealings, fraud, and other such malfeasance. The Securities and Exchange Commission (SEC) was created by Congress in 1934 by the Securities Exchange Act. Since then, it has acted as the regulator of the stock exchanges and the companies that list on them. Over time, the SEC and Wall Street have evolved together, influencing each other in the process.
By the 1960s, the volume of traded shares was overwhelming the traditional paper systems that brokers, dealers, and specialists on the floor used. A“paperwork crisis” developed that seriously hampered operations of the NYSE and led to the first electronic order routing system, DOT by 1976. In addition, inefficiencies in the handling of OTC orders, also known as “pink sheets”, led to a 1963 SEC recommendation of changes to the industry which led the National Association of Securities Dealers (NASD) to form the NASDAQ in 1968. Orders were displayed electronically while the deals were made through the telephone through“market makers” instead of dealers or specialists. In 1975, under the prompting of Congress, the SEC passed the Regulation of the National Market System, more commonly known as Reg NMS, which was used to mandate the interconnectedness of various markets for stocks to prevent a tiered marketplace where small, medium, and large investors would have a specific market and smaller investors would be disadvantaged. One of the outcomes of Reg NMS was the accelerated use of technology to connect markets and display quotes. This would enable stocks to be traded on different, albeit connected, exchanges from the NYSE such as the soon to emerge electronic communication networks (ECNs), known to the SEC as alternative trading systems (ATS).
In the 1980s, the NYSE upgraded their order system again to SuperDot. The increasing speed and availability computers helped enable trading of entire portfolios of stocks simultaneously in what became known as program trading. One of the first instances of algorithmic trading, program trading was not high-frequency per se but used to trade large orders of multiple stocks at once. Program trading was profitable but is now often cited as one of the largest factors behind the 1987 Black Monday crash. Even the human systems broke down, however, as many NASDAQ market makers did not answer the phones during the crash.
The true acceleration of progress and the advent of what is now known as high frequency trading occurred during the 1990s. The telecommunications technology boom as well as the dotcom frenzy led to many extensive changes. A new group of exchanges became prominent. These were the ECN/ATS exchanges. Using new computer technology, they provided an alternate market platform where buyers and sellers could have their orders matched automatically to the best price without middlemen such as dealers or brokers. They also allowed complete anonymity for buyers and sellers. One issue though was even though they were connected to the exchanges via Reg NMS requirements, there was little mandated transparency. In other words, deals settled on the ECN/ATS were not revealed to the exchange. On the flip side, the exchange brokers were not obligated to transact with an order displayed from an ECN, even if it was better for their customer.
This began to change, partially because of revelations of multiple violations of fiduciary duty by specialists in the NYSE. One example, similar to the soon to be invented ‘flash trading’, was where they would “interposition” themselves between their clients and the best offer in order to either buy low from the client and sell higher to the NBBO (National Best Bid and Offer; the best price) price or vice versa.
In 1997, the SEC passed the Limit Order Display rule to improve transparency that required market makers to include offers at better prices than those the market maker is offering to investors. This allows investors to know the NBBO and circumvent corruption.
However, this rule also had the effect of requiring the exchanges to display electronic orders from the ECN/ATS systems that were better priced. The SEC followed up in 1998 with Regulation ATS. Reg ATS allowed ECN/ATS systems to register as either brokers or exchanges. It also protected investors by mandating reporting requirements and transparency of aggregate trading activity for ECN/ATS systems once they reach a certain size.
These changes opened up huge new opportunities for ECN/ATS systems by allowing them to display and execute quotes directly with the big exchanges. Though they were required to report these transactions to the exchange, they gained much more. In particular, with their advanced technology and low-latency communication systems, they became a portal through which next generation algorithmic trading and high frequency trading algorithms could have access to wider markets. Changes still continued to accelerate.
In 2000, were two other groundbreaking changes. First was the decimalization of the price quotes on US stocks. This reduced the bid-ask spreads and made it much easier for computer algorithms to trade stocks and conduct arbitrage. The NYSE also repealed Rule 390 which had prohibited the trading of stocks listed prior to April 26, 1979 outside of the exchange. High frequency trading began to grow rapidly but did not truly take off until 2005.
In June 2005, the SEC revised Reg NMS with several key mandates. Some were relatively minor such as the Sub-Penny rule which prevented stock quotations at a resolution less than one cent. However, the biggest change was Rule 611, also known as the Order Protection Rule. Whereas with the Limit Order Display rule, exchanges were merely required to display better quotes, Reg NMS Rule 611 mandated, with some exceptions, that trades are always automatically executed at the best quote possible. Price is the only issue and not counterparty reliability, transaction speed, etc. The opening for high frequency trading here is clear. The automatic trade execution created the perfect environment for high speed transactions that would be automatically executed and not sit in a queue waiting for approval by a market maker or some vague exchange rule. The limit to trading speed and profit was now mostly the latency on electronic trading systems.
The boom in ECN/ATS business created huge competition for exchanges causing traditional exchanges (NYSE & Euronext) to merge and some exchanges to merge with large ECNs (NYSE & Archipelago). In addition, the competition created increasingly risky business strategies to lure customers. CBSX and DirectEdge pioneered ‘flash trading’ on the Chicago Board of Exchange and the NYSE/NASDAQ respectively where large limit orders would be flashed for 50 milliseconds on the network to paying customers who could then rapidly trade in order to profit from them before public advertisement. Many of these were discontinued in late 2009 after public outcry but HFT was already the dominant vehicle for US equity trading as shown in figure 1. HFT thrives on rapid fire trading of small sized orders and the overall shares/trade has dropped rapidly over the last few years is shown in figure 2. In addition, the HFT strategy of taking advantage of pricing signals from large orders has forced many orders off exchanges into proprietary trading networks called ‘dark pools’ which get their name from the fact they are private networks which only report the prices of transactions after the transaction has occurred and typically anonymously match large orders without price advertisements. These dark pools allow a safer environment for large trades which (usually) keep out opportunistic high frequency traders. The basic structure of today’s market and a timeline of developments are given in figure 3 and figure 4. For more detailed information, see Stoll (2006); McAndrews and Stefandis (2000); Francis et. al. (2009); Mittal (2008); Degryse et. al. (2008); Palmer (2009)
Full paper link here .