Credit Suisse Crossfinder Dark Pool Negotiations

Zerohedge reports that Credit Suisse are in discussions with the New York Attorney General’s Office and the Securities and Exchange Commission about alleged market manipulation in its Crossfinder dark pool. Some thoughts:

 

1. The NY Attorney General’s Office and the SEC are going to have a wealth of market-based information about CrossFinder that could inform industry research on high frequency trading (HFT) and dark pools arbitrage.

 

2. The academic literature on market microstructure identifies some of the mechanisms that electronic execution services use for HFT / dark pool arbitrage. But the academics don’t really understand how to utilise these mechanisms for alpha generation in the same way that electronic execution services do. The knowledge gap informs the algorithmic trading types that Credit Suisse uses.

 

3. Electronic execution services are now more sophisticated about flow trading. There is little public literature on this apart from hints in Stephanie Hammer’s Architects of Electronic Trading (New York: John Wiley and Sons, 2013) and the IEX discussions in Michael Lewis’s Flash Boys (New York: W.W. Norton and Company, 2014).

 

4. I have seen fill order and midpoint gaming in trades that I have placed. This highlights the value of carefully examining the course of a day’s trades / transaction records and comparing it with what is known about market microstructure (in both academic research and in on-going media coverage).

Norway’s Sovereign Wealth Fund on HFT

Oyvind G. Schanke (New York Times)
Oyvind G. Schanke (New York Times)

Norway’s sovereign wealth fund Norges Bank Investment Management has released a new report about high-frequency trading (HFT). NBIM’s report finds that HFT firms front-run the large orders of asset management firms; that there is “transient liquidity” due to cancelled quotes; and that exchanges benefit “low latency ultra HFT strategies.” NBIM trader Oyvind G. Schanke told The New York Times: “It has become much more a market trading for trading’s sake.”

High-Frequency Trading: A Reading List

I first heard of high-frequency trading (HFT) via Charles Duhigg’s New York Times article in July 2009.

 

Over the past few years I have investigated facets of HFT. Below is an introductory reading list to HFT and the related area of algorithmic trading, which has recently ‘crossed the chasm’ from institutional to retail investors. It covers an historical overview; some relevant theory; and the use of computer algorithms and machine learning. Large-scale HFT firms spend millions on their computing and technological infrastructure.

 

The introductory reading list hopefully shows how you can use research skills to Understand a media debate or knowledge domain in greater detail.

 

This work connects with the Chaos Rules school of thought that I helped write for the Smart Internet 2010 Report (2005). One academic told my boss that Chaos Rules thinking “did not exist.” The two decades long research into HFT — and related areas like Bayesian econometrics and market microstructure — shows otherwise.

HFT Introductions

Dark Pools: The Rise of AI Trading Machines and the Looming Threat to Wall Street by Scott Patterson (Cornerstone Digital, 2012). (TS-3). A history of algorithmic and high-frequency trading on Wall Street, and the emergence of dark pools.

Inside The Black Box: A Simple Guide to Quantitative and High Frequency Trading (2nd ed.) by Rishi K. Narang (John Wiley & Sons, 2012). (TS-3). An introduction to quantitative trading models and coverage of the media debate about high-frequency trading. For a counter-view see Haim Bodek’s The Problem of HFT: Collected Writings on High Frequency Trading and Stock Market Structure Reform (CreateSpace, 2013) (TS-3), who was a source for Patterson’s Dark Pools.

HFT Theory: Bayesian Econometrics, High-Frequency Data, and Machine Learning

Empirical Market Microstructure: The Institutions, Economics, and Econometrics of Securities Trading by Joel Hasbrouck (New York: Oxford University Press, 2007). (TS-4). Hasbrouck explains the empirical approaches to market microstructure that underpin high-frequency trading.

Market Liquidity: Theory, Evidence and Policy by Thierry Foucault, Marco Pagano, and Ailsa Roell (New York: Oxford University Press, 2013). (TS-4). The current debates on how high-frequency trading has affected liquidity and price discovery in markets, and the growth of market microstructure frameworks.

Bayesian Reasoning and Machine Learning by David Barber (New York: Cambridge University Press, 2012). (TS-4). An introduction to Bayesian probability and data analysis using filters and machine learning. For an introduction to machine learning see Peter Flach’s Machine Learning: The Art and Science of Algorithms That Make Sense of Data (New York: Cambridge University Press, 2012) (TS-4).

Econometrics of High-Frequency Data by Nikolaus Hautsch (New York: Springer, 2011). (TS-4). An advanced overview of high-frequency data and relevant econometric models for liquidity, volatility, and market microstructure analysis.

Handbook of Modeling High-Frequency Data in Finance by Frederi G. Viens, Maria C. Mariani, and Ionut Florescu (New York: John Wiley & Sons, 2011). (TS-4). An advanced reference on how to model high-frequency data.

HFT Algorithmic Trading

The Science of Algorithmic Trading and Portfolio Management by Robert Kissell (Academic Press, 2013). (TS-4). An advanced introduction to how algorithmic trading influences market microstructure, and is used for the transaction and execution systems of high-frequency trading. For an earlier introduction see Barry Johnson’s Algorithmic Trading & DMA: An Introduction to Direct Access Trading Strategies (4Myeloma Press, 2010) (TS-4).

Professional Automated Trading: Theory and Practice by Eugene A. Durenard (New York: John Wiley & Sons, 2013). (TS-4). Insights from mathematics and computer science about how to develop, test, and automate the algorithmic trading strategies, using agent-based learning.

Statistically Sound Machine Learning for Algorithmic Trading of Financial Instruments: Developing Predictive-Model-Based Trading Systems Using TSSB by David Aronson and Timothy Masters (CreateSpace, 2013) (TS-4). The authors developed the TSSB software program that uses machine learning to implement algorithmic trading strategies.

15th June 2013: HFT, Disruptive Innovation & Theta Arbitrage

23rd July 2009 was perhaps the day that retail investors became aware of high-frequency trading (HFT).

 

That was the day that New York Times journalist Charles Duhigg published an article on HFT and market microstructure changes. Duhigg’s article sparked a public controversy about HFT and changes to United States financial markets.

 

Then on 6th May 2010 came the Flash Crash. HFT was again the villain.

 

For the past few years HFT has inspired both pro and con books from publishers. HFT has changed how some retail investors and portfolio managers at mutual and pension funds view financial markets. Now, Matthew Philips of Bloomberg Businessweek reports that 2009-10 may have been HFT’s high-point in terms of being a profitable strategy.

 

Philips’ findings illustrate several often overlooked aspects of Clayton Christensen‘s Disruptive Innovation Theory. Scott Patterson notes in his book Dark Pools (New York: Crown Business, 2012) that HFT arose due to a combination of entrepreneurial innovation; technological advances in computer processing power; and changes to US Securities and Exchanges Commission regulations. Combined, these advances enabled HFT firms to trade differently to other dotcom era and post-dotcom firms that still used human traders or mechanical trading systems. This trading arbitrage fits Christensen’s Disruptive Innovation Theory as a deductive, explanatory framework.

 

The usually overlooked aspect of Disruptive Innovation Theory is that this entrepreneurial investment and experimentation gave HFT firms a time advantage: theta arbitrage. HFT firms were able to engage for about a decade in predatory trading against mutual and pension funds. HFT also disrupted momentum traders, trend-followers, scalping day traders, statistical arbitrage, and some volatility trading strategies. This disruption of trading strategies led Brian R. Brown to focus on algorithmic and quantitative black boxes in his book Chasing The Same Signals (Hoboken, NJ: John Wiley & Sons, 2010).

 

Paradoxically, by the time Duhigg wrote his New York Times article, HFT had begun to lose its profitability as a trading strategy. Sociologist of finance Donald MacKenzie noted that HFT both required significant capex and opex investment for low-latency, and this entry barrier increased competition fueled ‘winner-takes-all’ and ‘race to the bottom’ competitive dynamics. HFT’s ‘early adopters’ got the theta arbitrage that the late-comers did not have, in a more visible and now hypercompetitive market.  Duhigg’s New York Times article wording and the May 2010 Flash crash also sparked an SEC regulatory debate:

 

  • On the pro side were The Wall Street Journal’s Scott Patterson; author Rishi K. Narang (Inside The Black Box); and industry exponent Edgar Perez (The Speed Traders).
  • On the con side were Haim Bodek of Decimus Capital Markets (The Problem With HFT), and Sal L. Arnuk and Joseph C. Saluzzi of Themis Trading (Broken Markets) which specialises in equities investment for mutual and pension fund clients.
  • The winner from the 2009-12 debate about HFT regulation appears to be Tradeworx‘s Manoj Narang who was both pro HFT yet who also licensed his firm’s systems to the SEC for market surveillance, as a regulatory arbitrage move. The SEC now uses Tradworx’ systems as part of the Market Information Data Analytics System (MIDAS, Philips reports.

 

Philips announced that HFT firms now have new targets: CTAs, momentum traders, swing traders, and news sentiment analytics. That might explain some recent changes I have seen whilst trading the Australian equities market. Christensen’s Disruptive Innovation Theory and theta arbitrage both mean that a trading strategy will be profitable for a time before changes in market microstructure, technology platforms, and transaction and execution costs mean that it is no longer profitable.

6th August 2012: HFT Amok

Knight Capital‘s (KCG) new high-frequency trading (HFT) algorithms wreaked havoc on Wall Street last Wednesday.

 

The market-maker has had to rely on emergency funding to stay operational. A ‘runaway algo‘ had run amok for thirty minutes — and almost ended KCG’s existence. So much for automated trading strategies that enabled customer collaboration.

 

Wired has ‘rushed to publish’ a Jerry Adler profile of HFT firms and trends. Adler interviews the usual suspects including Aaron C. Brown, TradeworxThemis Trading, and RavenPack.

16th July 2012: High-Frequency Futures

Global financial markets increasingly rely on computers and algorithms.

 

Between 2:42pm and 3:07pm on 6th May 2010, the Dow Jones Industrial Average plunged 998.5 points before recovering. The ‘Flash Crash’ raised United States regulator awareness of how computers and algorithms can affect trading. The US Securities and Exchange Commission blamed high-frequency trading (HFT): millisecond, computer-driven arbitrage used primarily by quantitative hedge funds. The trial of ex-Goldman Sachs programmer Sergey Aleynikov also created media attention on HFT. Today, supercomputers dominate financial market exchanges.

 

Computers and algorithms have a Wall Street prehistory. Michael Goodkin co-founded the Arbitrage Management Company in 1968 to pioneer computer and statistical arbitrage strategies. Goodkin recruited economist and corporate finance experts Harry Markowitz, Myron Scholes and Paul Samuelson as his academic brains trust. Bill Fouse used a Prime mini-computer to develop quantitative tactical asset allocation and index funds. In 1981, Michael Bloomberg founded the company that would sell his now-ubiquitous Bloomberg terminals to Wall Street. Bloomberg LP now has ThomsonReuters and Australia’s IRESS as market data competitors. Artificial intelligence, genetic algorithms, machine learning and neural networks each had speculative bubbles as Wall Street experimented with them and marketed black box systems as client solutions.

 

This experimentation spawned a new generation of academic entrepreneurs.

 

Finance academics like Fischer Black and Emanuel Derman moved to Goldman Sachs and enjoyed the market-driven environment. In the early 1980s, Wall Street hired physicists and created new sub-fields of knowledge: econophysics, computational finance, and financial engineering. In 1991, Doyne Famer, Norman Packard and Jim McGill founded The Prediction Company (acquired in 2005 by UBS) to use complex adaptive systems theory to model financial markets. In 1996, Michael Goodkin co-founded Numerix to use Monte Carlo simulations to test trading strategies.

 

Collectively, their work identified new market anomalies and complex dynamics to trade. Their research anticipated new software. Today, US university programs in financial engineering use software platforms like Alphacet, Deltix and Streambase to develop algorithms for HFT and complex event processing (CEP) systems. Yet these innovations remain unavailable in many Australian university programs with the exception of the Capital Markets CRC.

 

Two former academics offer one compelling vision of how computers and algorithms will reshape Wall Street in the next century. Stony Brook University mathematician Jim Simons formed the quantitative fund Renaissance Technologies and now uses ex-IBM voice synthesis scientists. Stanford supercomputer designer David Shaw founded D.E. Shaw & Company, which employed Jeff Bezos before he founded Amazon.com. Shaw rejects technical analysis (the pattern recognition of price and volume) for Karl Popper’s philosophy of falsifiability and event-based studies.

 

Shaw and Simons’ funds use terabytes of data, daily: a forerunner of the current interest in Big Data research. As academic entrepreneurs, they ended journal publications and government competitive grants. Instead, they used market arbitrage, economies of scope, highly incentivised staff, private scientific knowledge, and walled gardens to protect their funds’ intellectual property.

 

Shaw and Simons have already lived a decade in a different future than most investors and traders.

 

HFT and CEP systems are already changing how Australia’s financial markets operate. The Australian Securities Exchange (ASX) and the new Chi-X Exchange now both have HFT capabilities including direct market access: the exchanges now host the ‘co-located’ low-latency computing systems of market-makers, proprietary trading firms and hedge funds. Algorithmic and HFT trading now accounts for higher trading volumes and volatility in company share prices. HFT is also blamed for greater inter-market correlation such as between the ASX and the Shanghai Composite Index. These trends echo the volatility of commodities and futures markets in the 1970s. More subtly, HFT and CEP systems create knowledge decay: in which new knowledge and faster cycle times makes existing knowledge and investment strategies obsolete. Such innovations are unlikely to diffuse any time soon to retail investors.

 

The 2007-09 global financial crisis has prompted a backlash against Wall Street computers and algorithms. This backlash is similar to the fall of Master of the Universe traders after the 1980s merger wave and the demise of technology firms after the 1995-2000 dotcom bubble. University of Edinburgh’s Donald MacKenzie exemplifies the new academic research programs that are emerging: how sociology, and science and technology studies, might contribute to our understanding of financial markets. Barnard College’s president Debora L. Spar and Columbia Law School’s Timothy Wu caution that regulatory actions can dramatically affect future industry trajectories. A financial world without computers would be a return to mid-1960s trading: back-office processing, brokerage, clearing and settlement delays, and lower trading volumes.

 

In the face of HFT technology Wall Street traders emphasise craft. Arbitrage opportunities, psychology, and risk/money management are still vital for trading success, they contend. HFT has just changed Wall Street in a way closer to Margin Call than to Boiler Room or Wall Street. Interactive Brokers has a more direct future in mind. It ran a new television advertising campaign after the Occupy Wall Street protests in New York’s Zuccotti Park: “Join the 1%.”

Don Tapscott’s Transformation Agenda for Risk Management in Financial Institutions

Paul Roberts pointed me to this Don Tapscott video about how wiki-type collaborative knowledge might transform risk management in financial institutions. Tapscott draws on his coauthored book Wikinomics (2008) to pose the following points:

(1). Financial institutions need to share their intellectual property (IP) about risk management in a commons-based model similar to the Human Genome Project or Linux.

(2). The key IP are algorithms and ratings system for risk.

(3). In response to an objection that the key IP should remain proprietary, Tapscott points to the failure of algorithms and rating systems to prevent the systemic risk of the global financial crisis.

(4). Tapscott appeals to financial institutions to act as peers — “a rising tide lifts all boats” — and that through sharing this information, they can compete more ethically in new markets, reinvent their industry, transform the practices in risk management, and act with a “new modus operandi.”

Tapscott is a persuasive business strategist who manages above to integrate his advocacy of “wikinomics” with the current debate on financial institutions, and his earlier, mid-1990s work on how technology would transform business. He echoes Umair Haque’s call for a Finance 2.0 based on transparency and social innovation in financial markets.

Here are thoughts, some ‘contrarian’, on each of Tapscott’s points.

(1). Read Burton Malkiel or the late Peter L. Bernstein and you will see that finance is driven to innovate new instruments, methodologies and institutions to hedge or arbitrage risk. Some of these are commons-based such as the actuarial development of insurance. Some innovations are now blamed for the problem, such as RiskMetricsValue at Risk methodology. The Basel II Accord which attempts to provide an international regulatory framework raises an interesting question: Under what conditions can a commons-based approach be successfully implemented in an institutional form and practices? Off-balance sheet items and special investment vehicles are two potential barriers to this goal. As for Tapscott’s examples, their success is due to a combination of public and private approaches, such as the parallel research by the National Institutes of Health‘s Human Genome Project and Craig Venter‘s Celera Corporation. This combination dynamic can be left out of an advocacy stance for a commons-based solution.

(2). Tapscott and Haque are correct to identify these as points of leverage. Some of the algorithms and rating systems are public information, such as Google Finance and Morningstar metrics, and trader algorithms on public sites. There are however several potential barriers to Tapscott and Haque’s commons-based view. Investors will have different risk appetites and decision/judgment frames despite access to the same public information. Philip Augar discloses in The Greed Merchants (Portfolio, New York, 2005) that proprietary algorithms rarely remain as private knowledge within institutions unless the knowledge is kept tacit, or in the case of ex-Goldman Sachs trader Sergey Aleynikov, through lawsuits. Aleynikov’s expertise in high-frequency trading which uses complex algorithms and co-located computer systems highlights other barriers: access to technology, information arbitrage, learning curves, and market expertise. As Victor Niederhoffer once observed, this advantage renders large parts of the financial advice or investor seminar industry obsolete, or as noise and propaganda at best. Finally, although public information may help investors it may never completely replace risk arbitrage based on private information or market insight.

(3). Tapscott’s observation about the global financial crisis echoes Satyajit Das, Nassim Nicholas Taleb, Nouriel Roubini and others on the inability of institutions to deal with the systemic crises which the complex instruments and methodologies created. Some hedge fund managers however have been very successful, despite the crisis. Others, notes Gillian Tett in her book Fool’s Gold (The Free Press, New York, 2009) helped create the financial instruments which led to the crisis, yet largely avoided it. So, a more interesting question might be: How did such managers avoid or limit the effects from the systemic crisis, and what decisions did they make?

(4). This is Tapscott as inspirational advocate for change. He echoes Haque on momentum and long-based strategies for investors. He also channels Adam Brandenburger and Barry Nalebuff’s game theoretic model of cooperating to create new markets and then competing for value. This is unlikely to happen in competitive financial institutions. A project to develop a commons-based approach to financial risk management may however interest a professional organisation such as the CFA Institute (US), Global Association of Risk Professionals (US) or the Financial Services Institute of Australasia. Will Tapscott lead an initiative to develop this?