Speculative Bubbles in DIY Trading Algorithms

WSJ‘s Austen Hufford profiles a group of retail investors who use DIY trading algorithms to try and extract alpha. Some comments:

 

1. Hufford’s interviewees sound very similar to dotcom era day traders at the peak of the 1995-2000 speculative bubble — particularly about their positive expectancy of financial profits.

 

2. The choice of asset class (forex) and markets (S&P500 and Nasdaq Composite) will likely mean that Hufford’s retail traders are picked off by high-frequency traders.

 

3. Hufford’s article has some typical anecdotes on how traders lose money early on in the trade development process and how coding errors can lead to unprofitable trades. On the upside the group of traders now has a daily, actionable routine  to deal with financial markets.

 

4. Aspects emphasised by institutional traders — pre-trade analysis, market microstructure, transaction cost economics, and tax implications — are not considered by Hufford’s retail traders.

 

5. Hufford mentions Interactive Brokers whose own history of algorithmic trading is featured in Scott Patterson’s book Dark Pools. I took part of Tucker Balch’s course but also compared it to other known research such as Andrew W. Lo’s studies. I’ve also looked at Quantopian and Rizm.

 

6. $200,000 in account size for equities / forex depends also on other factors such as leverage, position size, and regularity of trading.

 

7. Computer-driven hedge funds use very sophisticated programming.

 

8. Twitter and academic journal research uses sentiment analysis.

 

9. Hufford’s trading strategy example uses a moving average indicator in technical analysis. This is a basic strategy for retail traders that is now gamed by high-frequency trading algorithms.

 

10. Agile software development practices have insights for how to refactor code.

 

For a comparison of methodology see Alvaro Cartea, Sebastian Jaimungal and Jose Penalva’s Algorithmic and High-Frequency Trading (New York: Cambridge University Press, 2015).

Thematic Analysis of a Reading List on Investment Alpha

I recently did a thematic analysis of a reading list on investment alpha, which involves:

 

1. Excess return.

2. Active management.

3. Adjusted risk.

 

The following themes emerged from the reading list, and from also checking the rankings of several hundred books at Amazon.com:

 

1. Excess return: fund type (hedge fund, private equity, venture capital); return drivers (including asset class); and quantitative models.

 

2. Active management: discretionary (human trading, portfolio composition and rebalancing, options, technical analysis) and algorithmic (algorithmic trading; complex event / stream processing; computational intelligence; genetic algorithms; machine learning; neural nets; and software agents).

 

3. Adjusted risk: Bayesian probabilities; investor psychology; market microstructure; and risk management models (such as Monte Carlo simulation, Value at Risk, and systematic risk)

 

This core work suggests the following query line:

 

SELECT return drivers (Bayesian belief network) (multi-asset) (portfolio) (fund)

 

 

WHERE risk (Bayesian probability) (exposures) (exposures – investor decisions) (exposures – market microstructure) AND trade (algorithms)

 

ORDER BY Bayesian (belief network, probability); return drivers (multi-asset); risk (exposures); and trading (algorithms).

 

This thematic analysis will help to focus my post-PhD research on the sociology of finance into the following initial research questions:

 

1. What is the spectrum of possible return drivers in a multi-asset world?

 

A good model for this is David Swensen’s Yale endowment portfolio detailed in Pioneering Portfolio Management: An Unconventional Approach to Institutional Investment (New York: The Free Press, 2009). Antti Ilmanen’s magisterial Expected Returns: An Investor’s Guide to Harvesting Market Rewards (Hoboken, NJ: John Wiley & Sons, 2011) has information on the return drivers of specific asset classes. Matthew Hudson’s recent Funds: Private Equity, Hedge Funds, and All Core Structures (Hoboken, NJ: John Wiley & Sons, 2014) deals with global fund structures.

 

2. What specific risk exposures might these multi-assets face, and under what conditions?

 

Richard C. Grinold and Ronald Kahn’s Active Portfolio Management: A Quantitative Approach for Producing Superior Returns and Controlling Risk (New York: McGraw-Hill, 1999) is the classic book on institutional portfolio models. Morton Glantz and Robert Kissell’s Multi-Asset Risk Modeling: Techniques for a Global Economy in an Electronic and Algorithmic Trading Era (San Diego, CA: Academic Press, 2014) is a recent book I will look at. Charles Albert-Lehalle and Sophie Larulle’s Market Microstructure in Practice (Singapore: World Scientific Publishing Company, 2014), and Thierry Foucault, Marco Pagano, and Ailsa Roell’s Market Liquidity: Theory, Evidence, and Policy (New York: Oxford University Press, 2013) deal respectively with the practice and theory of contemporary financial markets. There are many books on behavioural finance and investor psychology: two recent ones are H. Kent Baker and Victor Ricciardi’s collection Investor Behavior: The Psychology of Financial Planning and Investing (Hoboken, NJ: John Wiley & Sons, 2014), and Tim Richards’ Investing Psychology: The Effects of Behavioral Finance on Investment Choice and Bias (Hoboken, NJ: John Wiley & Sons, 2014).

 

3. How can algorithmic trading and computational techniques model the risk-return dynamics of alpha generation?

 

Despite its flaws Rishi K. Narang’s Inside the Black Box: A Simple Guide to Quantitative and High Frequency Trading (New York: John Wiley & Sons, 2013) opened my eyes to the structures needed for alpha generation. The Bayesian approach is detailed in David Barber’s Bayesian Reasoning and Machine Learning (New York: Cambridge University Press, 2012). Barry Johnson’s Algorithmic Trading and DMA: An Introduction to Direct Access Trading Strategies (London: 4Myeloma Press, 2010) and Robert Kissell’s The Science of Algorithmic Trading and Portfolio Management (San Diego, CA: Academic Press, 2013) deal with order types in algorithmic trading. Christian Dunis, Spiros Likothanassis, Andreas Karathanasopoulos, Georgios Sermpinis, and Konstantinos Theofilatos have edited a recent collection on Computational Intelligence Techniques for Trading and Investment (New York: Routledge, 2014). Eugene A. Durenard’s Professional Automated Trading: Theory and Practice (New York: John Wiley & Sons, 2013) covers software agents. For retail trader-oriented applications of data mining, machine learning, and Monte Carlo simulations there is Kevin Davey’s Building Algorithmic Trading Systems: A Trader’s Journey from Data Mining to Monte Carlo Simulation to Live Trading (New York: John Wiley & Sons, 2014), and David Aronson and Timothy Masters’ Statistically Sound Machine Learning for Algorithmic Trading of Financial Instruments: Developing Predictive-Model-Based Trading Systems Using TSSB (CreateSpace, 2013).

 

What this means is that for an investment of about $US1,000 a new researcher can gain some of the core books on institutional, quantitative portfolio and risk management; behavioural finance and market microstructure as potential sources for edges; and some recent practitioner-oriented literature on algorithmic / automated trading that uses computational intelligence.

 

In deference to Mao and McKenzie Wark’s vectoralist class:

 

Let a thousand algorithmic / quantitative micro-funds bloom.

High-Frequency Trading: A Reading List

I first heard of high-frequency trading (HFT) via Charles Duhigg’s New York Times article in July 2009.

 

Over the past few years I have investigated facets of HFT. Below is an introductory reading list to HFT and the related area of algorithmic trading, which has recently ‘crossed the chasm’ from institutional to retail investors. It covers an historical overview; some relevant theory; and the use of computer algorithms and machine learning. Large-scale HFT firms spend millions on their computing and technological infrastructure.

 

The introductory reading list hopefully shows how you can use research skills to Understand a media debate or knowledge domain in greater detail.

 

This work connects with the Chaos Rules school of thought that I helped write for the Smart Internet 2010 Report (2005). One academic told my boss that Chaos Rules thinking “did not exist.” The two decades long research into HFT — and related areas like Bayesian econometrics and market microstructure — shows otherwise.

HFT Introductions

Dark Pools: The Rise of AI Trading Machines and the Looming Threat to Wall Street by Scott Patterson (Cornerstone Digital, 2012). (TS-3). A history of algorithmic and high-frequency trading on Wall Street, and the emergence of dark pools.

Inside The Black Box: A Simple Guide to Quantitative and High Frequency Trading (2nd ed.) by Rishi K. Narang (John Wiley & Sons, 2012). (TS-3). An introduction to quantitative trading models and coverage of the media debate about high-frequency trading. For a counter-view see Haim Bodek’s The Problem of HFT: Collected Writings on High Frequency Trading and Stock Market Structure Reform (CreateSpace, 2013) (TS-3), who was a source for Patterson’s Dark Pools.

HFT Theory: Bayesian Econometrics, High-Frequency Data, and Machine Learning

Empirical Market Microstructure: The Institutions, Economics, and Econometrics of Securities Trading by Joel Hasbrouck (New York: Oxford University Press, 2007). (TS-4). Hasbrouck explains the empirical approaches to market microstructure that underpin high-frequency trading.

Market Liquidity: Theory, Evidence and Policy by Thierry Foucault, Marco Pagano, and Ailsa Roell (New York: Oxford University Press, 2013). (TS-4). The current debates on how high-frequency trading has affected liquidity and price discovery in markets, and the growth of market microstructure frameworks.

Bayesian Reasoning and Machine Learning by David Barber (New York: Cambridge University Press, 2012). (TS-4). An introduction to Bayesian probability and data analysis using filters and machine learning. For an introduction to machine learning see Peter Flach’s Machine Learning: The Art and Science of Algorithms That Make Sense of Data (New York: Cambridge University Press, 2012) (TS-4).

Econometrics of High-Frequency Data by Nikolaus Hautsch (New York: Springer, 2011). (TS-4). An advanced overview of high-frequency data and relevant econometric models for liquidity, volatility, and market microstructure analysis.

Handbook of Modeling High-Frequency Data in Finance by Frederi G. Viens, Maria C. Mariani, and Ionut Florescu (New York: John Wiley & Sons, 2011). (TS-4). An advanced reference on how to model high-frequency data.

HFT Algorithmic Trading

The Science of Algorithmic Trading and Portfolio Management by Robert Kissell (Academic Press, 2013). (TS-4). An advanced introduction to how algorithmic trading influences market microstructure, and is used for the transaction and execution systems of high-frequency trading. For an earlier introduction see Barry Johnson’s Algorithmic Trading & DMA: An Introduction to Direct Access Trading Strategies (4Myeloma Press, 2010) (TS-4).

Professional Automated Trading: Theory and Practice by Eugene A. Durenard (New York: John Wiley & Sons, 2013). (TS-4). Insights from mathematics and computer science about how to develop, test, and automate the algorithmic trading strategies, using agent-based learning.

Statistically Sound Machine Learning for Algorithmic Trading of Financial Instruments: Developing Predictive-Model-Based Trading Systems Using TSSB by David Aronson and Timothy Masters (CreateSpace, 2013) (TS-4). The authors developed the TSSB software program that uses machine learning to implement algorithmic trading strategies.

18th June 2013: Algorithmic Trading Goes Retail

Fortune Magazine reports that EquaMetrics is now selling a cloud-based app that creates Technical Analysis-based algorithmic trading strategies for retail trading subscribers:

 

EquaMetrics’ app is simply designed and since its software firepower comes from the cloud, it doesn’t require anything more than the typical PC. You can drag and drop colored tiles to assemble your own algorithm. Day traders can choose between 30 variables to build their formulas. The options are built on so-called technical indicators, metrics that reflect trading patterns as opposed to stock fundamentals such as the price-earnings ratio. After you’re done, you run the program to buy and sell stocks and currencies.

 

The web application is relatively inexpensive: it costs $99 a month or $250 a month, depending on how many algorithms you want to run. That’s a steal compared to the alternative of hiring a quantitative programmer for $200,000 a year. EquaMetrics gives you the stuff a programmer could produce. Then it’s up to you to assemble your own strategy.

 

I have been expecting apps like this for several months, and have been monitoring other initiatives like the Quantopian community. The popular literature on algorithmic trading strategies evolved from Technical Analysis mechanical systems (Tushar S. Chande’s Beyond Technical Analysis) to back-testing (Robert Pardo’s The Evaluation and Optimization of Trading Strategies) and then to algo trading using Matlab software (Ernie Chan’s Quantitative Trading and his new Algorithmic Trading; and Barry Johnson’s Algorithmic Trading & DMA). This period spans the post-dotcom collapse; the 2003-08 speculative bubble in real estate and asset-backed securitisation; and institutional experimentation with high-frequency trading platforms, and transaction and execution costs.

 

EquaMetrics’ strategy reflects this decade-long evolution:

  • Its initial offering is Technical Analysis strategies: at a time when: (a) high-frequency trading has ‘broken’ many trend-following and momentum indicators; and (b) hedge funds and proprietary trading desks use predatory trading to clean out TA-oriented retail traders.
  • The model is subscription-based software as a service — which could eventually disrupt or change the economics of agile software programming if this offering scales up in a significant way. Will the $US99-250 per month price point remain? Or will another platform develop a lower-priced offering and trigger a ‘race to the bottom’ competitive dynamic?
  • It opens the way for the licensing of specific TA indicators and proprietary methods as ancillary revenue streams, and as a way to build a market around the core product offering (which NinjaTrader, MetaStation, and ESignal have all done with their respective platforms).
  • The quality and scope of the back-tested data is important: quantitative hedge funds like Jim Simons’ Renaissance and David Shaw’s D.E. Shaw & Co each clean their own data.
  • EquaMetrics’ move into fundamental indicators reflects some recently published work on the quantitative analysis of these strategies (notably, Richard Tortoriello’s Quantitative Strategies for Achieving Alpha, and Wesley Gray and Tobias Carlisle’s Quantitative Value).
  • EquaMetrics’ choice of FXCM and Interactive Brokers as prime brokers to process client trades is significant: brokerage transaction and execution costs can mean a potential, new trading strategy is actually unprofitable to execute, or that its profit-taking ability declines over time, especially in correlated and ‘crowded trade’ markets.
  • The focus on TA and fundamental indicators does not address some of the quantitative, statistical or machine learning strategies that quantitative hedge funds use to develop algorithms; how correlation testing of model variables might occur; and what might happen to retail investors once several different competing firms have back-tested and issued dueling algorithms (a factor in high-frequency markets where scalping and order front-running occurs).

 

Still, the EquaMetrics offering has me interested: I’ve been waiting for algorithmic trading to ‘value migrate’ (Adrian Slywotzky) to retail traders, for awhile. It’s a first step towards post-human trading (Charles Stross’s novel Accelerando).

15th June 2013: HFT, Disruptive Innovation & Theta Arbitrage

23rd July 2009 was perhaps the day that retail investors became aware of high-frequency trading (HFT).

 

That was the day that New York Times journalist Charles Duhigg published an article on HFT and market microstructure changes. Duhigg’s article sparked a public controversy about HFT and changes to United States financial markets.

 

Then on 6th May 2010 came the Flash Crash. HFT was again the villain.

 

For the past few years HFT has inspired both pro and con books from publishers. HFT has changed how some retail investors and portfolio managers at mutual and pension funds view financial markets. Now, Matthew Philips of Bloomberg Businessweek reports that 2009-10 may have been HFT’s high-point in terms of being a profitable strategy.

 

Philips’ findings illustrate several often overlooked aspects of Clayton Christensen‘s Disruptive Innovation Theory. Scott Patterson notes in his book Dark Pools (New York: Crown Business, 2012) that HFT arose due to a combination of entrepreneurial innovation; technological advances in computer processing power; and changes to US Securities and Exchanges Commission regulations. Combined, these advances enabled HFT firms to trade differently to other dotcom era and post-dotcom firms that still used human traders or mechanical trading systems. This trading arbitrage fits Christensen’s Disruptive Innovation Theory as a deductive, explanatory framework.

 

The usually overlooked aspect of Disruptive Innovation Theory is that this entrepreneurial investment and experimentation gave HFT firms a time advantage: theta arbitrage. HFT firms were able to engage for about a decade in predatory trading against mutual and pension funds. HFT also disrupted momentum traders, trend-followers, scalping day traders, statistical arbitrage, and some volatility trading strategies. This disruption of trading strategies led Brian R. Brown to focus on algorithmic and quantitative black boxes in his book Chasing The Same Signals (Hoboken, NJ: John Wiley & Sons, 2010).

 

Paradoxically, by the time Duhigg wrote his New York Times article, HFT had begun to lose its profitability as a trading strategy. Sociologist of finance Donald MacKenzie noted that HFT both required significant capex and opex investment for low-latency, and this entry barrier increased competition fueled ‘winner-takes-all’ and ‘race to the bottom’ competitive dynamics. HFT’s ‘early adopters’ got the theta arbitrage that the late-comers did not have, in a more visible and now hypercompetitive market.  Duhigg’s New York Times article wording and the May 2010 Flash crash also sparked an SEC regulatory debate:

 

  • On the pro side were The Wall Street Journal’s Scott Patterson; author Rishi K. Narang (Inside The Black Box); and industry exponent Edgar Perez (The Speed Traders).
  • On the con side were Haim Bodek of Decimus Capital Markets (The Problem With HFT), and Sal L. Arnuk and Joseph C. Saluzzi of Themis Trading (Broken Markets) which specialises in equities investment for mutual and pension fund clients.
  • The winner from the 2009-12 debate about HFT regulation appears to be Tradeworx‘s Manoj Narang who was both pro HFT yet who also licensed his firm’s systems to the SEC for market surveillance, as a regulatory arbitrage move. The SEC now uses Tradworx’ systems as part of the Market Information Data Analytics System (MIDAS, Philips reports.

 

Philips announced that HFT firms now have new targets: CTAs, momentum traders, swing traders, and news sentiment analytics. That might explain some recent changes I have seen whilst trading the Australian equities market. Christensen’s Disruptive Innovation Theory and theta arbitrage both mean that a trading strategy will be profitable for a time before changes in market microstructure, technology platforms, and transaction and execution costs mean that it is no longer profitable.