Life Alpha Sources

Alpha in investment usually means: (1) excess return; (2) adjusted by risk; and (3) earned by active management.


More generally, alpha signifies the people and resources that contribute to life significance.


This morning I made a list of alpha sources in my life. These ranged from my PhD studies and cumulative research experience to membership of international scholarly organisations. Several themes emerged:


  • The alpha sources fell into three major categories: (1) capital; (2) knowledge / networks (as resources); and (3) decisions that led to specific, mindful actions that established cause-effect chains (causality).
  • Positive changes to capital and knowledge / networks also expanded my decision scope.
  • Utilisation means better actions on capital and knowledge / networks (pragmatics).
  • Academic success has a winner-takes-all dynamic that is also a J-curve with asymmetric payoffs for those who can survive the ‘up or out’ career dynamic.


I also noted the following after completing most of my annual tax expense estimates:


  • At least 10% of my yearly income goes to resources for personal research projects.
  • What if the personal research projects were income-producing?
  • I have spent the past decade in combinatorial search through various disciplines; now I am developing a personal synthesis that draws on all of these experiences.
  • I have access to academic networks and institutional libraries that expand the scope and reach of my personal research projects. This can be a problem: shifting goal-posts.
  • The employers I have worked for are increasingly top-down, fragilista (Taleb), and run on a private equity model.
  • Informational resources are expanding; bureaucracies are creating new ideological myths in response.
  • I signed some bad contract deals early in life that still financially affect me (namely, student debt).
  • Having ‘skin in the game’ heuristic (Taleb) changes how to deal with sucker bet dynamics: you become aware that you are placed in a sucker position (single point of failure) even if you can’t change it (external costs shifted to you).


Some final thoughts:


  • The core resources I need for personal research — a computer, a personal research library, an inter-library loans card, and personal blog facilities — can be run on a tighter budget than I have allowed over the past decade.
  • The core challenges I have are: (1) having enough capital (including to hedge downside risks); (2) cultivating high energy / focus to do optimal research work; and (3) making daily progress amidst life changes and work commitments / routines.

Literature Review on the Black Box

Rishi K. Narang’s book Inside the Black Box: A Simple Guide to Quantitative and High Frequency Trading (Hoboken, NJ: John Wiley & Sons, 2013) proposes a generic model of a black box trading system.


Narang’s generic model features: (1) an alpha module for alpha generation; (2) a risk module for risk management; and (3) a transaction cost module for costs. These feed into (4) a portfolio management module, which then feeds into (5) an execution module.


High frequency trading’s innovation was to alter the alpha /risk / portfolio equation through changes to transaction cost and execution strategies.


As an exercise I used Narang’s categories from his generic module to organise some Amazon Kindle trading books:


  • Alpha module: 83 books.
  • Risk module: 65 books.
  • Transaction Cost module: 22 books
  • Portfolio Construction module: 38 books
  • Execution module: 85 books


There is some overlap in books between each of the categories. I also used some additional categories:


  • Algorithmic trading: 80 books
  • Trading strategies: 229 books
  • Trading psychology (including therapeutic manuals): 174 books
  • Funds: 76 books


A couple of observations from this initial cumulative literature review of black box trading systems:


  • Most of the public trading literature deals in an unstructured way with alpha strategies or with trading strategies – the overwhelming emphasis is on momentum and trend-following strategies that high-frequency trading has now disrupted. Some of these books are still variants on trading strategies from the pre-dotcom 1990s. Some publishers recycle themes using art design, new authors, and small, cumulative information. In contrast, some of the most interesting information comes from outlier authors. I have screened out most of the cheap Kindle books that now add noise to new retail traders.
  • The real sources of institutional or proprietary alpha are only hinted at in the publicly available trading literature – and is more often glimpsed in investigative journalism accounts. Many trading books are written by pseudo-retail traders who have developed white box trading systems using basic technical analysis, risk, and money management rules. Jack Schwager’s Market Wizards series remains influential and significant in part because it offers a glimpse of how professional traders and successful money managers actually think.
  • The literature on trading psychology developed in part as a way to deal with the methodological limitations of the Edwards and Magee school of technical analysis that focused on signals and indicators.
  • The portfolio construction literature emerged from Harry Markowitz’s work in corporate finance, and later, David Swensen’s development of the Yale endowment model of foundation investment.
  • The risk literature covers either traditional corporate finance models, value at risk models, post-Taleb extreme value models on tail risk, or recent applications of Bayesian probabilities to portfolio models.
  • The funds literature covers white box versions of hedge fund, mutual fund, and sovereign wealth fund strategies.
  • The algorithmic trading literature covers general overviews, order types, white box strategies, and some computer science / programming manuals on algorithms. There is very little publication of actual trading algorithms or code. A computer science / programming background is helpful for quantitative finance.
  • The rise of high frequency trading has led to a greater focus on transaction costs and execution as sources of competitive edge. This emphasis differs from the Edwards and Magee focus on signals and indicators that provide set-ups for possible trades. It’s also what is missing from much of the publicly available literature on trading systems (which itself is very fractured). Thus, most trading books suffer from transaction / execution cost decays.


This initial literature review suggests the following strategies for future systems development:


  • Continue to find potential sources of alpha whilst noting the patterns of alpha decay (i.e. how alpha ends).
  • Decompose the alpha-risk-portfolio literature into checklists, an expert system, portfolio screens, or rules with an awareness of Bayesian probabilities (= edge / positive expectancy as ‘go / no go’ criteria: if there is no real edge then don’t trade – and this also involves understanding other traders and known trading algorithms). Eventually, this ‘explicit’ codification may be integrated with an off-the-shelf machine learning system such as David Aronson and Timothy Masters’ TSSB software.
  • Screen out the trading strategies that are now unsuccessful in the current market environment (strategy decay): focusing on transaction / execution costs will be helpful.
  • Continue to do developmental / therapeutic work for cultivating expertise and improving decision heuristics / judgment.
  • Search for new opportunities that involve more competitive transaction / competitive costs (although this is difficult as an Australian-based retail trader due to broker / exchange / platform  limitations).

Thematic Analysis of a Reading List on Investment Alpha

I recently did a thematic analysis of a reading list on investment alpha, which involves:


1. Excess return.

2. Active management.

3. Adjusted risk.


The following themes emerged from the reading list, and from also checking the rankings of several hundred books at


1. Excess return: fund type (hedge fund, private equity, venture capital); return drivers (including asset class); and quantitative models.


2. Active management: discretionary (human trading, portfolio composition and rebalancing, options, technical analysis) and algorithmic (algorithmic trading; complex event / stream processing; computational intelligence; genetic algorithms; machine learning; neural nets; and software agents).


3. Adjusted risk: Bayesian probabilities; investor psychology; market microstructure; and risk management models (such as Monte Carlo simulation, Value at Risk, and systematic risk)


This core work suggests the following query line:


SELECT return drivers (Bayesian belief network) (multi-asset) (portfolio) (fund)



WHERE risk (Bayesian probability) (exposures) (exposures – investor decisions) (exposures – market microstructure) AND trade (algorithms)


ORDER BY Bayesian (belief network, probability); return drivers (multi-asset); risk (exposures); and trading (algorithms).


This thematic analysis will help to focus my post-PhD research on the sociology of finance into the following initial research questions:


1. What is the spectrum of possible return drivers in a multi-asset world?


A good model for this is David Swensen’s Yale endowment portfolio detailed in Pioneering Portfolio Management: An Unconventional Approach to Institutional Investment (New York: The Free Press, 2009). Antti Ilmanen’s magisterial Expected Returns: An Investor’s Guide to Harvesting Market Rewards (Hoboken, NJ: John Wiley & Sons, 2011) has information on the return drivers of specific asset classes. Matthew Hudson’s recent Funds: Private Equity, Hedge Funds, and All Core Structures (Hoboken, NJ: John Wiley & Sons, 2014) deals with global fund structures.


2. What specific risk exposures might these multi-assets face, and under what conditions?


Richard C. Grinold and Ronald Kahn’s Active Portfolio Management: A Quantitative Approach for Producing Superior Returns and Controlling Risk (New York: McGraw-Hill, 1999) is the classic book on institutional portfolio models. Morton Glantz and Robert Kissell’s Multi-Asset Risk Modeling: Techniques for a Global Economy in an Electronic and Algorithmic Trading Era (San Diego, CA: Academic Press, 2014) is a recent book I will look at. Charles Albert-Lehalle and Sophie Larulle’s Market Microstructure in Practice (Singapore: World Scientific Publishing Company, 2014), and Thierry Foucault, Marco Pagano, and Ailsa Roell’s Market Liquidity: Theory, Evidence, and Policy (New York: Oxford University Press, 2013) deal respectively with the practice and theory of contemporary financial markets. There are many books on behavioural finance and investor psychology: two recent ones are H. Kent Baker and Victor Ricciardi’s collection Investor Behavior: The Psychology of Financial Planning and Investing (Hoboken, NJ: John Wiley & Sons, 2014), and Tim Richards’ Investing Psychology: The Effects of Behavioral Finance on Investment Choice and Bias (Hoboken, NJ: John Wiley & Sons, 2014).


3. How can algorithmic trading and computational techniques model the risk-return dynamics of alpha generation?


Despite its flaws Rishi K. Narang’s Inside the Black Box: A Simple Guide to Quantitative and High Frequency Trading (New York: John Wiley & Sons, 2013) opened my eyes to the structures needed for alpha generation. The Bayesian approach is detailed in David Barber’s Bayesian Reasoning and Machine Learning (New York: Cambridge University Press, 2012). Barry Johnson’s Algorithmic Trading and DMA: An Introduction to Direct Access Trading Strategies (London: 4Myeloma Press, 2010) and Robert Kissell’s The Science of Algorithmic Trading and Portfolio Management (San Diego, CA: Academic Press, 2013) deal with order types in algorithmic trading. Christian Dunis, Spiros Likothanassis, Andreas Karathanasopoulos, Georgios Sermpinis, and Konstantinos Theofilatos have edited a recent collection on Computational Intelligence Techniques for Trading and Investment (New York: Routledge, 2014). Eugene A. Durenard’s Professional Automated Trading: Theory and Practice (New York: John Wiley & Sons, 2013) covers software agents. For retail trader-oriented applications of data mining, machine learning, and Monte Carlo simulations there is Kevin Davey’s Building Algorithmic Trading Systems: A Trader’s Journey from Data Mining to Monte Carlo Simulation to Live Trading (New York: John Wiley & Sons, 2014), and David Aronson and Timothy Masters’ Statistically Sound Machine Learning for Algorithmic Trading of Financial Instruments: Developing Predictive-Model-Based Trading Systems Using TSSB (CreateSpace, 2013).


What this means is that for an investment of about $US1,000 a new researcher can gain some of the core books on institutional, quantitative portfolio and risk management; behavioural finance and market microstructure as potential sources for edges; and some recent practitioner-oriented literature on algorithmic / automated trading that uses computational intelligence.


In deference to Mao and McKenzie Wark’s vectoralist class:


Let a thousand algorithmic / quantitative micro-funds bloom.

15th June 2013: HFT, Disruptive Innovation & Theta Arbitrage

23rd July 2009 was perhaps the day that retail investors became aware of high-frequency trading (HFT).


That was the day that New York Times journalist Charles Duhigg published an article on HFT and market microstructure changes. Duhigg’s article sparked a public controversy about HFT and changes to United States financial markets.


Then on 6th May 2010 came the Flash Crash. HFT was again the villain.


For the past few years HFT has inspired both pro and con books from publishers. HFT has changed how some retail investors and portfolio managers at mutual and pension funds view financial markets. Now, Matthew Philips of Bloomberg Businessweek reports that 2009-10 may have been HFT’s high-point in terms of being a profitable strategy.


Philips’ findings illustrate several often overlooked aspects of Clayton Christensen‘s Disruptive Innovation Theory. Scott Patterson notes in his book Dark Pools (New York: Crown Business, 2012) that HFT arose due to a combination of entrepreneurial innovation; technological advances in computer processing power; and changes to US Securities and Exchanges Commission regulations. Combined, these advances enabled HFT firms to trade differently to other dotcom era and post-dotcom firms that still used human traders or mechanical trading systems. This trading arbitrage fits Christensen’s Disruptive Innovation Theory as a deductive, explanatory framework.


The usually overlooked aspect of Disruptive Innovation Theory is that this entrepreneurial investment and experimentation gave HFT firms a time advantage: theta arbitrage. HFT firms were able to engage for about a decade in predatory trading against mutual and pension funds. HFT also disrupted momentum traders, trend-followers, scalping day traders, statistical arbitrage, and some volatility trading strategies. This disruption of trading strategies led Brian R. Brown to focus on algorithmic and quantitative black boxes in his book Chasing The Same Signals (Hoboken, NJ: John Wiley & Sons, 2010).


Paradoxically, by the time Duhigg wrote his New York Times article, HFT had begun to lose its profitability as a trading strategy. Sociologist of finance Donald MacKenzie noted that HFT both required significant capex and opex investment for low-latency, and this entry barrier increased competition fueled ‘winner-takes-all’ and ‘race to the bottom’ competitive dynamics. HFT’s ‘early adopters’ got the theta arbitrage that the late-comers did not have, in a more visible and now hypercompetitive market.  Duhigg’s New York Times article wording and the May 2010 Flash crash also sparked an SEC regulatory debate:


  • On the pro side were The Wall Street Journal’s Scott Patterson; author Rishi K. Narang (Inside The Black Box); and industry exponent Edgar Perez (The Speed Traders).
  • On the con side were Haim Bodek of Decimus Capital Markets (The Problem With HFT), and Sal L. Arnuk and Joseph C. Saluzzi of Themis Trading (Broken Markets) which specialises in equities investment for mutual and pension fund clients.
  • The winner from the 2009-12 debate about HFT regulation appears to be Tradeworx‘s Manoj Narang who was both pro HFT yet who also licensed his firm’s systems to the SEC for market surveillance, as a regulatory arbitrage move. The SEC now uses Tradworx’ systems as part of the Market Information Data Analytics System (MIDAS, Philips reports.


Philips announced that HFT firms now have new targets: CTAs, momentum traders, swing traders, and news sentiment analytics. That might explain some recent changes I have seen whilst trading the Australian equities market. Christensen’s Disruptive Innovation Theory and theta arbitrage both mean that a trading strategy will be profitable for a time before changes in market microstructure, technology platforms, and transaction and execution costs mean that it is no longer profitable.

2nd July 2012: Ray Dalio

I’ve spent the past few weeks reading about Bridgewater hedge fund founder Ray Dalio who is notorious for his management principles (PDF). The Economist and Barron’s have profiled Dalio recently and he updated his model of how the economy works. Dalio also did extensive interviews for Maneet Ahuja‘s The Alpha Masters and Jack D. Schwager‘s Hedge Fund Market Wizards. Dalio’s secret is to find 15 different and uncorrelated alpha streams; to separate alpha from beta exposure; to have a 6-18 month timeframe for holding; and to control transaction and execution costs.

Foreclosure Of A Hedge Fund Dream

Media personalities who took a career detour into managing hedge funds are the latest casualty of the subprime fallout, reports New York Times journalist Andrew Ross Sorkin.

Sorkin profiles Ron Insana the former CNBC news anchor who founded Insana Capital Partners at the height of easy credit in 2006 and closed ICP in August 2008.  Insana raised $US116 million from major investor Deutsche Bank and media contacts.  Rather than invest directly in complex financial instruments Insana chose an intermediary position: a fund of funds investor in a diversified portfolio of hedge funds.

Insana made several errors that led to ICP’s blow-up.  Sorkin notes the US$116 million was a smaller capital raising than its blue chip competitors.  The fund of funds positioning meant a rational herds strategy on the hedge funds that ICP invested in.  Subprime-caused market volatility set off a cascade: the hedge funds didn’t make alpha returns above the market and ICP didn’t have the diversified portfolio to weather the volatility.  Consequently, ICP still had to pay out investors in full for their original investments (the ‘high water mark’ rule) before it could earn its ‘1.5 of 20’ fee (1.5% management fee on funds and 20% of fund profits).

Sorkin is insightful about the cost structures of hedge funds:

That would have been enough if it was just Mr. Insana, a secretary and
a dog. But Mr. Insana was hoping to attract more than $1 billion from
investors. And most big institutions won’t even consider investing in a
fund that doesn’t have a proper infrastructure: a compliance officer,
an accountant, analysts and so on. Mr. Insana had seven employees, and
was paying for office space in the former CNBC studios in Fort Lee,
N.J., and Bloomberg terminals — at more than $1,500 a pop a month —
while traveling the globe in search of investors. Under the
circumstances, $870,000 just wasn’t going to last very long.

This ‘contrarian’ observation highlights the leverage of institutional investors, and, in contrast to the usual media portrayal, the regulatory burdens of institutional compliance on funds.

Sorkin’s profile raises some interesting questions beyond his comparison of Insana and the media-savvy millionaires who blew-up after the April 2000 dotcom crash.  Did ICP adopt the trend following strategy from CNBC’s media coverage and Insana’s popular books?  If so, could Insana distinguish between market noise and critical events?  How did Insana grapple with the career change from CNBC news anchor to hedge fund head?  What risk mitigation steps did ICP’s investors demand, and did Insana exercise prudential caution? When he had to close ICP was Insana able to be self-critical about his past decisions and errrors?  Are there firm-specific, operational and positioning risks for fund of funds?  That would be a really interesting post-implementation review for aspiring hedge fund mavens.

Don’t expect to see it in CNBC European Business or Bloomberg Markets anytime soon.

Errors In Quantitative Models & Forecasting

Could the roots of the 2007 subprime crisis in collateralised debt obligations (CDOs) and residential mortgage-backed securities (RMBS) lie in financial analysts who all used similar assumptions and forecasts in their quantitative models?

Barron’s Bill Alpert argues so
, pointing to a shift of investment styles after the 2000 dotcom crash from sector-specific, momentum and growth stocks to value investing.  Investment managers who prefer the value approach then constructed their portfolios with ‘stocks that were cheap relative to their book value.’  In other words, the value investors exploited several factors — the gaps in asset valuation, asymmetries in public and private information sources, price discovery mechanisms and market participants — which contributed to mispriced stocks compared to their true value.

However, the value investing strategy had a blindspot: many of the stocks selected for investment portfolios also had a high exposure to credit and default risk.  The 2007 subprime crisis exposed this blindspot, which adversely affected value investors whose portfolios had stocks with a high degree of positive covariance.

Alpert quotes hedge fund manager Rick Bookstaber who believes that financial engineers have accelerated crises and systemic risks via the complex dynamics of new futures contracts, exotic options and swaps.  These new financial instruments create interlocking markets (capital, commodities, debt, equity, treasuries) which have the second-order effects of larger yield curve spreads and trading volatility.  Alpert and Bookstaber’s views echo Susan Strange‘s warnings a decade ago of ‘casino capitalism’  and ‘mad money’ as unconstrained forces in the international political economy.

Quantitative models also failed to foresee the 2007 subprime crisis due to excessive leverage, difficulties to achieve ‘alpha’ or above-market returns in market volatility, and the separation of risk management from the modelling process and testing.  Other commentators have raised the first two errors, which have led to changes in portfolio construction and market monitoring.  Nassim Nicholas Taleb has built a second career on the third error, with his Black Swan conjecture of high-impact events, randomness and uncertainty (see Taleb’s Long Now Foundation lecture The Future Has Always Been Crazier Than We Thought).

Alpert hints that these three errors may lead to several outcomes: (1) a new ‘arms race’ between investment managers to find the new ‘factors’ in order to construct resilient investment portfolios; (2) the integration of Taleb’s second-order creative thinking and risk management in the construction of financial models, in new companies and markets such as George Friedman’s risk boutique Stratfor; and (3) a new ‘best of breed’ manager who can make investment decisions in a global and macroeconomic environment of correlated and integrated financial markets.