Picks & Pans

Losing Reality: On Cults, Cultism and the Mindset of Political and Religious Zealotry by Robert Jay Lifton (New York: The Free Press, 2019). Lifton is a United States psychiatrist who helped to conceptualise the 1970s and 1980s debate on ‘thought totalism’ and brainwashing. This small book is a collection of Lifton’s insights on topics ranging from the Korean War and Nazi doctors to Aum Shinrikyo and President Donald Trump’s political psychology. Lifton observes that we have a ‘protean’ Self that can change and transform under existential and psychosocial pressures. A doorway to understanding the contemporary metapolitical issues in liberal democratic, authoritarian, and totalitarian societies.

The Twittering Machine by Richard Seymour (London: The Indigo Press, 2019). Seymour is a United Kingdom and Marxist-influenced social critic who has previously profiled the UK Labour leader Jeremy Corbyn, prior to his disastrous 2019 election campaign. In this polemical book, Seymour looks at the addictive psychology that underpins the ‘social [media] industry’, and its emergent phenomena such as internet celebrities and trolling. One of the side-effects of this industry is a new immersive dynamic of writing, Seymour observes. This book is a reflective primer to think more deeply about how you interact with the internet and social media in a more mindful and strategic way.

Status: Why Is It Everywhere? Why Does It Matter? by Cecilia L. Ridgeway (New York: Russell Sage Foundation, 2019). Ridgeway is a Stanford University professor and sociologist and her publisher the Russell Sage Foundation is a major philanthropic investor in social inequality research. In this book Ridgeway advances a cultural schema of status as a form of social inequality, and how this informs the importance of status beliefs and the microdynamics of status. Ridgeway’s cultural schema framework builds on the earlier insights of sociologist Charles Tilly and others to explain how social stratification works in the United States.

The Man Who Solved The Market: How Jim Simons Launched The Quant Revolution by Gregory Zuckerman (New York: Penguin Books, 2019). Jim Simons is a former National Security Agency-affiliated cryptographer and Stony Brook University mathematician who in 1982 founded Renaissance Technologies: the world’s most profitable quantitative hedge fund. Zuckerman’s investigative reportage provides a glimpse of Renaissance’s black box and how Simons used pattern recognition to generate record profits from the financial markets. Robert Mercer – Renaissance’s co-Chief Executive Officer – was a major donor and strategist to President Donald Trump’s 2016 election campaign.

The Gig Academy: Mapping Labor in the Neoliberal University by Adrianna Kezar, Tom DePaola and Daniel T. Scott (Baltimore, MD: John Hopkins University, 2019). In the past two decades the elite status of academic tenure has steadily been eroded in the United States and in many other countries. This book surveys what has replaced it: a ‘neoliberal university’ of more short-term and fixed term contracts, a focus on obtaining external, competitive-based research funding, and resulting social stratification. The authors trace recent developments in the academic labour market to the ‘gig economy’: labour practices adopted from Uber and similar platform capitalists.

Poisoner In Chief: Sidney Gottlieb and the CIA Search for Mind Control by Stephen Kinzer (New York: Henry Holt and Company, 2019). Sidney Gottlieb (1918-1999) was a United States chemist who spearheaded the Central Intelligence Agency’s now infamous MK-Ultra research program. Kinzer fills in some gaps about Gottlieb’s life; the medical and ‘special interrogation’ projects he oversaw in MK-Ultra; and how he dealt with United States Senate and public investigations into MK-Ultra’s abuses and legacy. There is plenty of conspiratorial myth-making about what Gottlieb did and what he did (or did not) achieve: Kinzer’s investigative reportage gets closer than most to what probably happened.

Capitalism Alone: The Future of the System That Rules the World by Branko Milanovic (Boston, MA: Harvard University Press, 2019). Milanovic is an influential economist and senior scholar at the City University of New York’s Stone Center on Socio-Economic Inequality. In this book he examines the political economy success of Liberal Meritocratic Capitalism; its challenger in Political Capitalism; and the implications for globalisation and the future of the capitalist economic system. An insightful and empirical data-informed analysis of the ‘hypercommercial’ world that is highly likely to emerge in the 21st century.

Fortress Russia: Conspiracy Theories in the Post-Soviet World by Ilya Yablokov (Cambridge: Polity Press, 2018). United States political discourse since its 2016 election outcome has been dominated by allegations of Russia’s political meddling. Less well understood is the metapolitical function of conspiracy theories in Russia itself and in post-Soviet nation-states. This book based on Yablokov’s doctoral dissertation advances some new explanations as to why and it also profiles some of the more leading and influential practitioners. For contrasting views, see Eliot Borenstein’s recent book Plots Against Russia: Conspiracy and Fantasy After Socialism (Ithaca, NY: Cornell University Press, 2019) and Peter Pomerantsev’s This Is Not Propaganda: Adventures in the War Against Reality (London: Faber & Faber, 2019).

On Jim Simons, String Theory, and Quantitative Hedge Funds

Renaissance Technologies founder and mathematics professor Jim Simons is an enigma in quantitative hedge funds.

 

Simons rarely gives interviews. One of the best is an Institutional Investor interview he gave in 2000 (PDF). One insight is that Renaissance makes trades in specific time periods using pattern recognition to model volatility.

 

Simons has done important work in differential geometry and the theoretical physics subdiscipline of string theory. I recently looked at some academic journal articles by Lars Brink (Sweden’s Chalmers University of Technology) and Leonard Susskind (Stanford University) to try and understand how Simons views financial markets.

 

String theory proposes one-dimensional objects called strings as particle-like objects that have quantum states. String theory and cosmology has progressed over the past 35 years to describe this phenomena but still lacks some key insights.

 

How might Simons use string theory to understand financial markets? Two possibilities:

 

(1) The mathematical language of couplings, phase transitions, perturbations, rotational states, and supersymmetries provides a scientific way to describe financial market  data and price time-series. It does so in a different way to fundamental analysis, technical analysis, and behavioural finance: Simons uses string theory to understand the structure of information in financial markets. (Ed Thorp pursued a similar insight with Claude Shannon using probability theory.) String theory-oriented trading may be falsifiable in Karl Popper’s philosophy of science.

 

(2) String theory provides a topological model that can be applied to money flows between mutual funds, hedge funds, and bank trading desks over short periods of time. This might enable Simons’ traders to forecast the likely catalysts for changes in stock prices in the short-term and to trade accordingly. This might involve using string theory to forecast how price trajectories might change if portfolio managers at other funds alter their portfolio weights for a stock. In doing so, Simons is trading in a similar way to SAC’s Steve Cohen (who uses game theory) and D.E. Shaw’s David Shaw but uses different methods of pattern recognition to do so.

 

I have made a list of popular science books and Springer academic monographs to keep an eye on string theory. Simons’ success also illustrates how insights from one knowledge domain (string theory, astrophysics, computational linguistics, and voice recognition) can be transferred to another domain (financial markets trading).

17th April 2012: Is Wall Street Inefficient?

Wall Street

 

New York University’s Thomas Phillippon has a new paper that reaches counterintuitive conclusions about Wall Street:

 

Historically, the unit cost of intermediation has been somewhere between 1.3% and 2.3% of assets. However, this unit cost has been trending upward since 1970 and is now significantly higher than in the past. In other words, the finance industry of 1900 was just as able as the finance industry of 2010 to produce loans, bonds and stocks, and it was certainly doing it more cheaply. This is counter-intuitive, to say the least. How is it possible for today’s finance industry not to be significantly more efficient than the finance industry of John Pierpont Morgan? [emphasis added]

 

Phillippon’s study of financial intermediation costs emphasises debt growth, hidden systemic risks and the growth in trading as a secondary market activity. The New Republic‘s Timothy Noah also emphasised trading in his write-up of Phillippon’s results. The paper’s problem is Phillippon’s reliance on the “neo-classical growth model” which argues that information technology, trading, and risk management should lead to lower costs and superior information about future prices. A second problem is that Phillippon explains growth in ‘informativeness’ as a key criterion variable but he does not adequately define it. A third problem is that the paper examines two key data points – 1900-1910 and 1980-2011 – without considering how the innovation pathways in financial intermediation have also changed (both in-period and across-period). For instance, Value at Risk looked like a great innovation in 1992 but it was re-evaluated in 2007-08 during the global financial crisis. A fourth problem is that prices in trading are not always about future prices, or even the fair market value of firms, but they can reflect the market-moving tactics of hedge funds and other firms. A fifth problem is that the inefficiencies that Phillippon identifies lie partly in the fees and incentives that the mutual industry charges investors as revenue generation (and Wall Street’s incentivisation through end-of-year bonuses). Thus, any evaluation of financial intermediation efficiencies should take current market practices into account.

 

If Phillippon had used a market microstructure framework then he might have reached different conclusions about the paper’s aggregate data. Specific firms are able to leverage information technology, trading, and risk management to gain an edge on other market participants. They extract alpha (returns made from active management above and beyond the market index returns or beta). This creates a ‘winner takes all’ dynamic in which a small group of efficient firms do exceedingly well. However, the Schumpeterian dynamics of inter-firm competition means that factors like information technology do not simply lead over time to greater efficiencies and lower costs, as they did with Wal-Mart. Quantitative finance firms like Jim Simon’s Renaissance Technologies, Clifford Asness’s AQR Capital and David Shaw’s D.E. Shaw & Company spend millions on infrastructure and proprietary research to outpace their competitors. This creates ‘informativeness’ in the form of private knowledge that Phillippon’s models probably could not measure. Is this really a misallocation of capital?

 

Photo: apertu/Flickr.