17th April 2012: Is Wall Street Inefficient?

Wall Street

 

New York University’s Thomas Phillippon has a new paper that reaches counterintuitive conclusions about Wall Street:

 

Historically, the unit cost of intermediation has been somewhere between 1.3% and 2.3% of assets. However, this unit cost has been trending upward since 1970 and is now significantly higher than in the past. In other words, the finance industry of 1900 was just as able as the finance industry of 2010 to produce loans, bonds and stocks, and it was certainly doing it more cheaply. This is counter-intuitive, to say the least. How is it possible for today’s finance industry not to be significantly more efficient than the finance industry of John Pierpont Morgan? [emphasis added]

 

Phillippon’s study of financial intermediation costs emphasises debt growth, hidden systemic risks and the growth in trading as a secondary market activity. The New Republic‘s Timothy Noah also emphasised trading in his write-up of Phillippon’s results. The paper’s problem is Phillippon’s reliance on the “neo-classical growth model” which argues that information technology, trading, and risk management should lead to lower costs and superior information about future prices. A second problem is that Phillippon explains growth in ‘informativeness’ as a key criterion variable but he does not adequately define it. A third problem is that the paper examines two key data points – 1900-1910 and 1980-2011 – without considering how the innovation pathways in financial intermediation have also changed (both in-period and across-period). For instance, Value at Risk looked like a great innovation in 1992 but it was re-evaluated in 2007-08 during the global financial crisis. A fourth problem is that prices in trading are not always about future prices, or even the fair market value of firms, but they can reflect the market-moving tactics of hedge funds and other firms. A fifth problem is that the inefficiencies that Phillippon identifies lie partly in the fees and incentives that the mutual industry charges investors as revenue generation (and Wall Street’s incentivisation through end-of-year bonuses). Thus, any evaluation of financial intermediation efficiencies should take current market practices into account.

 

If Phillippon had used a market microstructure framework then he might have reached different conclusions about the paper’s aggregate data. Specific firms are able to leverage information technology, trading, and risk management to gain an edge on other market participants. They extract alpha (returns made from active management above and beyond the market index returns or beta). This creates a ‘winner takes all’ dynamic in which a small group of efficient firms do exceedingly well. However, the Schumpeterian dynamics of inter-firm competition means that factors like information technology do not simply lead over time to greater efficiencies and lower costs, as they did with Wal-Mart. Quantitative finance firms like Jim Simon’s Renaissance Technologies, Clifford Asness’s AQR Capital and David Shaw’s D.E. Shaw & Company spend millions on infrastructure and proprietary research to outpace their competitors. This creates ‘informativeness’ in the form of private knowledge that Phillippon’s models probably could not measure. Is this really a misallocation of capital?

 

Photo: apertu/Flickr.

Worth Reading

Strategist Edward Luttwak on Atilla the Hun and the status of military historiographers in academe.

Barry Saunders on journalism in an age of data abundance.

The Kevin Rudd essay (PDF) that the geopolitics journal Foreign Affairs rejected.

How the collapse of Lehman Bros. created a global shockwave and the anniversarial debate.

Why the StatArb hedge fund Renaissance Technologies red-flagged Bernie Madoff in 2003 and the SEC report (PDF).

Rewriting the political punditry of military historian Max Boot and neocon Paul Wolfowitz, who calls for a rethink on realist foreign policy, despite critics.

View some free university lectures on Academic Earth and YouTube Edu.