Worth Reading

The Wall Street Journal on the boom in software platforms for open source intelligence in finance, regulatory compliance and intelligence analysis, such as Palantir Technologies.

Search the Global Terrorism Database of the National Consortium for the Study of Terrorism and Responses to Terrorism (START) at the University of Maryland.

Oliver Stone returns to Wall Street with the sequel Money Never Sleeps.

How 9/11 conspiracy theories may have ended Obama’s appointment of ‘green’ expert Van Jones.

Security maven Bruce Schneier on Australian counterinsurgency expert David Kilcullen (with thanks to Barry Saunders).

Chronicle of Higher Education on Facebooking your way out of (academic) tenure.

We Are All Traders Now?

Mark Pesce pointed me to Bernard Lunn’s article which contends netizens now live in a real-time Web. Lunn suggests that journalists and traders are two models for information filtering in this environment, and that potential applications include real-time markets for digital goods, supply chain management and location-based service delivery.

Lunn’s analogy to journalists and traders has interested me for over a decade. In the mid-1990s I read the Australian theorist McKenzie Wark muse about CNN and how coverage of real-time events can reflexively affect the journalists who cover them. As the one-time editor for an Internet news site I wrote an undergraduate essay to reflect on its editorial process for decisions. I then looked at the case studies on analytic misperception during crisis diplomacy, intelligence, and policymaker decisions under uncertainty. For the past year, I’ve read and re-read work in behavioural finance, information markets and the sociology of traders: how the financial media outlets create noise which serious traders do not pay attention to (here and here), what traders actually do (here, here, and perhaps here on the novice-to-journeyman transition), and the information strategies of hedge fund mavens such as George Soros, Victor Niederhoffer, David Einhorn, Paul Tudor Jones II and Barton Biggs. This body of research is not so much about financial trading systems, as it is about the individual routines and strategies which journalists and traders have developed to cope with a real-time world. (Of course, technology can trump judgment, such as Wall Street’s current debate about high-frequency trade systems which leaves many traders’ expertise and strategies redundant.)

Lunn raises an interesting analogy: How are journalists and financial traders the potential models for living in a real-time world? He raises some useful knowledge gaps: “. . . we also need to master the ability to deal with a lot of real-time
information in a mode of relaxed concentration. In other words, we need
to study how great traders work.” The sources cited above indicate how some ‘great traders work’, at least in terms of what they explicitly espouse as their routines. To this body of work, we can add research on human factors and decision environments such as critical infrastructure, disaster and emergency management, and high-stress jobs such as air traffic control.

Making the wrong decisions in a crisis or real-time environment can cost lives.

It would be helpful if Lunn and others who use this analogy are informed about what good journalists and financial traders actually do. As it stands Lunn mixes his analogy with inferences and marketing copy that really do not convey the expertise he is trying to model. For instance, the traders above do not generally rely on Bloomberg or Reuters, which as information sources are more relevant to event-based arbitrage or technical analysts. (They might subscribe to Barron’s or the Wall Street Journal, as although the information in these outlets is public knowledge, there is still an attention-decision premia compared to other outlets.) Some traders don’t ‘turn off’ when they leave the trading room (now actually an electronic communication network), which leaves their spouses and families to question why anyone would want to live in a 24-7 real-time world. Investigative journalists do not generally write their scoops on Twitter. ‘Traditional’ journalists invest significant human capital in sources and confidential relationships which also do not show up on Facebook or Twitter. These are ‘tacit’ knowledge and routines which a Web 2.0 platform or another technology solution will not be the silver bullet for, anytime soon.

You might feel that I’m missing Lunn’s point, and that’s fine. In a way, I’m using his article to raise some more general concerns about sell-side analysts who have a  ‘long’ position on Web 2.0. But if you want to truly understand and model expertise such as that of journalists and financial traders, then a few strategies may prove helpful. Step out of the headspace of advocacy and predetermined solutions — particularly if your analogy relies on a knowledge domain or field of expertise which is not your own. Be more like an anthropologist than a Web 2.0 evangelist or consultant: Understand (verstehen) and have empathy for the people and their expertise on its own terms, not what you may want to portray it as. Otherwise, you may miss the routines and practices which you are trying to model. And, rather than commentary informed by experiential insight, you may end up promoting some myths and hype cycles of your own.

Duelling Web 2.0 Scenarios: Boom/Bust

Has Tim O’Reilly’s Web 2.0 meme become a high-tech bubble about to burst?

Origins of the Web 2.0 Boom

O’Reilly’s vision of a new Web platform originally fused two developments.

The first development: C, Smalltalk and object oriented programmers devised design patterns in the early 1990s to reuse software code and workaround solutions across projects.  A 1995 catalog catapulted its four authors to software engineering fame.  To capture the rapidly growing number of design patterns programmer Ward Cunningham created the first wiki: the Portland Patterns Repository.

The second development: a re-evaluation of dotcom era business models to encompass new technologies that enhanced the end-user experience including the site interface and information architecture.  Industry buzz around News Corporation’s acquisition of MySpace (18th July 2005), Yahoo!’s purchase of Flickr (21st March 2005) and del.ico.us (9th December 2005), and Google’s stock-for-stock deal for YouTube (9th October 2006) made O’Reilly’s vision the ‘default’ vision for Web pundits and investors.

The media’s buzz cycle soon went into warp speed as Facebook frenzy replaced MySpace mania.  In a move that exemplified the pivotal role of complementors O’Reilly & Associates morphed into the juggernaut O’Reilly Media.  Ajax and Ruby Rails soon replaced Java and C# as the languages for new programmers to learn.  For activists in community-based media, angel investors investing in scalable programming prototypes and international conglomerates seeking to control their industry white-spaces Web 2.0 provided an all-encompassing answer to venture capitalists on how they would change the world.

Two Scenarios: Web 2.0 Boom & Bust

For industry pundits Google’s decision in October 2008 not to acquire Digg may signal the Web 2.0 boom has become a bubble.  If true Google’s decision could be the mirror of News Corporation and Yahoo!’s acquisitions in 2005.  Slate‘s Chris Anderson points to several factors: no tech IPOs in the second quarter of 2008, the cyclical nature of the digital consumer market, the exit of Yahoo! as a potential buyer due to internal problems, market noise due to low barriers of entry for startups, and a smaller “window of opportunity in which startups can think of a new neat trick, generate buzz, and cash out.”  YouTube’s co-founder Jawed Karim adamently believes that Silicon Valley is in a bubble.

Twitter is the latest startup in the duelling scenarios of Web 2.0 boom versus bust. New York Times journalist Adam Lashinsky experiences a similar euphoria to Facebook and YouTube when he visits Twitter’s co-founder Jack Dorsey.  Sceptics counter that Facebook and YouTube have not ‘monetised’ their business models into profitable revenues.  Portfolio‘s Sam Gustin raises the ‘monetisation’ problem with Twitter co-founder Biz Stone who believes that service reliability is a priority over the “distraction” of revenue pressures.  In support of Stone’s position Anderson observes that cloud computing and open source software are lowering the operational costs and slowing the burn rates of startups.

Yet monetisation remains a primary concern for Sand Hill Road entrepreneurs and other venture capitalists.  They differ in their decision-making criteria to Web 2.0 pundits and high-tech futurists: for angel investors and first round VC funding the entrepreneurs will demand a solid management team, the execution ability to control an industry whitespace, and viable sources of future revenue growth.  This is the realm of financial ratios and mark-to-market valuation rather than normative beliefs and ideals which probably influenced the acquiring firm’s decisions and valuation models in 2005-06.

Furthermore, if a Web 2.0 bust scenario is in play, the ‘contrarian’ sceptics will look to Charles Mackay, Charles P. Kindleberger, Joseph Stiglitz and other chroniclers of past bubbles, contagion and manias for guidance.  With different frames and time horizons the Web 2.0 pundits, high-tech futurists and venture capitalists will continue to talk past each other, creating still more Twitter microblogging, blog posts and media coverage.

Several preliminary conclusions can be drawn from the Web 2.0 boom/bust debate.  In a powerful case of futures thinking O’Reilly’s original Web 2.0 definition envisioned the conceptual frontier which enabled the social network or user-generated site of your choice to come into being.  The successful Web 2.0 startups in Silicon Valley have a distinctive strategy comparable to their dotcom era counterparts in Los Angeles and New York’s Silicon Alley.  Web 2.0 advocates who justify their stance with MySpace, YouTube and del.icio.us are still vulnerable to hindsight and survivorship biases. There’s a middle ground here to integrate the deep conceptual insights
of high-tech futurists with the quantitative precision of valuation
models.

It’s possible that the high-visibility Web 2.0 acquisitions in 2005-06 were due to a consolidation wave and strategic moves/counter-moves by their acquirers in a larger competitive game.  There are two precedents for this view.  Industry deregulation sparked a mergers and acquisitions boom in Europe’s telecommunications sector in the late 1990s comparable to the mid-1980s leveraged buyout wave in the United States.  Several factors including pension fund managers, day trading culture and the 1999 repeal of the US Glass-Steagall Act combined to accelerate the 1995-2000 dotcom bubble.  Thus, analysts who want to understand the boom/bust dynamics need to combine elements and factors from Web 2.0 pundits, high tech futurists and venture capitalists.

If the Web 2.0 boom has become a bubble then all is not lost.  Future entrepreneurs can take their cue from Newsweek journalist Daniel Gross and his book Pop! Why Bubbles Are Great for the Economy (Collins, New York, 2007): the wreckage from near-future busts may become the foundation of future bubbles.  Web 3.0 debates are already in play and will soon be eclipsed by Ray Kurzweil‘s Transhumanist agenda for Web 23.0.