Duelling Web 2.0 Scenarios: Boom/Bust

Has Tim O’Reilly’s Web 2.0 meme become a high-tech bubble about to burst?

Origins of the Web 2.0 Boom

O’Reilly’s vision of a new Web platform originally fused two developments.

The first development: C, Smalltalk and object oriented programmers devised design patterns in the early 1990s to reuse software code and workaround solutions across projects.  A 1995 catalog catapulted its four authors to software engineering fame.  To capture the rapidly growing number of design patterns programmer Ward Cunningham created the first wiki: the Portland Patterns Repository.

The second development: a re-evaluation of dotcom era business models to encompass new technologies that enhanced the end-user experience including the site interface and information architecture.  Industry buzz around News Corporation’s acquisition of MySpace (18th July 2005), Yahoo!’s purchase of Flickr (21st March 2005) and del.ico.us (9th December 2005), and Google’s stock-for-stock deal for YouTube (9th October 2006) made O’Reilly’s vision the ‘default’ vision for Web pundits and investors.

The media’s buzz cycle soon went into warp speed as Facebook frenzy replaced MySpace mania.  In a move that exemplified the pivotal role of complementors O’Reilly & Associates morphed into the juggernaut O’Reilly Media.  Ajax and Ruby Rails soon replaced Java and C# as the languages for new programmers to learn.  For activists in community-based media, angel investors investing in scalable programming prototypes and international conglomerates seeking to control their industry white-spaces Web 2.0 provided an all-encompassing answer to venture capitalists on how they would change the world.

Two Scenarios: Web 2.0 Boom & Bust

For industry pundits Google’s decision in October 2008 not to acquire Digg may signal the Web 2.0 boom has become a bubble.  If true Google’s decision could be the mirror of News Corporation and Yahoo!’s acquisitions in 2005.  Slate‘s Chris Anderson points to several factors: no tech IPOs in the second quarter of 2008, the cyclical nature of the digital consumer market, the exit of Yahoo! as a potential buyer due to internal problems, market noise due to low barriers of entry for startups, and a smaller “window of opportunity in which startups can think of a new neat trick, generate buzz, and cash out.”  YouTube’s co-founder Jawed Karim adamently believes that Silicon Valley is in a bubble.

Twitter is the latest startup in the duelling scenarios of Web 2.0 boom versus bust. New York Times journalist Adam Lashinsky experiences a similar euphoria to Facebook and YouTube when he visits Twitter’s co-founder Jack Dorsey.  Sceptics counter that Facebook and YouTube have not ‘monetised’ their business models into profitable revenues.  Portfolio‘s Sam Gustin raises the ‘monetisation’ problem with Twitter co-founder Biz Stone who believes that service reliability is a priority over the “distraction” of revenue pressures.  In support of Stone’s position Anderson observes that cloud computing and open source software are lowering the operational costs and slowing the burn rates of startups.

Yet monetisation remains a primary concern for Sand Hill Road entrepreneurs and other venture capitalists.  They differ in their decision-making criteria to Web 2.0 pundits and high-tech futurists: for angel investors and first round VC funding the entrepreneurs will demand a solid management team, the execution ability to control an industry whitespace, and viable sources of future revenue growth.  This is the realm of financial ratios and mark-to-market valuation rather than normative beliefs and ideals which probably influenced the acquiring firm’s decisions and valuation models in 2005-06.

Furthermore, if a Web 2.0 bust scenario is in play, the ‘contrarian’ sceptics will look to Charles Mackay, Charles P. Kindleberger, Joseph Stiglitz and other chroniclers of past bubbles, contagion and manias for guidance.  With different frames and time horizons the Web 2.0 pundits, high-tech futurists and venture capitalists will continue to talk past each other, creating still more Twitter microblogging, blog posts and media coverage.

Several preliminary conclusions can be drawn from the Web 2.0 boom/bust debate.  In a powerful case of futures thinking O’Reilly’s original Web 2.0 definition envisioned the conceptual frontier which enabled the social network or user-generated site of your choice to come into being.  The successful Web 2.0 startups in Silicon Valley have a distinctive strategy comparable to their dotcom era counterparts in Los Angeles and New York’s Silicon Alley.  Web 2.0 advocates who justify their stance with MySpace, YouTube and del.icio.us are still vulnerable to hindsight and survivorship biases. There’s a middle ground here to integrate the deep conceptual insights
of high-tech futurists with the quantitative precision of valuation
models.

It’s possible that the high-visibility Web 2.0 acquisitions in 2005-06 were due to a consolidation wave and strategic moves/counter-moves by their acquirers in a larger competitive game.  There are two precedents for this view.  Industry deregulation sparked a mergers and acquisitions boom in Europe’s telecommunications sector in the late 1990s comparable to the mid-1980s leveraged buyout wave in the United States.  Several factors including pension fund managers, day trading culture and the 1999 repeal of the US Glass-Steagall Act combined to accelerate the 1995-2000 dotcom bubble.  Thus, analysts who want to understand the boom/bust dynamics need to combine elements and factors from Web 2.0 pundits, high tech futurists and venture capitalists.

If the Web 2.0 boom has become a bubble then all is not lost.  Future entrepreneurs can take their cue from Newsweek journalist Daniel Gross and his book Pop! Why Bubbles Are Great for the Economy (Collins, New York, 2007): the wreckage from near-future busts may become the foundation of future bubbles.  Web 3.0 debates are already in play and will soon be eclipsed by Ray Kurzweil‘s Transhumanist agenda for Web 23.0.

Ebook Textbooks & The Market for Lemons

The software consultant Ed Yourdon once warned US programmers in his book Decline and Fall of the American Programmer (1992) that they faced global hypercompetition.  This was a fashionable message in the turbulent early 1990s of industry deregulation, export tariffs, mega-mergers, downsizing and reengineering.  Spenglerian pessimism made Decline and Fall an IT bestseller as Eastern European and Russian computer programmers emerged as low cost competition with their US counterparts.  Now in Thomas Friedman‘s vision of a flatter world the Eastern European and Russian computer programmers have help from an unlikely source: electronic copies of IT textbooks.

Several barriers mean that US textbook publishers are cautious about embracing ebook versions.  Publishers fear the Napsterisation of ebooks on peer-to-peer networks.  There’s no standard ebook device although Amazon’s Kindle is the latest candidate.  There’s no standard ebook format: most use Adobe PDF, however when Acrobat 8 was released Adobe shifted its ebook functionality to a new Digital Reader that did not necessarily read a user’s existing ebook collection.  Potential customers do not have a utility function to necessarily favour ebooks over printed copies: publishers charge high prices for ebook versions that may contribute a higher contribution margin to profits but that give the customer little price differential compared with print counterparts.

The implementation of digital rights management (DRM) also leaves much to be desired: McGraw-Hill’s Primis uses a digital fingerprint on a hard-drive that voids an ebook even if reinstalled on a reformatted drive due to a virus, whilst Thomson’s Cengage Learning uses a time-sensitive model which gives the user access for one semester to an ebook with the full price of its exact print version.  Publishers are also slow to adjust cross-currency rates: Australian textbooks still cost $A120-$200 despite near parity between the Australian and US dollars.

Thus, it’s no surprise that ebook divisions remain small in multinational publishing conglomerates.  One exception is Harvard Business School Press which appears to have ditched Sealed Media’s DRM plugin for Adobe Acrobat after Oracle acquired SM in August 2006 and then had integration problems with information rights management.

These barriers suggest a failure in market design with analogies to George Akerlof‘s study of the used car market in his influential paper The Market for Lemons (1970).  Publishers counter that although there is a lack of ebook standards similar to Akerlof’s paper the economics of publishing provide a disincentive to lower prices.  They claim high fixed costs in printing, photography rights and licensing fees for the case studies taken from Businessweek, Fortune and The Wall Street Journal.  Author fees and promotional budgets to professional associations add variable costs –  however, Australian academics have a disincentive to publish textbooks compared with their US colleagues, as Australia’s Department of Education, Employment & Workplace Relations does not provide recognition points.

To survive US textbook publishers have turned to global market models with regional editions of popular texts (such as Asia-Pacific editions with local coauthors), and adopted the music industry’s business model of electronic and online content (similar to how record labels have released Dualdisc, DVD and collectors editions of albums).  However as Yourdon warned US programmers this may not be a business model with longterm sustainability.  MIT’s OpenCourseWare, Apple’s iTunesU and Scribd all provide free content that mirrors the generic content in most textbooks, although some differentiate via a problem-based approach.

Yourdon’s ‘challenger’ computer programmers now also have illegal BitTorrent sites such as The Pirate Bay, filehosting networks such as Rapidshare, and ebook sites including Avaxsphere.com and PDFCHM to choose from.  The last two provide solutions to Akerlof’s challenge in market design: they have an easier user interface, a broader (illegal) catalogue of ebook titles, and DRM-free files compared to Cengage Learning or McGraw-Hill.  Even business strategists are getting in on the act, as Clayton Christensen, Curtis Johnson & Michael Horn explore in Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns (McGraw-Hill, New York, 2008).

There’s one textbook coauthor who came up with a unique solution to Akerlof’s dilemma in market design.  His Macroeconomics book coauthors Andrew Abel and Dean Croushore opted for the mod-cons from publisher Addison-Wesley: an online site and a one-semester ebook version as a bundle deal.  The textbook coauthor?

Federal Reserve Chairman Ben Bernanke.