I’ve started to use Twitter’s List feature on my main Twitter account (@alexburns). The lists includes alumni from the classic era where I edited and wrote for the Disinformation website (1998-2008); people who I met whilst in Swinburne University’s Strategic Foresight program (2002-04); and past collaborations and current projects. I’ll continue to add to current and new lists. Enjoy!
Should the Fed open a brokerage account? (Businessweek).
The Drezner-Bergsten interview on the international political economy (Foreign Policy).
The Australian dollar and the commodities boom: “it looks to be in the throes of burning itself out.” (Market Anthropology).
Australians evade 2007-09 global financial crisis (but not 2011-12 Eurozone debt crisis). (Bloomberg).
Twitter mischief plagues Mexico’s election (Technology Review).
Should the United States lift a ban on domestic propaganda? (Reason).
Romney’s Bain pioneered outsourcing (Washington Post).
Test, learn, adapt: policymaking and randomised scientific trials (UK Cabinet Office).
The growing, for-profit detention industry (Mother Jones).
A history of Wall Street market research (Minyanville).
The scam Wall Street brokers learned from the Mafia (Rolling Stone).
Over a year ago Ben Eltham and I did a conference paper on Twitter’s role in Iran’s 2009 election crisis. The paper proved too controversial for the conference’s refereed stream yet it has gone on to become our most widely read and cited paper.
Today, Paul Raymond posed some questions about Twitter and Wikileaks for a forthcoming article in Saudi Arabia’s magazine The Diplomat. Below are my email answers:
You express doubts that Twitter and other social network tools will “enable ordinary people to seize power from repressive regimes.” But what other political potentials do these networks have, in terms of broadening the public sphere for debate, mobilizing political networks, and helping to globalise civil society? What will be the results of these potentials for governments?
Twitter, Facebook, and other social networking tools certainly have the potential to broaden debate, mobilise political networks, and to globalise civil society. Perhaps they are today’s equivalent of the Cold War’s Radio Free Europe or Voice of America broadcasts. They are able to mobilise autonomous, self-regulative networks of people on a salient issue, and allow government agencies like the US State Department to reach a wider audience. However, these same qualities also mean that particularly for Twitter, social networks can be used to spread rumour and propaganda.US neoconservatives recognised these qualities in 2000 during their discussions on what a ‘next generation’ capability might resemble.Twitter’s interest in Iran gradually faded after the weeks of political uncertainty, as it became clear that Ahmadinejad’s regime would remain in power. Our conclusions echo the late sociologist and political scientist Charles Tilly’s work on political violence and repressive regimes.
The US State Department implicitly recognised Twitter’s importance when it asked Twitter to delay server upgrades – or at least, officials wanted to know what would happen next in the political cyber-laboratory of Iran. What would be a proper response by western governments to the results, including the “unintended uses” different actors gained from the network?
A ‘proper response’ may depend on the specific government agency. Whilst the US State Department was interested in public diplomacy, other agencies may have different agendas or uses for the same data. The US Department of Defense may be interested in the danger of social networking sites to be used for adversary propaganda and disinformation to international public audiences. A US intelligence agency may be interested in ‘contextual intelligence’ that may arise from diaspora networks, or alternatively, how many ‘tweets’ or messages can lead to ‘noise’. We tried to explore how the same data could be used in different ways depending on the aims and objectives of the specific end-user.
The State Department reacted very differently to the recent phenomenon of Wikileaks. What would be a proper governmental reponse to that kind of use of the internet?
The likely response of the US Government will probably be to charge Wikileaks publisher Julian Assange under the relevant espionage and national security legislation for releasing diplomatic information. In the short-term this will also mean increased security and restricted access in the US Government on a ‘need to know’ basis to diplomatic cables. In the long-term, the US Government could work with specific media and scholarly groups — the American Political Science Association, the Society for Historians of American Foreign Relations, The New York Times, The Washington Post, or George Washington University’s National Security Archive — to release declassified versions of the diplomatic cables in a more controlled and possibly ‘redacted’ manner. However, this might also require changes to US freedom of information laws and declassification schedules. Marc Trachtenberg at the University of California, Los Angeles, is an expert in these declassification issues for historians and political scientists.
What general rules would you suggest governments apply to make best use of the public diplomacy potential of social networking?
Use social networking tools to openly inform the public, such as Saudi Arabia’s initiatives on combatting terrorist financing and successful rehabilitation programs for ex-jihadists. Understand the limitations of social networking tools, such as their varied use by different groups, and how they can become disconnected in crisis situations from ‘on-the-ground’ events. Have mechanisms in place to identify, monitor and to counter disinformation and propaganda that may propagate on such social networks. Integrate social networking tools into a ‘hearts and minds’ strategy that uses a variety of media.
Can government’s efforts to crack down on freedom of information (such as the Chinese attacks on Google and the worldwide campaign against Wikileaks) work in the long run, or has the playing field been permanently leveled, giving civil society and opposition groups the ability to challenge governments’ influence over media agendas and foreign publics’ perceptions?
Constructivist scholars like Alexander Wendt, Peter Katzenstein and Martha Finnemore note the growing power of civil society groups to shape public perceptions and influence media agendas. Sophisticated governments may even work closely with aid organisations like the International Federation of Red Cross and Red Crescent Societies during a crisis. It depends on the context and the nature of the information being publicly released. Google’s problems with China were foreseeable since at least early 2006 because of how Google’s management handled earlier crises about the identities of Chinese human rights activists. The campaign against Wikileaks and its publisher Julian Assange appears in part because the information was released in an ‘unredacted’ form and not through an establishment source like Thomas Friedman or Bob Woodward. The realist scholar Stephen M. Walt and others have pointed out the hypocrisy of this: Assange and Wikileaks are being attacked whilst mainstream media institutions like The New York Times are not. Perhaps the challenge also is that the information Wikileaks has published is about recent and current events, and not the usual 20-30 year gap of normal declassification procedures. The public’s demand for ‘real-time’ information and more transparency is an opportunity for governments and public diplomats, should they decide to seize it.
Burns, Alex & Eltham, Ben (2009). ‘Twitter Free Iran: An Evaluation
of Twitter’s Role in Public Diplomacy and Information Operations in
Iran’s 2009 Election Crisis’. In Papandrea, Franco & Armstrong,
Mark (Eds.). Record of the Communications Policy & Research Forum
2009. Sydney: Network Insight Institute, pp. 298-310 [PDF pp. 322-334]. Presentation slides here.
Social media platforms such as Twitter pose new challenges for
decision-makers in an international crisis. We examine Twitter’s role
during Iran’s 2009 election crisis using a comparative analysis of
Twitter investors, US State Department diplomats, citizen activists and
Iranian protesters and paramilitary forces. We code for key events
during the election’s aftermath from 12 June to 5 August 2009, and
evaluate Twitter. Foreign policy, international political economy and
historical sociology frameworks provide a deeper context of how Twitter
was used by different users for defensive information operations and
public diplomacy. Those who believe Twitter and other social network
technologies will enable ordinary people to seize power from repressive
regimes should consider the fate of Iran’s protesters, some of whom
paid for their enthusiastic adoption of Twitter with their lives.
Burns, Alex & Saunders, Barry (2009). ‘Journalists as Investigators
and ‘Quality Media’ Reputation’. In Papandrea, Franco & Armstrong,
Mark (Eds.). Record of the Communications Policy & Research Forum
2009. Sydney: Network Insight Institute, pp. 281-297 [PDF pp. 305-321]. Presentation slides here.
The current ‘future of journalism’ debates focus on the crossover (or
lack thereof) of mainstream journalism practices and citizen
journalism, the ‘democratisation’ of journalism, and the ‘crisis in
innovation’ around the ‘death of newspapers’. This paper analyses a
cohort of 20 investigative journalists to understand their skills sets,
training and practices, notably where higher order research skills are
adapted from intelligence, forensic accounting, computer programming,
and law enforcement. We identify areas where different levels of
infrastructure and support are necessary within media institutions, and
suggest how investigative journalism enhances the reputation of
‘quality media’ outlets.
Floyd, Josh, Burns, Alex and Ramos, Jose (2008). A Challenging Conversation on Integral Futures: Embodied Foresight & Trialogues. Journal of Futures Studies, 13(2), 69-86.
Practitioner reflection is vital for knowledge frameworks such as Ken
Wilber’s Integral perspective. Richard Slaughter, Joseph Voros and
others have combined Wilber’s perspective and Futures Studies to create
Integral Futures as a new stance. This paper develops Embodied
Foresight as a new approach about the development of new Integral
Futures methodologies (or meta-methodologies) and practitioners, with a
heightened sensitivity to ethics and specific, local contexts. Three
practitioners conduct a ‘trialogue’ – a three-way deep dialogue – to
discuss issues of theory generation, practitioner development,
meta-methodologies, institutional limits, knowledge systems, and
archetypal pathologies. Personal experiences within the Futures Studies
and Integral communities, and in other initiatory and wisdom traditions
Personal Research Program
McKinsey to cut Conde Nast magazine budgets by 25%?
Editors take a red pen to Dan Brown and Sarah Palin.
Popmatters remembers ‘gonzo’ journalist Hunter S. Thompson.
Venezuela’s Dangerous Liaisons.
Mark Pesce pointed me to Bernard Lunn’s article which contends netizens now live in a real-time Web. Lunn suggests that journalists and traders are two models for information filtering in this environment, and that potential applications include real-time markets for digital goods, supply chain management and location-based service delivery.
Lunn’s analogy to journalists and traders has interested me for over a decade. In the mid-1990s I read the Australian theorist McKenzie Wark muse about CNN and how coverage of real-time events can reflexively affect the journalists who cover them. As the one-time editor for an Internet news site I wrote an undergraduate essay to reflect on its editorial process for decisions. I then looked at the case studies on analytic misperception during crisis diplomacy, intelligence, and policymaker decisions under uncertainty. For the past year, I’ve read and re-read work in behavioural finance, information markets and the sociology of traders: how the financial media outlets create noise which serious traders do not pay attention to (here and here), what traders actually do (here, here, and perhaps here on the novice-to-journeyman transition), and the information strategies of hedge fund mavens such as George Soros, Victor Niederhoffer, David Einhorn, Paul Tudor Jones II and Barton Biggs. This body of research is not so much about financial trading systems, as it is about the individual routines and strategies which journalists and traders have developed to cope with a real-time world. (Of course, technology can trump judgment, such as Wall Street’s current debate about high-frequency trade systems which leaves many traders’ expertise and strategies redundant.)
Lunn raises an interesting analogy: How are journalists and financial traders the potential models for living in a real-time world? He raises some useful knowledge gaps: “. . . we also need to master the ability to deal with a lot of real-time
information in a mode of relaxed concentration. In other words, we need
to study how great traders work.” The sources cited above indicate how some ‘great traders work’, at least in terms of what they explicitly espouse as their routines. To this body of work, we can add research on human factors and decision environments such as critical infrastructure, disaster and emergency management, and high-stress jobs such as air traffic control.
Making the wrong decisions in a crisis or real-time environment can cost lives.
It would be helpful if Lunn and others who use this analogy are informed about what good journalists and financial traders actually do. As it stands Lunn mixes his analogy with inferences and marketing copy that really do not convey the expertise he is trying to model. For instance, the traders above do not generally rely on Bloomberg or Reuters, which as information sources are more relevant to event-based arbitrage or technical analysts. (They might subscribe to Barron’s or the Wall Street Journal, as although the information in these outlets is public knowledge, there is still an attention-decision premia compared to other outlets.) Some traders don’t ‘turn off’ when they leave the trading room (now actually an electronic communication network), which leaves their spouses and families to question why anyone would want to live in a 24-7 real-time world. Investigative journalists do not generally write their scoops on Twitter. ‘Traditional’ journalists invest significant human capital in sources and confidential relationships which also do not show up on Facebook or Twitter. These are ‘tacit’ knowledge and routines which a Web 2.0 platform or another technology solution will not be the silver bullet for, anytime soon.
You might feel that I’m missing Lunn’s point, and that’s fine. In a way, I’m using his article to raise some more general concerns about sell-side analysts who have a ‘long’ position on Web 2.0. But if you want to truly understand and model expertise such as that of journalists and financial traders, then a few strategies may prove helpful. Step out of the headspace of advocacy and predetermined solutions — particularly if your analogy relies on a knowledge domain or field of expertise which is not your own. Be more like an anthropologist than a Web 2.0 evangelist or consultant: Understand (verstehen) and have empathy for the people and their expertise on its own terms, not what you may want to portray it as. Otherwise, you may miss the routines and practices which you are trying to model. And, rather than commentary informed by experiential insight, you may end up promoting some myths and hype cycles of your own.
Watching the hostility between ‘old media’ journalists and some Web 2.0 bloggers is often like watching Muzafer Sherif‘s Robbers Cave experiment. For bloggers, traditional journalists are constrained by objectivity, news values and institutional power, and traffic in biased op-ed columns and lightly rewritten corporate press releases. For journalists, bloggers don’t understand the norms and practices of the craft, don’t navigate the institutional shadow network, and vary greatly in the quality of their analytical insights. The two clashing stereotypes fuel a circular debate, which like Sherif’s experiment, may only change when a frame-changing exogenous threat is introduced.
For me, Quiggin makes three key points: (1) journalists have a socially recognised role to “pick up the phone” and talk to strangers; (2) journalists may select material from their interviews into a story and do not have to report everything; and (3) journalists have “a formal code of ethics and a set of informal conventions” to do this whilst bloggers do not. In doing so, I believe Quiggin adopts a middleground position similar to Terry Flew, Barry Saunders, Jason Wilson (from their YouDecide2007 project) and my own thoughts on citizen journalism, with some new insights.
My personal experience of Quiggin’s first point is that their role can empower journalists with Freedom to talk with anyone, and to view a situation through different, iterative stances. As I discovered during a 1994 student journalism stint and 1995 coverage of Noam Chomsky‘s Australian lecture tour, this is a great shock: in the right situation, people can tell you anything, and you can also become a a participant-observer who is now inside the unfolding events. It’s a little like Jim Carrey‘s character in the romantic comedy film Yes Man (2008): you ‘forget’ the self-limitations of yourself and act beyond normal social conventions. As Quiggin observes, few people can ask questions of strangers and expect to get revelatory answers.
This approach reaches its zenith in New Journalism as a methodology and repertoire of practices in three ways. First, the journalist may create the “story” through a catalytic, influential effect on the external environment. Second, the journalist can become part of the “story” through capturing their subjective consciousness, and trying to capture a similar stance from the other participants through internal dialogue, scene reconstructions and other techniques. Third, the journalist has more freedom in convention, methodology and practice, such as using fiction techniques in a non-fiction profile. At its core, New Journalism fuses autoethnography, anthropology and acting, which are facets that a blog publishing system might not capture.
Quiggin’s second observation is a major flashpoint in the debate: how journalists hone a story and select the facts to report. Bloggers turn to critical media studies for many of their arguments: objective news values, op-ed columnists and other biased sources, institutional forces, and a conservative implementation of web publishing capabilities. In turn, journalists point to the chatter/noise factor in blogs: they may be alternatives to op-ed columnists and newswire press releases yet do not yet replace areas that are resource-heavy and have mature practices, notably investigative journalism. Bloggers counterargue they have more freedom to use nonlinear narrative styles and to publish the raw sources. Perhaps one of the lessons from YouDecide2007 and AssignmentZero was that the editorial decision process to hone and select material is more nuanced in practice than Twitter‘s role as a first responder in disasters and emergencies.
How do journalists navigate such decision processes? Journalists have discipline-based norms, practices and ethics as barometers. This acts as a check and balance within newsroom culture and its role only becomes clear in a go/no-go decision where the editor has to weigh up the competing interests of different stakeholders and the potential outcomes of publication. In contrast, many bloggers appear to be driven by normative-based anchors (Web 2.0 compared with institutional journalism) and commons-based advocacy (education, sustainability, future generations). But belief alone in noosphere politics and networks may not be enough to surmount the different manifestations of power. If bloggers want to influence the objective universe they can learn much from journalist ethics and strategic nonviolence.
During a stint as Disinformation‘s site editor I learnt to monitor how analysts and experts respond to significant events. Analysts and experts can situate the significant event in relation to a discipline or knowledge area. So, it’s a strategy in which the event and the expertise are wayfinders to help learn about the discipline, in a contextual, real-time way.
For the past five days I’ve looked at Change.gov: how President-Elect Obama uses open government principles and strategic communication to implement his transition prior to the Inauguration on 20th January 2009. It’s not all gone smoothly: ProPublica‘s Mike Webb and BoingBoing‘s Xeni Jardin note that some early information on Obama Administration policies were removed (Slate confirmed this occurred). The Obama campaign’s Twitter page may be dead as the President-Elect now opts for more traditional media outlets. Despite this, Change.gov is a very intriguing project that generates lots of commentary in the media and policy circles.
As a real-time case study Change.gov may turn out to be a richer learning experience than an entire bookshelf of dotcom era books on change management projects, e-government transformation and e-policy ecosystems. Who will write the case study for Harvard Business School MBAs and Harvard Kennedy School policymakers? Will the Obama Administration license David Bowie‘s “Changes” as the site’s theme music?
A side-benefit of Change.gov is some really insightful media commentary about the games that new political appointees must play to thrive in the Beltway. Exhibit One: The New Republic‘s Noam Scheiber explains how Tim Geitner cultivates a keen political awareness for institutional buy-in and is a frontrunner for the US Treasury Secretary. Geitner’s insights are useful for change agents or anyone who wants to navigate organisational politics.
Has Tim O’Reilly’s Web 2.0 meme become a high-tech bubble about to burst?
Origins of the Web 2.0 Boom
O’Reilly’s vision of a new Web platform originally fused two developments.
The first development: C, Smalltalk and object oriented programmers devised design patterns in the early 1990s to reuse software code and workaround solutions across projects. A 1995 catalog catapulted its four authors to software engineering fame. To capture the rapidly growing number of design patterns programmer Ward Cunningham created the first wiki: the Portland Patterns Repository.
The second development: a re-evaluation of dotcom era business models to encompass new technologies that enhanced the end-user experience including the site interface and information architecture. Industry buzz around News Corporation’s acquisition of MySpace (18th July 2005), Yahoo!’s purchase of Flickr (21st March 2005) and del.ico.us (9th December 2005), and Google’s stock-for-stock deal for YouTube (9th October 2006) made O’Reilly’s vision the ‘default’ vision for Web pundits and investors.
The media’s buzz cycle soon went into warp speed as Facebook frenzy replaced MySpace mania. In a move that exemplified the pivotal role of complementors O’Reilly & Associates morphed into the juggernaut O’Reilly Media. Ajax and Ruby Rails soon replaced Java and C# as the languages for new programmers to learn. For activists in community-based media, angel investors investing in scalable programming prototypes and international conglomerates seeking to control their industry white-spaces Web 2.0 provided an all-encompassing answer to venture capitalists on how they would change the world.
Two Scenarios: Web 2.0 Boom & Bust
For industry pundits Google’s decision in October 2008 not to acquire Digg may signal the Web 2.0 boom has become a bubble. If true Google’s decision could be the mirror of News Corporation and Yahoo!’s acquisitions in 2005. Slate‘s Chris Anderson points to several factors: no tech IPOs in the second quarter of 2008, the cyclical nature of the digital consumer market, the exit of Yahoo! as a potential buyer due to internal problems, market noise due to low barriers of entry for startups, and a smaller “window of opportunity in which startups can think of a new neat trick, generate buzz, and cash out.” YouTube’s co-founder Jawed Karim adamently believes that Silicon Valley is in a bubble.
Twitter is the latest startup in the duelling scenarios of Web 2.0 boom versus bust. New York Times journalist Adam Lashinsky experiences a similar euphoria to Facebook and YouTube when he visits Twitter’s co-founder Jack Dorsey. Sceptics counter that Facebook and YouTube have not ‘monetised’ their business models into profitable revenues. Portfolio‘s Sam Gustin raises the ‘monetisation’ problem with Twitter co-founder Biz Stone who believes that service reliability is a priority over the “distraction” of revenue pressures. In support of Stone’s position Anderson observes that cloud computing and open source software are lowering the operational costs and slowing the burn rates of startups.
Yet monetisation remains a primary concern for Sand Hill Road entrepreneurs and other venture capitalists. They differ in their decision-making criteria to Web 2.0 pundits and high-tech futurists: for angel investors and first round VC funding the entrepreneurs will demand a solid management team, the execution ability to control an industry whitespace, and viable sources of future revenue growth. This is the realm of financial ratios and mark-to-market valuation rather than normative beliefs and ideals which probably influenced the acquiring firm’s decisions and valuation models in 2005-06.
Furthermore, if a Web 2.0 bust scenario is in play, the ‘contrarian’ sceptics will look to Charles Mackay, Charles P. Kindleberger, Joseph Stiglitz and other chroniclers of past bubbles, contagion and manias for guidance. With different frames and time horizons the Web 2.0 pundits, high-tech futurists and venture capitalists will continue to talk past each other, creating still more Twitter microblogging, blog posts and media coverage.
Several preliminary conclusions can be drawn from the Web 2.0 boom/bust debate. In a powerful case of futures thinking O’Reilly’s original Web 2.0 definition envisioned the conceptual frontier which enabled the social network or user-generated site of your choice to come into being. The successful Web 2.0 startups in Silicon Valley have a distinctive strategy comparable to their dotcom era counterparts in Los Angeles and New York’s Silicon Alley. Web 2.0 advocates who justify their stance with MySpace, YouTube and del.icio.us are still vulnerable to hindsight and survivorship biases. There’s a middle ground here to integrate the deep conceptual insights
of high-tech futurists with the quantitative precision of valuation
It’s possible that the high-visibility Web 2.0 acquisitions in 2005-06 were due to a consolidation wave and strategic moves/counter-moves by their acquirers in a larger competitive game. There are two precedents for this view. Industry deregulation sparked a mergers and acquisitions boom in Europe’s telecommunications sector in the late 1990s comparable to the mid-1980s leveraged buyout wave in the United States. Several factors including pension fund managers, day trading culture and the 1999 repeal of the US Glass-Steagall Act combined to accelerate the 1995-2000 dotcom bubble. Thus, analysts who want to understand the boom/bust dynamics need to combine elements and factors from Web 2.0 pundits, high tech futurists and venture capitalists.
If the Web 2.0 boom has become a bubble then all is not lost. Future entrepreneurs can take their cue from Newsweek journalist Daniel Gross and his book Pop! Why Bubbles Are Great for the Economy (Collins, New York, 2007): the wreckage from near-future busts may become the foundation of future bubbles. Web 3.0 debates are already in play and will soon be eclipsed by Ray Kurzweil‘s Transhumanist agenda for Web 23.0.