A Rejoinder to Bernard Keane’s ASIO Claims

During a 2006 Monash postgraduate class on intelligence analysis our adviser made several observations on how the Australian Security Intelligence Agency (ASIO) is misrepresented and misunderstood. The ‘S’ stood for domestic security not secrecy. ASIO had an accountability and audit regime at multiple levels: legislative limits, the Treasury budget process, appeals processes, external audits and supply contract review, and reporting to the public and to bipartisan government committees. Australia’s intelligence resources were mo stly deployed in military agencies for signals intelligence. Finally, media coverage of ASIO rarely evolves to the sophistication seen in the United States and the United Kingdom.

Bernard Keane’s Crikey article ‘The Answer is ASIO‘ (24th February 2010) risks continuing this trend in media coverage of intelligence issues. I want to illustrate below how Keane’s own arguments can be interpreted as having their own “deeply-flawed logic” in his accusations of Labor’s “security propaganda.”

Continue reading “A Rejoinder to Bernard Keane’s ASIO Claims”

Academic Publications 2009

Burns, Alex & Eltham, Ben (2009). ‘Twitter Free Iran: An Evaluation
of Twitter’s Role in Public Diplomacy and Information Operations in
Iran’s 2009 Election Crisis’
. In Papandrea, Franco & Armstrong,
Mark (Eds.). Record of the Communications Policy & Research Forum
2009
. Sydney: Network Insight Institute, pp. 298-310 [PDF pp. 322-334]. Presentation slides here.

Social media platforms such as Twitter pose new challenges for
decision-makers in an international crisis. We examine Twitter’s role
during Iran’s 2009 election crisis using a comparative analysis of
Twitter investors, US State Department diplomats, citizen activists and
Iranian protesters and paramilitary forces. We code for key events
during the election’s aftermath from 12 June to 5 August 2009, and
evaluate Twitter. Foreign policy, international political economy and
historical sociology frameworks provide a deeper context of how Twitter
was used by different users for defensive information operations and
public diplomacy. Those who believe Twitter and other social network
technologies will enable ordinary people to seize power from repressive
regimes should consider the fate of Iran’s protesters, some of whom
paid for their enthusiastic adoption of Twitter with their lives.

Burns, Alex & Saunders, Barry (2009). ‘Journalists as Investigators
and ‘Quality Media’ Reputation’
. In Papandrea, Franco & Armstrong,
Mark (Eds.). Record of the Communications Policy & Research Forum
2009
. Sydney: Network Insight Institute, pp. 281-297 [PDF pp. 305-321]. Presentation slides here.

The current ‘future of journalism’ debates focus on the crossover (or
lack thereof) of mainstream journalism practices and citizen
journalism, the ‘democratisation’ of journalism, and the ‘crisis in
innovation’ around the ‘death of newspapers’. This paper analyses a
cohort of 20 investigative journalists to understand their skills sets,
training and practices, notably where higher order research skills are
adapted from intelligence, forensic accounting, computer programming,
and law enforcement. We identify areas where different levels of
infrastructure and support are necessary within media institutions, and
suggest how investigative journalism enhances the reputation of
‘quality media’ outlets.


A 2008 academic publication that made the Top 25 downloaded papers of the past year on Victoria University’s institutional repository:

Floyd, Josh, Burns, Alex and Ramos, Jose (2008). A Challenging Conversation on Integral Futures: Embodied Foresight & Trialogues. Journal of Futures Studies, 13(2), 69-86.

Practitioner reflection is vital for knowledge frameworks such as Ken
Wilber’s Integral perspective. Richard Slaughter, Joseph Voros and
others have combined Wilber’s perspective and Futures Studies to create
Integral Futures as a new stance. This paper develops Embodied
Foresight as a new approach about the development of new Integral
Futures methodologies (or meta-methodologies) and practitioners, with a
heightened sensitivity to ethics and specific, local contexts. Three
practitioners conduct a ‘trialogue’ – a three-way deep dialogue – to
discuss issues of theory generation, practitioner development,
meta-methodologies, institutional limits, knowledge systems, and
archetypal pathologies. Personal experiences within the Futures Studies
and Integral communities, and in other initiatory and wisdom traditions
are explored.

Fast & Fearful

Australia’s current affairs program 4 Corners ran a story this week on Internet hackers which has backfired.

4 Corners reporter Andrew Fowler contends in the report that cybercrime is one rise, and may explode when Australia’s long-delayed National Broadband Network (NBN) launches sometime before December 21, 2012 or Skynet takes over the world’s computer networks. Fowler’s report is a mix of commentaries from victims of denial-of-service attacks and identity fraud; ethical hackers who are employed by companies to test their information systems security; vendors who provide virus protection software; and a jount investigation by the Australian Federal Police (AFP) and Victorian Police into a warez site for hackers.

Detective Superintendent Brian Hay of Queensland Police’s Fraud and Corporate Crime Squad sums up the report’s mood: “I expect to see at some stage in the future there will be a real debate on the future of the internet, should we turn it off?”

Over four years ago I looked at this area, as part of a research team on internet futures see the report‘s section ‘Chaos Rules’. The experts the tean interviewed had sometimes expressed similar thoughts to Hay. Despite the mention of NBN this was the same kind of report which could have been filmed in 2004 or 1999, as PBS Frontline did in 2003 in its Cyber War! report. The journalistic genre extends to the choice of edits, music and images to portray the vulnerability of the technologies.

Several other things struck me about Fowler’s 4 Corners report. Many of the sources had an interest in raising the threat levels of identity theft and denial-of-service attacks. The program’s case studies raised other potential sources — banks, customer service teams in financial intermediaries, and telecommunications infrastructure providers — which Fowler did not pursue. High-profile experts who might have a more informed and critical viewpoint, such as hacker Kevin Mitnick and security maven Bruce Schneier, were missing. Perhaps Fowler’s researchers did not have the leads or production budget. For me, the result was that whilst Fowler raised important issues about internet security, he also went for the low-hanging fruit and with a cliched editorial format.

Hackers retaliated and broke into AFP computers only 24 hours after Fowler’s report screened. The incident raises some further questions. Under what conditions is the short-term ‘publicity dividend’ of police cooperation in a journalist story worth the risk of a retaliatory tit-for-tat attack? To prevent unauthorised and external access, will police intelligence on the investigation (continue to) be kept on a secure computer with no online or network connections? Should a police team maintain a low-key, covert presence to monitor underground hacking sites, or instead alert site members as a deterrent? And, given this latest development, will Fowler’s team file a follow-up report?

We Are All Traders Now?

Mark Pesce pointed me to Bernard Lunn’s article which contends netizens now live in a real-time Web. Lunn suggests that journalists and traders are two models for information filtering in this environment, and that potential applications include real-time markets for digital goods, supply chain management and location-based service delivery.

Lunn’s analogy to journalists and traders has interested me for over a decade. In the mid-1990s I read the Australian theorist McKenzie Wark muse about CNN and how coverage of real-time events can reflexively affect the journalists who cover them. As the one-time editor for an Internet news site I wrote an undergraduate essay to reflect on its editorial process for decisions. I then looked at the case studies on analytic misperception during crisis diplomacy, intelligence, and policymaker decisions under uncertainty. For the past year, I’ve read and re-read work in behavioural finance, information markets and the sociology of traders: how the financial media outlets create noise which serious traders do not pay attention to (here and here), what traders actually do (here, here, and perhaps here on the novice-to-journeyman transition), and the information strategies of hedge fund mavens such as George Soros, Victor Niederhoffer, David Einhorn, Paul Tudor Jones II and Barton Biggs. This body of research is not so much about financial trading systems, as it is about the individual routines and strategies which journalists and traders have developed to cope with a real-time world. (Of course, technology can trump judgment, such as Wall Street’s current debate about high-frequency trade systems which leaves many traders’ expertise and strategies redundant.)

Lunn raises an interesting analogy: How are journalists and financial traders the potential models for living in a real-time world? He raises some useful knowledge gaps: “. . . we also need to master the ability to deal with a lot of real-time
information in a mode of relaxed concentration. In other words, we need
to study how great traders work.” The sources cited above indicate how some ‘great traders work’, at least in terms of what they explicitly espouse as their routines. To this body of work, we can add research on human factors and decision environments such as critical infrastructure, disaster and emergency management, and high-stress jobs such as air traffic control.

Making the wrong decisions in a crisis or real-time environment can cost lives.

It would be helpful if Lunn and others who use this analogy are informed about what good journalists and financial traders actually do. As it stands Lunn mixes his analogy with inferences and marketing copy that really do not convey the expertise he is trying to model. For instance, the traders above do not generally rely on Bloomberg or Reuters, which as information sources are more relevant to event-based arbitrage or technical analysts. (They might subscribe to Barron’s or the Wall Street Journal, as although the information in these outlets is public knowledge, there is still an attention-decision premia compared to other outlets.) Some traders don’t ‘turn off’ when they leave the trading room (now actually an electronic communication network), which leaves their spouses and families to question why anyone would want to live in a 24-7 real-time world. Investigative journalists do not generally write their scoops on Twitter. ‘Traditional’ journalists invest significant human capital in sources and confidential relationships which also do not show up on Facebook or Twitter. These are ‘tacit’ knowledge and routines which a Web 2.0 platform or another technology solution will not be the silver bullet for, anytime soon.

You might feel that I’m missing Lunn’s point, and that’s fine. In a way, I’m using his article to raise some more general concerns about sell-side analysts who have a  ‘long’ position on Web 2.0. But if you want to truly understand and model expertise such as that of journalists and financial traders, then a few strategies may prove helpful. Step out of the headspace of advocacy and predetermined solutions — particularly if your analogy relies on a knowledge domain or field of expertise which is not your own. Be more like an anthropologist than a Web 2.0 evangelist or consultant: Understand (verstehen) and have empathy for the people and their expertise on its own terms, not what you may want to portray it as. Otherwise, you may miss the routines and practices which you are trying to model. And, rather than commentary informed by experiential insight, you may end up promoting some myths and hype cycles of your own.

Don Tapscott’s Transformation Agenda for Risk Management in Financial Institutions

Paul Roberts pointed me to this Don Tapscott video about how wiki-type collaborative knowledge might transform risk management in financial institutions. Tapscott draws on his coauthored book Wikinomics (2008) to pose the following points:

(1). Financial institutions need to share their intellectual property (IP) about risk management in a commons-based model similar to the Human Genome Project or Linux.

(2). The key IP are algorithms and ratings system for risk.

(3). In response to an objection that the key IP should remain proprietary, Tapscott points to the failure of algorithms and rating systems to prevent the systemic risk of the global financial crisis.

(4). Tapscott appeals to financial institutions to act as peers — “a rising tide lifts all boats” — and that through sharing this information, they can compete more ethically in new markets, reinvent their industry, transform the practices in risk management, and act with a “new modus operandi.”

Tapscott is a persuasive business strategist who manages above to integrate his advocacy of “wikinomics” with the current debate on financial institutions, and his earlier, mid-1990s work on how technology would transform business. He echoes Umair Haque’s call for a Finance 2.0 based on transparency and social innovation in financial markets.

Here are thoughts, some ‘contrarian’, on each of Tapscott’s points.

(1). Read Burton Malkiel or the late Peter L. Bernstein and you will see that finance is driven to innovate new instruments, methodologies and institutions to hedge or arbitrage risk. Some of these are commons-based such as the actuarial development of insurance. Some innovations are now blamed for the problem, such as RiskMetricsValue at Risk methodology. The Basel II Accord which attempts to provide an international regulatory framework raises an interesting question: Under what conditions can a commons-based approach be successfully implemented in an institutional form and practices? Off-balance sheet items and special investment vehicles are two potential barriers to this goal. As for Tapscott’s examples, their success is due to a combination of public and private approaches, such as the parallel research by the National Institutes of Health‘s Human Genome Project and Craig Venter‘s Celera Corporation. This combination dynamic can be left out of an advocacy stance for a commons-based solution.

(2). Tapscott and Haque are correct to identify these as points of leverage. Some of the algorithms and rating systems are public information, such as Google Finance and Morningstar metrics, and trader algorithms on public sites. There are however several potential barriers to Tapscott and Haque’s commons-based view. Investors will have different risk appetites and decision/judgment frames despite access to the same public information. Philip Augar discloses in The Greed Merchants (Portfolio, New York, 2005) that proprietary algorithms rarely remain as private knowledge within institutions unless the knowledge is kept tacit, or in the case of ex-Goldman Sachs trader Sergey Aleynikov, through lawsuits. Aleynikov’s expertise in high-frequency trading which uses complex algorithms and co-located computer systems highlights other barriers: access to technology, information arbitrage, learning curves, and market expertise. As Victor Niederhoffer once observed, this advantage renders large parts of the financial advice or investor seminar industry obsolete, or as noise and propaganda at best. Finally, although public information may help investors it may never completely replace risk arbitrage based on private information or market insight.

(3). Tapscott’s observation about the global financial crisis echoes Satyajit Das, Nassim Nicholas Taleb, Nouriel Roubini and others on the inability of institutions to deal with the systemic crises which the complex instruments and methodologies created. Some hedge fund managers however have been very successful, despite the crisis. Others, notes Gillian Tett in her book Fool’s Gold (The Free Press, New York, 2009) helped create the financial instruments which led to the crisis, yet largely avoided it. So, a more interesting question might be: How did such managers avoid or limit the effects from the systemic crisis, and what decisions did they make?

(4). This is Tapscott as inspirational advocate for change. He echoes Haque on momentum and long-based strategies for investors. He also channels Adam Brandenburger and Barry Nalebuff’s game theoretic model of cooperating to create new markets and then competing for value. This is unlikely to happen in competitive financial institutions. A project to develop a commons-based approach to financial risk management may however interest a professional organisation such as the CFA Institute (US), Global Association of Risk Professionals (US) or the Financial Services Institute of Australasia. Will Tapscott lead an initiative to develop this?

Brian Eno’s ‘Scenius’ Keynote for Sydney’s Luminous Festival

My notes from Brian Eno’s Scenius keynote talk at the Sydney Opera House on 29th May 2009 for the inaugural Luminous Festival as part of Vivid Sydney.

 

Eno goes out of his way to downplay his work and his public image; he also tape records every talk he does.

 

In response to a group of protesters outside who were angry about the Australian Government funding Eno’s trip, Eno explores various governance issues about government arts funding. He felt uncomfortable about receiving government funds. Noting the public influence of scientists such as Richard Dawkins, Stephen Jay Gould and Daniel C. Dennett, Eno states that one of the problems artists faced is that they often did not make clear on their grant funding applications how the broader society would benefit from their work. A second problem was the deliberate mystification by artists of their craft and methodologies. Eno feels that artists need to detail with greater clarity their methodological approach.

 

Eno praises Charles Darwin‘s On The Origin of Species (1859) as a model of clarity which revolutionised our scientific worldview and challenged the prevailing theological interpretations of natural history.

 

He describes Western cultural history as the evolution and interplay of functional artifacts and aesthetic forms. Eno illustrated this by showing and talking about four different screwdrivers from the Sydney Opera House’s maintenance department — contrasting the functional ends with different handles. He also mentioned fashion and joke punchlines as examples of ornamentation and self-presentation.

Scenius’ is Eno’s term for a proxemic subculture which diffuses from an aesthetic response and evolves into a unique design space to solve complex social problems. Eno describes late 1960s San Francisco as a space that was less politicised than is now portrayed, and in his view, was more about a group of people deciding to simply ‘live’ a different philosophy. He also describes the Manhattan Project in these terms, given that the scientists essentially solved the problem of nuclear fission through brute force. He then suggests that there were other scientific frontiers, potentially cold fusion, that were at a similar conceptual and epistemological stage as nuclear fusion was in 1935. ‘Scenius’ also describes Eno’s role as curator/mentor in New York’s ‘No Wave’ scene in the late 1970s.

 

In contrast to these successful large-scale collaborations, Eno suggests the Santa Fe Institute has been a failure, as nothing really has emerged, and its researchers continue to work on their individual projects rather than collaborative research programs. Eno omits that Citicorp’s Walter Wriston tapped Santa Fe expertise so that its capital markets and trading division could develop complexity models of international and cross-border financial flows.

Eno thinks in terms of axes, continuums, and spectrums which are then layered in a possibility space (he uses an overhead projector to explore various axes and issues throughout his talk), and drew on ideas from product development and quasi-experimental methods of iterative, rapid prototyping.

 

He talks briefly about working with Danny Hillis to co-develop the algorithms and music for January 07003: The Bell Studies for The Clock of the Long Now (Opal, 2003), and the rationale and research design of the project for the Long Now Foundation.

 

Eno feels that climate change and the ‘limits to growth’ scenario means that artistic methods for problem-solving need to be diffused more widely, and that everyone needs to perceive themselves as having the abilities to contribute to solutions.

J.G. Ballard: The Personal Mythologist

Author’s note: Vale J.G. Ballard. This interview was originally published in REVelation magazine (Summer, 1994): 96-97. Archived links from Disinformation version (2000).


J.G. Ballard has a unique place in Twentieth Century literature. Imaginative fiction writer and cult figure, his life has often been as nightmarish as the stories he writes. Born on November 15th, 1930 in Shanghai, Ballard’s childhood changed from living in a house with nine servants to being interned by the Japanese following the bombing of Pearl Harbour.

His experiences of surviving acute food shortages and dysentry formed the basis for the 1984 novel that brought Ballard widespread recognition – Empire of the Sun, later filmed by Steven Spielberg.

“As far as I was concerned, Empire of the Sun was a breakthrough book, but there have been people who have been generous to my material from the beginning,” Ballard says, explaining the difference in his earlier styles.

“The real problem is that imaginative fiction unsettles a lot of people who prefer naturalistic novels that reflect everyday life. Imaginative fiction has never been too popular, but that’s changing.”

“When magic realism came from South America, people realised that it creates a wonderful, imaginative world, particularly as TV does the everyday stuff better than novels do. After Empire of the Sun, I was dealing with a whole new audience.”

Continue reading “J.G. Ballard: The Personal Mythologist”

Spearheading Social Media Innovation

Congrats to QUT’s Axel Bruns who now spearheads the Smart Services CRC‘s Social Media program and is likely to become a Chief Investigator in the ARC’s Centre of Excellence for Creative Industries and Innovation. The significance of these appointments is that Bruns has the academic track record as an internationally recognised expert to make a strong research business case to government policymakers, grant-making agencies and institutions for large-scale social media-oriented research.

Bruns’ career illustrates how to navigate the academic research game: it has changed from conference papers and solo projects to team-based projects in competitive institutional contexts. Bruns co-founded the online academic journal M/C Media & Culture in 1998 which became an important open publishing journal in digital media studies and criticism. His PhD thesis cemented his academic credentials, and led to Bruns’ produsage theory of user-created content. This work has underpinned a publications record, collaborations such as Gatewatching with emerging scholars, and streams at the Association of Internet Researchers and Australian and New Zealand Communication Association conferences. Thus, in a relatively short time, Bruns has positioned himself as an internationally recognised scholar on digital and social media innovation.

The next generation of digital and social media researchers can learn from Bruns’ example and career-accelerating strategies.

Jacob Weisberg’s Possible Fallacies

Slate‘s Jacob Weisberg recently surveyed a range of sociopolitical issues from nuclear proliferation to the China Century where the expert consensus might be wrong.

Weisberg’s survey sample includes macroeconomic aggregates (home ownership, asset investment classes, international competition), geostrategic stability (China, nuclear proliferation), and longrun environmental issues (climate change, fossil fuels). In each, Weisberg contrasts a prevailing view, hypothesis or expert with a challenger.

Below are some thoughts on Weisberg’s analyses and observations on research methods in journalism
.

· Selection and Framing of Experts: Weisberg mentions the late realist Samuel P. Huntington‘s thesis on political order in changing societies, to raise concerns about China’s near-future macroeconomic growth. He also refers to neorealist Kenneth Waltz‘s views that nuclear proliferation is inevitable. Huntington and Waltz both represent dominant traditions within international relations theory, and neither are as new or radical as Weisberg seems to portray. The selection and framing of experts is crucial: it would be even more interesting to compare their views with other schools of thought, such as liberal democratic, critical or constructivist theories, which have different deductive premises and levels of analysis. After all, neoconservative fears about Iraq were not just that nuclear weapons acquisition was a defensive action, but also the security orientation of ‘Axis of Evil’ regimes and their potential connections to non-state actors. Perhaps Weisberg could have checked with a nuclear proliferation specialist such as Graham Allison or Jessica Stern. Equally, a China specialist might convincingly show that Hu Jintao’s Chinese government is aware of Huntington’s thesis and has plotted a different future trajectory

· Heretics & Mavericks: Weisberg cites Freeman Dyson as a heretic of climate change models, although Dyson’s scientific expertise is primarily as a physicist and cosmologist. Other mavericks such as the late Federal Bureau of Investigation counterterrorism expert John O’Neill and United Nations weapons inspector Scott Ritter were ostracised in organisational politics, whilst economist Nouriel Roubini strengthened his reputation by foreseeing the global financial crisis. Perhaps it’s also a matter of luck, timing, and having an effective image makeover.

· Incomplete Deductive Arguments: Weisberg observes that market analysts are re-evaluating house ownership, stock investments and the global competitiveness of car manufacturers. Yet the common assumptions that Weisberg mentions are really incomplete deductive arguments with hidden premises. First, home ownership does not necessarily lead to greater community involvement, and the negative factors mentioned (financial risk, labour market mobility, commute time) are weighted during the purchase decision or emerge later in decision regret. Second, evaluating and comparing the risk premia of bonds versus stocks requires further details on the time horizon, sampling frame, weightings and volatility. Shocks such as the 1973 OPEC oil crisis, the 1995-2000 dotcom bubble, the 1997 Asian currency crisis and the current global financial crisis may affect the comparison of bond and stock returns. Third, although the Detroit Three have cut costs and launched several international joint ventures, their debts and liabilities are partly the result of earlier decisions. Weisberg’s argument that these balance sheet issues are the Detroit Three’s main barrier is not really new: asset management and private equity firms have targeted them for over two decades in their acquisition, reengineering and turnaround attempts. Perhaps that’s why the Obama administration has hired media banker Stephen Rattner.

· Inferences from Small Samples: In his sections on long-run stocks, climate change and fossil fuels Weisberg quotes from a single academic study. Whilst this establishes a challenger hypothesis it also probably means that the sample is too small for inferences that would establish a definitive Kuhnian paradigm shift in a knowledge field. Weisberg would have a more robust argument if he referenced meta-analyses which evaluated a group of studies for their sample size and other effects.