11th April 2012: How Academia Kills Writing

I recently had some productive exchanges with Roy Christopher and Axel Bruns on academic writing strategies. Christopher wrote-up his insights:

 

I am sympathetic to all of these conditions, but I have found it important to cultivate the ability to write at any time, in any circumstance — even if it’s just collecting thoughts about something. I keep a pen and paper in my pocket at all times, pen and pad by my bed, notebook(s) in my backpack and all over the house. I do find that I need large chunks of uninterrupted time to surmount larger writing tasks, but the ubiquity of computers, portable or otherwise, makes writing anywhere a much more viable option. [emphasis added]

 

Christopher’s insight led to an email exchange on the barriers that academia poses for writers. I think about this a lot in my current university gig as a developmental editor. I also work with a talented copy-editor. Here are six ways that academia kills writing:

 

1. Perverse incentive structures. Christopher and I are both intrinsically motivated writers who approach it as a craft. We blog, write journal articles and in-progress PhD dissertations, and Christopher has several book projects. In contrast, some academics I know write only for performance-based incentives. They play games such as writing fake conference papers, sending book manuscripts to vanity publishers, and publishing in obscure international journals. This leads the university research administrators to change the incentives structures. It also introduces scoping problems into competitive grants: the journal article(s) only get written if the money is awarded. It’s very rare that I find an intrinsically motivated writer: maybe an Early Career Researcher who has just finished their PhD, or a senior academic intent on making a contribution to their field or discipline. I wish academics had a more hip-hop or punk sensibility and just did the work, regardless of the institutional incentives.

 

2. Misuse of university research metrics. The Australian Research Council‘s Excellence for Research in Australia shifted the research conversation to performance and quality-based outputs. This also lead to games such as poaching academics who had ERA publishing track records. However, it also sometimes led to a narrow focus on A* and A-level journals without changes to the workload models or training investment for academic skills and robust research designs. Not everyone is Group of 8, Harvard or Stanford material, or at least not at their career stage. Metrics use must be counter-balanced with an understanding of intellectual capital and development strategies. To-date the use of ERA and Field of Research metrics is relatively unsophisticated, and it can often actually de-value academic work and publishing track records.

 

3. A failure to understand and create the conditions for the creative process. The current academic debate about knowledge creation swings between two extremes. On the one hand, budget-driven cost-cutting similar to GE’s Work-Out under Jack Welch or private equity turnarounds. On the other, a desire to return to a mythical Golden Age where academics are left alone with little accountability. Both views are value destructive. The middle ground is to learn from Hollywood studios, music producers, and academic superstars about the creative process, and to create the conditions for it. This means allowing time for insights to emerge or for academics to become familiar with new areas. It means not relying on conferences and being pro-active in forming collaborative networks. It means treating academic publications as an event and leveraging them for maximum public impact and visibility. Counterintuitively, it can also mean setting limits, stage gates, and ‘no go’ or ‘abandon’ criteria (real options theory can be a useful tool). This is one reason why Christopher and I sometimes exchange stories of the strategies that artists use: to learn from them. This is a different mentality to some university administrators who expect research publications to emerge from out of nowhere (a view often related to the two barriers above).

 

4. Mystifing the blind peer review process. What differentiates academic research from other writing? Apart from the research design, many academics hold up the blind peer review process to be a central difference. Usually, a competitive grant or a journal article goes to between two and five reviewers, who are often subject matter experts. The identities of both the author(s) and the reviewers are kept secret from each-other. Supposedly, this enhances the quality of the review process and the candour of the feedback provided. Having studied the feedback of 80 journal articles and 50 competitive grants, I disagree. The feedback quality is highly reviewer dependent. Blind peer review provides a lack of transparency that allows reviewers to engage in uber-critical reviews (without constructive or developmental feedback), disciplinary in-fighting, or screeds on what the reviewer wished had been written. Many academic journals have no rejoinder process for authors to respond. These are problems of secrecy and can be avoided through more open systems (a lesson from post-mortems on intelligence ‘failures’).

 

5. Being set up to fail through the competitive grants process. A greater emphasis on research output metrics has prioritised success in competitive grants. Promotions committees now look for a track record in external grants for Associate Professor and Professor roles. Australian universities do not often have endowed chairs or institutional investment portfolios — so they are more reliant on grant income. Collectively, these trends translate into more pressure on academics to apply for competitive grants. However, success is often a matter of paying close attention to the funding rules, carefully scoping the specific research project and budget, developing a collaborative team that can execute on the project, and having the necessary track record in place. These criteria are very similar to those which venture capitalists use to evaluate start-ups. Opportunity evaluation, timing, and preparatory work is essential. Not meeting this criteria means the application will probably fail and the grant-writing time may be wasted: most competitive grants have a 10-20% success rate. Some universities have internal grant schemes that enable new academics to interact with these dynamics before applying to an external agency. In all cases, the competitive grant operates as a career screening mechanism. For institutions, these grants are ‘rain-making’ activities: they bring money in, rather than to the individual academic.

 

6. A narrow focus on A* and A-level journals at the expense of all other forms of academic writing. The ARC’s ERA and similar schemes prioritise peer reviewed journals over other forms of writing. (This de-valued large parts of my 18-year publishing history.) The 2009 and 2010 versions of ERA had a  journal ranking list which led many university administrators I know to focus on A* and A-level journals. I liked the journal ranking list but I also saw it had some perverse effects over its 18 months of use. It led to on-the-fly decisions made because of cumulative metrics in a publishing track record. It destroyed some of the ‘tacit’ knowledge that academics had about how and why to publish in particular journals. It de-valued B-ranked journals that are often sub-discipline leaders. It helped to create two groups of academics: those with the skills and training to publish in A* and A-level journals, and those who did not. It led to unrealistic expectations of what was needed to get into an A* journal like MIT’s International Security: a failure to understand creative and publishing processes. The narrow emphasis on journals ignored academic book publishers, CRC reports, academic internet blogs, media coverage, and other research outputs. Good writers, editors and publishers know differently: a high-impact publication can emerge from the unlikeliest of places. As of April 2012, my most internationally cited research output is a 2009 conference paper, rejected from the peer review stream due to controversy, that I co-wrote with Ben Eltham on Twitter and Iran’s 2009 election crisis. It would be excluded from the above criteria, although Eltham and I have since written several articles for the A-level journal Media International Australia.

 

Awareness of these six barriers is essential to academic success and to not becoming co-dependent on your institution.

16th February 2012: Academic Blogging

fred and academic blogging

 

The Lowy Institute’s Sam Roggeveen contends that Australian academics would benefit from blogging their research (in response to The Australian‘s Stephen Matchett on public policy academics).

 

I see this debate from several perspectives. In a former life I edited the US-based alternative news site Disinformation (see the 1998-2002 archives). I also work at Victoria University as a research administrator. I’ve blogged in various forums since 2003 (such as an old LiveJournal blog). In contrast, my PhD committee in Monash’s School of Political and Social Inquiry are more likely to talk about book projects, journal articles, and media interviews.

 

As Roggeveen notes, a major uptake barrier is the structure of institutional research incentives. The Australian Research Council’s Excellence for Research in Australia (ERA) initiative emphasises blind peer reviewed journal articles over other forms. Online blogging is not included as an assessable category of research outputs although it might fit under ‘original creative works’. Nor is blogging included in a university’s annual Higher Education Research Data Collection (HERDC) outputs. University incentives for research closely follow ERA and HERDC guidelines. The ARC’s approach is conservative (in my view) and focuses on bibliometrics.

 

I know very few academics who blog. Many academics are not ‘intrinsic’ writers and are unused to dealing with developmental editors and journals. University websites often do not have blog publishing systems and I’ve seen several failed attempts to do so. Younger academics who might blog or who do use social media are often on casual or short-term contracts. The ones who do blog like Ben Eltham have a journalism background, are policy-focused, and are self-branded academic entrepreneurs.

 

Roggeveen is correct that blogging can potentially benefit academics — if approached in a mindful way. I met people like Richard Metzger and Howard Bloom during my publishing stint. I am regularly confused with QUT social media maven Axel Bruns — and we can now easily clarify potential queries. Blogging has helped me to keep abreast of sub-field developments; to build networks; to draft ideas for potential journal articles and my PhD on strategic culture; and has influenced the academic citations of my work and downloads from institutional repositories.

 

Problem is, HERDC or ERA have no scope for soft measures or ‘tacit’ knowledge creation — so blogging won’t count to many universities.

 

That Roggeveen needs to make this point at all highlights how much the internet has shifted from its original purpose to become an online marketing environment. Tim Berners-Lee’s proposal HyperText and CERN (1989) envisioned the nascent internet as a space for collaborative academic research. The internet I first encountered in 1993-94 had Gopher and .alt newsgroups, and later, web-pages by individual academics. Regularly visited example for PhD research: University of Notre Dame’s political scientist Michael C. Desch and his collection of easily accessible publications.  It’s a long way from that free environment to today’s “unlocking academic expertise” with The Conversation.

 

Photo: davidsilver/Flickr.

23rd November 2011: Google Scholar Personal Profiles

Google Scholar has announced open citations and personal profiles.

The service is popular with academics for citation analysis and publication track records. Google Scholar’s data collection is messy: it trawls the internet and gathers citations from a range of websites and sources. It does not yet have the rigour of Elsevier’s Scopus database, for example. However, it is likely to outrank such proprietary services, due to Google’s accessibility and popularity.

My Google Scholar profile is here. For now, it is a highly selective collection — academic journal articles and conference papers, some postgraduate and undergraduate essays, and old Disinformation dossiers (see archives). I was surprised that some long-forgotten articles had been internationally cited. I have a more complete publications profile which gets updated as new academic research is published (PDF).

Several past collaborators — Axel Bruns, Ben Eltham & Jose Ramos — have their own profiles, and you should check out their personal research programs.

Spearheading Social Media Innovation

Congrats to QUT’s Axel Bruns who now spearheads the Smart Services CRC‘s Social Media program and is likely to become a Chief Investigator in the ARC’s Centre of Excellence for Creative Industries and Innovation. The significance of these appointments is that Bruns has the academic track record as an internationally recognised expert to make a strong research business case to government policymakers, grant-making agencies and institutions for large-scale social media-oriented research.

Bruns’ career illustrates how to navigate the academic research game: it has changed from conference papers and solo projects to team-based projects in competitive institutional contexts. Bruns co-founded the online academic journal M/C Media & Culture in 1998 which became an important open publishing journal in digital media studies and criticism. His PhD thesis cemented his academic credentials, and led to Bruns’ produsage theory of user-created content. This work has underpinned a publications record, collaborations such as Gatewatching with emerging scholars, and streams at the Association of Internet Researchers and Australian and New Zealand Communication Association conferences. Thus, in a relatively short time, Bruns has positioned himself as an internationally recognised scholar on digital and social media innovation.

The next generation of digital and social media researchers can learn from Bruns’ example and career-accelerating strategies.