By MARK PEARSON Follow @Journlaw
The Australian Government conducted the first round of its ‘Excellence in Research for Australia (ERA)’ initiative in 2010 and the next round is scheduled for 2012.
The process has caused considerable angst across many disciplines, including both journalism and law. (Legal academics are specially concerned that important research done in the form of textbooks does not get recognition under the system.)
The latest edition of the Australian Journalism Review (vol. 33, no. 1, July 2011) features commentary from 11 academics on the implications of the ERA for journalism researchers. My contribution is reproduced below for the benefit of my blog followers.
Other contributors to the section include Graeme Turner, Michael Meadows, Anne Dunn, Alan Knight, Terry Flew, David Nolan & Libby Lester, Martin Conboy and Grant Hannis. They present a range of views, with a common theme that journalism researchers will need to be much more strategic in their approach to research if they hope to gain recognition from government and colleagues in other fields.
—————–
A poor fit for journalism research
By MARK PEARSON
Australian Journalism Review (vol. 33, no. 1, July 2011, pp. 25-27)
Excellence is a worthy goal. It is testimony to human progress that every generation has been advanced by a few individuals who have settled for nothing less. Often, however, those very same individuals have been derided by their peers or have failed the institutional tests of their day only to have their work acknowledged by future scholars. It took the research establishment four decades to recognise the pioneering genetic discoveries of 1983 Nobel laureate Barbara McClintock. Her work in the 1940s was so poorly regarded by her contemporaries that she was forced to publish most of her findings in the annual reports of her laboratory (Cherfas and Connor, 1983, p. 78).
The danger of the Excellence in Research for Australia Initiative (ERA) is that key elements of its design are already disadvantaging disciplines, institutions and individual researchers whose work does not fit neatly within politically and bureaucratically defined research ‘quality’. There may be no Barbara McClintocks among us, but journalism educators are already suffering the consequences of a flawed system offering a poor fit with research outputs often aimed at being relevant to professional practice in this country.
One of the key problems is that size does matter in this ERA process. Larger research units are rewarded because they are able to generate a critical mass of outputs to get them so far over the minimum submission threshold that they are able to nominate the very best work for review. This left journalism as a discipline without a submission under the 1903 code at most institutions and, at some, a submission where the bulk of the outputs were the work of just one or two researchers and therefore lacked depth and variety. While the Australian Research Council makes the claim that it is not assessing individual departments or programs, that is a fantasy when Journalism has a distinct code and it is readily identifiable within a particular institution. When that program has only one or two key researchers in that same institution, it is those individuals who get to bask in the glory or to wear the shame of the field’s ranking at that university. The recent move to allow multidisciplinary outputs to be assigned to a preferred discipline might help some of us reach the newly raised thresholds, but will not resolve this bias against the smaller institutions.
Size also matters in other ways. As a relatively small discipline, it means journalism is likely to earn fewer seats at the table in the ERA’s secret evaluation process. According to CEO Margaret Sheil’s foreword to the 2010 report (ARC, 2011), there were only 149 members of the research evaluation committees assessing 157 disciplines, meaning journalism could at best have had only one or two representatives. She stated about 500 peer reviewers from throughout the world were used across all disciplines, with no indication of how many were used for journalism. This added even further to the essentially qualitative nature of the process, and left key decisions about submissions in the hands of very few peers to judge them against ‘world standard’. Others have written on the murky terrain of ‘world standard’ and have questioned what cultural baggage reviewers might have brought to that task (Trounson, 2011). Were the reviewers really comparing our journalism research with that produced in New York, Cardiff, Zambia, and Ecuador? Without any agreed definition of world standard, or evidence of a standardised application in the ERA methodology, we have no choice but to conclude that the very process that has assessed our research would itself be ranked ‘well below world standard’ by any competent academic researcher.
This leads to aspects of the ERA of concern to journalists: a lack of transparency, a process of self-nominating conflicts of interest, and an abundance of spin. Secrecy and confidentiality abound in the selection of committee members and reviewers. They were gagged by confidentiality agreements and the conflict of interest provisions published by the ARC relied upon the reviewers themselves to identify any conflicts. Secrecy provisions prevented external scrutiny (ARC, 2010, pp. 23-24). Spin has already emanated from the ARC and university marketing offices in the wake of the first ERA round. A simple Google search for ‘ERA world standard’ shows several universities boasting about their results.
Institutions and the researchers in disciplines within them deserve feedback on their submissions so that they might aim to improve their performance next time. In journalism submissions, there are too many variables at play for institutions to know what has earned them the ranking they have been given. At the very least, they should be advised on the respective contributions of their creative works and their traditional outputs to their ranking in the discipline. Sadly, much more is expected of these institutions in the evaluation of their students than the ARC has provided in feedback to researchers.
The grandfathering of the collection process added to the injustice. Researchers’ work was judged on a six-year retrospective basis, but we were only aware of the rules of the process and the respective journal rankings for the year prior. Those fortunate enough to have been producing work meeting the new ‘quality’ criteria, and publishing into journals ranked A and A* did well, while the others only had a year or so to start making adjustments to their research careers. (The shift from journal rankings to a ‘journal quality profile’ announced on May 30 only stands to relieve this pressure if it is articulated clearly what constitutes ‘quality’ in such a profile, but researchers will still have to live with their past ‘mistakes’ if their outputs have not met the newly defined standard.) Again, the relative smallness of our discipline works against us here. Many journalism educators were happy to have their research about Australian journalism education published in the three or four academic journals catering to this market in the region. Whole CVs changed when the main outlet, Australian Journalism Review was downgraded to a B in the journal rankings after the 2009 trial. Senior researchers who had published there over the six-year collection period saw their research careers eroded in one simple regrading process – all retrospective. The same stands to happen now that rankings have been replaced by this quality profile, and again when any further refinement is introduced. Until the grandfathering is changed, it will always take us five or six years to adjust to the latest tinkering with the criteria.
All that said, most who have entered journalism education from industry did not have academic research as their primary goal. While many have grown to enjoy it and excel at it, their original motivation was to share their experience with a new generation of journalists in the tertiary classroom. Many have found ways to embrace research while still retaining an applied focus, often by writing textbooks or by researching topics of applied value to Australian journalists and newsrooms. Much of the research we have done has been about Australian journalism practice and has often had implications for the classroom or the newsroom. Often it has related to the fast-changing technological and economic changes affecting journalism. Its intended audience has been Australian journalism students, Australian journalism education colleagues and the Australian journalism industry so we have looked to an Australian journal as the most logical outlet, and preferably one where the work could get traction as soon as possible. Under the rules of the last round, if we sought high regard academically we had to aim for esteemed international journals with long review and publication lead-times, rendering much of our work dated before it is published, if it is not considered too parochial on the international stage to be published at all. We have yet to learn whether new ‘quality profile’ will have a different imperative.
The ranking process was already feeding promotions and appointments at universities with committees scanning vitae for A* journal publications. No doubt it has already changed the research emphasis of journalism academics, and the jury is out on whether the new quality profile will reverse this. While the ERA process allows for the submission of creative works as a form of research, it is likely that career advancement will be limited to those pursuing research of a more timeless and esoteric nature in highly regarded international journals. It is not a model that seems to accommodate what many of us see as our mission: producing quality research with an applied local focus. There will be ways to adapt, of course, and journalism educators have proven themselves excellent chameleons. However, if we were asked to invent a system of assessing the quality of our work, I am sure it would be much more reflective of the mission of applied journalism education and inclusive of the needs of our students and professional colleagues.
Author
Mark Pearson is professor of journalism at Bond University, Gold Coast, Queensland, Australia.
References
Australian Research Council (2011). Excellence in Research for Australia 2010 National Report. Australian Government, Canberra. Retrieved from http://www.arc.gov.au/pdf/ERA_report.pdf
Australian Research Council (2010). Excellence in Research for Australia. ERA 2010 Evaluation Guidelines. Australian Government, Canberra. Retrieved from http://www.arc.gov.au/pdf/ERA2010_eval_guide.pdf
Cherfas, Jeremy and Steve Connor. (1983, Oct 13). How restless DNA was tamed. New Scientist. Vol. 100 No 1379, pp. 78-79.
Trounson, Andrew. (2011, Feb 9). More than mates’ ratings: ARC chief Margaret Sheil. The Australian. Retrieved from http://www.theaustralian.com.au/higher-education/more-than-mates-ratings-arc-chief-margaret-sheil/story-e6frgcjx-1226002411524