ERA journal rankings are dead – hurrah, hurrah!

By Legal Eagle

I haven’t heard of an academic yet who is sad about the end of the ERA journal rankings, announced yesterday by Minster Kim Carr. Carr said in his press release:

There is clear and consistent evidence that the [ERA journal] rankings were being deployed inappropriately within some quarters of the sector, in ways that could produce harmful outcomes, and based on a poor understanding of the actual role of the rankings. One common example was the setting of targets for publication in A and A* journals by institutional research managers.

In light of these two factors – that ERA could work perfectly well without the rankings, and that their existence was focussing ill-informed, undesirable behaviour in the management of research – I have made the decision to remove the rankings, based on the ARC’s expert advice.

The journals lists will still be of great utility and importance, but the removal of the ranks and the provision of the publication profile will ensure they will be used descriptively rather than prescriptively.

These reforms will strengthen the role of the ERA Research Evaluation Committee (REC) members in using their own, discipline-specific expertise to make judgments about the journal publication patterns for each unit of evaluation.

Very fortunately, my law school had not insisted on certain levels of submissions to A* and A journals. Those kinds of short-sighted policies disadvantage those who work in specialist areas where there are few journals, or those who primarily publish book chapters.

Anecdotal evidence from a publisher I spoke to at the end of last year indicated that quality submissions to a number of their more practically-oriented specialist law journals had fallen massively after these journals were not ranked in the A or A* category. These journals are actually good journals, and I suspect practitioners are far more likely to read articles within them than they are to read some of the more esoteric academic journals which gained an A*. But the net effect was a disincentive for people to publish in journals which might provide a communication point for academia and practitioners. At base, I am a deeply practical academic; although I like the theoretical side of private law too, if my work doesn’t suggest ways of giving litigants a better chance of practical justice, and if it isn’t something which someone in practice might find at least a little useful, I don’t feel that it’s very worthwhile. What is more, a low ranking became a self-fulfilling prophecy, precisely because it deterred future quality submissions.

Still, I concur with the observations from Restless Capital on this (which also references Joshua Gans’ blog):

With all that said, I’m actually ambivalent about the end of the rankings. I want to know more detail about what’s replacing it, for a start.

Further, I find today’s post by Joshua Gans pretty compelling. You should go and read the whole thing, but this resonated with me:

But we should be more angry about this. Many academics’ comments on hearing about the demise of the ERA is good riddance. Why? Because they bore the costs of fighting about the measure and then the gaming. But those costs have been borne. I personally bore a ton of them and so did so many others. A complete waste of time.

And for what? Nothing. Just to prove to the Government what we all could have predicted four years ago!

Another squibbed reform from Labor? Gans seems to think so.

One thing’s for sure. There will be a lot of scholars with papers stuck in three year queues at what were once A* journals who might have cause for ambivalence, too.

I’m really glad (and somewhat surprised) that the government listened to feedback and pulled the pin on this when it became clear that it was producing perverse outcomes. But I hope that they don’t replace the scheme with a different perverse scheme. Just leave well enough alone, guys!

(Via Mark Bahnisch and Catallaxy – when there’s unanimous rejoicing across the blogosphere, you know it has to be a good idea!)

11 Comments

  1. Posted May 31, 2011 at 12:30 pm | Permalink

    Hmmm…. I find it a little extraordinary that Kim Carr is surprised by research institutions tailoring their research to do the best they on the metric which determines their funding. D’oh.

    We saw some of the shenanigans that went on over journal rankings. I didn’t follow it in detail, but my partner found that in his discipline, there were some very odd outcomes, most of which related to the journals published in by the people reviewing the ranking exercise outcomes.

  2. derrida derider
    Posted May 31, 2011 at 1:24 pm | Permalink

    The ERA list seriously disadvantaged researchers in all highly specialised fields. I know all the labour economists were complaining mightily about the way it actively prevented applied research on Australian labour markets, in favour of studies of US ones that could get into the JHR or JOLE.

    I suspect it was the consequent complaints from elsewhere in the burreaucracy about how the policy evidence base was being eroded, rather than complaints from the academics, that did the trick.

  3. Jeremy Gans
    Posted May 31, 2011 at 1:35 pm | Permalink

    Eh. Joshua’s right of course. (I would say that.) But at least the rankings were transparent. Now we face some sort of ranking based on a journal’s ‘frequency of publication’. What does that mean? I can’t think of anything remotely rational that it even could mean.

    Yes, the A*/A system worked badly in a lot of disciplines or sub-disciplines. But I’m not convinced that the alternative – guesswork about which journal is the ‘best’ one – is anything to celebrate.

    The best thing we can hope for is that the entire concept of academic journals tips over and dies because of the redundancy of paper-based publications and old-fashioned peer review. I’d rather publish directly on my own webpage/blog and make my own case that what I’ve said is making a splash. But it looks like it’s back to the old games of choosing between being respected and being read.

  4. Mr Bee
    Posted May 31, 2011 at 5:57 pm | Permalink

    “I dunno what the frequency of publication measure is going to achieve. ”

    It helps the evaluators to determine whether you keep the right company, and so are worthy of joining their club.

    Journals lists are everywhere, from Belgium to Melbourne. They have a life of their own, for sound reasons.

    But the ERA list in particular failed because it was compiled behind closed doors after a sham consultation process. So now they have thrown it out in favour of a new measure … that was compiled behind closed doors after a sham consultation process.

    It will all end in tears. I am reminded of Santayana … I think this round was tragedy. The next will be farce.

  5. Anthony
    Posted May 31, 2011 at 6:13 pm | Permalink

    “I got told by someone that I needn’t have bothered publishing in the Law Institute Journal. But I wanted to get the message out that restitution law is actually quite useful for practitioners in certain specific areas (despite certain elements in the High Court being very averse to it) and I thought the Law Institute Journal was the best method of getting my point across. Whereas how many local lawyers read the Law Quarterly Review?”

    Surely there’s a category confusion here. It’s great to service the profession, but this doesn’t always count as research.

    As academics we’re paid to do lots of things, and research is just one aspect of what we do. The ERA – and the RQF before it – was just designed to measure that one aspect. That’s why it’s called Excellence in *Research* Australia, rather than Excellence in *Everything We Do* Australia. If you engage with the profession through LIJ columns or put together cases and materials books that are lauded by the industry or adopted widely in undergraduate courses, then this will be recognised as either a contribution to your teaching portfolio or as part of your professional engagement. But in many cases it can’t be counted in good conscience as “research”.

  6. Bernice
    Posted May 31, 2011 at 9:28 pm | Permalink

    I quote Mr Bee:
    “But the ERA list in particular failed because it was compiled behind closed doors after a sham consultation process. So now they have thrown it out in favour of a new measure … that was compiled behind closed doors after a sham consultation process.”

    Twaddle sir twaddle.

    Firstly ERA did not compile the ranked journal list behind any doors – it came together with input from academics, institutions, academies and leading researchers plus incorporated existing journal rankings from overseas.

    Secondly, in working up the 2010 ERA evaluation process, over 100 000 public submissions were received by the ERA. And they were not fed into the shredder.

    Thirdly, the ERA has been warning unis that journal rankings were not a measure of performance. They have also attempted to point out how nuanced the ERA process actually is and what a small small part journal articles played overall in accessing research outputs (and considering that it is only applied to ARC funding, largely irrelevant for most academics at any given moment).

    It would be wonderful if Mr Bee and friends read ERA’s materials before making sweeping statements. You may like to start here:
    http://www.arc.gov.au/pdf/ERA2010_eval_guide.pdf

  7. Mr Bee
    Posted June 1, 2011 at 12:56 pm | Permalink

    Bernice,

    There was plenty of consultation on the draft submissions, but much of it was ignored in a very strange process that resulted in the final list. It is hard to work out what happened, but the ARC seems to have relied on its panel of experts rather than the learned societies in making the final rankings.

    I note with amusement that they are now saying the rankings were based on bibliometric experts. That was not the basis of the consultation! Bibliometrics have major and well known problems of which the ARC and ERA don’t seem to have been aware – because they preferred not to give genuine opportunity for consultation on bibliometrics.

    While ERA doesn’t claim to be about allocating funding, it obviously, quite obviously, is. I think it is precious of the government to set up journal rankings that would influence funding, and then complain that people managed to them. What else were Vice Chancellors supposed to do?

  8. Henry2
    Posted June 1, 2011 at 9:13 pm | Permalink

    Of course, journal rankings in Gentle Macchia are decided by the contributors;

    From here

    Obviously this isn’t great as none of us got to review it. Odd that she didn’t send it to one of us here as she knew we were writing the article she asked us to!

    I will be emailing the journal to tell them I’m having nothing more to do with it until they rid themselves of this troublesome editor.

    -Phil Jones.

    …I believed our only choice was to ignore this paper. They’ve already achieved what they wanted—the claim of a peer-reviewed paper. There is nothing we can do about that now, but the last thing we want to do is bring attention to this paper…

    This was the danger of always criticising the skeptics for not publishing in the “peer-reviewed literature”. Obviously, they found a solution to that—take over a journal!

    I think we have to stop considering Climate Research as a legitimate peer-reviewed journal. Perhaps we should encourage our colleagues in the climate research community to no longer submit to, or cite papers in, this journal.

    I’m not sure that Geophysical Research Letters can be seen as an honest broker in these debates any more, and it is probably best to do an “end run” around Geophysical Research Letters now where possible. They have published far too many deeply flawed contrarian papers in the past year or so.

    -Mike Mann

    Regards,
    Frank

One Trackback

  1. […] “I haven’t heard of an academic yet who is sad about the end of the ERA journal rankings, announced yesterday by Minster Kim Carr. Carr said in his press release: ‘There is clear and consistent evidence that the [ERA journal] rankings were being deployed inappropriately within some quarters of the sector, in ways that could produce harmful outcomes …’” (more) […]

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*