In Partial Praise of the ABS Journal Rankings
I recently attended an online conferences of UK management academics where there was a passionate discussion of the Academic Journal Guide, the rankings of management journals produced by the UK’s Chartered Association of Business Schools. This list has always been controversial, particularly whenever a journal is downgraded or upgraded. Similar passions are ignited by the equivalent journal rankings used in other countries, such as Australia’s Business School Deans List or the CNRS list in France. The stakes in the battles over which lists to use and if so how to use them are high because hiring and promotion in many UK business schools is largely a function of one’s ability to publish in whichever journals are highly ranked. The uses of journal rankings as heuristic or quick proxy for research quality is almost necessary evil in the UK context because the job candidate process is a very short and low cost compared the expensive multi-day campus visits by which job candidates are screened in North American universities.
In recent years, some UK business schools have moved away from using journal ranking lists such as the AJG in the course of making decisions about hiring and promotion. The business school at a leading Welsh university, for instance, has gone so far as to prohibit all references to this or any other journal rankings, or even journal impact factors, in makings decisions about hiring and promotion. The British Academy of Management recently declared that it wants an end to the use of journal rankings. Some other business schools are moving in that direction, although I have been informed that there are now bitter, internecine struggles between different departments within at least on English business school over this issue. The move away from the use of journal rankings is driven, in part, by a growing belief on the part of some academics and some external knowledge stakeholders that the research published in some highly-ranked journals is low in quality. A few years ago, a retired management academic published a scathing article on this subject called “The Triumph of Nonsense in Management Studies”. This Emperor-Has-No-Clothes article was read and discussed by some policymakers and doubtless has undermined the credibility of a few of the many journals reported in the list.
Another factor that is pushing UK business schools away from using journal lists is government policy, both in England and in the nations with devolved administrations. The UK’s current Science Minister has heard about the use of journal ranking lists in various academic disciplines and has said that this practice does not promote the interests of taxpayers, British companies, students, and so forth. Speaking to University-Business UK in October 2020, Amanda Solloway spoke of the problems that come from the use of journal ranking lists: “Researchers tell me they feel pressure to publish in particular venues in order to gain respect to their peers, which wrongly suggests that where you publish something is more important than what you say.” Solloway has ordered a “root-and-branch” review of how academic research in the UK is funded and measured so that it produces more benefits for UK firms and society as a whole. The international task force she has appointed is currently reviewing how the UK’s research funding and measurement system works. It is expected that it will recommend that it will recommend that it become much more like Research for Australia, the Australian government’s system for measuring and funding research, which has attempted to change the focus of academics away from just publishing papers for other academics and towards greater engagement with private industry. The replacement of the Research Excellence Framework with a sort of Research for Britain system would doubtless encourage more management schools to stop using journal lists, especially since some of the highly ranked journals publish research that would be, we would have to say, of zero potential relevance to any manager ever.
While we await word of what the UK’s new research funding system looks like, the UK government has introduced a policy that is clearly incompatible with the use of journal ranking lists: in 2019, UKRI, which under the control of the Minister of Science, ordered all UK universities to both become signatories of the San Francisco Declaration on Research Assessment (DORA) and to take steps to ensure that all units within universities (departments, faculties, etc) were compliant with the basic principle of DORA, which is that in evaluating the quality of a published research output (say a paper), judgements must be informed solely by a reading of the output rather than any knowledge of who published it. My understanding is that there are now countless fights going on a various levels within universities across the country about how seriously departments and hiring committees should take the DORA principles to which all universities now pay lip service. I suspect that the model of academic behaviour introduced in Chapter 2 of Cracks in the Ivory Tower by Jason Brennan and Phillip Magness does a good job of explaining the patterns we can see in which academics support DORA implementation and which academics are against it.
Personally, I think that journal guides are likely to remain an important factor that influences hiring and promotion decisions in UK universities and UK business schools in particular. I generally use the theory of regulatory capture (Stigler, 1971) to understand how the UK’s REF system, which was originally developed with the laudable goal of increasing the taxpayer’s return on funding in academic research, works in practice. Perhaps I watched too many episodes of Yes Minister when I was growing up, but I suspect that the journal ranking lists will outlive the tenure in office of the UK’s current Science Minister. Moreover, I’m not 100% certain that getting rid of the use of journal rankings would make hiring and promotion decisions in a world in which the people making the hiring decisions are not forced to internalise the costs to the organisation of hiring people who aren’t very good at research. Right now, the busy people on hiring committees use journal rankings as a proxy for research quality. I think that if their use was banned, they would use some of other proxy for quickly judging research quality. The proxy measures they would likely use would be the prestige of applicants’ undergraduate universities, the prestige of their PhD institution, and, since we are in the UK, social class accent. Pretty soon, business schools would come to be staffed by Oxbridge graduates with Received Pronunciation accents. Right now, they are staffed more by people who are proficient at churning out papers that will be read by a handful of other academics. I’m not certain that change would be a net improvement for society.
If we assume that business schools will continue to use journal rankings lists in hiring and promotion decisions, then which list to use becomes an important question. I think that the ABS list should be used because it is the least problematic of the various lists that are available. Unlike the FT50 list of journals, which is produced by a few newspaper staffers through a very non-transparent process, the procedure by which this list is created is reasonably transparent. The names of the individuals who prepare each iteration of the ABS list are published, which is a very important accountability mechanism totally missing from the FT list. The subject experts whose (admittedly subjective) judgement calls determine the rankings of journals are diverse in nationality and their country of residence. Four work at at universities in Canada and two are in universities in the United States. My sense is that the 2021 version of the list displays much less home country bias than does the equivalent list used in Australia. (I’ve blogged in the past about the home country bias displayed in the Australian list).
Moreover, at a time when the need for political viewpoint diversity in the humanities and the social sciences is increasingly recognized by academics of all ideological stripes, the Chartered Association of Business School should be commended for ensuring that the team that produces the AJG rankings is ideologically diverse and is composed just of your typical Guardian-reading UK academic. I see that the committee included Professor Eric Chang of Shanghai Jiao Tong University in mainland China and Professor Donald Siegel of Arizona State University. The latter is very conservative Republican who has been extremely critical of the lockdown measures most countries have used to control the spread of the virus, while Chang sits on Chinese government bodies (!!!). I’m certainly no fan of either President Trump or the Chinese government but my confidence in this list is boosted by the fact that team that produced it so ideologically diverse and isn’t just composed of centre-left liberals who work in British and Canadian universities. We know that ideological diversity reduces the dangers of groupthink.
Leave a Reply