Journal rankings are extremely important in the management school world, so we business historians ought to give some thought as to the processes by which the different ranking systems are produced. It’s also crucial to see whether the journal rankings produced by different scholarly organizations (the UK’s Association of Business Schools, the Australian Deans’ List, CNRS, etc) are consistent with each other and with those created by for-profit outfits such as the Financial Times. As I show below, there are some major discrepancies between the positions of journals on these lists. There are also differences in the degree of methodological rigour employed in the development of these lists.
Sadly, no business history journal was included in the all-important FT50 journal list. As I pointed out when this list was first published, the Financial Times has never revealed the methodology it used to rank these journals. The FT did inadvertently reveal the name of the individual who produced this list, one Laurent Ortmans. We know from LinkedIn that this individual is a graduate of Kingston University and the University of Rennes. We also know from other sources that some of the major management journals lobbied Mr. Ortmans intensively in the period when he was compiling the FT50 list, so perhaps the business history community will need to do so the next time he revises the list. Given its sheer importance to the working lives of academics in the British Isles and elsewhere, it is unfortunate that the FT50-Ortmans list is produced with so little methodological rigour. As far as I can tell, it is whipped up by a single individual who lacks a PhD. Pretty much the only thing I can say in defence of the FT50-Ortmans list is that there are no Japan-based journals on it (yet). Readers of the FT will have noted that ever since Pearson sold the FT to a Japanese company, there has been a surge in the number of articles about Japanese business in its pages. When I first heard of Nikkei’s purchase of the Financial Times, I was concerned that one or two really mediocre English-language journals published by elite Japanese universities might be added to the FT journal list. So far, that has not happened, which I guess is to the credit of the FT50-Ortmans list. To be clear, there is good research being produced in Japanese universities, but it is typically published in either English-language international journals or in Japanese language journals and not so much English-language management journals edited at Japanese universities.
For business historians who work in UK management schools, perhaps the most important journal ranking is the one produced by the Chartered Association of Business Schools. Unlike the FT50-Ortmans list, part of the methodology behind the ABS guide is published. We know that the 2015 ABS list was produced under the supervision of a Scientific Committee that included leading scholars from various business-school disciplines. In 2015, committee’s expert on business and economic history was Geoffrey G. Jones Harvard Business School. It is not yet known who the historical expert hired to help produce the forthcoming 2017 ABS guide was. The ABS has wisely chosen not to release the name of the 2017 expert so as to preclude lobbying of the sort that took place prior to the release of the all-important FT50 list. Keeping the name of the subject experts secret was a smart move on the part of the ABS and one that increases the credibility of their ranking system.
That being said, I’m not entirely convinced that the ABS journal ranking process is sufficiently transparent and robust for us to use it as actionable information. The version of the ABS guide released in 2015 placed journals into five categories: 4*, 4, 3, 2, and 1. A variety of journals in the field of business history, economic history, and management history appeared on page 17 of the guide. (Note that in the 2010 version of the guide, there were separate lists for business history and economic history but that in 2015 these were lists were merged, perhaps in response to the growing importance of economic history in top econ departments). In 2015, 26 historical journals were ranked. None were ranked 4* (the best possible ranking in the ABS 2015 system), but two were ranked 4, five were ranked 3, twelve were ranked 2, and seven were ranked 1.
Here are the relative positions of the journals in the 2015 ABS guide.
Business History Review 4
Economic History Review 4
Business History 3
Enterprise and Society 3
European Review of Economic History 3
Explorations in Economic History 3
Journal of Economic History 3
Entreprises et Histoire 2
European Journal of the History of Economic Thought 2
Financial History Review 2
Journal of the History of Economic Thought 2
Management and Organizational History 2
Journal of Management History 1
It is not clear what methodology will be used to create the 2017 guide. According to the 2015 ABS guide, the classification process was “stringent and methodical in all cases” and “five sources of evidence” were used
- The assessments of leading researchers in each of the main fields and sub-fields covered; [AS: unfortunately, this criterion leaves room for subjectivity, especially since the written assessments were not published]
- The mean citation impact scores for the most recent five-year period (where available); [AS: ok, this part of the methodology is based on some actual hard numbers, which reduces the scope for subjectivity, which is good]
- Evaluation by the Editors and Scientific Committee members of the quality standards, track records, contents and processes of each journal included in the Guide; [AS: unless a detailed description of the working methods used by the Editors and a transcript of the deliberations of the Scientific Committee are published, this criterion would leave room for even more subjectivity]
- The number of times the journal was cited as a top journal in five lists taken to be representative of the ‘world’ rating of business and management journals [AS: Some hard numbers will be used here, which is positive because it gets us away from subjectivity]
- The length of time a journal has been established.[Again, a nice clear criterion that can be measured and independently confirmed].
Unfortunately, the ABS hasn’t published the formula they use to weigh these five factors. One hopes that the relative weighting of factors will be specified when the ABS releases the 2017 version of its list. The ABS are to be commended for showing at least part of their work, unlike the Financial Times. Why anybody respects that FT50-Ortmans list is beyond me.
Let’s turn from the UK to the rankings of management journals used in other countries. In many French business schools, the CNRS list is used. Journals are ranked from 1 to 4, with 1 being the best. If you publish in a journal ranked 1, you are rewarded more than if you publish in a 3 or 4 journal. Version 1 of this ranking was published in 2004 and version 4 was released last year. Overall, the French rankings are not massively dissimilar to the British ABS rankings. The Accounting History Review is ranked 3 in the French system and 2 in the UK system—it is thus a third-tier journal in both countries. Accounting History is also ranked 3 in France. So far, so good: the ratings look commensurate, which suggests the absence of home-nation bias and conflicts of interest. However, there are some discrepancies in the category “Business History Histoire des Affaires” that are worthy of note. The relative positions of the journals Business History and Business History Review reverse when one crosses the English Channel. In the CNRS system, Business History (2) is a higher ranked journal that the Business History Review (3). Moreover, Management and Organizational History is ranked 2 in the UK system but has a lower ranking of 1 in the French system. Similarly, Entreprise et Histoire, a French journal, is lower ranked in France than it is in the UK.
I really like how the creators of the CNRS list of management journals frankly concede in the preface of the document that all rankings of journals are necessarily somewhat subjective. I really appreciate this degree of intellectual humility.
Elle ne peut évidement pas prétendre à la perfection, tout simplement parce que les appréciations trop raccourcies que peut fournir une telle liste sur des revues scientifiques (présence ou non sur la liste et rang de classement) sont évidemment des appréciations très réductrices et qui ne sont pas exemptes de subjectivité.
In Australian business schools, they use the ABDC Journal Quality List. In looking at the relative position of the historical journals on that list, one notices some interesting discrepancies between the relative positions of journals here and on the lists used in the UK and France. In Australia, journals are ranked A*, A, B, or C. Business History and Business History Review are both ranked A (second tier). So in France, Business History is considered to be better than BHR, in the UK BHR is better than BH, and in Australia they are viewed as equal. Enterprise and Society is also considered an A journal in Australia. The really curious thing about the Australian list is that it Journal of Management History, which is ranked a lowly 1 in the UK, is highly ranked in Australia (it’s an A journal in the ABDC list). The editor of this journal, Bradley Bowden, teaches at Griffith Business School at Griffith University, Queensland. Bradley is a good scholar who is working hard to develop this journal, but the sheer discrepancy between this journal’s ranking in Australia and in other countries is worthy of discussion.
The role of ratings agencies in the 2008 financial crisis generated a great deal of interest in the wider social phenomenon of conflicts of interest in ratings. This issue, which was dramatized in the film The Big Short, has investigated by academics who work in business schools around the world. Going forward, it will be interesting to see how the FT, ABDC, ABS, and CNRS ranking systems improve their credibility through improved methodological rigour and a higher degree of process transparency. I would strongly suggest that they work with the Center for Open Science as they move in this direction. I am confident that improving the overall rigour of the process used to rank management-school journals would be a net benefit to the field of business history.
A journal’s Impact Factor is a relatively crude way of determining its quality, but at least it has the advantage of being measurable and independently verifiable. For the record, here are the impact factors of some journals in our field.
Business History 2016 Impact Factor: 0.830. Ranking: 94/120 (Business); 5/35 (History of Social Sciences) in Thomson Reuters Journal Citation Reports. [Note: the 2015 IF appeared in an earlier version of this post].
Enterprise & Society 2016 Impact Factor: 0.593. Journal is ranked 14 out of 35 History of Social Sciences and 110 out of 121 Business in Thomson Reuters Journal Citation Reports.
Business History Review 2016 Impact Factor: 0.425. Ranking: 19/35 History of Social Sciences; 112/121 (Business)
Andrew Popp, the editor of Enterprise and Society, should be congratulated for the rapidly rising profile of this young(ish) journal. Even though he has not set out to chase Impact Factor, the high quality and innovative nature of the scholarship published in this journal has increased its IF as a sie effect.
Full disclosure: Andrew Popp is a colleague here at the University of Liverpool Management School.