Ranking the Rankings: Which Ranking of Management Journals is the Most Credible?

3 07 2017

Journal rankings are extremely important in the management school world, so we business historians ought to give some thought as to the processes by which the different ranking systems are produced. It’s also crucial to see whether the journal rankings produced by different scholarly organizations (the UK’s Association of Business Schools, the Australian Deans’ List, CNRS, etc) are consistent with each other and with those created by for-profit outfits such as the Financial Times.   As I show below, there are some major discrepancies between the positions of journals on these lists. There are also differences in the degree of methodological rigour employed in the development of these lists.

Sadly, no business history journal was included in the all-important FT50 journal list. As I pointed out when this list was first published, the Financial Times has never revealed the methodology it used to rank these journals. The FT did inadvertently reveal the name of the individual who produced this list, one Laurent Ortmans. We know from LinkedIn that this individual is a graduate of Kingston University and the University of Rennes. We also know from other sources that some of the major management journals lobbied Mr. Ortmans intensively in the period when he was compiling the FT50 list, so perhaps the business history community will need to do so the next time he revises the list. Given its sheer importance to the working lives of academics in the British Isles and elsewhere, it is unfortunate that the FT50-Ortmans list is produced with so little methodological rigour. As far as I can tell, it is whipped up by a single individual who lacks a PhD. Pretty much the only thing I can say in defence of the FT50-Ortmans list is that there are no Japan-based journals on it (yet). Readers of the FT will have noted that ever since Pearson sold the FT to a Japanese company, there has been a surge in the number of articles about Japanese business in its pages. When I first heard of Nikkei’s purchase of the Financial Times, I was concerned that one or two really mediocre English-language journals published by elite Japanese universities might be added to the FT journal list. So far, that has not happened, which  I guess is to the credit of the FT50-Ortmans list. To be clear, there is good research being produced in Japanese universities, but it is typically published in either English-language international journals or in Japanese language journals and not so much English-language management journals edited at Japanese universities.

For business historians who work in UK management schools, perhaps the most important journal ranking is the one produced by the Chartered Association of Business Schools. Unlike the FT50-Ortmans list, part of the methodology behind the ABS guide is published. We know that the 2015 ABS list was produced under the supervision of a Scientific Committee that included leading scholars from various business-school disciplines. In 2015, committee’s expert on business and economic history was Geoffrey G. Jones Harvard Business School. It is not yet known who the historical expert hired to help produce the forthcoming 2017 ABS guide was. The ABS has wisely chosen not to release the name of the 2017 expert so as to preclude lobbying of the sort that took place prior to the release of the all-important FT50 list.  Keeping the name of the subject experts secret was a smart move on the part of the ABS and one that increases the credibility of their ranking system.

That being said, I’m not entirely convinced that the ABS journal ranking process is sufficiently transparent and robust for us to use it as actionable information.  The version of the ABS guide released in 2015 placed journals into five categories: 4*, 4, 3, 2, and 1. A variety of journals in the field of business history, economic history, and management history appeared on page 17 of the guide. (Note that in the 2010 version of the guide, there were separate lists for business history and economic history but that in 2015 these were lists were merged, perhaps in response to the growing importance of economic history in top econ departments).  In 2015, 26 historical journals were ranked. None were ranked 4* (the best possible ranking in the ABS 2015 system), but two were ranked 4, five were ranked 3, twelve were ranked 2, and seven were ranked 1.

Here are the relative positions of the journals in the 2015 ABS guide.

Business History Review 4

Economic History Review  4

Business History 3

Enterprise and Society 3

European Review of Economic History 3

Explorations in Economic History 3

Journal of Economic History 3

Entreprises et Histoire 2

European Journal of the History of Economic Thought 2

Financial History Review 2

Journal of the History of Economic Thought 2

Management and Organizational History 2

Journal of Management History 1

 

It is not clear what methodology will be used to create the 2017 guide.  According to the 2015 ABS guide, the classification process was “stringent and methodical in all cases” and “five sources of evidence” were used

  1. The assessments of leading researchers in each of the main fields and sub-fields covered; [AS: unfortunately, this criterion leaves room for subjectivity, especially since the written assessments were not published]
  2. The mean citation impact scores for the most recent five-year period (where available); [AS: ok, this part of the methodology is based on some actual hard numbers, which reduces the scope for subjectivity, which is good]
  3. Evaluation by the Editors and Scientific Committee members of the quality standards, track records, contents and processes of each journal included in the Guide; [AS: unless a detailed description of the working methods used by the Editors and a transcript of the deliberations of the Scientific Committee are published, this criterion would leave room for even more subjectivity]
  4. The number of times the journal was cited as a top journal in five lists taken to be representative of the ‘world’ rating of business and management journals [AS: Some hard numbers will be used here, which is positive because it gets us away from subjectivity]
  5. The length of time a journal has been established.[Again, a nice clear criterion that can be measured and independently confirmed].

 

Unfortunately, the ABS hasn’t published the formula they use to weigh these five factors. One hopes that the relative weighting of factors will be specified when the ABS releases the 2017 version of its list.  The ABS are to be commended for showing at least part of their work, unlike the Financial Times. Why anybody respects that FT50-Ortmans list is beyond me.

 

Let’s turn from the UK to the rankings of management journals used in other countries.  In many French business schools, the CNRS list is used. Journals are ranked from 1 to 4, with 1 being the best. If you publish in a journal ranked 1, you are rewarded more than if you publish in a 3 or 4 journal.  Version 1 of this ranking was published in 2004 and version 4 was released last year. Overall, the French rankings are not massively dissimilar to the British ABS rankings.  The Accounting History Review is ranked 3 in the French system and 2 in the UK system—it is thus a third-tier journal in both countries.  Accounting History is also ranked 3 in France.  So far, so good: the ratings look commensurate, which suggests the absence of home-nation bias and conflicts of interest. However, there are some discrepancies in the category “Business History Histoire des Affaires”  that are worthy of note.  The relative positions of the journals Business History and Business History Review reverse when one crosses the English Channel. In the CNRS system, Business History (2) is a higher ranked journal that the Business History Review (3). Moreover,  Management and Organizational History is ranked 2 in the UK system but has a lower ranking of 1 in the French system. Similarly, Entreprise et Histoire, a French journal, is lower ranked in France than it is in the UK.

I really like how the creators of the CNRS list of management journals frankly  concede in the preface of the document  that all rankings of journals are necessarily somewhat subjective. I really appreciate this degree of intellectual humility.

Elle ne peut évidement pas prétendre à la perfection, tout simplement parce que les appréciations trop raccourcies que peut fournir une telle liste sur des revues scientifiques (présence ou non sur la liste et rang de classement) sont évidemment des appréciations très réductrices et qui ne sont pas exemptes de subjectivité.

 

 

In Australian business schools, they use the ABDC Journal Quality List. In looking at the relative position of the historical journals on that list, one notices some interesting discrepancies between the relative positions of journals here and on the lists used in the UK and France. In Australia, journals are ranked A*, A, B, or C. Business History and Business History Review are both ranked A (second tier). So in France, Business History is considered to be better than BHR, in the UK BHR is better than BH, and in Australia they are viewed as equal.  Enterprise and Society is also considered an A journal in Australia. The really curious thing about the Australian list is that it Journal of Management History, which is ranked a lowly 1 in the UK, is highly ranked in Australia (it’s an A journal in the ABDC list). The editor of this journal, Bradley Bowden, teaches at Griffith Business School at Griffith University, Queensland. Bradley is a good scholar who is working hard to develop this journal, but the sheer discrepancy between this journal’s ranking in Australia and in other countries is worthy of discussion.

The role of ratings agencies in the 2008 financial crisis generated a great deal of interest in the wider social phenomenon of conflicts of interest in ratings.  This issue, which was dramatized in the film The Big Short, has investigated by academics who work in business schools around the world. Going forward, it will be interesting to see how the FT, ABDC, ABS, and CNRS ranking systems improve their credibility through improved methodological rigour and a higher degree of process transparency. I would strongly suggest that they work with the Center for Open Science as they move in this direction.  I am confident that improving the overall rigour of the process used to rank management-school journals would be a net benefit to the field of business history.

A journal’s Impact Factor is a relatively crude way of determining its quality, but at least it has the advantage of being measurable and independently verifiable. For the record, here are the impact factors of some journals in our field.

Business History 2016 Impact Factor: 0.830.   Ranking: 94/120 (Business); 5/35 (History of Social Sciences) in Thomson Reuters Journal Citation Reports. [Note: the 2015 IF appeared in an earlier version of this post].

Enterprise & Society 2016 Impact Factor: 0.593. Journal is ranked 14 out of 35 History of Social Sciences and 110 out of 121 Business in Thomson Reuters Journal Citation Reports.

Business History Review 2016 Impact Factor: 0.425. Ranking: 19/35 History of Social Sciences; 112/121 (Business)

Andrew Popp, the editor of Enterprise and Society, should be congratulated for the rapidly rising profile of this young(ish) journal. Even though he has not set out to chase Impact Factor, the high quality and innovative nature of the scholarship published in this journal has increased its IF as a sie effect.

Full disclosure: Andrew Popp is a colleague here at the University of Liverpool Management School.

 

Advertisements

Actions

Information

14 responses

3 07 2017
Steph

Hi Andrew
I am a little surprised that you cite the 2015 impact factor for BH, and 2016 for the rest. Our impact factor has again increased this year to 0.83 – in fact we are the only business history journal increasing its impact factor consistently over the last few years:

BH BHR E&S
2012 0.233 0.548 0.474
2013 0.564 0.725 0.133
2014 0.712 0.625 0.479
2015 0.709 0.634 0.680
2016 0.830 0.425 0.593

(Disclaimer: I am a co-editor of Business History. The above figures are drawn from citation reports.)

3 07 2017
andrewdsmith

Oops— I looks like I did indeed use the 2015 IF for Business History. That was inadvertent! Thanks for correcting me

3 07 2017
Oxford academic

Andrew,
You set out to find out to answer the question “Which Ranking of Management Journals is the Most Credible?”. You also promise a critique of the methodology of different journal lists. Unfortunately, you deliver neither.

You simply describe a range of journal list, and some extremely general remarks about their alleged methodological rigour, without any evidence presented that lets the reader assess where those lists fail, and what “methodological rigour” would actually look like.

Your main point of criticism of the FT list seems to be that “as far as I can tell, it is whipped up by a single individual who lacks a PhD”. Hard to believe that you call yourself an academic if your main criterion for assessing methodological rigour is whether or not someone has a PhD.

There are many valid critiques of journal lists, but this clearly boils down to a case of “I don’t like them but can’t really give a good reason why”.

Poor methodological rigour, indeed. Have you actually contacted any of the organisations to cross-check your findings? Clearly you have not. Poor rigour indeed.

3 07 2017
andrewdsmith

The ABS does publish a general description of their methodology… I quote from it here. The FT does not. If these organizations wish to get in touch with me, they are free to do so. I’ve blogged on this issue for a while.

3 07 2017
Oxford academic

You have blogged on this issue for a while, and the most poignant critique that you came up with is that someone at the FT doesn’t have a PhD? Oh well then, I rest my case.

3 07 2017
andrewdsmith

That wasn’t my major criticism of the FT list at all. It’s that they don’t disclose their methodology, yet alone the raw data that goes into it. Can you imagine a paper that publishes findings without explaining the methods or the data used?

3 07 2017
Oxford academic

In your article it is. The FT have disclosed their methodology to schools participating. Precisely in the communication that you talk about but apparently have never seen. Not difficult to find that out. Rigour, right?

3 07 2017
andrewdsmith

The FT methodology wasn’t published for the world to see and debate, so it’s not published. A summary of the ABS methodology was published, so that list has more credibility. If you have a detailed description of the FT50 methodology, send it along as a PDF and I’ll happy share it here!

3 07 2017
Oxford academic

Really? The methodology takes about 30 seconds to find.
https://www.ft.com/content/3405a512-5cbb-11e1-8f1f-00144feabdc0

Are you sure you are good at research?

3 07 2017
andrewdsmith

I saw that thing and linked to that at the time. That’s NOT a sufficiently clear methodology– it’s a few sentences and doesn’t explain how the 200 schools were selected, which individuals at the schools were contacted, etc, etc etc. That very brief description of methodology would never get through peer review at a top journal. What’s your name by the way? I can only see an IP address.

4 07 2017
Oxford academic

One would assume the schools contacted are the ones participating in the ranking. How would it matter which individuals are contacted? Schools got votes as institutions.

Yes I quite agree this methodology would never get published in a “through peer review at a top journal”. As if the FT had ever claimed to do that. Surely an unfair standard to impose on a newspaper. Again, you are taking a standard that most academics would not live up to, and try to impose it on a newspaper. That’s pompous at best.

Criticising is easy, what is your academically sound and approved method of ranking journals that would “get through peer review in a top journal”? By all means do tell.

14 07 2017
Laurent Ortmans

I contacted the 200 odd business schools that take part in either of our three MBA rankings.

14 07 2017
andrewdsmith

Ok. But what’s the geographical distribution, response rate, the nature of the “contact”, the wording of the emails… etc.

Wording of questions in polling in v important

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




%d bloggers like this: