live by the sword

4 views
Skip to first unread message

Glenn Hampson

unread,
Apr 4, 2024, 12:07:44 PMApr 4
to osi20...@googlegroups.com

Hi Folks,

 

From Danny Kingsley’s book of scholarly communication comedy, chapter 1: University Rankings, here’s a new post from Caroline Wagner (https://bit.ly/43IDWj1) about how those who live by the sword may end up dying by the sword.

 

For at least a generation already, university rankings generated by US News and Times Higher Education (THE) have used ranking formulae that heavily weight research publishing. About 35% of THE’s rankings appear to be based on scholarly publishing output (see https://bit.ly/3J3CNZA). US News appears to attribute only about 4% to these same factors (see https://bit.ly/43NxuXN). There are other ranking systems as well, as Caroline notes, like the Shanghai Ranking (https://bit.ly/3VGxh6X), which attribute about 40% to scholarly publishing output.

 

To calculate publishing output, these major ranking systems use indexes like SCI (Shanghai) and Scopus/SciVal (THE and US News), which, as we all know, are heavily weighted with Western, STM, English language, subscription listings---so, representing only a portion of the full scholarly publishing universe. Enter the Leiden ranking, created by Ludo Waltman about 16 years ago (see the methodology discussion at https://bit.ly/3J1C8bn). The Leiden ranking isn’t new, and it doesn’t compare 1:1 with THE and US News rankings because it focuses exclusively on objective and verifiable science-related data (carefully combining publication output, citation impact, and indicators of scientific collaboration; whether you agree with using these measures as proxies for science impact is another matter altogether) rather than also including university survey data, social metrics and other types of self-reported data. What is new is that the latest version of the Leiden ranking has (for the first time?) been generated using OpenAlex rather than proprietary index data (see the press announcement at https://bit.ly/3U027Gh). The result is an upheaval of rankings from what we’re used to seeing, with most of the top spots being grabbed by universities in China that are apparently home to some prolific researchers (it would be interesting to dig down on this data…).

 

What does this mean for ranking powerhouses like THE? What will happen to these important (sorry Danny) rankings if they also shift to the Leiden methodology? I think that’s the gist of Caroline’s article---a sort of live by the sword, die by the sword moment where if publishing is going to remain such a big factor in rankings, and if we’re also going to be more open and transparent about how this publishing data is calculated, then the world’s “top universities” list is overdue for a shuffling. Danny’s beef has been that because these ranking systems are biased toward “prestige publishing,” they have in effect acted like a brake on the uptake of open access publishing. But if we take the brake off and count all publishing equally, will this be a step in the right direction? Or would this development just incentivize more publishing period?---not necessarily the best outcome either.

 

Best regards,

 

Glenn

 

 

Glenn Hampson
Executive Director
Science Communication Institute (SCI)
Program Director
Open Scholarship Initiative (OSI)

Reply all
Reply to author
Forward
0 new messages