And if you want to make a sensible comparison of university research you are best to look to it, rather than the major rankings such as the Academic Ranking of World Universities, Times Higher Education or QS.
The reason is that Leiden provides a wealth of data that users can cut in various ways.
As well as total volume of research papers produced, you can look for papers that are most highly cited, and then look at the proportion of a university’s research output that is in the most highly cited category.
Using Leiden you can distinguish, for example, between high performance that is simply a consequence of a university’s size, and high performance that is due to its underlying quality.
Oddly enough, the ARWU, which has entrenched itself as the world’s most watched research ranking, does not do this. ARWU constructs its ranking from a composite of size-dependent and size-independent indicators.
This makes no sense, the Leiden people say.
Leiden also offers information in a range of areas including the degree of collaboration among a university’s researchers (encompassing the all-important industry collaboration), the extent to which a university’s research papers are accessible and not locked behind journal publisher’s subscription walls, and the proportion of female researchers.
Undoubtedly, part of the reason why the Leiden Ranking is not recognised more widely is that it doesn’t offer a simple ranking, with winners and losers.
And it doesn’t lend itself to headlines or marketing campaigns.
Instead it more truly reflects the real complexities that exist when trying to measure the quality of university research.
The annual CWTS Leiden Ranking offers a deep dive into the research performance of all the significant research universities in the world — nearly 1000 of them.