NewsBite

Why academics hate international ranking season

They create markets too much like capitalism for those in academia who believe the role of government is to write the cheque and then leave universities alone – or ask hard questions about the quality and value for money of the courses they teach.

University rankings aren’t popular among rank and file academics, whose culture emphasises collegiality and despises competitive markets.
University rankings aren’t popular among rank and file academics, whose culture emphasises collegiality and despises competitive markets.

It’s university ranking season, when comparing local unis against the world’s best is briefly big news. National pride aside, rankings are also imbued with magical marketing powers – just ask South Australian Premier Peter Malinauskas, who is betting the state’s higher education future on merging the University of Adelaide and University of SA to create a super-university that will rocket up the research rankings, creating a name to attract fee-paying international students.

Rankings are a win-win for university managements. A higher score is good for corporate reputation and gives marketers something to sell to young people in China looking for somewhere to study. Plus vice-chancellors can claim the achievement was despite utterly inadequate government funding. And drops are due to, yes, utterly inadequate government funding.

But rankings aren’t popular among rank-and-file academics, whose culture emphasises collegiality and despises competitive markets. The curious thing about this is that there is an Australian-based ranking that makes classroom teachers stars but does not get as much promotion from their managers.

It was over-reaction as normal last week when the Times Higher Education rankings reported falls among Australian universities on the new edition of its global top-to-bottom list of 2000 institutions around the world.

At the top end the changes did not matter much. Melbourne University fell two places to 39th in the world. The University of NSW lifted one place to 83rd but it, and four more, still stayed in the world’s top 100, no mean feat given they are competing against Europe, the Anglosphere and, ever increasingly, China.

And there is always something for everybody to promote in rankings: “Fastest-improving university with Q in its name,” overstates opportunities, but not by much. This matters to managements who use rankings to promote whatever they can find, especially to fee-paying internationals.

Daniel Edwards wins 2024 science teaching prize

Hype aside, rankings are a good enough guide to relative performance in a national market such as Australia. All the big commercial products generally have much the same results each year. And they are an OK indication of how our major universities compare internationally insofar as a relative scale matters all that much. This year’s results show the long-predicted slide in Australian scores is picking up, which university lobbies blame on the government, rather than the far more complex interaction of funding and policy decisions in competing countries.

But, while the big rankings that compare universities as brands get the media coverage, there are many more that do very specific jobs. The immensely well-regarded Leiden ranking, also released last week, compares university research on bibliometric data. There are few as good but many others use similar approaches to measure individual disciplines and people annually, by numbers of research publications, the status of the journals in which they are published, and the number of times they are cited in other articles – it is career-making and breaking stuff. Which is why popular academic opinion less dislikes than loathes the whole KPI-on-steroids thing. For a start, they complain about ranking methodologies. They have a point; some of the big commercial products include entirely subjective measures in the calculations. Plus the formulas – there is much maths in ranking calculations – are hypersensitive to small changes. A mediocre university that pays a bomb to hire a super-productive researcher can rocket up a discipline ranking and crash to Earth when they leave. And there are always institutions that complain they were robbed by a ranking that got it wrong – which can happen.

Rankings also make academics feel powerless, angry that their university chases an improved score rather than listening to staff, worried their careers depend on publishing at all costs. They have a point about both. But the big reason they hate rankings in general is that they create markets in which universities compete for research funding and full-fee-paying international students. This is too much like capitalism for those who believe the role of government is to write the cheque then leave universities alone, especially individual academics, who should be left to teach and research according to their own expertise.

But, while big-name managements make the most of brand rankings, they generally go quieter on a performance measure that gives their teaching staff a stage to shine on. It’s a collection of annual rankings based on really, really rigorous opinion surveys completed by the people whose opinions should matter most in Australian higher education – students, recent graduates, and the people who hire them.

The federally funded Quality Indicators for Learning and Teaching come out annually and rate universities and private providers in great detail on what their customers, including international students, think about the quality of their products and services.

And QILT scores them from top to bottom, with often embarrassing result for big-noting universities that do well on international brand rankings, such as the University of Melbourne, always number one in Australia and in the world top 50. The 2023 QILT scores are imminent, but for 2022 Melbourne was scored 71 by students for their overall experience, and Sydney University 68, both well under the national university average of 75. There was less a gap than a chasm between them and top performer, tiny Avondale University, ranked 88 – small, teaching-focused institutions are often more popular than very big, research-focused universities.

QILT also asks students about the quality of their courses, which is where teaching academics can shine or flame out, with comparative rankings on each university – this may be why managements and staff don’t always want QILT results uncovered.

But QILT is exactly what prospective undergraduates, certainly Australians, need to help them decide where they want to study – information from present and immediate past students. For somebody looking where to study nursing or accounting, which universities where they live are good matters way more than how many places on a league table they are behind Oxford, or Harvard, or Peking.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/nation/why-academics-hate-international-ranking-season/news-story/3a490b2f92c8bdf04b13c0dc4e44e520