Our top ranked universities are rated poorly by students
Here’s a conundrum. We’re celebrating the big news that three top Australian research universities – Melbourne, Sydney and UNSW – have reached the top 20 of the QS World University Rankings, a stratospheric result never previously achieved.
But there’s also another key university performance measure released this week – the federal government-backed 2022 Student Experience Survey – which shows these same three universities among Australia’s worst, as judged by undergraduate students, in delivering an educational experience.
Of the 42 universities in the survey – which measures what students think of their learning environment – Sydney, UNSW and Melbourne were respectively third, fourth and fifth from the bottom. And the university that came last, Southern Cross, can make the reasonable point that students at its main campus in Lismore were disrupted by major floods last year.
What’s going on? In fact the two very different assessments of these three top universities are entirely compatible because they measure different things.
The factors that make up the QS ranking are academic and employer reputation, the ratio of academics to students, citations per academic on research papers they write, and the ratio of international students and international academics to all students and academics (with a higher radio considered better).
This year QS added three more criteria that appear to have assisted Australian universities towards their historically high results. These are a measure of sustainability (both as practised and as taught by the university), as well as graduate employability and a measure of international research collaboration.
Not there at all in the QS rankings is any direct measure of what students think. It is only there by proxy, as in a higher ratio of international students suggests that students (at least ones from overseas) think the university is good.
On the other hand, the student experience survey is a questionnaire completed by more than 200.000 students nationwide that asks them how well their university performs in skill development, learner engagement, teaching quality, student support and learning resources.
They are direct questions, not proxy measures, and the fact Australia’s three ranked universities come out near the bottom, as rated by students, is concerning.
Melbourne, UNSW and Sydney are all large universities, with more than 47,000 full-time equivalent students, and they are highly research intensive. It’s very reasonable to ask how well they do in supporting individual students and whether their research is prioritised over their teaching.
The other thing to remember is that the primary users of global university rankings are international students. Australian universities, particularly those such as Melbourne, UNSW and Sydney that rely on international students to pay for their research, are very attentive to rankings because they drive international student enrolments. Foreign students and their parents pay close attention to them, particularly in China.
However, global rankings are not nearly as relevant to Australian students. As is very clear, they don’t properly measure student satisfaction. Australian students are far better served by looking at reliable local measures of university performance on two federal government websites, qilt.edu.au and compared.edu.au. Both include the student experience data as well as data on graduate employment.
Our top universities should pay more attention to them too.