Why university world rankings are meaningless
Data used for placings is misleading as it is almost entirely based on research function
A few years ago QS went to a university’s website and lifted data about the number of staff to calculate the staff: student ratio, ignorant of the fact that the staff numbers included gardeners, cleaners and administrative staff. Photograph: Getty Images
The most recent in the apparently never-ending stream of university rankings, out today, shows no sign of reversing the 10-year decline in the performance of Irish universities – as a whole they are nowhere near where they were a decade ago, and at best it shows that the decline may have paused.
Should this be a cause for alarm? The answer is only up to a (very small) point, if at all. No more than it should cause rejoicing when Irish universities rise in the rankings.
Among the reasons for dismissing rankings and the message they appear to tell are that they only focus on one aspect of what universities do, ignoring their many other functions; that the data on which they are based are very poor and cannot be trusted; and, most seriously, that they distort the behaviour of universities and even governments.
To compare universities in different countries as rankings purport to do, they need data that is comparable across national boundaries. Other than in respect of research hardly any such data exist. Even the definition of a student may vary from country to country.
However, data relating to research publications are obviously research-related. Staff to student ratios – that are claimed as measures of teaching commitment – actually also relate to research (the more research a university does the more staff it employs and the better its staff to student ratio).
Related to research
The “reputation surveys” are also clearly related to research – an academic in one country will generally only become aware of a university in another through the research activity and publications of its staff. Altogether, over 90 per cent of rankings are based on measures of research performance.
So yes, if all you are interested in is how good universities are at research the rankings may tell you something. However, while research is important for some universities it is by no means important for all, and research performance is certainly insufficient to identify the “best” universities.
But does the slide of Irish universities mean that the quality of their research has declined? Certainly not. They’ve actually greatly improved their research performance. All that the rankings tell you is that others may have improved by even more.
As far as the quality of the data is concerned, it’s difficult to exaggerate how poor this is. Rankings need data and in the absence of data, they create their own. Fully 50 per cent of the QS ranking is based on opinion surveys – what others think of any particular university. Cross-border reputation is based almost wholly on research profile, and anyway a reputation survey is hardly an objective way to judge quality.
But in the absence of other data this is what they use, and to boost responses they even use the votes of respondents who may be five years dead. The other data are problematic too. Most are self-reported by the universities themselves, and subject to minimal quality control by the rankers.
So it was that Trinity a couple of years ago found that it had reported some of its data with the decimal point in the wrong place – misreporting its data by a factor of 10. And where universities don’t report data QS will find something to take their place.
A few years ago it went to a university’s website and lifted data about the number of staff to calculate the staff: student ratio, ignorant of the fact that the staff numbers included gardeners, cleaners and administrative staff.
Finally, and most important, is the impact that rankings have. Rankings are a zero-sum game, with as many losers as winners – no matter how hard universities play and how much they improve – there will be others that may have improved even more, as Irish universities have found.
I’ve heard presidents of universities from Ireland to Indonesia say that they will do whatever they can to improve in the rankings. And since the only way to do so is to improve research performance and visibility, that means devoting more of their scarce resources to research, neglecting other, arguably more important, functions.
It’s a matter for debate whether that is what Ireland really needs – but in any case if so that’s a decision that should be taken on its own merits, not simply to rise in the rankings.
And even governments haven’t been immune, several adopting policies explicitly to improve the position of their universities in international rankings. This has meant directing public money to research instead of teaching, and in general that means providing it to those universities that are already among the better funded.
What should our response be? It’s too much to hope that rankings will be ignored – politicians, journalists, the public and universities themselves have become mesmerised by them. But it’s important to see them for what they are and understand that they are bogus in their claims to identify the “best” universities or even to measure university performance more generally.
They certainly don’t, and so in itself the slide of Irish universities is not a cause for alarm. There is plenty to be concerned about in respect of third-level education in Ireland, but performance in rankings is certainly not one of them. Bahram Bekhradnia is president of the Higher Education Policy Institute