Why university rankings may be harming higher education
League tables have gained notoriety, but in fact they measure only research output
The very idea that there are 100 ‘best’ universities worldwide may oversimplify a very complex question about the role of the institution in today’s society. Is there a way to measure quality on a global scale, as the way the rankings purport to do? Photograph: Dara Mac Dónaill/The Irish Times
Another week, another set of university rankings as the Times Higher Education releases it league table. It follows others recently published by Quacquarelli Symonds (QS) and the Academic Ranking of World Universities (Shanghai).
The organisations may differ, but the pattern is broadly the same: Ireland’s best higher education institutions are in free-fall.
The decline in rankings has been alarming university presidents for the past six years. But while rankings are of significant reputational importance worldwide, they only take into account a tiny proportion of the picture of how universities stand.
Lack of understanding of what they actually measure has resulted in the rankings gaining an unwarranted notoriety and position to influence policy, with the potential to harm higher education institutions in Ireland and worldwide.
There are about 20 global rankings of higher education. All have varying methodologies and in some cases give vastly different weightings to the factors they have in common.
Overall, though, the global rankings are largely a measure of research output. The major flaw with global rankings is that they do not – and cannot – measure quality of education.
“In an environment in which higher education is hugely internationalised, they are an important feature and factor for national competitiveness. Rankings are not surprising,” says Ellen Hazelkorn, director of the Higher Education Policy Research Unit at DIT.
“The big issue,” Prof Hazelkorn adds, “is that they don’t measure what people think they’re measuring, and they certainly don’t measure anything about education. They are predominantly a measure of research.”
More than 60 per cent of ranking methodology is based on research output and citations, which, although an important aspect of a university, is hardly the most important factor to potential students.
The absence of a standardised measure for education quality and a lack of easily comparable information for third-level institutes globally results in questionable proxy measures being used to measure education, such as staff-to-student ratio and reputational surveys.
“It would be foolish for students to purely choose their higher education institute based on rankings,” said a representative from the Higher Education Authority. “You’ve also got to remember that universities are quite diverse institutions.
“So, for instance, one could be particularly good in some subject fields and not as good in others depending on their expertise. Rankings only measure certain things.”
Ireland’s relationship with the global rankings industry has been fraught, but university presidents clearly recognise the importance of performance in the rankings, however flawed. Both Trinity College and University College Cork have been accused of attempting to manipulate the rankings to their advantage.
This competitive attitude to the rankings is troubling and could have dangerous implications for third level.
“Improving performance is always important,” says Prof Hazelkorn. “The difficulty comes in when we have universities changing priorities, adjusting priorities, and doing a range of other things, some of them unethical, in order to improve in rankings.”
“There are many university presidents who have said they will do whatever it takes to improve their performance,” she adds. “It is the elevation of research over teaching.”
UCC president Michael Murphy argues that Irish universities have changed their view on rankings.
“I think we’ve grown up,” he says. “We recognise that what we need to do in Ireland must be determined by Irish societal priorities and not doing things because we want to compete with somebody in eastern China. I would be surprised if any of my colleagues or any of the Irish institutions are making strategic decisions based on rankings.”
Moving up or down in the rankings may seem of vital significance, but statistically, small variations can largely be assigned to changes in methodology rather than improvements or disimprovements in quality.
The top-100 constitutes 0.4 percent of universities worldwide, so statistically there’s very little difference between a university at No 30 and one at No 40.
As the ranking methodologies currently stand, it is unlikely Ireland will ever see a university within the top 20 in any ranking. This is partially because many of the factors that influence them are based on the amount of money a university has to spend.
“If you were to take any central message out of the rankings, it is that they are a proxy for investment in higher education, because most of the things – and particularly the quantitative elements in the rankings – are fundamentally influenced by resources,” said Ned Costello, chief executive of the Irish Universities Association.
In the case of Ireland, he says, “the big one is the student-staff ratio piece. If you’re a country like Ireland, where you’ve had rapidly rising student numbers combined with rapidly falling public investment in higher education, it’s a virtual impossibility to improve your standings.”
Comparing the funding between Irish universities and the highest ranked in the world shows it would be impossible for Ireland to compete, even if funding went back to pre-economic crisis levels.
This year’s No 1 university, according to QS, is the Massachusetts Institute of Technology, with €2.8 billion in annual funding. “Yes, you can have an elite Irish university, but you must close the other six and throw in another billion,” says UCC’s Prof Murphy.
The very idea that there are 100 “best” universities worldwide may also oversimplify a very complex question about the role of the institution in today’s society.
The lack of comparable information from universities across the world makes the interrogation of standards on a level deeper than research output extremely difficult. Is there a way to measure quality on a global scale, as the way the rankings purport to do?
“The simple answer is no,” says Prof Hazelkorn. “It is easier to do on a national level, because you’re all within the same socioeconomic national context. Your ability to collect a wider set of data is easier, and you’ve got more meaningful comparatives.”