Compare colleges

Which has the highest dropout rate, the best job prospects, the liveliest social mix? Find out with our interactive graphics


Higher education has been crying out for performance indicators, but authorities in Ireland have been wary of the distorting effects of global rankings. The three big international indices – Shanghai, Times Higher Education and QS – all measure different things. One focuses on reputation among businesspeople, for example, while another concentrates on the scale of published research in the sciences.

And the criteria used for each index change regularly, along with the pool of institutions examined. It makes for a misleading picture, says Muiris O'Connor, head of policy at the Higher Education Authority. "We looked at how the [world] rankings of TCD and UCD were changing. They both trundled down the tables on the back of a declining reputation. But it was 'Ireland in bailout mode', Ireland's reputation in a global sense, that carried them down. "Research output and citations in that period increased, but it did not reflect in the table or counterbalance the damage to national reputation. That proved to us how fickle international tables are and how we need to own our own ambitions in higher education."

The HEA's first step towards this was published recently in the form of a detailed profile of each university, college and institute of technology. Its title, Towards a Performance Evaluation Framework , demonstrates the authority's nervousness of getting into rankings. But the profiling template does allow people, for the first time, to compare Irish institutions across a range of headings.

The HEA summarises these statistics in a detailed graphic for each institution. The Irish Times has here used the same data under various selected headings, for ease of comparison. (The HEA's full report is on hea.ie.)

READ MORE

It shouldn’t be confused with a league table, but it does demonstrate the potential for people to make judgments about quality and performance across the sector. It is worth noting that the HEA has collated this data for the academic year 2010-11 in most cases.

The value of such data will increase as annual changes are recorded and trends detected. The HEA argues that you get a truer picture of what’s going on by making comparisons under the headings that you value most rather than by looking at a single ranking that weighs a multitude of factors under a somewhat arbitrary formula.

Ellen Hazelkorn, director of research at Dublin Institute of Technology and rankings consultant for the OECD, says that with all such indexes "there is no objective measure of quality. All indicators come with biases and the prejudices of the people who put them together."

Rankings “have awakened us to the fact that we lived in a global world” and have “focused everyone’s attention on issues around quality”, she says.

But there are concerns about “the overattention on research” in rankings. “It has undermined other roles of higher education.”

Hazelkorn says the Danish authorities were talking to her about designing their own ranking system and suggested using salaries of graduates as an indicator. “Give me a break! Salaries to some extent measure employability, but salaries are a function of the market.”