Last week, Times Higher Education released its World University Rankings for 2018. Malaysian universities didn’t fare too well there.
The best performance came from University of Malaya (UM), and that was at the 351-400th spot. The other top four research universities in the country – National University of Science, Malaysia (USM), National University of Malaysia (UKM), University Putra Malaysia (UPM) and University of Technology, Malaysia (UTM) – only managed to make it to the 601-800th spots.
It’s in stark contrast with the 2018 QS World University Rankings where those universities were all in the top 300 – UM even rose to 114th place, a few hundred places off where it placed in THE 2018.
Add in the US News and World Report’s Best Global University Rankings and the Shanghai Jiao Tong Academic Ranking of World Universities (ARWU) and we start to see even more variety between the rankings, as shown in the chart below:
University | QS 2018 | THE 2018 | US News 2017 | Shanghai ARWU |
---|---|---|---|---|
Universiti Malaya | 114 | 351-400 | 356 | 401-500 |
Universiti Sains Malaysia | 264 | 601-800 | 576 | 401-500 |
Universiti Kebangsaan Malaysia | 230 | 601-800 | 783 | – |
Universiti Putra Malaysia | 229 | 601-800 | 670 | – |
Universiti Teknologi Malaysia | 253 | 601-800 | 639 |
Source: Penang Institute
It’s a crowded and confusing landscape. As outlets outdo their competition by producing more rankings (and spin-offs from these), students are left to decipher even more lists of rankings on top of the four mentioned above without knowing what those numbers really represent.
Yet, they still rely on the rankings. In the 2017 International Student Survey (ISS), education consultant Hobsons found that a considerable portion of prospective international students say rankings are the most important factor when choosing a country or university to study in.
In a ranking of rankings, Hobsons even found QS to be the most popular, followed by THE, ARWU and US News.
But Ong Kian Ming, head of Penang Institute in Kuala Lumpur cautions students against using any single ranking as the sole determinant when choosing a university.
Methodologies of these rankings matter, and each has their own sets of strengths and flaws, as detailed in the Penang Institute’s report titled “An unhealthy obsession with Global University Rankings?”
For example, Malaysia’s strong showing in QS 2018 is due to how much weight QS placed on research and citation based measures. Or rather, the lack of.
“The QS ranking only allocates 20 percent of its overall score to research and citation measures. In comparison 60 percent of THE, 70 percent of the Shanghai AWRU and 75 percent of the US News rankings are allocated to research and citation measures,” Ong wrote in a statement.
On the other hand, QS awards half of its overall score on academic reputation and employer reputation, which are both based on subjective measures that can be manipulated by universities.
Look closer into how Shanghai ARWU calculates its rankings, and you will find far too much weight is placed on science subjects and to universities with previous winners of Nobel prizes and Field medals – hardly the best way to measure a university’s research output.
Ranking systems can be hijacked too. Elizabeth Perry, a professor at Harvard and expert on China, said the Chinese are actively “gaming” the system as seen by the country’s strong showing in this year’s THE rankings.
“They are hiring an army of postdocs whose responsibility is to produce articles,” Perry told the Wall Street Journal. “They are changing the nature of a university from an educational institution to basically a factory that is producing what these rankings reward.”
Still, these shortcomings have neither stopped universities from advertising their ranks as proof of quality nor the Malaysian government from claiming credit when it has yet to be proven is due.
Higher Education Minister Idris Jusoh said of QS this June: “Since the establishment of research university 10 years ago, the quality of the universities have improved and show potential to rival other top universities in the world.”
The minister also used the listing of Malaysian universities in THE as proof that local universities are “soaring upwards” and expects to see the country making it into THE‘s Top 20 by 2050.
The Malaysian Education Blueprint 2015-2025, however, takes a more neutral stance on rankings – it’s treated as just “one of many measures the Ministry monitors as it works with Higher Learning Institutes to raise student and institutional outcomes”.
So did the Minister in a 2015 opinion piece titled “What it means to be world class”, where he wrote university rankings are not the “all and be all”, as they sometimes fail to “capture the more subtle values of higher education, such as prioritising access over outcomes, teaching over research and publications, building infrastructure or the capacity of young lecturers and so forth”.
It’s a position closer to Ong’s, who argues that the Malaysian Ministry of Higher Education should not be overly obsessed with rankings and fool itself into thinking that these rankings represent a sign of improvement by local universities when there are indicators showing otherwise.
And neither should we.