Rankings are important for international students.
For all their faults and flaws, these league tables provide a crucial snapshot of a specific institution or country – vital information when seeking your dream study abroad university. Unlike domestic students who have the luxury of visiting campus or growing up with information about a particular region, those coming from abroad unfortunately do not.
As such, information such as teaching quality, student satisfaction, etc – popular indicators of education quality – can be super helpful.
But things become less helpful when the internet is full of rankings of various combinations of factors. There are at least ten official bodies that grade universities worldwide.
It can be hard to decide which rankings to rely on, or how to filter through the barrage of information they supply. You know things have got pretty nuts when there’s even a “Ranking of Rankings of World Universities”…
With that in mind, here we list the top four global rankings every international student should know, offering our explanation as to how and why they matter:
1. QS World University Rankings
https://twitter.com/worlduniranking/status/872582767112069120
Published by Quacquarelli Symonds (QS), this global ranking compares universities in four major areas — research, teaching, employability and international outlook. Each area is assessed against six indicators:
- Academic reputation based on a global survey of academics – 40 percent;
- Employer reputation based on a global survey of graduate employers – 10 percent;
- Faculty/student ratio – 20 percent;
- Citations per faculty – 20 percent;
- International student ratio – 5 percent;
- International staff ratio – 5 percent
It’s the only international ranking to have received International Ranking Expert Group (IREG) approval. What this “IREG Approved” label means is that QS global rankings look towards solving the complex problem of a multinational design, as well as making it as transparent and understandable as possible through a separate website that explains its methodology.
QS also claims that its rankings are designed with international students in mind, stating that its competitor, Times Higher Education, is largely a publication for academics.
But its methodology, which gives more weight (40 percent) to “academic reputation” than to any other factor, has come under fire. To do so, it conducts what it calls “the world’s largest survey of academic opinion,” with responses from more than 70,000 contributors.
These contributors are allegedly non-academic, with reports of others being university staff who were offered financial enticements to encourage participation. In response, QS said it has weeded out non-academic responses and that no financial incentives were given to any survey participant, academic or employer.
2. Times Higher Education (THE) World University Rankings
World University Rankings 2018 by subject: psychology, life sciences and clinical results out now. See the top 10 below – links to the full results here: https://t.co/NPesqkQxYd #THEunirankings #highered pic.twitter.com/1OO2Maun8N
— Times Higher Education (@timeshighered) November 9, 2017
The THE World University Rankings – described as “arguably the most influential” – is an annual publication of more than 1,000 universities the world over. Last year’s edition ranked the highest-scoring institutions up to the 200th position, eventually splitting them into groups of 50. Beyond 400, they are grouped in hundreds, and beyond 600, they are grouped in two hundreds.
Its methodology has evolved through the years but last year, the British publication used five broad indicators:
- Teaching (the learning environment) – 30 percent;
- Research (volume, income and reputation) – 30 percent;
- Citations (research influence) – 30 percent;
- International outlook (staff, students and research) – 7.5 percent;
- Industry income (knowledge transfer) – 2.5 percent
According to THE, these are the only global performance tables that judge research-intensive universities across all their core missions: teaching, research, knowledge transfer and international outlook. It also claims to “measure only what is measurable”.
However, the rankings allegedly suffer from a lack of estimation in their margin of error. As Stephen Curry wrote in The Guardian:
“Errors are not mentioned once in the Times Higher’s detailed description of its methodology. Is Caltech’s table-topping score of 95.2 statistically significantly different from the score of 94.2 achieved by Oxford in the number 2 spot? No information is given, though it is a critical point to hold on to if we want to understand what the performance data mean.”
Other previously reported cons include the methodology’s focus on citations, which results in a bias towards universities with English as its main language as well as those that taught and produce high-quality output of research in ‘hard science’.
However, we note that last year’s rankings are based on academic journals indexed by Elsevier’s Scopus database – the largest abstract and citation database of peer-reviewed literature, which includes social sciences – and all indexed publications between 2012 and 2016.
3. US News & World Report’s Best Global Universities
La #UAB, en la posición 164 del #ranking Best Global Universities de @usnews https://t.co/VKlx0we6wG #UABBarcelona pic.twitter.com/V1Kt1j2sw3
— Universitat Autònoma de Barcelona (@UABBarcelona) October 29, 2017
US News ranks 1,250 institutions from the US and more than 70 other countries based on 13 indicators, measuring their “academic research performance and their global and regional reputations” instead of separate undergraduate or graduate programs.
The pool of 1,295 universities that was used to rank the top 1,250 schools is first created from the top 250 universities in the results of Clarivate Analytics’ global reputation survey. This is followed by 1,385 institutions that had met the minimum threshold of 1,500 papers published in the 2011-2015 time frame.
This pool is then assessed according to the following 13 indicators:
Ranking indicator | Weight |
---|---|
Global research reputation | 12.5% |
Regional research reputation | 12.5% |
Publications | 10% |
Books | 2.5% |
Conferences | 2.5% |
Normalized citation impact | 10% |
Total citations | 7.5% |
Number of publications that are among the 10 percent most cited | 12.5% |
Percentage of total publications that are among the 10 percent most cited | 10% |
International collaboration | 5% |
Percentage of total publications with international collaboration | 5% |
Number of highly cited papers that are among the top 1 percent most cited in their respective field | 5% |
Percentage of total publications that are among the top 1 percent most highly cited papers |
5% |
Source: US News & World Report
Several criticisms have been leveled against US News and its methodology. Among the most damning is an op-ed in The Atlantic which that quality of education at each institution is not accounted for, nor is there consideration of graduate outcomes, ie. the result of learning at College X and student employability post-graduation.
Then, there’s the issue of “reputational” measure, which is based on the most recent five years of the Academic Reputation Survey. This is the only ranking to use such a measure. Critics have said this turns this component of the rankings into an empty exercise or a popularity contest, since this is merely based on asking college officials to rate merits of other schools (which they may not know anything about).
4. Academic Ranking of World Universities (Shanghai Ranking)
— ShanghaiRanking (@ShanghaiRanking) August 15, 2016
The only ranking to originate from Asia, the annual publication by Shanghai Ranking Consultancy, is often praised for the objectivity, stability and transparency of its methodology.
These are the four criteria used to rank the 800 institutions featured in its 2017 edition:
Criteria | Indicator | Weight |
---|---|---|
Quality of Education | Alumni of an institution winning Nobel Prizes and Fields Medals | 10% |
Quality of Faculty | Staff of an institution winning Nobel Prizes and Fields Medals | 20% |
Highly cited researchers in 21 broad subject categories | 20% | |
Research Output | Papers published in Nature and Science* | 20% |
Papers indexed in Science Citation Index-expanded and Social Science Citation Index |
20% | |
Per Capita Performance | Per capita academic performance of an institution | 10% |
Total | 100% |
* For institutions specialized in humanities and social sciences such as London School of Economics, N&S is not considered, and the weight of N&S is relocated to other indicators.
Source: ShanghaiRanking.com
Described by The Telegraph in 2015 as “a remarkably stable list” based on “long-term factors” (eg. number of Nobel Prize winners and number of articles published in Nature and Science journals), it has also been praised for its lack of bias towards Asian institutions despite originating in China.
In France, however, it is a source of annual controversy for the disproportionate weight given to research mostly done decades ago, how it’s used to upscale universities and its lack of understanding of the French academic system.
The ranking has received further flak for the inability of its metric to take into account university size. A bigger university will always end up with more publications and award winners, regardless of its research (or teaching) quality.
Ultimately, the best rankings for you will highly depend on your own wants and needs. Don’t take the results of any ranking as truth. As shown above, each ranking has its downfalls, so its wise for you to intricately dissect each methodology to understand how a university ended up in that particular position.
For a deeper understanding of common ranking indicators, check out our articles on university reputation and student satisfaction.
Liked this? Then you’ll love these…
Rankings Explained: Should you focus on university reputation or course rankings?
Rankings Explained: Why you should look at student satisfaction rates