Bulletin 001 - THE and QS BRICS 2019
metricas.edu
This first issue of our boletim bring two in depth analysis considering the release of QS BRICS, and THE. The most significant facts are a slight drop on the positions of the subjects in the THE ranking, mostly because of changes and incorporation of new institutions in their base. As the QS BRICS also brings a set of challenges, translated into a few advices on how to improve the performance by increasing the attention on a set of secondary indicators that can be helpful on improving the general performance of SP Public institutions.
Times Higher Education 2019 Key Findings for São Paulo State Universities
Results analysis
The headline that will be reported is the “fall” of UNESP into the group of 801-1000. This looks precipitous, as it could be cynically misrepresented as a fall of hundreds of places in absolute position. In reality, the margins of uncertainty this low in the ranking, mean that the change in score will not be statistically significant (21.5-30.6 falling to 19.0-25.9). When this is broken down into individual indicators, we can see that UNESP improved in its teaching assessment by 3.4 points, dropped by 1.7 in research, improved by 2 points in citations, 2.3 in industrial income and 2.9 in international outlook. This means that relative to the sample mean institution, UNESP actually improved its performance despite dropping down a group.
Similarly, USP improved in all indicators compared to the 2018 ranking, with the exception of the “research” indicator, as did Unicamp. The fact that the score fell for all three universities, by a similar value, deserves attention. The indicator is made up of research reputation, number of papers indexed on the Web of Science per FTE staff (“productivity”) and research income. Given that in teaching all scores improved, the reputation survey is unlikely to be the culprit for this fall, as the “research reputation” and “teaching reputation” are very heavily cross pollinated (it is doubtful whether these should be counted as two separate indicators). Given that internal university resources suggest that there has been no fall in staff productivity (and that the count is taken as a rolling five year mean), this leaves “research income” as the indicator most likely to have caused the fall. Research income is a highly contentious indicator; measurements of inputs should be used for cateogorising and profiling universities, not for evaluating them. It is also worth pointing out that the majority of this fall will be due to falling federal resources, something that is outside of the control of the public universities. In effect, the universities are being evaluated on the basis of the national economic context rather than on any form of objective performance.
Areas for improvement
While progress is being made in the “citations” indicator, performance remains far behind the global mean. This ranking’s dependence on the Scopus index, however, actually counts against the universities in this assessment. Scopus’ coverage of foreign language publications is 30-40% larger than Web of Science’s, which if the number of papers were counted in any signficant way, would be advantageous. Instead, the universities’ contribution to local knowledge is counted together with its international contributions, meaning that the universities are at a disadvantage when compared to universities from English-speaking countries. This should be contrasted with the approach used by the CWTS Leiden ranking, which uses Web of Science, and only English language publications, diminishing a large amount of this effect. Specificity of the Times Higher Education for Brazilian Universities
As previously established, the Times Higher Education ranking is particularly poorly oriented towards the evaluation of Latin American public universities, attending to few specific institutional priorities of the universities. The British ranking is heavily size normalised, reflecting a desire to represent the university environment, and not its contribution to society or scientific discourse per se. Consistently it is the ranking where Brazilian public universities perform worst, as the set on indicators are geared towards representing smaller, well-financed, research intensive universities with high numbers of postgraduates. Given the range, diversity and reach of the São Paulo state universities, their constant societal demand to expand undergraduate and continuing education amid a scenario where their abilities to contract new academic staff is limited by financial constraints, many of the 13 indicators are not of high priority to the universities.
Given that the Times Higher does not just aggregate indicators to reach its final score, like the ARWU or QS, but also aggregates indicators on an individual level (each individual indicator score is a combination of indicators), extracting meaningful information from this ranking is difficult, and largely speculative.
QS BRICS 2019 Analisys
USP
USP’s position in the BRICS ranking is a place lower than in 2018, having been overtaken by Sun-Yat Sen University in China, which, while not a C9 university, is one of the elites in China, and now benefitting from the Double First Class initiative. It also has China’s largest public university system attached to it, as well as being in relatively close proximity to Hong Kong’s world class universities (not counted in this ranking). USP scores full points in both repuation surveys, showing that the university continues to be extremely highly regarded in comparison to other universities in the developing world, both by academic peers and employers.
While faculty student ratio is lower than average compared to the whole of the sample presented here, it should be remembered that this ranking takes into account a very wide range of institutional types, and many small institutions such as the Indian Institutes of Technology, or SUSTech (number one ranked for this indicator), which is the Chinese government’s experimental platform for higher education reform, rather than being a full university in its own right. However, the large difference in this indicator between USP and the elite Chinese institutions – Tsinghua University has 8.8 students per member of staff (99.1 points) Lomonosov has 9.4 students per member of academic staff (100 points), Zhejiang University has 13.6 (78.3 points) compared to USP’s 16.1 students per member of staff.
The dramatic fall in the citation score last year was due to a combination of expanding reach of the Scopus index in the Portuguese language, combined with better insertion of small, research intensive universities with lower publication counts, and therefore typically higher average citations. To some extent, this position has been regained year, most likely due to changes in the country normalisation. It is still 20 points down on two years ago, and therefore USP can be said to be falling behind its Chinese counterparts, even if the drop suggested last year is probably over exaggerated. It only counts for 5% of this ranking though, and therefore does not have a strong impact on the results of this ranking.
The rise in international faculty score is promising, climbing 7.7 points. This is in part due to other universities being under-internationalised. However, its improved performance should be considered in conjunction with the reduction in the score for international students (by 2 points). This decline is due to China and Russia’s active recruitment of full time international students; in Russia particularly from the old Soviet bloc, and China worldwide. These two observations suggest that even though Brazil cannot necessarily compete with the incentive packages offered by Chinese and Russian institutions where they have become specific institutional goals incentivised by excellence initiatives, they are still managing to attract international staff on the basis of strong research opportunities on offer.
Institution | Year | Position | Score | Academic Rep 30% | Employer Rep 20% | Fac. Student 20% | Intl.Students 2.5% | Cit. per paper 5% | Intl. Faculty 2.5% | Papers per fac. 10% | Fac. with PhD 10% |
---|---|---|---|---|---|---|---|---|---|---|---|
USP | 2019 | 14 | 82.5 | 100 | 100 | 38.3 | 26.2 | 57.9 | 51.6 | 89.4 | 100 |
USP | 2018 | 13 | 81 | 100 | 100 | 33.3 | 28.2 | 39.9 | 43.9 | 85.6 | 100 |
USP | 2017 | 10 | 87 | 100 | 100 | 77.4 | 43 | 94.6 | 100 | ||
Unicamp | 2019 | 16 | 82 | 99.9 | 97.8 | 36.5 | 32.2 | 63.7 | 59.1 | 86.5 | 100 |
Unicamp | 2018 | 12 | 81.1 | 99.7 | 99.5 | 31.9 | 31.6 | 43.8 | 43.9 | 87.2 | 100 |
Unicamp | 2017 | 12 | 85.2 | 99.9 | 99.6 | 83.9 | 45.4 | 95.3 | 100 | ||
UNESP | 2019 | 29 | 72.5 | 87.6 | 74.3 | 52.4 | 32 | 42.4 | 36.1 | 61.2 | 100 |
UNESP | 2018 | 34 | 67.8 | 82.3 | 82.3 | 39 | 16.8 | 30.5 | 27.3 | 58 | 100 |
UNESP | 2017 | 36 | 72 | 85.7 | 72.6 | 57.1 | 62.6 | 100 |
Table from the author
Unicamp
Unicamp has fallen four places in the ranking this year, remaining in the top 20, but now behind the Universidade de São Paulo, Sun-Yat Sen University, Wuhan University and St Petersburg State University. Unlike USP’s fall last year, all four of the universities that now rank higher than Unicamp are comparable institutions; large, flagship public research intensive universities. Unicamp has been particularly affected by the drop in citations, in the 2018 edition. Although it regained 20 points, it is still 20 points lower than it was in 2017. In small part, this is due to failing to keep pace with Chinese advances, it is also in part to widening coverage of Portuguese language production. It has maintained its scores in reputation, reflecting that despite being a relatively small university, it enjoys a high level of international prestige compared to other universities in BRICS countries. The increase in score for international faculty is promising, showing that not only is Unicamp managing to retain its international staff, it is attracting at a relatively fast rate when compared to Chinese and Russian universities, who have much stronger financial incentives and specific programmes to attract international researchers.
In this case, it is difficult to say that Unicamp’s performance has declined in the past year, more that other universities have grown at a faster rate, evidenced by the fact that Unicamp’s overall score increased. This is suggestive of the fact that the universities at the top of this ranking are growing at a faster rate than the rest, and therefore whil Unicamp is improving faster than the latter, it is in danger of being left behind by the former.
Institution | Year | Pos | Score | Ac Rep | Em Rep | Fac. Student | Int Students | Cit | Int Fac | Paper | Fac PhD |
---|---|---|---|---|---|---|---|---|---|---|---|
Sun-Yat Sen | 2019 | 13 | 83.5 | 94 | 96 | 58.2 | 55.7 | 97.8 | 89 | 53.6 | 95.3 |
Sun-Yat Sen | 2018 | 16 | 78.7 | 90.7 | 92.1 | 41.8 | 55.4 | 77.3 | 92.1 | 57.2 | 94.6 |
St Petersburg | 2019 | 11 | 84.2 | 94.6 | 91.3 | 99.7 | 96.7 | 24.7 | 31.3 | 33.7 | 87.6 |
St Petersburg | 2018 | 13 | 81 | 92.3 | 84.6 | 100 | 94.8 | 17.2 | 24.9 | 79.3 | |
Wuhan | 2019 | 15 | 82.1 | 95.6 | 97.7 | 34.1 | 68.5 | 90.4 | 100 | 80 | 93 |
Wuhan | 2018 | 15 | 79 | 93 | 94.2 | 31.8 | 69.6 | 62.9 | 99.5 | 77.9 | 87.1 |
Sun-Yat Sen advanced 3.3 points in academic reputation and 3.9 points in employer reputation, as well as a huge 20.9 points in citations, it is also managing to attract international staff better than Unicamp, with smaller class sizes. Similar growth was seen in Wuhan university in both reputation, and 27.5 points more in citations. It seems that the rapid growth of the second line of Chinese universities (after Tsinghua, Peking and Zhejiang) is significantly now outperforming the São Paulo state universities in research performance. The other, St Petersburg State, relies mainly on the fact that it has a large reputation, small class sizes and large numbers of foreign students from the Russian speaking world. In terms of research performance, the Russian universities are even further behind the Chinese universities than Brazil. This higher performance than both USP and Unicamp is largely due to circumstancial environmental factors, and not performance indicators.
UNESP
UNESP is the only one of the three universities that has consistently climbed in position, year on year, breaking into the top 30 in the 2019 edition. It gained 5.3 points in academic reputation (1.9 since 2017). In employer reputation, while there was a gain of 9.7 in 2018, this score fell by 8 points this year. This means that over the period, UNESP is still 1.7 points better than it was two years ago. UNESP continues, in a global sense, to be well respected by its academic peers, but suffers from a lack of profile among employers, compared to USP and Unicamp.
UNESP’s number of international students increased by 15.2 points, to the same level as Unicamp, and signficantly above USP. The faculty student ratio fell by 18.1 points in 2018; almost certainly due to the inclusion of smaller universities and research institutes with small student bodies. This year, the university recovered 13.4 points on the scale. UNESP still presents on this ranking as having much smaller classes than either USP or Unicamp. International faculty count has grown markedly over the two years (18.1 points total), showing that the university’s internationalisation efforts are gaining broader recognition.
UNESP’s score in citations per paper is closing the gap on USP and Unicamp, gaining 11.9 points on 2018’s score. As their score was not recorded for 2017, it is impossible to assess whether or not the fall witnessed by USP and Unicamp would have happened for UNESP.
Key Indicators
Based on the key indicators identified in Annex II of the book Repensar a Universidade, the following indicators are essential for performance in this ranking.
- Field normalised Citation Impact (FNCI) over five years – “Citations per Paper”
- Number of full time equivalent international staff (visiting professors excluded) – “International Faculty”
- Number of full time equivalent foreign-born students (excluding exchange) – “International Students”
- Number of Papers indexed on Scopus (five years) – “Papers per Faculty”
- Number of full time equivalent active members of academic staff – “Faculty Student Ratio”, “International Staff Ratio”, “Staff with PhD”, “Papers per Faculty”
Secondary indicators
These indicators can be suggestive of raised performance in the ranking, but are not directly measured by it, the universities could consider monitoring the following indicators:
- Number of papers in the top 10% by field (“citations per paper”)
- Number of papers in the top 1% by field (“citations per paper”)
- Number of backlinks to university websites (“Academic Reputation”, “Employer Reputation”)
- Number of articles coauthored with industry (“Employer Reputation”)
- Altmetric Impact (“Academic Reputation”, “Employer Reputation”)
- Students placed in internships with international companies – “Employer Reputation”
- Graduates working in international companies in their field – “Employer Reputation”
- Activity in global research and university networks (“Academic Reputation”)
- Number of visiting professors and university visits (“Academic Reputation”)
- Inward and outbound exchange students (“Academic Reputation”)
Areas for improvement
Given the high volatility of some indicators in this ranking, it would appear that the data drawn from Scopus is not the same each year. In this sense, continuining efforts to implement universal use of Orcid numbers for researchers, and GRID numbers for the institution would ensure that the university receives full credit for what it publishes. The Chinese universities appear to be benefitting from the Double First Class initiative, focused on the promotion of strategic areas of excellence within universities, meaning that now second tier universities in China are becoming competitive with Brazilian institutions. This approach may raise the overall quality of science produced.
For improving reputation and visibility of research, the universities should consider running bilingual announcements of research achievements, rather than only in Portuguese, and tracing where they are shared and consumed across the internet (both in traditional academic sources and in social media such as LinkedIn). This is especially applicable to the category of Employer Reputation, where knowledge of the sector is less specialised, and decision makers are often influenced by short articles rather than academic texts. This type of activity enhances global perception of the research produced within the university, and therefore should be measured and monitored.