Technical Note 2020 – Unesp and UFABC
The Times Higher Education (THE) Young Universities ranking uses the same methodology and the same weightings as the main THE ranking. Thirty percent of the ranking considers citations per paper normalised by area of knowledge and publication year (FWCI). Another 30% is given to teaching in a composite indicator of reputation, classroom size, proportion of postgraduates and institutional income. The same weight is given to a composite research indicator of reputation survey, research productivity (papers per faculty) and competitive research income. Internationalisation makes up 7.5% of the score – 2.5% each for number of international collaborations, proportion of full international students and proportion of permanent international teaching staff. The remaining 2.5% is for income derived from non-academic partnerships.
The ranking uses the Scopus database, and the current reporting period considers the years 2014-2018 for publications and citations. All financial indicators are adjusted for PPP, and all institutional sizes are reported as full time equivalents (FTE). The ranking is normalised on a z-score scale, meaning that they are normally distributed, and 50 represents the mean score for the sample.
THE defines a young university as one established less than 50 years previously. This includes mergers, and so the list is includes institutions like Paris Sorbonne, technically established in 2015 but from parts of the Université de Paris, established in 1170.
Over the four years presented in this report, the sample has increased notably – this edition counts 414 universities, last year was 351, the year before that 250. Usually in a ranking, newly visible universities are at the bottom of the ranking, not at the top, meaning that the mean score that weights the indicator decreased. Therefore, we would expect gains in indicator score even if the university’s performance remains the same. This significantly complicates the interpretation of the ranking for institutional purposes.
Unesp performance
Year | Position | Overall | Citations | industry income | international outlook | Research | Teaching |
2020 | 201-250 | 28.6–32.7 | 16.8 | 36.9 | 25.1 | 31.9 | 43.4 |
2019 | 201-250 | 24.9–30.3 | 14.7 | 35.4 | 25.1 | 27.5 | 47 |
2018 | 151-200 | 25.7–32.6 | 12.7 | 33.1 | 22.2 | 29.7 | 42.3 |
2017 | 151-200 | 21.9-28.4 | 9.2 | 34.5 | 18.8 | 24.8 | 36.5 |
Over the four years represented here, Unesp has dropped one group in the ranking, but have managed to maintain or increase all indicator scores. The university’s citation score has increased slightly in indicator scores, but has in reality remained fairly stable over the past four years.
Unesp’s largest gains have been in research and teaching. This is impressive given that the two involve financial indicators of core budget (teaching) and competitive research income (research); two aspects that Brazilian universities have confronted serious hardships over the past three years, and faculty student ratio. Continual expansionary pressure and a lack of ability to hire staff places Brazilian public institutions at a disadvantage in this dimension. By a process of elimination, we can attribute most of these gains to improvements in reputation and visibility.
University | Position | Overall | Citations | industry income | international outlook | Research | Teaching |
Unesp | 201-250 | 28.6–32.7 | 16.8 | 36.9 | 25.1 | 31.9 | 43.4 |
Deakin | 55 | 50.1 | 73.5 | 40.6 | 85.2 | 40.7 | 28 |
Kwa-Zulu Natal | 71 | 48.3 | 70.1 | 36.8 | 55 | 41.4 | 32.7 |
Shenzhen | 101-150 | 38.0–43.7 | 70.4 | 51.4 | 34.9 | 30.3 | 26.6 |
The benchmarked universities are all of roughly the same size, age and subject balance. They also have a strong reputation for being regional leaders heavily engaged in their local communities, are committed to social inclusion and widening access and have similar subject mixes. Therefore we could conclude that they can be considered peers for similar profile and mission. Unesp has the strongest teaching profile of any them, and a competitive research profile, and yet is 100 places lower. The majority of this difference is in citation impact – 30% of the ranking total. Unesp publishes far more articles than any of the other three, and expanded over the past decade much more quickly. While its FWCI also grew over the period, it did not grow as quickly.
Instructive here is the change undergone by Shenzhen university in 2012, when with the Double First Class initiative the university increased its output exponentially, but also dramatically increased its FWCI. The institutional changes and financing of the university produced a dramatic change in performance over this period. The other two universities have undergone continual and progressive increases in citation impact, but not as dramatically as Shenzhen.
In order to position in the top 100, Unesp would have to increase its FWCI from 0.88 (current) to around 1.2. To achieve this, a target of around 14% in the top 10% by citation impact would be a good guide that the university was on track.
UFABC
Year | Position | Overall | Citations | industry income | international outlook | Research | Teaching |
2020 | 301-350 | 20.6–24.4 | 24.1 | 35.8 | 33.4 | 17.4 | 20.8 |
2019 | 251-300 | 19.7–24.8 | 26.8 | 39.3 | 33.6 | 18.2 | 19.2 |
2018 | 151-200 | 25.7–32.6 | 29.3 | 34.5 | 32.8 | 17.8 | 38.3 |
2017 | 151-200 | 21.9-28.4 | 33.4 | 36.4 | 31.9 | 19.2 | 19.5 |
At first glance, these results are not at all positive for UFABC, but a closer analysis reveals a slightly different picture. This ranking is very heavily weighted towards size independence – i.e. indicators are normalised. Therefore for citations, the university appears to be generating less impact that it did four years ago, or receiving fewer citations. This is not the case. The average number of citations has decreased, but the reason for this is that the university has vastly expanded its research base away from its initial narrow focus on physical sciences carried out predominantly in large international consortia towards a more diverse base considering social and life sciences. This expansion and diversification inevitably leads to a lower mean score, but the number of papers published in the top 10% has remained steady, meaning that the university is publishing more highly cited research than in 2017, but is establishing a more diversified research base instead of being heavily dependent on a few research groups in a few topics. The initial impact of being involved with the Higgs Boson particle also gave the university a citation boost around the early period represented in this table.
All other indicators have remained relatively stable, suggesting that it is not so much the case that UFABC has fallen in the ranking, but that it has been crowded out by a large number of universities entering the list with similar performance profiles.
University | Position | Overall | Citations | industry income | international outlook | Research | Teaching |
UFABC | 301-350 | 20.6–24.4 | 24.1 | 35.8 | 33.4 | 17.4 | 20.8 |
Tampere | 34 | 54.1 | 81.8 | 52.6 | 49.3 | 48.6 | 33.3 |
Qatar | 73 | 47.9 | 64.7 | 49.3 | 99.6 | 40.4 | 25.6 |
Haifa | 101-150 | 38.0–43.7 | 52.8 | 36.3 | 35.9 | 44.6 | 33.2 |
Tsukuba | 101-150 | 38.0–43.7 | 34.4 | 44.6 | 44.3 | 46.3 | 50.2 |
The universities in this benchmark are all of a relatively similar size, with a strong focus on physical sciences and engineering. Tampere and Tsukuba are more advanced in age and so can be considered benchmarks for the future, while Qatar is roughly the same age as UFABC. Because of this they were considered good universities to make comparative benchmarks.
Among the universities in this benchmark, we can see that UFABC performs relatively well for industry income, a good sign that it is competitive in terms of its innovative potential. Where it is behind in this measure is in citations. In the benchmark, however, UFABC’s numbers appear to be competitive or better than Tsukuba, Qatar and Haifa. The ranking, however, excludes papers with large numbers of authors. UFABC publishes around half of its highly cited research (total 1400 articles) in papers with more than 100 authors. This is common practice in the physical sciences, but appears to limit UFABC’s performance in this ranking. UFABC’s main challenge to perform better in this ranking is to learn from its excellence in Big Science physics projects and apply it to other areas that involve fewer authors.
What could the universities consider improving?
Citations on this type of measure will always be difficult to compete for a large university with a strong role in local leadership like Unesp. To that extent, a comparison with a university like Diego Portales (99th position, score of 95 in citations) is unhelpful. Diego Portales has published 3,000 articles since 2010 – Unesp has published 50,000 in the same period. What it should consider is ensuring correct institutional attribution through Orcid and GRID identifiers.
The best institutional action for increasing performance in this ranking is the implementation of strategic planning for obtaining competitive research funding. Having analysts actively seeking international bases to see where the university’s current research portfolio and collaborative relationships match current international funding calls, and then assisting through the construction of partnerships would help the university to launch larger and more ambitious projects, attract more competitive research income, and reduce the administrative load on faculty. This type of action – already common practice in the US, Europe and Asia through research support offices changes the role of the university from reactive to proactive.
Pursuing this would improve the university’s score in competitive research income and reputation (research indicator) as well as citation scores, having a positive effect on up to half of the indicators available in this ranking, as well as dramatically increasing the university’s research capabilities.
As established, UFABC has not really fallen in this ranking, but it has not really improved in it either. As a very young university, reputation building is a challenge because research shows that survey respondents tend not to have genuine deep knowledge of what they are assessing, and tend to be very heavily influenced by Halo effects – the survey asks academics which the most reputable universities are, and because most academics have at most knowledge of a few institutions, they vote for those at the top of rankings, meaning that a virtuous cycle is created for those at the top, who benefit from previous good performances, and systematically exclude those at the bottom.
UFABC however, has the benefit not of being just a new university, but of being an innovator. It therefore should focus on building reputation for its interdisciplinary teaching and research through coordinated communications strategies
Over time, the apparent citation dip will equalise and begin to grow again. However, like Unesp, the university should consider taking a more strategic supporting role towards attracting international research funding, an action that would improve the university’s performance in multiple dimensions of the ranking, as well as helping it to overcome the instability in federal funding schemes over the past four years.
Specific actions
Short term
- Ensure correct institutional attribution of research.
- Focus on improving international communications and strengthening role in international partnerships.
- Implement international communications strategy that ensures that the university maintains a profile as a regional influential leader.
Medium term
- Become more proactive in identifying international competitive funding opportunities that match existing portfolio strengths and international connections.
- Develop competences that encourage faculty to undertake ambitious projects, and support them through application and execution of projects.