Categories
Ranking Monitor Reference

U-21

Unlike the other rankings in consideration, the U21 ranking does not measure the quality of universities, but in the strength of higher education systems, aiming to show how universities function within their local environments, and how supportive their local environments are for creating strong university systems.

Universitas 21

Unlike the other rankings in consideration, the U21 ranking does not measure the quality of universities, but in the strength of higher education systems, aiming to show how universities function within their local environments, and how supportive their local environments are for creating strong university systems. Therefore it analyses inputs, in terms of governmental and total financial resources invested into tertiary education, as a proportion of GDP, PPP adjusted spend per student, and R&D expenditure as a percentage of GDP and per capita.

It then seeks a variety of proxies to describe the university environment, such as gender diversity, a rating for data quality, something highlighted by the authors as extremely important for maintaining a quality higher education system, and while not assessed by any other ranking, is fundamental to good performance within them. It also assesses systemic diversity, especially focusing on ISCED level 5 courses (post-secondary technical courses below bachelor’s level) and the public/private mix by enrolment.

Compiling and financing

The U21 ranking is compiled and published by the Universitas 21 consortium of universities, and is created and overseen by Ross Williams at the Melbourne Institute of Applied Economic and Social Research at the University of Melbourne.

The team of compilers is as follows:

  • Professor Ross Williams
  • Ms Anne Leahy
  • Professor Paul Jensen
  • Professor Gaétan de Rassenfosse

Because the initiative is financed by a consortium of universities, it is much more concerned with acting as a tool to guide governance at an institutional and governmental level than producing sensational results, or in serving as a guide to prospective students.

Data Collection Processes

Normalised vs un-normalised

Because the ranking seeks to be a tool to guide governance, it does not merely seek to grade every system against the richest and most developed. The U21 has a version adjusted via regression for GDP per capita in order to predict the expected investment and outputs for a system.

Table 1: Metrics for U21 2017

AreaMetricWeighting
Resources1. (5%) Government expenditure on tertiary education institutions as a percentage of GDP, 2013.
2. (5%) Total expenditure on tertiary education institutions as a percentage of GDP, 2013.
3. (5%) Annual expenditure per student (full-time equivalent) by tertiary education institutions in USD purchasing power parity, 2013.
4. (2.5%) Expenditure in tertiary education institutions for research and development as a percentage of GDP, 2014.
5. (2.5%) Expenditure in tertiary education institutions for research and development per head of population at USD purchasing power parity, 2014.
20%
EnvironmentE1: (1%) Proportion of female students in tertiary education, 2014.
E2: (2%) Proportion of academic staff who are female in tertiary institutions, 2014.
E3: (2%) A rating for data quality. For each quantitative series, the value is 2 if the data are available for the exact definition of the variable; 1 if some data are available which relate to the variable but some informed adjustment is required; and 0 otherwise. E4: (10%) Qualitative measure of the policy environment comprising:
E4.1 (2%) Diversity of the system comprising two components of equal weight: the percentage of tertiary students enrolled in private institutions (capped at 50 per cent) and the percentage of students enrolled in ISCED level 5 courses.
E4.2 (4%) Survey results for the policy and regulatory environment (see Appendix 2).
E4.3 (4%) Survey results for the financial autonomy of public universities ((see Appendix 2)
E5: (5%) Responses to WEF survey question (7-point scale): “how well does the educational system in your country meet the needs of a competitive economy?”.
20%
ConnectivityC1: (4%) Proportion of international students in tertiary education, 2014.
C2: (4%) Proportion of articles co-authored with international collaborators, 2014 (coverage is all institutions that publish at least 100 papers).
C3: (2%) Webometrics Web TRANSPARENCY measure: sum of values from 4,200 universities divided by country’s population, July 2016 edition.
C4: (2%) Webometrics VISIBILITY index (external links that university web domains receive from third parties). Sum of data for 10,000 tertiary institutions divided by country’s population, July 2016 edition.
C5: (4%) Responses to question ‘Knowledge transfer is highly developed between companies and universities’, asked of business executives in the annual survey by IMD World Development Centre, Switzerland, 2016.
C6: (4%) Percentage of university research publications that are co-authored with industry researchers, 2012–14
20%
OutputO1: (10%) Total articles produced by higher education institutions, 2014.
O2: (3%) Total articles produced by higher education institutions per head of population, 2014.
O3: (5%) Average impact of articles as measured by citations in 2014 to articles published in previous years using the Karolinska Institute normalized impact factor.
O4: (3%) The depth of world class universities in a country. This is calculated as a weighted average of the number of institutions listed in the top 500 according to the 2016 Shanghai Jiao Tong scores, divided by country population.
O5: (7%) The excellence of a nation’s best universities calculated by totalling the 2016 Shanghai Jiao Tong scores for the nation’s three best universities.
O6: (3%) Enrolments in tertiary education as a percentage of the eligible population, defined as the five-year age group following on from secondary education, 2014.
O7: (3%) Percentage of the population aged 25–64 with a tertiary qualification, 2015.
O8: (3%) Number of researchers (full-time equivalent) in the nation per million of population, 2014.
O9: (3%) Unemployment rates among tertiary educated aged 25–64 years compared with unemployment rates for those with only upper secondary or post-secondary non-tertiary education, 2015.
20%

Performance

São Paulo does not, at present, rank at all in U21, because it tends to focus on national systems. However there is a strong case for including more regions, and this is something that could be considered going forward, especially given the size of the higher education system in Brazil, and it’s high levels of asymmetrical development.

Brazil’s current performance source

In 2017 Brazil ranks 42nd overall, which combines ranks of 33 for Resources, 42 for Environment, 48 for Connectivity and 37 for Output. The absence of official data on private expenditure and R&D expenditure means that the ranking for Resources is only an approximation. Government expenditure on higher education as a share of GDP is ranked 34th.

In the Output module Brazil is 11th on total publications but only 40th on publications per head and 47th for the average impact of papers. The country ranks 26th for the quality of its best three universities but is in the bottom decile for participation rate and the qualification of its workforce. Collaboration with international researchers and with local business are ranked in the bottom quintile. When the country standings are adjusted for levels of GDP per capita, Brazil rises to 28th in the rankings but its score is a little below that expected at its income level.