Ranking Universities vs. Benchmarking Systems:

Download Report

Transcript Ranking Universities vs. Benchmarking Systems:

The Convergence of University Rankings
and System Benchmarking
An Apparent Paradox
of “Rankology”
Questions
Two approaches:
University Rankings
System Benchmarking
Are they:
Complementary?
Competing?
Consistent?
IREG - Warsaw, 16-17 May 2013
2
Outline
(1) Background: from ranking to benchmarking
(2) Method of investigation
(3) Results
(4) Interpretation and conclusion
IREG - Warsaw, 16-17 May 2013
3
(1) University Rankings
IREG - Warsaw, 16-17 May 2013
4
U Rankings: a Polarizing Exercise
U Rankings:
hated/loved,
criticized/commended,
threatening/stimulating
but
proliferating (“here to stay”)
Ph. Albatch’s advice [“Don’t take too much notice of
rankings” (UWN, March 23, 2013)]: unlikely to be widely
followed
More pitfalls discovered, uncovered, elucidated more attempts to improve methods
IREG - Warsaw, 16-17 May 2013
5
U Rankings: the Disease
Methodological caveats
Biases:
Research, English, STEM
Composite indicators: Weighting
Subjective (reputation) /non transparent
=> Elitism
Dangerous use (“misuses”, “abuses”)
Universities:
(1) Focus on competition with others
instead of own improvement / Affect strategic planning
(2) Focus on biased criteria (research)
Policy makers:
instead of whole system
Students:
Overall:
Commercialization
Focus on a few WCUs
Impact on university selection
Impact on financing
(crowded) market
IREG - Warsaw, 16-17 May 2013
6
From Ranking to Benchmarking
“If Ranking is the Disease,
Is Benchmarking the Cure?”
(Jamil Salmi, Sunita Kosaraju. Evaluation in Higher Education, Vol. 5 no.1, June 2011)
“Rankings: Neither a Disease nor a Cure”
(Ph. Albatch, UWN, 2013)
IREG - Warsaw, 16-17 May 2013
7
(2) System Benchmarking
Governance
Resources
Access
Equity
TE
SYSTEM
Economic, Social
& Technological
Environment
IREG - Warsaw, 16-17 May 2013
Quality
control
Private
Providers
8
Benchmarking: Objective & Criteria
Objective: assess strength, health and
performance of countries' tertiary education
systems
Criteria: resources, inputs, governance, outputs
and outcomes of the system (access, equity,
quality, relevance)
IREG - Warsaw, 16-17 May 2013
9
Benchmarking: Main Initiatives
•
SABER: System Approach for Better Education Results
(World Bank)
Still under construction
•
U21 (Universitas 21/ University of
Melbourne) Most recent, comprehensive
available case
 See below
•
•
Benchmarking University Governance (World Bank –
MENA):
Hybrid
AHELO: Assessment of Higher Education Learning Outcomes (OECD)
 Still under experimentation
IREG - Warsaw, 16-17 May 2013
10
Hypothesis
Benchmarking developed in reaction to
Rankings
Objectives, level of observation and criteria of
Benchmarking and Ranking are quite different
==Shouldn’t they yield different results?
IREG - Warsaw, 16-17 May 2013
11
Method (1)
1/ Select 4 of the more popular university rankings:
ARWU, THE, QS, WEBOmetrics
2/ Pick the most recent system benchmarking: U21
3/ Compare their results
IREG - Warsaw, 16-17 May 2013
12
Method (2)
Issue:
How to compare U and Systems?
Solution:
Translate U rankings into Country Rankings
Method:
From: number of top universities
to: number of tertiary aged youths in one
country potentially served by top universities in that country
(e.g. supply of top universities)
NB: no correlation between the 2 measures
IREG - Warsaw, 16-17 May 2013
13
20
10
5
United States
United Kingdom
Germany
Australia
Canada
Italy
Japan
Netherlands
France
Sweden
China
Switzerland
Belgium
Spain
Taiwan
Austria
Hong Kong
New Zealand
South Korea
Denmark
Finland
Ireland
Turkey
Norway
South Africa
India
Israel
Portugal
Brazil
Poland
Russian Federation
Singapore
Colombia
Czech Republic
Estonia
Greece
Iceland
Iran, Islamic Rep.
Mexico
Saudi Arabia
Thailand
NB: Number of Top 400 U and
Supply of Top 400 U (THE) : Rank)
45
40
China
25
21
22
15
13
11
3
India
37
30
31
26
23
24
19
18
16
14
10
8
5
6
2
0
IREG - Warsaw, 16-17 May 2013
38
35
30
27
29
39
36
34
32
33
35
28
25
Supply
(density)
20
17
15
12
9
7
Nbr of top 400
Uni
4
Iceland
14
Method (3)
Quick look at the 4 leagues selected
The “sample”: Top 400 universities
THE
ARWU
QS
WEBO
Nbr of countries with at least one top 400
university in each league
41
38
45
41
Nbr of countries with at least one top 400
university found in all 4 leagues (Overlap)
34
34
34
34
Nbr of top 400 universities in the countries
with at least one top 400 university found in
all 4 leagues (Overlap)
389
394
378
387
IREG - Warsaw, 16-17 May 2013
15
The 34
Australia
Austria
Belgium
Brazil
Canada
China
Czech Republic
Denmark
Finland
France
Germany
Greece
Hong Kong
India
Ireland
Israel
Italy
Japan
Mexico
Netherlands
New Zealand
Norway
Poland
Portugal
Russian Federation
Singapore
South Africa
South Korea
Spain
Sweden
Switzerland
Taiwan
United Kingdom
United States
IREG - Warsaw, 16-17 May 2013
Russia
Portugal
Mexico
India
Czech
South Africa
Singapore
Poland
New Zealand
Greece
Norway
Ireland
Finland
Austria
Denmark
Taiwan
Hong Kong
Brazil
Israel
Belgium
Switzerland
Spain
South Korea
Sweden
Netherland
Italy
China
Japan
France
Australia
Canada
Germany
United Kingdom
Comparing the results of the 4
Rankings (1)
Correlation between results of the 4 leagues:
(Number of top universities in each country)
60
50
40
30
Webo
20
THE
ARWU
10
QS
0
17
Comparing the results of the 4
Rankings (2)
Correlation between results of the 4 leagues:
(1) number of top universities in each country
Nbr of Top 400 Universities: R2
THE
THE
QS
ARWU
WEBO
QS
0.98
ARWU
0.98
0.95
IREG - Warsaw, 16-17 May 2013
WEBO
0.96
0.93
0.98
18
Comparing the results of the 4
Rankings (3)
Correlation between results of the 4 leagues:
(2) Supply of top universities
Density: R2
THE
THE
QS
QS
ARWU
WEBO
0.96
0.87
0.78
0.83
0.72
0.86
ARWU
WEBO
IREG - Warsaw, 16-17 May 2013
19
Supply: Nbr of top U/ TE aged population
The first five countries
1
2
3
4
5
Finland
New Zealand
Switzerland
Ireland
Denmark
QS
16.1
14.5
13.4
13.3
11.5
ARWU
6.9
4.8
11.8
8.0
9.2
THE
11.5
14.5
13.4
13.3
11.5
WEBO
9.2
2.4
11.8
5.3
9.2
The last five countries
30
31
32
33
34
Poland
Mexico
Brazil
China
India
QS
0.3
0.1
0.1
0.1
0.04
ARWU
0.5
0.1
0.2
0.1
0.01
THE
0.5
0.1
0.1
0.1
0.02
WEBO
0.8
0.1
0.4
0.2
0.01
Benchmarking: “U 21”Method (1)
1/ A priori selection of 48 countries ( +2)
2/ Assessment of countries’ performance based
on one overall indicator and 4 “measures”:
(1) Resources
(3)Connectivity
(2) Environment
(4)Output
IREG - Warsaw, 16-17 May 2013
21
Benchmarking: Method (2)
(1) Resources (25%):
5 indicators on expenditures
(2) Environment (25%):
2 indicators on gender balance,
1 indicator on data quality,
3 indicators on policy and regulatory
environment,
1 homegrown index on internal
governance
IREG - Warsaw, 16-17 May 2013
22
Benchmarking: Method (3)
(3) Connectivity (10%):
2 indicators on degree of internationalization (students &
research)
(4) Output (40%):
5 indicators on research,
1 indicator on Probability of a person to attend a top 500
university (*) based on ARWU…
1 indicator on enrollment
1 indicator on tertiary educated population
1 indicator on unemployment among tertiary educated
population
IREG - Warsaw, 16-17 May 2013
23
Benchmarking: Links between the 5 measures
Overall
Overall
Resources
Connectivity
(40%)
Environment
(25%)
0.93
0.63
0.59
0.75
0.39
0.43
0.50
0.38
Resources
Outputs
(25%)
0.88
Outputs
Environment
(25%)
0.40
Connectivity
IREG - Warsaw, 16-17 May 2013
24
Comparing Results of Rankings and
Benchmarking (1a)
Countries Overlap between UR and SB:
U21 & THE:
U21 & QS:
U21 & ARWU:
U21 & WEBO:
37 common countries
40 common countries
37 common countries
41 common countries
 Essentially same pool of countries
IREG - Warsaw, 16-17 May 2013
25
Comparing Results of Rankings and
Benchmarking (1b)
Not in U21
Colombia
Estonia
Iceland
Lebanon
Oman
Philippines
Saudi Arabia
UAE
Not in one (or
more) Ranking
Argentina
Bulgaria
Chile
Croatia
Hungary
Indonesia
Iran
Malaysia
Romania
Slovakia
Slovenia
Thailand
Turkey
Ukraine
IREG - Warsaw, 16-17 May 2013
26
Comparing Results of Rankings and
Benchmarking (2)
Correlation between U21 Indicators and Rankings (Supply): R2
U21
Overall
Ressources
Outputs
Environmment
Connectivity
THE
0.74
0.69
0.55
0.50
0.73
QS
ARWU
0.73
0.69
0.57
0.49
0.65
IREG - Warsaw, 16-17 May 2013
0.76
0.78
0.63
0.45
0.63
WEBO
0.76
0.77
0.62
0.38
0.58
27
120
40
20
United States
Sweden
Canada
Finland
Denmark
Switzerland
Norway
Australia
Netherland
United Kingdom
Singapore
Austria
Belgium
New Zealand
France
Ireland
Germany
Hong Kong
Israel
Japan
Taiwan
South Korea
Portugal
Spain
Czech Rep
Poland
Greece
Italy
Russia
China
Brazil
Thailand
Iran
Mexico
Turkey
South Africa
India
Comparing Results of Rankings and
Benchmarking (3)
U21 (Overall) and THE Rankings (R2= 0.74)
160
140
Switzerland
New Zealand
Sweden
Ireland
100
80
IREG - Warsaw, 16-17 May 2013
U21
Hong Kong
Supply
60
Log. (Supply)
USA
France
0
28
Comparing Results of Rankings and
Benchmarking (4)
U21 (Resources) & ARWU (Supply): R2 = 0.78
U21 (Resources)
Canada
Denmark
Sweden
USA
Norway
Finland
Switzerland
Singapore
Netherland
Austria
Ireland
Belgium
France
Hong Kong
Israel
Germany
Taiwan
Australia
Greece
100
97
94
92
92
89
87
82
80
75
72
69
67
64
64
64
63
63
63
ARWU (Supply)
60
South Korea
92
New Zealand
136
Portugal
46
Spain
74
Iran
69
UK
118
Japan
Poland
48
90
Italy
46
Czech Rep
80
Russia
72
Brazil
32
Mexico
86
Hungary
80
Argentina
46
South Africa
16
China
82
India
24
IREG - Warsaw, 16-17 May 2013
U21 (Resources)
ARWU (Supply)
60
59
58
58
57
56
53
16
48
13
23
1
60
19
49
5
47
47
43
42
40
40
39
35
33
23
31
12
1
2
1
12
2
3
1
0.1
29
Conclusions /Interpretation
1/
Hypothesis not confirmed:
a/ same set of countries
b/ similar results
2/
Two types of explanations:
a/ methodological
b/ structural
IREG - Warsaw, 16-17 May 2013
30
Epilogue
• System Benchmarking ends up ranking
countries
• Boundaries between UR and SB are blurred
• SB suffers common symptoms with UR
• Convergence of the two streams of
“Rankology” not surprising
• Benchmarking needs to expand its pool of
countries to become more relevant
IREG - Warsaw, 16-17 May 2013
31
Take Away
IREG - Warsaw, 16-17 May 2013
32
Thank You
SB
UR
IREG - Warsaw, 16-17 May 2013
33