Transcript Document

THE WORLD UNIVERSITY RANKINGS

History of the Rankings and the Global higher education context

Phil Baty

Editor

Times Higher Education World University Rankings

About Times Higher Education

The weekly magazine for all higher education professionals

• •

Why Rank? Rapid globalisation of higher education

• There are 3.7 million students enrolled in higher education outside their country of origin – 7 million by 2020 Universities now have at least 200 satellite campuses outside their home countries (37 more on their way) Around 20 per cent of all academics working in the UK are appointed from overseas • • Almost 40 per cent of papers from top 200 universities international Sir Drummond Bone said: “World class research is inherently international”

Why Rank? Rapid globalisation of higher education

“We are living through one of those tipping points where in five years, (commentators will say) that this was the period when the landscape changed for ever, when the speed of reputational growth and decline suddenly accelerated.

“We all accept that higher education is borderless - ideas know no boundaries, do not accord any significance to geography and maps - and that is equally true of reputations and university rankings.”

Peter Upton

Director, British Council, Hong Kong

• • •

Why Rank? Rankings have a useful function

• “Rankings often serve in place of formal accreditation systems in countries where such accountability measures do not exist.” “Prompt change in areas that directly improve student learning experiences” “Encourage institutions to move beyond their internal conversations to participate in broader national and international discussions.” “Foster collaboration, such as research partnerships, student and faculty exchange programmes.”

US Institute for Higher Education Policy, May 2009

Rankings: increasing influence

“Rankings are an unmistakable reflection of global academic competition… they seem destined to be a fixture on the global education scene for years to come… As they are

refined and improved

they can and should play an important role in helping universities get better.”

Ben Wildavsky, The Great Brain Race (Princeton University Press, May 2010)

The old ranking system: 2004-2009, with QS

Citations Reputation Intl students Intl staff Staff Student ratio Employer poll

Old QS ranking system (2004-2009) not fit for purpose

We have torn them up and will start again QS no longer has ANY involvement at all with the Times Higher Education World University Rankings We abandoned the old THE-QS methodology and developed a new system in consultation with academics and university managers worldwide.

Old QS ranking system (2004-2009) not fit for purpose

Results have been highly volatile. There have been many sharp rises and falls… Fudan in China has oscillated between 72 and 195

…” Simon Marginson, University of Melbourne.

Most people think that the main problem with the rankings is the opaque way it constructs its sample for its reputational rankings

”. Alex Usher, vice president of Educational Policy Institute, US.

The logic behind the selection of the indicators appears obscure

”. Christopher Hood, Oxford University

Old QS ranking system (2004-2009) not fit for purpose

The organizations who promote such ideas should be unhappy themselves, and so should any supine universities who endorse results they view as untruthful

” Andrew Oswald, professor of Economics, University of Warwick, 2007.

Times Higher Education’s responsibility

The responsibility weighs heavily on our shoulders. We are very much aware that national policies and multimillion-pound decisions are influenced by the rankings…. We feel we have a duty to improve how we compile the rankings..

“We believe universities deserve a rigorous, robust and transparent set of rankings – a serious tool for the sector, not just an annual curiosity.”

Ann Mroz, Editor, Times Higher Education, November 2009

What was wrong with the old QS system?

Our editorial board highlighted two key concerns: * Reputation survey * Citations

What was wrong with the old QS system? Reputation

Peer review – simply a reputation survey. Inherently controversial Subjective. They reflect past, not current, performance. Based on stereotype or even ignorance.

A good or bad reputation may be mindlessly replicated.

But

: support for reputation measure in Thomson Reuters’ opinion poll. 79 per cent said were a “must have” or “nice to have”.

Reputation is crucial. Survey can bring in some measure of the things quantitative data cannot.

What was wrong with the old QS system? Reputation

QS achieved a tiny response rate to its survey: In 2009 only around 3,500 people responded to the survey Tiny number of responses from individual countries. In 2008, there were just 182 from Germany, and 236 from India.

Lack of clarity over the questions asked. What are we judging?

This is not good enough when you’re basing 40 per cent of the score on academic peer review

What was wrong with the old QS system? Reputation

“The scores are based on a rather small number of responses: 9,386 in 2009 and 6,534 in 2008: in actual fact, the 3,000 or so answers from 2009 were simply added to those of 2008. the number of answers is pitifully small compared to the 180,000 e-mail addresses used”.

Global University Rankings and Their Impact, European Universities Association, June 2011

What was wrong with the old QS system? Reputation

“What are the criteria for leaving out a great number of universities or whole countries? The lists of universities pre-selected.. Usually continued universities from only 25/26 European countries out of the 48 countries in the European Higher Education Area”.

Global University Rankings and Their Impact, European Universities Association, June 2011

What was wrong with the old QS system? Reputation

“QS’s extensive corporate database, a network of partners with whom QS cooperates in its events, and participating institutions who submit a list of professionals with whom they work, thus creating a new bias”.

Global University Rankings and Their Impact, European Universities Association, June 2011

What was wrong with the old QS system? Reputation

“The fact that the total number of worldwide responses in 2009 was only 2,336 has implications. .. Such a small sample of world employers might simply not be aware of excellent universities in smaller countries, and especially in those countries where neither English nor another world language are spoken”.

Global University Rankings and Their Impact, European Universities Association, June 2011

What was wrong with the old QS system? Citations

QS failed to take into account dramatically different citation volumes between disciplines Major bias towards hard sciences, because arts and humanities papers have much lower citation volumes No normalisation for subject

Field What was wrong with the old QS system? Citations

Chemistry Engineering Mathematics Molecular Biology & Genetics Physics

Papers Citation

618,568 3,335,763

Citation Impact

5.39

438,538 140,219 958,640 211,268 2.19

1.51

145,939 1,597,660 494,451 2,154,290 10.95

4.36

What was wrong with the old QS system? Citations

Influence despite clear flaws

“The term world class universities has begun to appear in higher education discussions, in institutional mission statements, and government education policy worldwide” “Many staffing and organisational decisions at institutions worldwide have been affected by ranking related goals and outcomes.” “Rankings play an important role in persuading the Government and universities to rethink core national values”

US Institute for Higher Education Policy

The development of a new world ranking system

In November 2009 we signed a deal with

Thomson Reuters

, to work with us to develop and fuel a new and improved global ranking for the future.

A perfect partner

“In addition to unmatched data quality, Thomson Reuters provides a proven history of bibliometric expertise and analysis. We are proud that our data continues to be chosen by leading organisations around the world and we’re happy to provide insight and consultation on such a widely respected indicator,” Jonathan Adams, director of research evaluation,

Thomson Reuters

Thomson Reuters’ stakeholder survey. Key findings:

“The overriding feeling was that a need existed to use more information, not only on research, but also on broader institutional characteristics. The data indicators and methodology currently utilized were perceived unfavorably by many and there was widespread concern about data quality in North America and Europe.” Global Opinion Survey: New Outlooks on Institutional Profiles

Thomson Reuters’ stakeholder survey. Key findings:

“Some would even manipulate their data to move up in the rankings. This is of great concern and warns against any reliance on indicators that could be manipulated without creating a real underlying improvement...” Global Opinion Survey: New Outlooks on Institutional Profiles, Feb 2010

Thomson Reuters’ stakeholder survey. Key findings:

• 92 per cent said that faculty output (publications) was a must have/nice to have • 91 per cent said that faculty impact (citations) was a must have/nice to have • 86 per cent said they wanted faculty/student ratios • 84 per cent wanted income from research grants • 79 per cent wanted peer “reputation” measure

Thank you

• Visit the Global Institutional Profiles Project website: http://science.thomsonreuters.com/globalprofilesproject • See the results in full, with our interactive tables: http://bit.ly/thewur • Join our rankings Facebook group. www.facebook.com/THEWorldUniRank • Keep up to date with all the rankings news on Twitter: @THEWorldUniRank * Follow Phil Baty on Twitter: @Phil_Baty

Thank you. Stay in touch.

Phil Baty

Times Higher Education

T.

020 3194 3298

E.

[email protected]