THE WORLD UNIVERSITY RANKINGS History of the Rankings and the Global higher education context

  • Upload
    misu

  • View
    73

  • Download
    0

Embed Size (px)

DESCRIPTION

THE WORLD UNIVERSITY RANKINGS History of the Rankings and the Global higher education context. Phil Baty Editor Times Higher Education World University Rankings. About Times Higher Education The weekly magazine for all higher education professionals. - PowerPoint PPT Presentation

Citation preview

  • THE WORLD UNIVERSITY RANKINGSHistory of the Rankings and the Global higher education contextPhil Baty EditorTimes Higher Education World University Rankings

  • About Times Higher Education

    The weekly magazine for all higher education professionals

  • Why Rank? Rapid globalisation of higher education

    There are 3.7 million students enrolled in higher education outside their country of origin 7 million by 2020

    Universities now have at least 200 satellite campuses outside their home countries (37 more on their way)

    Around 20 per cent of all academics working in the UK are appointed from overseas

    Almost 40 per cent of papers from top 200 universities international

    Sir Drummond Bone said: World class research is inherently international

  • Why Rank? Rapid globalisation of higher education

    We are living through one of those tipping points where in five years, (commentators will say) that this was the period when the landscape changed for ever, when the speed of reputational growth and decline suddenly accelerated.We all accept that higher education is borderless - ideas know no boundaries, do not accord any significance to geography and maps - and that is equally true of reputations and university rankings.

    Peter UptonDirector, British Council, Hong Kong

  • Why Rank? Rankings have a useful function

    Rankings often serve in place of formal accreditation systems in countries where such accountability measures do not exist.

    Prompt change in areas that directly improve student learning experiences

    Encourage institutions to move beyond their internal conversations to participate in broader national and international discussions.

    Foster collaboration, such as research partnerships, student and faculty exchange programmes.

    US Institute for Higher Education Policy, May 2009

  • Rankings: increasing influence

    Rankings are an unmistakable reflection of global academic competition they seem destined to be a fixture on the global education scene for years to come As they are refined and improved they can and should play an important role in helping universities get better.Ben Wildavsky, The Great Brain Race (Princeton University Press, May 2010)

  • The old ranking system: 2004-2009, with QS

    CitationsReputationEmployer pollStaff-Student ratioIntl staffIntl students

  • Old QS ranking system (2004-2009) not fit for purpose

    We have torn them up and will start again

    QS no longer has ANY involvement at all with the Times Higher Education World University Rankings

    We abandoned the old THE-QS methodology and developed a new system in consultation with academics and university managers worldwide.

  • Old QS ranking system (2004-2009) not fit for purpose

    Results have been highly volatile. There have been many sharp rises and falls Fudan in China has oscillated between 72 and 195 Simon Marginson, University of Melbourne.

    Most people think that the main problem with the rankings is the opaque way it constructs its sample for its reputational rankings. Alex Usher, vice president of Educational Policy Institute, US.

    The logic behind the selection of the indicators appears obscure. Christopher Hood, Oxford University

  • Old QS ranking system (2004-2009) not fit for purpose

    The organizations who promote such ideas should be unhappy themselves, and so should any supine universities who endorse results they view as untruthful

    Andrew Oswald, professor of Economics, University of Warwick, 2007.

  • Times Higher Educations responsibilityThe responsibility weighs heavily on our shoulders. We are very much aware that national policies and multimillion-pound decisions are influenced by the rankings. We feel we have a duty to improve how we compile the rankings..

    We believe universities deserve a rigorous, robust and transparent set of rankings a serious tool for the sector, not just an annual curiosity.

    Ann Mroz, Editor, Times Higher Education, November 2009

  • What was wrong with the old QS system?Our editorial board highlighted two key concerns:

    * Reputation survey

    * Citations

  • What was wrong with the old QS system? ReputationPeer review simply a reputation survey. Inherently controversial

    Subjective. They reflect past, not current, performance. Based on stereotype or even ignorance.

    A good or bad reputation may be mindlessly replicated.

    But: support for reputation measure in Thomson Reuters opinion poll. 79 per cent said were a must have or nice to have.

    Reputation is crucial. Survey can bring in some measure of the things quantitative data cannot.

  • What was wrong with the old QS system? ReputationQS achieved a tiny response rate to its survey: In 2009 only around 3,500 people responded to the survey

    Tiny number of responses from individual countries. In 2008, there were just 182 from Germany, and 236 from India.

    Lack of clarity over the questions asked. What are we judging?

    This is not good enough when youre basing 40 per cent of the score on academic peer review

  • What was wrong with the old QS system? ReputationThe scores are based on a rather small number of responses: 9,386 in 2009 and 6,534 in 2008: in actual fact, the 3,000 or so answers from 2009 were simply added to those of 2008. the number of answers is pitifully small compared to the 180,000 e-mail addresses used.

    Global University Rankings and Their Impact, European Universities Association, June 2011

  • What was wrong with the old QS system? ReputationWhat are the criteria for leaving out a great number of universities or whole countries? The lists of universities pre-selected.. Usually continued universities from only 25/26 European countries out of the 48 countries in the European Higher Education Area.

    Global University Rankings and Their Impact, European Universities Association, June 2011

  • What was wrong with the old QS system? ReputationQSs extensive corporate database, a network of partners with whom QS cooperates in its events, and participating institutions who submit a list of professionals with whom they work, thus creating a new bias.

    Global University Rankings and Their Impact, European Universities Association, June 2011

  • What was wrong with the old QS system? ReputationThe fact that the total number of worldwide responses in 2009 was only 2,336 has implications. .. Such a small sample of world employers might simply not be aware of excellent universities in smaller countries, and especially in those countries where neither English nor another world language are spoken.

    Global University Rankings and Their Impact, European Universities Association, June 2011

  • What was wrong with the old QS system? CitationsQS failed to take into account dramatically different citation volumes between disciplines

    Major bias towards hard sciences, because arts and humanities papers have much lower citation volumes

    No normalisation for subject

  • What was wrong with the old QS system? Citations

  • What was wrong with the old QS system? Citations

  • Influence despite clear flawsThe term world class universities has begun to appear in higher education discussions, in institutional mission statements, and government education policy worldwide

    Many staffing and organisational decisions at institutions worldwide have been affected by ranking-related goals and outcomes.

    Rankings play an important role in persuading the Government and universities to rethink core national values

    US Institute for Higher Education Policy

  • The development of a new world ranking systemIn November 2009 we signed a deal with Thomson Reuters, to work with us to develop and fuel a new and improved global ranking for the future.

  • A perfect partnerIn addition to unmatched data quality, Thomson Reuters provides a proven history of bibliometric expertise and analysis. We are proud that our data continues to be chosen by leading organisations around the world and were happy to provide insight and consultation on such a widely respected indicator,

    Jonathan Adams, director of research evaluation, Thomson Reuters

  • Thomson Reuters stakeholder survey. Key findings:The overriding feeling was that a need existed to use more information, not only on research, but also on broader institutional characteristics. The data indicators and methodology currently utilized were perceived unfavorably by many and there was widespread concern about data quality in North America and Europe.

    Global Opinion Survey: New Outlooks on Institutional Profiles

  • Thomson Reuters stakeholder survey. Key findings:Some would even manipulate their data to move up in the rankings. This is of great concern and warns against any reliance on indicators that could be manipulated without creating a real underlying improvement...

    Global Opinion Survey: New Outlooks on Institutional Profiles, Feb 2010

  • Thomson Reuters stakeholder survey. Key findings: 92 per cent said that faculty output (publications) was a must have/nice to have 91 per cent said that faculty impact (citations) was a must have/nice to have 86 per cent said they wanted faculty/student ratios 84 per cent wanted income from research grants 79 per cent wanted peer reputation measure

  • Thank you Visit the Global Institutional Profiles Project website: http://science.thomsonreuters.com/globalprofilesproject

    See the results in full, with our interactive tables: http://bit.ly/thewur

    Join our rankings Facebook group. www.facebook.com/THEWorldUniRank

    Keep up to date with all the rankings news on Twitter: @THEWorldUniRank

    * Follow Phil Baty on Twitter: @Phil_Baty

  • Thank you. Stay in touch.

    Phil BatyTimes Higher Education

    T. 020 3194 3298E. [email protected]

    Thank youThis is Times Higher Education magazine. It was Founded in 1971

    The day to day mission at Times Higher Education, which weve been involved in for almost 40 years, is to be an authorative and respected source of information for the global higher education community.

    We are accountable to that community, so it is crucial that our rankings stand up to the close scrutiny of academics and university staff.

    As experts on higher education working for the people in higher education, we are acutely aware that universities are extraordinarily complex organisations, which do many wonderful, life-changing and paradigm shifting things which simply can not be measured.

    We know that universities can not really be reduced to a crude set of numbers. So why do we rank at all?

    Well first of all, higher education is rapidly globalising.The world of higher education is changing.

    We believe strongly that rankings, despite their limitations, help us understand this dramatic process.And rankings do serve a valid purpose. They may be controversial, but they fill an increasingly important role.

    This slide shows a series of quotes from the US Institute of Higher Education policy on the influence of rankings. QUOTE..

    We have examples of our rankings being built into immigration laws, for example, or used to assess a candidates elligibility for a government scholarship.So they serve a number of important functions. They fill a serious global information gap. Love them or hate them, they are here to stay.

    Heres Ben Wildavsky: I think that the key point from this is Refined and improved.So, between 2004 and 2009, THE published a global university ranking with a company called QS.

    It was very simple. Very crude

    Explain indicators:But as the rankings had grown in reach and in influence, we were increasingly receiving complaints about the validity of the methodology..

    In 2009 we ended our partnership with QS. QS continues to produce this rankings, now called the QS World University Ranking but it must be stressed that THE has no further involvement at all. Not since 2009.Why did we end the relationship with QS?

    There has been strong criticism of the old QS methodology, which Times Higher Education accepted and listened too. For example:

    But far the most stinging attack came from Andrew Oswald professor of Economics at the University of Warwick.

    2007, he mocked the pecking order of that years rankings

    Oxford and Cambridge were joint second. Stanford University was 19th despite having garnered three times as many Nobel Prizes over the past two decades as the universities of Oxford and Cambridge did combined

    He said we should be unhappy with ourselves. And believe me we were.

    New editor in 2008, she put me in charge of rankings in 2009. We had a review and did not like what we saw.We concluded that the QS system we had been publishing was simply not good enough.

    This is the position we were in, at the end of 2009. This was the mission as laid out by my esteemed editor, Ann Mroz.

    So we had to start all over again.First we had to establish what was wrong with the old system. Why was it attracting so much criticism?

    We convened a meeting of our editorial board, including:

    Drummond Bone, HE consultant and and advisor to the UK government

    Bahram Bekhradnia, director of the Higher Education Policy Institute

    Malcolm Grant, Provost of UCLSimon Marginson, Professor of HE Universtiy of Melbourne

    Philip Altbach, director, Center for International Higher Education, Boston College

    Research by Michael Bastedo from the University of Michigan found that the biggest most important factor in achieving a good reputation score was the previous years ranking position

    Is open to manipulation the US website Inside Higher Ed reported last year on the senior administrator who revealed that her colleagues were instructed to routinely give low ratings to all other programmes other than their own institutions.

    But reputation is crucial in globalised and competitive HE world. It also helps get a sense of some of the important but intangible things.

    And it is a measure that people want. But we had to devise a better and fairer way of doing it.

    Big problem with response rates to the QS survey.

    This issue was looked at by the European Universities Association last year.

    What QS does is aggregate three years worth of data to boltser the numbers

    QS also allows anyone to volunteer to take part even I was invited once

    QS allows universities to submit lists of names surely that can not be fair.

    QS also use an employer survey asking employers to say which universities they like to recruit from

    But where do they get their lists of employers to survey?

    READ. That is never going to produce a fair resultAlso big problems with response rates to the employer surveyThe other big problem we had was with the research excellence measure.

    By simply measuring the volume of citations against staff numbers, the old rankings took no account of the dramatically different citations habits between disciplines. Med School advantage.

    This shows how crucial it is to normalise your data.

    Van Rann mortal sin. Some rankings fail to normalise citations data at all. A major problem.

    This graph was produced by Van Raan at the Leiden University.

    But despite all these clear weaknesses with the QS rankings, they became hugely influential.

    Back to that IHEP report I mentioned earlier

    So we really felt we had a duty to change the rankings. If so much weight was being placed upon ranking, they had to better bear that weight.

    to deliver on this mission, in late 2009 we ended a six year relationship with QS, and brought in a new data partner: Thomson Reuters, the worlds leading research data specialist.

    We worked with Thomson Reuters to build a new database of the worlds leading institutions, to develop a new mission and a new methodology for a more rigorous and robust university ranking system. A new brand was born.

    For us, it is a perfect partnership. Thomson Reuters brings exceptional expertise with data and research evaluation. We bring 40 years of getting under the skin of the higher education world.And by partnering with Thomson Reuters, we also got the expertise of Jonathan Adams. He is director of research evaluation for Thomson Reuters. He truly is a world leading expert UK Government advisor on research policy. European Framework 7, Australia, etc etc. We couldnt ask for a more informed partner.

    So after establishing the flaws in the QS system, how did we go about developing our plans for a new ranking system?

    Well, first we asked the community. With Thomson Reuters, in 2009 we carried out a global survey of academics, university administrators and students, asking them what they thought of rankings, how they used them, which performance indicators they value, which ones dont work, which ones they would like to see.

    READ. So we asked them what theyd really want to see in rankings

    Read

    We took this survey information, and then we engaged in 10 months of open consultation with the community, though our news pages and website, to develop detailed proposals. These were then put to an expert advisory group of more than 50 leading experts from 15 countries.

    And I the next session, I will explain the detailed methodology that we came up with.But to make sure were as accountable as it is possible to be, we need constant criticism and input.

    Only with the engagement of the higher education sector will we achieve a tool that is as rigorous and as transparent and as useful as the sector needs and deserves.

    So please use the sites and tools above to make sure you have your say and tell us what you think.