7
KATHRYN RILEY and HARRY TORRANCE BIG CHANGE QUESTION AS NATIONAL POLICY-MAKERS SEEK TO FIND SOLUTIONS TO NATIONAL EDUCATION ISSUES, DO INTERNATIONAL COMPARISONS SUCH AS TIMMS AND PISA CREATE A WIDER UNDERSTANDING, OR DO THEY SERVE TO PROMOTE THE ORTHODOXIES OF INTERNATIONAL AGENCIES? KATHRYN RILEY Evidence about America’s performance on international league tables even makes an impact in the Oval Office. The President was horrified by the results from an international indicators study (TIMMS – Third Interna- tional Science and Maths Study) which showed that, for mathematics and science, American pupils were rated 19 out of 21 countries – behind South Africa and Cyprus. President Bartlett’s indignation, expressed in a recent episode of the West Wing, may or may not be shared by Presi- dent Bush, but there is little doubt these days that in our global and socially and economically competitive world, Governments take those figures seriously. On the recent publication of the OECD’s Education at a Glance 2003, the London Times led with the following: ‘British pupils plummet in international league’ The United Kingdom is sliding down the educational league table because it is failing to keep pace with other countries, according to an international study published yesterday by the Organisations for Economic Co-operation and Development (OECD). Britain has fallen in a generation from 13th to 22nd among industrialised nations in terms of the proportion of 16-year-old gaining the equivalent five ‘A to C grades.’ (Times 17th September 2003) Despite the fact that the UK has maintained its position as a world leader in Higher Education, The Guardian of the next day, 18th September, quotes Schools’ Minister, David Miliband, in defensive mode: “The survey doesn’t show that British Schools are in decline, it shows that our rate of progress at GCSE was slower in the early 90s than in almost every other developed country.” Journal of Educational Change 4: 419–425, 2003. © 2003 Kluwer Academic Publishers. Printed in the Netherlands.

Big change question As national policy-makers seek to find solutions to national education issues, do international comparisons such as TIMMS and PISA create a wider understanding,

Embed Size (px)

Citation preview

Page 1: Big change question As national policy-makers seek to find solutions to national education issues, do international comparisons such as TIMMS and PISA create a wider understanding,

KATHRYN RILEY and HARRY TORRANCE

BIG CHANGE QUESTIONAS NATIONAL POLICY-MAKERS SEEK TO FIND SOLUTIONS TO

NATIONAL EDUCATION ISSUES, DO INTERNATIONALCOMPARISONS SUCH AS TIMMS AND PISA CREATE A WIDER

UNDERSTANDING, OR DO THEY SERVE TO PROMOTE THEORTHODOXIES OF INTERNATIONAL AGENCIES?

KATHRYN RILEY

Evidence about America’s performance on international league tables evenmakes an impact in the Oval Office. The President was horrified by theresults from an international indicators study (TIMMS – Third Interna-tional Science and Maths Study) which showed that, for mathematics andscience, American pupils were rated 19 out of 21 countries – behindSouth Africa and Cyprus. President Bartlett’s indignation, expressed ina recent episode of the West Wing, may or may not be shared by Presi-dent Bush, but there is little doubt these days that in our global andsocially and economically competitive world, Governments take thosefigures seriously.

On the recent publication of the OECD’s Education at a Glance 2003,the London Times led with the following:

‘British pupils plummet in international league’The United Kingdom is sliding down the educational league table because it is failing tokeep pace with other countries, according to an international study published yesterday bythe Organisations for Economic Co-operation and Development (OECD). Britain has fallenin a generation from 13th to 22nd among industrialised nations in terms of the proportionof 16-year-old gaining the equivalent five ‘A to C grades.’ (Times 17th September 2003)

Despite the fact that the UK has maintained its position as a worldleader in Higher Education, The Guardian of the next day, 18th September,quotes Schools’ Minister, David Miliband, in defensive mode: “The surveydoesn’t show that British Schools are in decline, it shows that our rate ofprogress at GCSE was slower in the early 90s than in almost every otherdeveloped country.”

Journal of Educational Change 4: 419–425, 2003.© 2003 Kluwer Academic Publishers. Printed in the Netherlands.

Page 2: Big change question As national policy-makers seek to find solutions to national education issues, do international comparisons such as TIMMS and PISA create a wider understanding,

420 KATHRYN RILEY

The OECD first published a set of international education indicators in1992 – Education at a Glance. The purpose, according to the OECD, wasto provide Member countries with comparative information on the organi-sation and operation of their education systems. In response to furtherdemands for comparative information on student performance, in 2000,the OECD launched the Programme for International Student Assessment(PISA), the focus of President Bartlett’s ire. PISA was designed to promotepolicy dialogue amongst OECD, and non-OECD countries, about learningoutcomes. A number of non-OECD countries, such as Brazil, also choseto participate.

In a major review of the OECD’s education programme George Papado-poulus, deputy director for education for a number of years, points outthat while the OECD has no explicit remit on education, the OECD hasan inferred role in promoting education across its member countries, interms of the contribution which it can make to economic growth andgeneral well-being. According to Papadopoulus, the OECD has had asignificant impact on the educational thinking and policy developments inMember countries through a range of activities, including the developmentof comparative data.

The purpose of this Big Change Question is to explore whether thatinfluence, particularly through the provision of comparative data, is malig-nant or benevolent. We had anticipated a piece from one of the OECD’sleading lights on indicators, but unfortunately did not receive this in timefor our publication deadline. However, we decided to go ahead with thispiece, as the question is such an important one.

The OECD’s website describes the focus of the 2003 edition of Educa-tion at a Glance as being the quality of learning outcomes and the policylevers that shape these outcomes. Education at a Glance 2003 includesa comparative picture of student performance in reading, mathematicaland scientific literacy, as well as of students’ attitudes, engagement andlearning strategies. There is also information on spending patterns inOECD countries; data on access, participation and progression in educa-tion across OECD countries; and an examination of students’ learningconditions, including information about instruction time and average classsize.

In his opinion piece, Harry Torrance raises a number of concerns abouthow data is used and interpreted and about the robustness of the studies:How valid and reliable are they?

Is there a case to be made for TIMMS, Education at a Glance, andPISA? League tables and performance indicators – as schools well know– are blunt instruments, but increasingly, they are part of the new educa-

Page 3: Big change question As national policy-makers seek to find solutions to national education issues, do international comparisons such as TIMMS and PISA create a wider understanding,

BIG CHANGE QUESTION 421

tion currency. The issue is how they are used, and what they measure.Many aspects of the indicators are positive when they generate debateand dialogue. There has been a steady stream of visitors to Finland, forexample, to explore why the Finnish system appears to be so successful.If the indicators “road show” can promote debate about when and how toteach reading to young children, or teachers’ qualification levels, this is allto the good. Comparative studies from the OECD, such as Schools underScrutiny, can also provide helpful insights into challenging issues. Theproblems arise, however, when politicians seek simplistic solutions to theeducation challenges which their own countries face and seek off-the-shelfsolutions which are highly context specific.

REFERENCES

OECD (1992). Education at a Glance. Paris: Organisation for Economic Co-operation andDevelopment.

OECD (2000). Knowledge and Skills for Life: First Results from PISA 2000. Paris:Organisation for Economic Co-operation and Development.

OECD (2003). Education at a Glance. Paris: Organisation for Economic Co-operation andDevelopment.

OECD (1995). Schools Under Scrutiny. Paris: Organisation for Economic Co-operationand Development.

Papadopoulos, G. (1995). Education 1960–1990: The OECD Perspective. Paris: Organi-sation for Economic Co-operation and Development.

KATHRYN RILEY

Institute of EducationLondon Leadership CentreWoburn Square WCIH ONSUnited KingdomE-mail: [email protected]

HARRY TORRANCE

The current TIMSS website claims to “provide participating countries withvaluable information about the achievement of their students” while PISAclaims it “assesses how far students near the end of compulsory educationhave acquired some of the knowledge and skills that are essential for fullparticipation in society.” But there is little evidence in the media or policydevelopment circles that policy-makers pay much attention to anything

Page 4: Big change question As national policy-makers seek to find solutions to national education issues, do international comparisons such as TIMMS and PISA create a wider understanding,

422 HARRY TORRANCE

beyond the crude league tables which these sorts of studies produce. Forall the detail that is included in the most recent TIMSS and PISA reports,TIMSS reproduces its league table of Maths and Science Achievementon the first page of its “Results” report, while the first links from thePISA executive summary take us straight to charts and tables indicatingrank orders (see website references below). International comparisons suchas TIMSS and PISA certainly have the potential to create understandingand identify where the strengths and weaknesses of education systemslie. Buried in the detail of their reports are potentially important findingssuch as the gender gap in achievement in Maths in England is wide andgetting wider between boys and girls (TIMSS 1999, Maths report p. 50).But the detail is as nothing when compared to the headlines generated bythe league tables.

Criticisms of international comparison studies are many and varied, butcan be summarised in terms of the validity and reliability of the findingson the one hand, and their political impact on the other. In other words –do the “findings” actually say anything very meaningful about the state ofeducation in different countries, and if so, do the league table presentationof results do more harm than good?

With respect to validity and reliability, questions have to be askedabout sampling, curriculum fidelity and the different importance attachedto doing well by different countries, along with the extensive researchevidence we have about how children often misinterpret test questionsanyway. Thus, for example, we know that some students can be consider-ably older than others in ostensibly the same grade sample because ofearly/late birthdays or policies of retention, while low attainers may beexcluded from some samples altogether because of such policies and/orselection policies which exclude students from vocational tracks frombeing included in some country samples (Brown, 1998). Also, not allchildren are in school in some of the developing countries taking part.In the event, sample size has actually varied considerably from countryto country with, for example, 8362 Canadian students being assessed inTIMSS as against 1776 English students – representing a much smallerproportion of the school population (Beaton et al., 1997).

Ensuring curriculum fidelity in any test situation is difficult, the moreso when trying to compare like-with-like across continents. Maths isoften taken to be the most culturally neutral subject with respect to thisissue, but even here there are significant variations in what is empha-sised and practised within the Maths curriculum both across countries andwithin countries if highly differentiated streaming and tracking policiesare pursued and this will certainly make a difference to results. Thus, for

bcq4b.tex; 26/11/2003; 16:23; p.2

Page 5: Big change question As national policy-makers seek to find solutions to national education issues, do international comparisons such as TIMMS and PISA create a wider understanding,

BIG CHANGE QUESTION 423

example, England scored well in Geometry in the 1996 TIMSS results, butcame out poorer in maths overall, especially with respect to computation,reflecting the English curriculum emphasis (no prizes for guessing whichresult the press focussed on).

With respect to student effort and concentration, we have evidencethat some are encouraged to regard participation in such studies as “highstakes” and important for national pride (Brown, 1998), while others (espe-cially in countries obsessed with testing such as the United States andEngland) will see taking yet another test, but fortunately a “low stakes”one related to research rather than life chance, as little more than a boringwaste of time to be endured and forgotten about as quickly as possible.We also know from a number of studies that interpreting test questions isby no means as straightforward as it may appear, that children can makemistakes for all sorts of reasons, and that they can often answer the ques-tion correctly if it is posed in a different way (Cicourel et al., 1974; Cooper& Dunne, 2000).

As regards the political impact of TIMSS and PISA, the authors mightargue that they cannot control this and should not be held responsiblefor it. Yet the rankings that are produced are clearly designed to attractattention while the caveats which are included in the reports are routinelyignored. Thus very small differences in mean scores, perhaps attributableto the validity problems outlined above, can lead to major differences inpublished rankings. But while reports include riders to this effect, oftenindicating in the more detailed tables that the differences in the results ofcountries ranked closely together are not statistically significant, this is lostin the rush to report that England is, for example, 7th in the PISA rankingof literacy, when it could as easily have been 3rd or 9th (the ranking rangeof countries whose differences in scores are not statistically significant).

As important to political discourse and policy-making, however, is thevery limited range of subjects which TIMSS and PISA include in theirassessments – and this is something for which they are responsible. ThusMaths, Science and Literacy, but especially Maths and Literacy, are takenas proxies for the performance of education systems overall, when theyquite clearly are no such thing. No measures are produced of perfor-mance in other academic areas, far less of attitudes and values across thecurriculum as whole, and whether or not schools are producing decent,tolerant and curious citizens of the future. This leads to the absurd situationwhich we currently have in England whereby, for once, the results seem tohave brought relatively good news in terms of a high ranking in science(TIMSS 1999 Science results) while there is widespread policy panicabout the poor attitudes which students have towards science, the lack

bcq4b.tex; 26/11/2003; 16:23; p.3

Page 6: Big change question As national policy-makers seek to find solutions to national education issues, do international comparisons such as TIMMS and PISA create a wider understanding,

424 HARRY TORRANCE

of creativity in the science curriculum, and the diminishing numbers ofstudents going on to do Maths and Science in further and higher education(see BBC Education News website in references below).

So, scores go up, while interest in science declines – are our schoolsdoing a good job or not? The answer is that they’re doing what they’re toldto do – by government. In response to previous results reporting low rank-ings for England the government has introduced a National Curriculumand Testing system which has driven domestic test scores up at theexpense of student interest and enjoyment. More particularly, the govern-ment has introduced very focussed curricular programmes in Numeracyand Literacy, specifying in great detail what students should learn, and,perhaps as importantly, how they should be taught this content and for howlong each day. Whole programmes of lessons are now scripted for teachersof these subjects. As a consequence, more time is spent on more narrowversions of Maths and English than hitherto, with the knock-on effect ofdiminishing the time available for the arts and humanities.

Could TIMSS and PISA “create wider understanding?” Yes – espe-cially if the caveats to the results were more widely known and discussed,and if the reports concentrated on analysing common errors and misun-derstandings and providing advice to teachers on what to do about them(though of course this sort of analysis is probably better conducted bynational surveys or tests focussing more specifically on national curricularather than cross-country common instruments (Williams & Ryan, 2000).Do TIMSS and PISA “serve to promote the orthodoxies of internationalagencies?” Of course they do – they help to create and reinforce a climatethat views education as narrow skill preparation for future employment,rather than as a challenging engagement with the knowledge and under-standing that constitutes our culture and the democratic processes whichfuture citizens must control. They narrow definitions of education andundermine the impulse to innovation and curiosity which must be at thecore of any healthy education system worth the name. For all the detailin their reports that might inform educators of problems to address andprocesses to develop, the headline results deny educators the opportunityto do this by their impact on policy. What physicists realised some timeago, but educational testing people seem averse to acknowledging, is thatwhen you measure something you change it. Now that is the “big changequestion” that organisations such as TIMSS and PISA really ought to beaddressing.

bcq4b.tex; 26/11/2003; 16:23; p.4

Page 7: Big change question As national policy-makers seek to find solutions to national education issues, do international comparisons such as TIMMS and PISA create a wider understanding,

BIG CHANGE QUESTION 425

REFERENCES

Beaton, A. et al. (1997). Mathematics Achievement in the Middle School Years: IEA’s ThirdMaths and Science Study. Massachusetts: Boston College.

Brown, M. (1998). The tyranny of the international horse race. In R. Slee, G. Weiner &S. Tomlinson (eds), School Effectiveness for Whom? London: Falmer Press.

Cicourel, A. et al. (1974). Language Use and School Performance. New York: AcademicPress.

Cooper, B. and Dunne, M. (2000). Assessing Children’s Mathematical Knowledge.Buckingham: Open University Press.

http://nces.ed.gov/timss/results.asp.http://news.bbc.co.uk/1/hi/education/1820838.stm (Maths students in decline).http://news.bbc.co.uk/1/hi/education/2120424.stm (Science lessons “tedious and dull”).http://news.bbc.co.uk/1/hi/education/3120708.stm (Testing “harming school science”).http://timss.bc.edu/timss1999i/science_achievement_report.html.http://timss.bc.edu/timss1999i/math_achievement_report.html.http://www.pisa.oecd.org/.http://www.pisa.oecd.org/knowledge/summary/intro.htm.Williams, J. and Ryan, J. (2000). National testing and the improvement of classroom

teaching: Can they co-exist? British Educational Research Journal, 49–74.

HARRY TORRANCE

Institute of EducationManchester Metropolitan UniversityUnited KingdomE-mail: [email protected]

bcq4b.tex; 26/11/2003; 16:23; p.5