Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
EDUCATIONAL RESEARCH CENTRE
Programme of Work
and Publications 2004
Educational Research Centre St Patrick’s College, Dublin Phone: Dublin (01) 8373789 Fax: Dublin (01) 8378997 Email Address: [email protected] Address: www.erc.ie
(09/04)
EDUCATIONAL RESEARCH CENTRE
STAFF 2004
Thomas Kellaghan, B.A., Ph.D., LL.D., F.B.Ps.S., F.I.A.E., M.A.E., Director Gerry Shiel, B.Ed., M.S.Ed., Ph.D., Research Fellow Peter Archer, B.A., Ph.D., Research Fellow Nick Sofroniou, B.Sc., Ph.D., F.R.S.S., Research Fellow Mary Lewis, M.Soc.Sc., Ph.D., Research Associate Susan Weir, B.A.(Mod)., Ph.D., Research Associate David Millar, B.Sc., D.Phil., Research Associate Eemer Eivers, B.A.(Mod)., Ph.D., Research Associate Judith Cosgrove, M.A., Research Associate Patrick Forde, M.A., Research Associate Déirdre Stuart, B.A., M.Psych.Sc., Research Associate Fionnuala Shortt, M.A., Research Assistant Sarah Zastrutzki, Dip.Psych., Research Assistant Louise Pembroke, B.A. (Mod)., Research Assistant Rachel Perkins, B.A., M.Psych.Sc., Research Assistant Paul Surgenor, B.Sc., Dip.Ind.Studies, Research Assistant Mary Rohan, B.A., M.Ed., Administrative Officer John Coyle, B.Sc., Dip.Stats, Systems Administrator Eileen Corbett, B.A., Dip.Lib.Info.Systems, Assistant Librarian Helen O'Gara, Executive Officer Hilary Walshe, Executive Officer Blána Kelly, Clerical Officer Stephen McMahon, M.A., Administrative Assistant
CONTENTS
PROGRAMME OF WORK 2004 1
1. National Assessment of English Reading (NAER) ..................................... 2
2. National Assessment of Irish ....................................................................... 3
3. Programme for International Student Assessment (PISA) .......................... 4
4. Third International Mathematics and Science Study (TIMSS) ................... 7
5. An Evaluation of the ‘Breaking the Cycle’ Intervention Programme in Disadvantaged Areas .................................................................................. 8
6. Primary Assessment of Writing Skills: Pilot Study .................................... 10
7. OECD International Survey of Upper Secondary Schools (ISUSS) .......... 11
8. An Evaluation of ‘Early Start’ ..................................................................... 12
9. Standardized Test Development .................................................................. 15
10. Development of Curriculum Profiles for Primary School ........................... 17
11. Calculators in Mathematics Study ............................................................... 18
12. A Review of Procedures for the Selection for the Receipt of Targeted Support to Deal with Educational Disadvantage ........................................ 20
13. Review of Existing Programmes Aimed at Addressing Disadvantage in Ireland and of Recent International Literature on Effective Responses to the Problems Associated with Disadvantage .............................................. 22
14. A Study of Reading in Schools Designated as Disadvantaged.................... 24
15 The Impact of Examination Components on Total Score in the Leaving Certificate Examination .............................................................................. 26
16 Operation of the Bonus Mark System for Answering Through Irish in the Leaving Certificate Examination ................................................................ 27
PUBLICATIONS 2000-2004 .............................................................................. 29
EDUCATIONAL RESEARCH CENTRE
PROGRAMME OF WORK, 2004
1 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 1
1. National Assessment of English Reading (NAER)
A national assessment of English reading (NAER) in first and fifth classes
took in May, 2004 in a probability sample of 152 schools. The survey was
implemented concurrently with a National Assessment of Mathematics Achievement
(NAMA) at fourth class level. Development work for NAER which began early in
2003 resulted in a new test of reading achievement for pupils in first class, and the
development of two new test booklets at fifth class level to replace two of the five
existing booklets (these comprise a test called TARA, Tasks for the Assessment of
Reading Achievement), which were used in NAER 1993 and NAER 1998.
The new test materials take changes in the primary schools English
curriculum and developments in recent international assessments of reading into
account. Existing questionnaires (school, pupil, pupil rating form, parent) were
reviewed and refined, taking salient policy issues and comparability with the 1998
survey into account. New questionnaires for class teachers, learning-support
teachers, and inspectors were also developed. The developmental work and
resulting tests and questionnaires are described in a framework document that was
released in February 2004, and may be accessed in full or summary form at
http://www.erc.ie/na2004.html
Work on NAER is continuing over the course of 2004, with the
implementation of the survey in schools, checking of returns, data entry, cleaning,
analysis, and test scaling using item response theory. A particular challenge entails
the scaling of test data at fifth class level that allows the reporting of trends in
reading achievement since 1998. A report on NAER 2004 is expected at the end of
2005.
The work on NAER 2004 is guided by a national committee comprising
representatives from the partners in education, and chaired by a member of the
primary-school inspectorate.
2 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 2
2. National Assessment of Irish
The Centre has been collaborating with Institiúid Teangeolaíochta Éireann
(ITÉ) in conducting a national assessment of the speaking, listening, and reading
skills in Irish of sixth class pupils in primary schools.
Field work for the assessment took place in May and June 2002. Data were
collected in all-Irish schools and schools in the Gaeltacht as well as in ordinary
schools.
Modified versions of tests used in previous studies by ITÉ are being used to
assess listening and speaking skills. A new test of Irish reading was developed at the
Centre. It has two versions, which have some subtests in common, one for use in
All-Irish schools and schools in the Gaeltacht, and another for use in ordinary
schools.
Questionnaires for teachers, parents, and pupils that were used in previous
ITÉ studies have been adapted for the survey. A school questionnaire similar to
ones used in the recent national assessments of English reading and mathematics
was also used. A separate survey was carried out in which primary-school
inspectors were asked about their experience of, and attitude to, the teaching and
learning of Irish.
A report which is in preparation will deal with pupil achievement and the
relationship between achievement and a variety of home and school factors. In the
case of speaking and listening achievement, comparisons will be made between the
most recent data and data from previous ITÉ studies.
3 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 3
3. Programme for International Student Assessment (PISA)
The Programme for International Student Assessment (PISA) is a multi-year
assessment programme, developed jointly by member countries of the Organisation
for Economic Co-operation and Development (OECD). It is aimed at assessing the
broad educational achievements in reading, mathematics, and science of 15-year
olds (the modal age in OECD countries for the end of compulsory schooling) and
their preparedness for adult life. The programme is steered by member governments
through the OECD, on the basis of shared, policy-driven interests, and is managed
by a consortium of institutions led by the Australian Council for Educational
Research (ACER). Each cycle of PISA focuses on a major achievement domain and
on two minor domains. In the first cycle in 2000, reading was the major focus. In
the second cycle, for which testing took place in 2003, mathematics was the major
focus; in the final cycle (2006), it will be science. While 32 countries participated in
PISA 2000, 41, including all 30 OECD-member countries, are involved in the
second cycle (‘PISA 2003’).
The first cycle of PISA (‘PISA 2000’) ended in December 2001 with the
simultaneous publication of the OECD report and a national report for Ireland. A
thematic report, focusing on students’ reading literacy, was published by the OECD
in 2002, and further thematic reports on student approaches to learning, and student
engagement in schooling were published in 2003.
Current work on the PISA project at the Educational Research Centre entails
secondary analyses of the PISA 2000 data, preparation for dissemination of the main
outcomes of PISA 2003 in December 2004, and preparation for the PISA 2006 field
trial in March 2005.
The major focus of the PISA 2003 assessment was on mathematical literacy,
in which students were assessed in relation to four ‘over-arching ideas’: quantity,
shape and space, change and relationships, and uncertainty. Minor domains in
which students’ competencies were assessed were reading literacy, scientific
literacy, and cross-curricular problem solving. An OECD publication, the PISA 2003
Assessment Framework – Mathematics, Reading, Science and Problem Solving
4 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 4
Knowledge and Skills, which was published in July 2003, outlines how all four
domains are assessed. This publication can be downloaded at www.pisa.oecd.org.
PISA 2003 was implemented in Ireland in March, 2003 in a nationally-
representative sample of 145 schools. Almost 4,000 15-year olds were involved. The
assessment included questionnaires for schools and pupils (the school questionnaire
and student questionnaire). In Ireland, a nationally-developed teacher questionnaire
was also administered to pupils’ mathematics teachers. The teacher questionnaire
sought to obtain information about the teaching of mathematics in schools,
particularly at Junior Cycle level.
It is planned to publish a short national report on the performance of Irish
students in PISA 2003, to coincide with publication of the initial international report
on December 7, 2004. It is also intended to publish a full national report in Spring
2005, that will further contextualise the outcomes of PISA 2003, and will include
multi-level models of performance. In preparation for these national reports, the
PISA 2003 database was matched with the 2002 and 2003 Junior Certificate
Examination databases of the Department of Education and Science in January. A
95% match was obtained. Each mathematics item in the PISA 2003 assessment was
rated by Irish mathematics curriculum experts with respect to its expected familiarity
to Junior Cycle students taking the Junior Certificate mathematics examination at
each syllabus level (higher, ordinary and foundation). These ratings will be used to
analyse outcomes on PISA 2003 mathematical literacy in Ireland at both the item
level and the student level.
At OECD level, consideration is being give to the development of thematic
reports based on the outcome of PISA 2003 in such areas as: teaching and learning
strategies, school characteristics, organisation and structure, educational pathways,
use of and access to technology, cross-curricular problem solving skills, and links
between the mathematics component of the 2003 Third International Mathematics
and Science Study (TIMSS 2003) and PISA mathematical literacy. Thematic
analyses of the PISA 2003 database in Ireland will commence in 2005 under the
guidance of the Irish PISA national committee.
The PISA Consortium is currently preparing a framework for the assessment
of scientific literacy as a major domain in PISA 2006. Stimulus texts and test items
5 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 5
are also being prepared. Countries are invited to submit texts and items for
consideration in the international item pool. It is expected that the scientific literacy
framework will be finalised by July 2004. An optional computer-based assessment
of scientific literacy is also being prepared for piloting in spring 2005; it is not clear
at this time which countries will participate in this option. Other options being
considered by countries for PISA 2006 include internationally-developed Teacher
and Parent Questionnaires, and sets of questionnaire items for students on self-
regulated learning, computer familiarity, and educational career paths. Countries are
currently reviewing science test items and questionnaires for PISA 2006 , which are
to be finalised in Autumn 2004. The field trial for PISA 2006 will take place in
Ireland in March 2005.
Further Information
www.erc.ie/pisa (national website); www.pisa.oecd.org (OECD website).
6 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 6
4. Third International Mathematics and Science Study (TIMSS)
The Third International Mathematics and Science Study (TIMSS) was
carried out under the auspices of the International Association for the Evaluation of
Educational Achievement (IEA) in over 40 countries in 1995. The study was
designed to answer questions about student achievement in mathematics and
science, pedagogical practices, and educational polices around the world. Data for
Ireland were collected by the Educational Research Centre.
Students' achievement in mathematics and science were assessed in grades 3
and 4 in primary schools and in first and second year in post-primary schools.
Information was also collected about curricula, students' study habits and attitudes,
and teaching methods.
Major reports with details of students' performance in participating countries
(including Ireland) that have been published include: Mathematics Achievement in
the Primary School Years, Mathematics Achievement in the Middle School Years,
Science Achievement in the Primary School Years, and Science Achievement in the
Middle School Years. Data from the study have been used in the OECD publication,
Education at a Glance.
Among further analyses at the Centre, relationships between the performance
of first and second year students on the TIMSS achievement scales and their later
Junior Certificate Examination results have been examined. The predictive utility of
three TIMSS scales for developing explanatory models of performance on the Junior
Certificate Examination has been examined.
A comparison of the performance of pupils in multigrade and single-grade
classes in primary schools in TIMSS mathematics and the 1999 National
Assessment of Mathematics Achievement is in progress.
7 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 7
5. An Evaluation of ‘Breaking the Cycle’ Intervention Programme in
Disadvantaged Areas
Client: Department of Education and Science
Following studies at the Educational Research Centre, the ‘Breaking the
Cycle’ programme was introduced to primary schools in 1996 on a pilot basis. The
scheme was designed to provide intensive support to 33 large urban and 123 small
rural primary schools which had been identified as experiencing high levels of
educational disadvantage. In urban schools in the programme, junior classes were
reduced to about 15 pupils and special grant assistance for the purchase of books,
teaching materials and equipment, enhanced capitation grants, and targeted incareer
development for school staffs were provided. In the rural component of the scheme,
25 clusters of approximately five geographically proximal schools received special
supports. Each cluster is served by a local co-ordinator whose role involves working
with staff of the selected schools as well as with families served by the schools.
Rural schools also received enhanced capitation grants, access to special grant
assistance for the purchase of teaching materials, and support for teachers in the
form of targeted incareer development.
The Educational Research Centre has carried out an evaluation of the
programme, monitoring its implementation and effects over a period of five years.
Data on the effects of participation were collected at school, teacher, and pupil levels
through the use of questionnaires, tests, and interviews. The aim of the evaluation
was to assess the overall effectiveness of the scheme, to identify models of good
practice, and to examine how participation in the scheme affected schools, teachers,
and pupils. Questionnaire data on the effects of participation at school and teacher
level were collected each year. At pupil level, data were gathered on pupils’
achievements, attainments, and attitudes to a range of school-related issues.
The achievements in reading and mathematics of 3rd and 6th class pupils
were assessed using standardised tests in the first and fourth years of the scheme.
The results indicated that the achievements of pupils on both occasions were
significantly lower than those of pupils nationally, and that there was a statistically
significant decrease in the average reading and mathematics achievements of pupils
8 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 8
in 6th class between 1997 and 2000. Follow-up testing was conducted on a cohort of
6th class pupils in 2003, the majority of whom should have received their junior
education in small classes. The reading and mathematics levels of pupils in the 2003
cohort did not differ from those in the 2000 cohort, and were significantly lower
than those of pupils in 6th class when the scheme began. As some of the educational
effects of participation in the scheme may not emerge until the longer term, it is
planned to compare the Junior Cycle completion rates in 2007 of pupils who
participated in the scheme with those of a cohort of pupils prior to the introduction
of the scheme.
9 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 9
6. Primary Assessment of Writing Skills: Pilot Study
Most studies of children’s writing in Ireland have tended to focus on the
extent to which the objectives of writing instruction are achieved and strategies for
the development of writing skills implemented. There is little information on
standards or the quality of children’s writing. Little is known either about the link
between the reading and writing skills of pupils. This contrasts with our knowledge
of the reading skills of pupils, derived from regular surveys which have been carried
out since 1972. Lack of information about standards in writing is particularly
problematic at a time when schools and teachers are beginning to implement the
revised primary school curriculum for English.
To address this problem, a short assessment of writing skills (the Primary
Assessment of Writing Skills; PAWS) was developed early in 2002 and piloted in
May 2002 in 41 fifth classes in 24 schools. Its purpose was to describe the quality of
children’s writing skills in a manner that would be relevant both to policymakers and
teachers. In the longer term, it is intended to carry out a writing assessment with a
nationally representative sample.
The development of the assessment was guided by recent large-scale
assessments of students’ writing skills in the U.S. and a recent Irish study that
examined the writing abilities of third class pupils. A set of nine appropriate writing
prompts was developed. Three are designed to elicit a piece of narrative writing,
three to elicit a persuasive or argumentative piece, and three to elicit expository or
descriptive writing.
A rubric for scoring pupils’ writing samples was developed. It involves
assessment and rating on three 5-point scales of children’s use of conventions; the
coherence or structure of the piece; and its overall quality. Procedures are in place to
ensure that the rating scales are reliably used by trained markers (primary teachers).
A proportion of samples will be double-marked to obtain a quantitative indicator of
marker reliability.
In addition to the assessment of writing, pupils were administered the
Drumcondra Primary Reading Test to establish links between reading and writing
skills. A summary of the results of the assessment is being prepared.
10 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 10
7. OECD International Survey of Upper Secondary Schools (ISUSS)
Over the past several years the OECD, and Network C in particular, have
shown increasing interest in understanding school processes. This is reflected in a
cycle of school surveys that began in 1995 at the primary level and extends now to
the International Survey of Upper Secondary Schools (ISUSS) which is designed to
produce institutional process indicators which it is hoped will reveal some of the key
characteristics of both the learning environment and the organisation of schools that
contribute to student achievement.
The main activity of the ISUSS was a survey of school principals in 17
OECD countries to develop data sources on upper secondary schools in four main
areas, chosen because they demonstrate school organizational processes that have a
direct impact on the quality of learning:
1. school characteristics aimed at facilitating transition to the labour market and/or
further education;
2. conditions of schooling that are instrumental in enhancing educational quality;
3. human resources; and
4. availability and use of information and communications technology (ICT).
The survey was piloted between April and May 2001, and data were collected in the
main survey between November 2001 and January 2002. The main findings of the
survey are presented in an OCED (2004) report entitled Completing the foundation
for lifelong learning: An OECD survey of upper secondary schools, while some of
data have also been presented as indicators in OECD’s Education at a Glance.
11 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 11
8. An Evaluation of ‘Early Start’
Client: Department of Education and Science
Early Start was established in eight preschool units in seven designated urban
areas of severe disadvantage in October 1994 and was extended to a further 32 units
in 1995. The curriculum in the preschools was adapted from the Rutland Street
experience, and accords priority to the development of language and cognitive skills.
In addition to encouraging parents to participate in all aspects of provision and
management, the preschools have as a major objective the enlisting of the assistance
of the Home-School-Community Liaison service and of appropriate local
community agencies.
An evaluation of Early Start was carried out by the Educational Research
Centre between 1994 and 1998. Early Start provision was found to have been
successfully incorporated into the school system. Parents expressed positive
attitudes towards the school and became involved in many of its activities.
However, the performance on standardized tests of the first wave of children in the
programme when they were in junior infants classes did not indicate that there had
been any effect on their cognitive or scholastic development, though teachers judged
Early Start participants to be superior in cognitive and language abilities to
comparable children that had not experienced Early Start. They also judged Early
Start children to be superior to non-Early Start children in their adaptation to
classroom procedures and in their general ‘readiness’ for school.
In a follow-up study, the scholastic achievements in reading and
mathematics of Early Start pupils when they were in second grade in November
1998 were found not to differ significantly from the achievements of non-Early Start
pupils when they were in second grade in 1994.
Further evaluation activity was initiated in 2000. Its starting point was an
investigation of the extent to which change had occurred in relation to a number of
aspects of Early Start that had been identified in the earlier evaluation as requiring
attention. Changes in implementation were monitored in relation to attendance,
emphasis on cognitive development, consistency of implementation across
preschools, quality of support for staff, parent involvement, individual attention,
12 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 12
adult-child ratios, demand for places and selection of pupils, selection of schools,
and the integration of Early Start with primary schools and other local initiatives
designed to tackle disadvantage.
Change in relation to some of these areas has been actively promoted within
Early Start. For example, some aspects of the intervention were singled out for
attention during inservice days and in the curricular guidelines that were distributed
to participating schools in 1998. Questionnaire data obtained from school principals
and experienced teachers who were appointed to Early Start before the introduction
of the guidelines suggest improvements in implementation, reflected in an increase
in parent involvement, better working relationships between teachers and child-care
workers, and a shift towards small-group (six or less) learning contexts. The
introduction of written guidelines and continuing incareer development are the two
main supports that are believed by school personnel to have facilitated the
improvements.
Teachers who began working in Early Start after the guidelines were
introduced provided additional questionnaire data that largely confirm these
conclusions. They too reported good working relationships between themselves and
child-care workers and expressed satisfaction with parent involvement, while noting
an emerging trend of low levels of classroom contact with ‘working’ parents. They
also acknowledged some difficulty in sustaining the levels of involvement achieved
and in reaching a core of parents described as ‘reluctant’.
All of this information suggests a good deal of consistency in the views of
school personnel on the implementation of key aspects of Early Start. However,
because these evaluation data are based on secondary sources (the reports of
principals and teachers) the need for further validation by direct observation of
practice in schools was indicated. Accordingly, some 19 centres selected at random
were visited by two observers at the end of the 2001/2002 school year. The visits
confirmed again that improvements had occurred with regard to staff relations,
parent involvement, and time allocated to small-group activity while underlining the
persisting problem of poor attendance. A further important observation to emerge
from the visits relates to the poorer than expected progress achieved by children
13 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 13
when judged against the objectives for Early Start laid down in the beginning and
end-of-year assessment profiles.
An observation instrument, adapted from a revised version of the Early
Childhood Environmental Rating Scale known as ECERS-R (Harms, Clifford &
Cryer, 1998) was administered in one class session during each visit. The adapted
scale consisted of a total of 28 items distributed across seven subscales: space and
furnishings; personal care; language reasoning; activities; interaction; programme
structure; and parents and staff. Analysis of the data showed overall performance to
be better on some parts of the scale (interaction, language-reasoning, and space and
furnishings) than on others (activities, personal care routines, parents and staff, and
programme structure) and highlighted scope for improvement in those aspects of the
curriculum involving cognitive development, music, art and socio-dramatic play.
While some differences between centres were revealed, the extent of variation was
limited overall.
Current evaluation objectives include the development of a profile, based on
the Early Start profiles in language and cognitive development, for use by junior-
infant teachers in schools with Early Start centres. It is hoped that, by obtaining
information on learning outcomes, an opportunity will be gained not only to provide
further insights on children’s progress but also to assist in facilitating stronger links
between Early Start and primary schools.
Reference
Harms, T., Clifford, R.M., & Cryer, D. (1988). Early childhood environmental
rating scale. New York: Teachers College Press, Columbia University.
14 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 14
9. Standardized Test Development
Since the mid-1990s, the Centre has updated its range of group-administered
standardized tests for primary schools with the development of the Drumcondra
Primary Reading Test (DPRT) and the Drumcondra Primary Mathematics Test
(DPMT). In addition, the Drumcondra Reasoning Test (DRT), a test of verbal
reasoning and numerical ability designed for use at the point of transition from
primary to post-primary schooling, has been developed. The tests are available to
schools through the Centre’s Test Department (www.erc.ie/testdept.htm)
In early 2004, the Centre published the Drumcondra Spelling Test (DST), a
new group-administered test of spelling for pupils in first to sixth classes in primary
schools. The test, which was standardized in a representative national sample of
schools in May 2002, provides summative information on a pupil’s development in
spelling. Suggestions are provided in the test manual for working with pupils
experiencing difficulty in learning to spell. Another new test, the Drumcondra
Sentence Reading Test, which was also standardized in Spring 2002, is currently
being used for research purposes only.
The implementation of the revised primary school mathematics curriculum in
Autumn 2002 necessitated the revision of the DPMT. In 2003, a pilot study, using
some old items and some new ones, was conducted in first and second classes in
primary schools. Current work is focused on developing new item sets for third to
sixth classes, with a view to piloting in 2004. It is planned to standardize the revised
DPMT in a nationally-representative sample of schools in Spring 2005. An
important feature of the revised test is the use of calculators in the tests designed for
pupils in fourth to sixth classes. This is being handled in two ways. At fourth class
level, where pupils are just beginning to use a calculator, they are to be asked to
complete two sections of the test without a calculator, and one section, consisting of
more numerically complex items, with a calculator. In fifth and sixth classes, pupils
are asked to complete two sections with a calculator, and a third ‘non-calculator’
section, without one. In line with the 1999 Primary School Mathematics Curriculum,
the revised test places less emphasis on number and more on shape and space and
data than does the current DPMT. The revised test features a combination of
15 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 15
multiple-choice and short constructed response items. An Irish language version of
the revised DPMT is planned for 2005.
It is planned to work on the development of a revised version of the
Drumcondra Primary Reading Test (DPRT) in 2004. While it is planned to retain
existing sections on Word Identification (first and second classes), and Vocabulary
(first to sixth classes), with some modifications to reflect changes in word usage, it is
intended to make more substantive modifications to sections dealing with Reading
Comprehension. The revisions are expected to result in the presentation of shorter
narrative and expository texts (reflecting feedback from teachers who have used the
DPRT), and the inclusion of some document texts, such as tables and charts, in line
with the emphasis in the 1999 primary schools English curriculum, on introducing
pupils to a broad range of text genres. In addition to developing separate
performance scales for each class level, it is planned to develop a single scale
spanning all six levels of the test. The test will be standardized at the same time as
the revised DPMT in May 2005.
Planning for the development of a standardized language test in Irish for
ordinary schools will also begin in 2004. It is likely that the test will consist of
listening comprehension and vocabulary identification activities for pupils in first
and second classes, and activities involving listening comprehension, word
identification, reading vocabulary, and reading comprehension for pupils in third to
sixth classes.
16 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 16
10. Development of Curriculum Profiles for Primary School
With the long-term objective of assisting schools to develop a policy on
assessment within the framework of school plans, and to implement guidelines on
assessment for the revised curricula for primary schools, work on the development
of curriculum profiles has been underway at the Centre for a number of years. In
autumn 2000, the Department of Education and Science sent a copy of the Profiles
to every primary school teacher. The Drumcondra English Profiles are currently
available on-line.
Profiles for Gaeltacht schools and Gaelscoileanna, Próifílí Measúnachta don
Ghaeilge sna Scoileanna Gaeltachta agus Scoileanna Lán-Ghaeilge are to be
published later in 2004. These are designed to be used by class teachers to assess
pupils’ oral language, reading and writing in Irish, using an array of informal
approaches to assessment.
Future work on the development of classroom-based assessment tools for
primary schools is likely to draw on the work of the National Council for
Curriculum and Assessment, which is currently looking at assessment in primary
schools.
17 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 17
11. Calculators in Mathematics Study
Client: Department of Education and Science
The Educational Research Centre is involved in a joint study with the
Education Department, St Patrick’s College and the School of Education, Trinity
College to examine the effects of introducing calculators into the Revised Junior
Cycle Mathematics syllabus which was introduced in the 2000/01 school year.
Students sitting the Junior Certificate Examination in mathematics in 2003 formed
the first cohort that were allowed to use calculators in the examination, and were
taught how to use them as part of their course. The aims of the current study are: (i)
to identify aspects of the mathematics achievement of third-year (post-primary)
students that benefit/do not benefit from calculator usage during teaching and
testing; (ii) to examine the attitudes of teachers and students towards calculator
usage in Mathematics classes and in Mathematics tests; (iii) to examine relationships
between levels of calculator usage in Mathematics classes, availability of calculators
in the Junior Certificate Mathematics Examination, and the performance of students
on the examination.
The project is divided into two phases. The first, which was implemented in
November, 2001, entailed gathering baseline data on a cohort of third year students
who studied under the old Junior Certificate Mathematics syllabus (before
calculators were formally introduced). A report on the first phase was published in
October 2003, and is available to download at www.erc.ie/erc_reports_and_
publications/calculator_report_summary.pdf The second phase, which will be
conducted in November 2004, involves examining the mathematics achievement of
a cohort of third year students who had been instructed in the use of calculators and
who will expect to use a calculator in the Junior Certificate Examination. The study
looks at performance on three tests: a test of mental operations during which pupils
do not have access to calculators; a test involving items for which a calculator may
or may not be helpful; and a test consisting of items for which access to a calculator
is deemed not only beneficial, but necessary.
18 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 18
Recent work has focused on conducting new analyses of the 2001 data, and
preparing for the second phase of the study. The new analyses are based on links
between the scores of students in the first phase of the study and their Junior
Certificate Mathematics grades. Students taking the Higher level paper outperformed
their counterparts taking the Ordinary level paper on the three calculator tests, while
those taking Ordinary level outperformed their counterparts taking Foundation level.
While students taking the Higher level Mathematics paper and completing the
calculator optional test without the aid of a calculator achieved a significantly higher
mean score than Ordinary level students taking the same test with calculator access,
the size of the difference was small, and suggests that calculator access may enable
some lower-achieving students to attempt more complex mathematical problems.
Preparation for fieldwork in November 2004 has involved reviewing the calculator
tests and questionnaires used in the November 2001 study, and making appropriate
revisions. In particular, there is an interest in 2004 in ascertaining the types of
instruction in the use of the calculator that teachers provide to Junior Certificate
students.
19 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 19
12. A Review of Procedures for the Selection of Schools for the Receipt of
Targeted Support to Deal with Educational Disadvantage
Client: Educational Disadvantage Committee
In October 2003, the Educational Disadvantage Committee asked the Centre
to undertake a review of current provision for educational disadvantage. The need
for a review arose out of a series of concerns that had been expressed about the
selection of schools to participate in the various schemes of the Department of
Education and Science. An oral report of the review was presented to the Committee
on January 20, 2004.
One of the Committee’s concerns related to discrepancies between the lists
of schools included in the various initiatives. For example, the Committee had
identified schools that, according to the survey conducted by the Centre in 2000 for
the Giving Children an Even Break (GCEB) initiative, had high concentrations of
disadvantage but were not involved in the Designated Areas Scheme (DAS). The
Committee asked the Centre to use the findings of the survey in 2000 with regard to
primary level and further work with the post-primary pupil database that the Centre
had done in 2002 to examine the number and scale of such anomalies.
The Committee also asked the Centre to try to establish the extent to which
the introduction of GCEB had resulted in a more equitable division of resources for
disadvantage between schools in different types of location. Previous studies had
shown that schools in cities had much higher representation in schemes than schools
in other locations (rural areas and large, medium and small towns). Almost all
schools that provided data for the survey in 2000 received at least a financial
allocation under GCEB. As a result, it is likely that the representation of schools in
cities, towns , and rural areas is similar to the distribution of schools in the system as
a whole. However, the aspect of GCEB that appears to be most valued by schools is
the allocation of additional staff and this was only considered in schools with the
heaviest concentrations of disadvantage. Therefore, it was decided that the
examination of whether the GCEB had resulted in any improvement in the
representation of schools in some locations should focus on schools that were
allocated posts under GCEB as well as schools participating in previous schemes. In
20 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 20
its presentation to the Committee, the Centre proposed further research on the
differences between the nature, prevalence and concentration of disadvantage in
cities, other urban, and rural locations.
The Committee had a particular concern that most previous attempts to target
resources had dichotomised schools (schools are designated or not; schools are
eligible for extra staffing under GCEB or not). Part of the review involves an
investigation of the appropriateness of dividing schools into a small number of bands
on the basis of their levels of concentration of disadvantage. This investigation is
based on analyses of data from the GCEB survey and from the post-primary pupil
database. It also involves the use of existing research sources such as PISA and
national assessments of English and Mathematics to examine the social context
effect. A social context effect is thought to exist if there is evidence that pupils’
achievement is related not just to their own socioeconomic background but to the
socioeconomic mix of the school that they attend. For the purpose of the review,
social context was operationalised as the percentage of pupils in a school whose
families possessed a medical card. In each of the analyses, the relationship between
individual pupil achievement and the percentage of medical card holders in a class
or school over and above the relationship between achievement and medical card
possession at the individual level is described.
21 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 21
13. Review of Existing Programmes Aimed at Addressing Disadvantage in
Ireland and of Recent International Literature on Effective Responses
to the Problems Associated with Disadvantage.
Client: Educational Disadvantage Committee
A review, requested by the Educational Disadvantage Committee, of selected
programmes in the formal school sector which are aimed at addressing the problems
of disadvantage at preschool, primary, and post-primary levels in Ireland is
underway. The review is focused primarily on the extent to which each of the
following programmes has been successful in meeting its original aims and
objectives: the Rutland Street Project, Early Start, and Preschools for Travellers
(preschool level); Breaking the Cycle, the Support Teacher Project, and Giving
Children an Even Break (primary level); the Disadvantaged Areas Scheme, the
Home-School-Community Liaison Scheme, and the Schools Completion Programme
(both primary and post-primary levels). With the exception of Giving Children an
Even Break, all of the programmes have been subject of formally evaluated, and the
review is based almost exclusively on the results of those evaluations. As the ERC
had a significant involvement with Giving Children an Even Break (from designing
and administering the survey to advising the Department of Education and Science
on appeals from schools that were dissatisfied with their eventual allocation under
the scheme), a commentary on the implementation of that scheme will be provided.
In addition to a description of the survey methodology and administration, the
commentary will include a breakdown of the allocation of resources (personnel and
funding) to urban and rural schools, and will attempt to assess whether the features
of the programme as originally outlined were implemented.
The Centre is also preparing a review of the international literature on
effective strategies for addressing disadvantage, and in the process, is updating a
similar review contained in the publication Educational Disadvantage in Ireland
(Kellaghan, Weir, Ó hUallacháin, & Morgan, 1995). The exercise involves
considering the recent evidence regarding the importance of each of the following
factors: curriculum adaptation; reduced class size; preschool provision; parent
involvement; school development/planning; and schools’ links with the community.
22 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 22
This list will be augmented by additional factors which have emerged as important
from the more recent literature (for example, the necessity to raise expectations
among staff, and to provide for their professional development).
The overall conclusions arising from the review will inform the Committee
in advising the Minister on effective approaches to providing future supports aimed
at addressing educational disadvantage.
Reference
Kellaghan, T., Weir, S., Ó hUallacháin, S., & Morgan, M. (1995). Educational
disadvantage in Ireland. Dublin: Department of Education/Combat Poverty
Agency/Educational Research Centre.
23 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 23
14. A Study of Reading in Schools Designated as Disadvantaged
Client: Department of Education and Science
A number of studies conducted at the Centre have pointed to low levels of
reading achievement among pupils in schools designated as disadvantaged. For
example, in the standardization of the Drumcondra Primary Reading Test in first
class in 1995, it was observed that pupils in designated schools achieved a mean
score that was one-half of a standard deviation lower than that of pupils in non-
designated schools. In an evaluation of the Breaking the Cycle initiative in urban
schools in 2003, 38% of pupils in sixth class achieved scores on the same test that
were at or below the (national) 10th percentile. Several studies, including the Study
of Remedial Education in Primary Schools (1997), suggest that efforts to raise
achievement among lower achievers in designated schools have met with only
limited success.
Following the establishment of targets for literacy in schools designated as
disadvantaged, arising from a the recent review of the National Anti-Poverty
Strategy, the Department of Education and Science asked the Centre to conduct a
study of reading in designated schools in 2003. While the overall purpose of the
study is to establish baseline data, and to monitor progress towards the NAPS targets
over a period of at least three years, it is also planned to examine associations
between reading achievement and a range of school-, teacher- and student-level
variables.
An advisory committee, consisting of representatives of the Department of
Education and Science and the education partners, was appointed in November,
2002 to assist the Centre in preparing for the study and interpreting the outcomes.
In 2003, baseline data were gathered on reading achievement in first, third,
and sixth classes in a representative sample of designated disadvantaged schools.
Over 2,000 pupils were tested at each grade level. School, class teacher, learning
support teacher, parent and pupil questionnaires were also administered to obtain
contextual data to be used in interpreting achievement outcomes. A pupil rating
form, addressing such matters as oral language proficiency, engagement in learning
24 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 24
support classes, and teacher expectations, was also completed in respect of each
pupil by his/her teacher.
Focus group interviews were conducted in two areas: a rural town and a
Dublin suburb. In each area, group interviews were conducted with principals from a
number of designated schools, teachers from a number of designated schools, and
parents of pupils in one designated school.
A report on the first phase of the study is expected to be completed in 2004.
In addition to reporting on reading levels and providing a multilevel model of
reading achievement, it is anticipated that the report will address ways in which
schools, parents, teachers, and pupils could work towards improving reading
standards.
An existing standardized test (the Drumcondra Sentence Reading Test) was
used to assess pupils’ reading achievement in 2003, and will be used in follow-up
testing. However, it is anticipated that the follow-up survey will include an
additional measure of reading achievement which will focus on the abilities of pupils
in sixth class to complete everyday reading tasks.
25 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 25
15. The Impact of Examination Components on Total Score in the Leaving
Certificate Examination
In this study, an analysis is being carried out of how the various components
in a number of Leaving Certificate Examination (LCE) subjects contribute to
students’ final grades. Data are available for individual students for each of the
separate components of the LCE 2001 in the following subjects (Higher and
Ordinary level): Agricultural Science, Art, Construction Studies, French, Irish, and
Music.
Examinations in these subjects involve a number of components. For
example, all students taking the Construction Studies examination sit a three-hour
written theory paper (in June), as well as completing two practical components (a
four-hour ‘skills test’ examination and a project) during the school year. There are
four components in the Art examination, one written, and three practical. All
students take the written paper (History & Appreciation of Art) and the Life
Sketching practical component. For the other two components, students choose
between the Still Life and Imaginative Composition options and between the Design
and Craftwork options.
Not only do subjects vary in number and type of examination components,
different values are given to components within subjects depending on whether the
examination is taken at Higher or Ordinary level. For example, the theory paper in
Construction Studies is worth 50% of the available marks at Higher level and 40% at
Ordinary level.
The study is examining issues such as differences in component facility (a
particular concern where students can choose between different component options),
differences between the intended and achieved weights of components, and how
inter-component correlations influence the spread of total marks when the
component scores are summed.
26 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 26
16. Operation of the Bonus Mark System for Answering Through Irish in
the Leaving Certificate Examination.
All Junior and Leaving Certificate Examination candidates have the option of
answering either in English or Irish. Where a student chooses to answer in Irish,
bonus marks, in addition to the marks gained in a subject, may be awarded. The
study is examining the operation of the bonus mark system for answering through
Irish in the 2002 Leaving Certificate Examination in Economics, French, History,
and Mathematics.
27 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 27
28 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 28
EDUCATIONAL RESEARCH CENTRE
PUBLICATIONS, 2000 – 2004
29 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 29
PUBLICATIONS 2000 Articles Cosgrove, J. (2000). El sistema nacional de evaluación de resultados educativos:
Irlanda. Revista de Educación, 321, 23-34. Cosgrove, J., & Morgan, M. (2000). Students’ TV viewing, computer game playing,
and attitudes to reading. Irish Journal of Education, 31, 50-62. Kellaghan, T., & Madaus, G. F. (2000). Outcome evaluation. In D. F. Stufflebeam,
G. F. Madaus, & T. Kellaghan (Eds.), Evaluation models: Viewpoints on educational and human services evaluation (2nd ed). Boston: Kluwer Academic. Pp. 97-112
Madaus, G. F., & Kellaghan, T. (2000). Models, metaphors and definitions of
evaluation. In D. F. Stufflebeam, G. F. Madaus, & T. Kellaghan (Eds.), Evaluation models: Viewpoints on educational and human services evaluation (2nd ed). Boston: Kluwer Academic. Pp. 19-31.
O'Leary, M., Kellaghan, T., Madaus, G. F., & Beaton, A. E. (2000). Consistency of
findings across international surveys of mathematics and science achievement: A comparison of IAEP2 and TIMSS. Education Policy Analysis Archives, 8, no 43, 1-14. http://epaa.asu.edu/epaa/v8n43.html
Shiel, G. (2000). Assessment of integrated communications. Paris: OECD/INES
Network A. Shiel, G. (2000). Assessing children's oral language. In G. Shiel, U. Ní Dhálaigh, &
E. Kennedy (Eds.), Language and literacy for the new millennium. Dublin: Reading Association of Ireland. Pp. 243-261.
Shiel, G., & Murphy, R. (2000). Development of coherence and structure in
children’s writing. In I. Austad & E.T. Lyssand (Eds.), Literacy: Challenges for the new millennium. Stavanger: Centre for Reading Research, Stavanger University College. Pp. 89-98.
Sofroniou N., Shiel G., & Cosgrove, J. (2000). Explaining performance on the
Junior Certificate Examination and the OECD/PISA study: A multilevel approach. Irish Journal of Education, 31, 25-49.
30 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 30
Books/Monographs/Reports Cosgrove, J., Kellaghan, T., Forde, P., & Morgan, M. (2000). The 1998 National
Assessment of English Reading with comparative data from the 1993 National Assessment. Dublin: Educational Research Centre.
Eivers, E., Ryan, E., & Brinkley A. (2000). Characteristics of early school leavers:
Results of research strand of the 8- to 15-year old early school leavers initiative. Report to the Department of Education and Science. www.erc.ie/erc_reports_and_publications/ESLIcomplete00.pdf
Eivers, E., & Ryan, E. (2000). A case study analysis of service provision for “at
risk” children and young people. Report to the Department of Education and Science. www.erc.ie/erc_reports_and_publications/ESLIcasestudy00.pdf
Flanagan, R., Morgan, M., & Kellaghan, T. (2000). A study of non-completion in Institute
of Technology Courses. Report to Council of Directors of Institutes of Technology. www.erc.ie/erc_reports_and_publications/Non-completionIT00.pdf
Shiel, G., & Murphy, R. (2000). Drumcondra English Profiles. A framework for
assessing oral language, reading, and writing in primary schools. Dublin: Educational Research Centre.
Shiel, G., Kennedy, E., & Ní Dhálaigh, U. (Eds.). (2000). Language and literacy for
the new millennium. Dublin: Reading Association of Ireland. Stufflebeam, D. F., Madaus, G. F., & Kellaghan, T. (Eds.) (2000). Education
models. Viewpoints on educational and human services evaluation (2nd ed.). Boston: Kluwer Academic.
Weir, S., & Ryan, C. (2000). Interim report on the evaluation of the Breaking the
Cycle scheme in rural schools. Report to the Department of Education and Science. Dublin: Educational Research Centre.
Weir, S., & Ryan, C. (2000). Interim report on the evaluation of the Breaking the
Cycle scheme in urban schools. Report to the Department of Education and Science. Dublin: Educational Research Centre.
31 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 31
2001 Articles Archer, P. (2001). ‘A response to Brahm Norwich’. Open dialogue on inclusion:
Evidence or value-based policy and practice? Psychology of Education Review, 25(1), 12-14.
Archer, P. (2001). Public spending on education, inequality and poverty. In S.
Cantillon, C. Corrigan, P. Kirby, & J. O’Flynn (Eds.), Rich and poor: Perspectives on tackling inequality in Ireland. Dublin: Oak Tree Press & Combat Poverty Agency. Pp. 197-234.
Eivers, E. (2001). Integrated service provision for at risk children and their families.
Irish Journal of Education, 32, 44-62. Keane, M., Hackett, D., & Davenport, J. (2001). Similarity processing depends on
the similarities present. In J. D. Moore & K. Stenning (Eds.), Proceedings of Twenty-Third Annual Conference of the Cognitive Science Society. Mahwah N.J.: Erlbaum. Pp. 477-482.
Kellaghan, T. (2001). Family and schooling. In N. J. Smelser & P. B. Baltes (Eds.),
International encyclopedia of the social and behavioral sciences. Oxford: Pergamon. Pp. 5303-5307.
Kellaghan, T. (2001). O uso da avaliação na reforma educacional. Ensaio: Avaliação
e Políticas Públicas em Educação. 9, n.32, 259-278. Kellaghan, T. (2001). Reading literacy standards in Ireland. Oideas, 49, 7-20. Kellaghan, T. (2001). Reading literacy standards in Ireland. In G. Shiel & U. Ní
Dhálaigh (Eds.), Reading matters: A fresh start. Dublin: Reading Association of Ireland. Pp. 3-19.
Kellaghan, T. (2001). Towards a definition of educational disadvantage. Irish
Journal of Education, 32, 3-22. Kellaghan, T., & Greaney, V. (2001). The globalisation of assessment in the 20th
century. Assessment in Education, 8, 87-102. Shiel, G. (2001). Reforming reading instruction in Ireland and England. Reading
Teacher, 55, 372-374. Sofroniou, N. (2001). Review of Statistical strategies for small sample research.
(R.H. Hoyle, Ed.). Statistical Methods in Medical Research, 10, 239-249.
32 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 32
Weir, S. (2001). The reading achievements of primary school pupils from disadvantaged backgrounds. Irish Journal of Education, 32, 23-43.
Weir, S., & Milis, L. (2001). The relationship between the achievements of 6th class
pupils from disadvantages backgrounds and their attitudes to school. Irish Journal of Education, 32, 63-83.
Books/Monographs/Reports Kellaghan, T., & Greaney, V. (2001). Using assessment to improve the quality of
education. Paris: International Institute for Educational Planning. Morgan, M., Flanagan, R., & Kellaghan, T. (2001). A study of non-completion in
undergraduate university courses. Dublin: Higher Education Authority. Shiel, G., Cosgrove, J., Sofroniou, N., & Kelly, A. (2001). Ready for life? The
literacy achievements of Irish 15-year olds with comparative international data. Dublin: Educational Research Centre.
Shiel, G., Cosgrove, J., Sofroniou, N., & Kelly, A. (2001). Ready for life? The
literacy achievements of Irish 15-year olds with comparative international data. Summary report. Dublin: Educational Research Centre.
Shiel, G., & Kelly, D. (2001). The 1999 National Assessment of Mathematical
Achievement. Dublin: Educational Research Centre. Shiel, G., & Kelly, D. (2001). The 1999 National Assessment of Mathematical
Achievement. Summary report. Dublin: Educational Research Centre. www.erc.ie/erc_reports_and_publications/NAMAsumm99.pdf Shiel, G., & Ní Dhálaigh, U. (Eds.). (2001). Reading matters: A fresh start. Dublin:
Reading Association of Ireland.
33 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 33
2002 Articles Archer, P., & McCormack, T. (2002). Lay trusteeship and a vision of the future of
Irish Catholic education: A response to David Tuohy, S.J. Studies, 91, 55-61. Cosgrove, J., Shiel, G., & Kennedy, D. (2002). The performance of Irish students in
scientific literacy in the Programme for International Student Assessment (PISA). Irish Journal of Education, 33, 53-70.
Greaney, V., & Kellaghan, T. (2002). International studies of achievement. In J. W.
Guthrie (Ed). Encyclopedia of education (2nd ed.). New York: Macmillan. Kellaghan, T. (2002). Approaches to problems of educational disadvantage. In
Primary Education: Ending disadvantage. Proceedings of the National Forum. Dublin: St Patrick’s College. Pp. 17-30.
Kellaghan, T., & Madaus, G. F. (2002). Teachers’ sources and uses of assessment
information. In D.F. Robitaille & A. E. Beaton (Eds), Secondary analysis of the TIMSS data. Dordrecht: Kluwer Academic. Pp. 343-356.
Shiel, G. (2002). Kindergarten children’s involvement in early literacy activities.
Perspectives from Europe. Reading Teacher, 56, 282-284. Shiel, G. (2002). Literacy standards and factors affecting literacy: What national and
international assessments tell us. In G. Reid & J. Wearmouth (Eds.), Dyslexia and literacy: Theory and practice. Chichester: Wiley. Pp. 131-145.
Shiel, G. (2002). Reflexões sobre o desempenho da Irlanda e Portugal na avaliação
da literacia em leitura no OECD/PISA 2000 (Reflections on the performance of Ireland and Portugal in the OECD/PISA 2000 assessment of reading literacy). Revisita Portuguesa de Educação, 15(2), 61-81.
Shiel, G. (2002). The performance of Irish students in reading literacy in the
Programme for International Student Assessment (PISA). Irish Journal of Education, 33, 7-30.
Shiel, G., & Cosgrove, J. (2002). International assessments of reading literacy.
Reading Teacher, 55, 690-692. Sofroniou, N., & Hutcheson, G. D. (2002). Confidence intervals for the prediction of
logistic regression in the presence and absence of a variance covariance matrix. Understanding Statistics: Statistical Issues in Psychology, Education and the Social Sciences, 1(1), 3-18.
34 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 34
Sofroniou, N., Cosgrove, J., & Shiel, G. (2002). Using PISA variables to explain performance on Junior Certificate Examinations in mathematics and science. Irish Journal of Education, 33, 99-124.
Sofroniou, N., Shiel, G., & Cosgrove, J. (2002). PISA reading literacy in Ireland: An
expanded model exploring attributes of self-regulated learning. Irish Journal of Education, 33, 71-98.
Books/Monographs/Reports Eivers, E., Flanagan, R., & Morgan, M. (2002). Non-completion in Institutes of
Technology: An investigation of preparation, attitudes and behaviours among first year students. Report to the Council of Directors of Institutes of Technology. www.erc.ie/erc_reports_and_publications/Non-completionIT02.pdf
Kellaghan, T., Morgan, M., Fitzpatrick, M., & Millar, D. (2002). An evaluation of
the sole use of short-answer tests in apprenticeship examinations. Report to the Department of Education and Science. www.erc.ie/erc_reports_and_publications/ Apprenticeshipexams02.pdf
Lewis, M., & Archer, P. (2002). Further evaluation of Early Start. Progress report.
Report to the Department of Education and Science. www.erc.ie/erc_reports _and_publications/ESprogress02.pdf
Millar, D., & Murphy, R. (2002). A preliminary report to the Task Force on the
Physical Sciences on the issue of the comparability of grades awarded in different subjects in the Leaving Certificate Examination.
Millar, D., & Kellaghan, T. (2002). Second report to the Task Force on the Physical
Sciences on the comparability of grades awarded in the Leaving Certificate Examination.
Weir, S., Milis, L., & Ryan, C. (2002). The Breaking the Cycle Scheme in urban
schools. Final evaluation report. Report to the Department of Education and Science. www.erc.ie/erc_reports_and_publications/BTCUfinal02.pdf
Weir, S., Milis, L., & Ryan, C. (2002). The Breaking the Cycle Scheme in rural
schools. Final evaluation report. Report to the Department of Education and Science. www.erc.ie/erc_reports_and_publications/BTCRfinal02.pdf
35 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 35
2003 Articles Archer, P. (2003). The ebb and flow of consensus seeking: 1988 to 2003. In
Reflections in a time of change. A tribute to Sr Teresa McCormack. Dublin: CORI Education Commission. Pp.21-43.
Cosgrove, J., & Forde, P. (2003). National assessments of English reading in
Ireland. In G. Shiel and U. Ní Dhálaigh (Eds.), Other ways of seeing: Diversity in language and literacy. Volume 1. Proceedings of the 12th European Conference on Reading. Dublin: Reading Association of Ireland. Pp. 46-56.
Kellaghan, T. (2003). Local, national, and international levels of system evaluation.
In T. Kellaghan & D. L. Stufflebeam (Eds.), International handbook of educational evaluation. Dordrecht: Kluwer Academic. Pp. 873-882.
Kellaghan, T., & Madaus, G. F. (2003). External (public) examinations. In T.
Kellaghan & D. L. Stufflebeam (Eds.), International handbook of educational evaluation. Dordrecht: Kluwer Academic. Pp. 577-600.
Omolewa, M., & Kellaghan, T. (2003). Educational evaluation in Africa. In T.
Kellaghan & D. L. Stufflebeam (Eds.), International handbook of educational evaluation. Dordrecht: Kluwer Academic. Pp. 465-481.
Shiel, G. (2003). Raising standards in reading and writing: Insights from England’s
National Reading Strategy. Reading Teacher, 56(7), 692-695. Shiel, G. (2003). Relations entre l’apprentissage autorégulé et les compétences en
lecture dans PISA 2000 (Associations between attributes of self-regulated learning and reading literacy in PISA 2000). Caractères (numéro special), Decémbre, 55-61.
Sofroniou, N. (2003). Review of Statistical concepts: A second course for education
and the behavioural sciences (R.G. Lomax). Journal of the Royal Statistical Society: Series D (The Statistician), 52, 415.
Books/Monographs/Reports Archer, P., & Shortt, F. (2003). Review of the Home-School-Community Liaison
Scheme. Report to the Department of Education and Science. www.erc.ie/erc_reports_and_publications/HSCLreview03.pdf Close, S., Oldham, E., Hackett, D., Dooley, T., Shiel, G., & O'Leary, M. (2003). A
study of the effects of calculator use in schools and in the Certificate Examinations - Summary Report on Phase I. Report to the Department of
36 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 36
Education and Science. Dublin: St Patrick's College. www.erc.ie/erc_reports_ and_ publications/Calculator_Report_Summary.pdf
Cosgrove, J., Sofroniou, N., Kelly, A., & Shiel, G. (2003). A teachers’ guide to
reading literacy in Ireland: Outcomes of the PISA 2000 student assessment. Dublin: Educational Research Centre. www.erc.ie/pisa/ReadLitRptNov03.pdf
Flanagan, R., & Morgan, M. (2003). Evaluation of initiatives targeting retention in
universities: A preliminary report of projects funded by the Higher Education Authority. Report to the Higher Education Authority. Dublin: Educational Research Centre.
Kellaghan, T., & Millar, D. (2003). Grading in the Leaving Certificate Examination:
A discussion paper. Dublin: Educational Research Centre. A PDF version of an additional 128 supplementary tables (S1 to S128), which contain data which are summarized in some of the 69 tables in the report, are available to download at: www.erc.ie/erc_reports_and_publications/LC_Grade_Tables.pdf
Kellaghan, T., & Stufflebeam, D. L., with the assistance of Wingate, L. A. (2003).
International handbook of educational evaluation. Dordrecht: Kluwer Academic.
Lewis, M., & Archer, P. (2003). Early Start evaluation. Report on observation
visits to schools. Report to the Department of Education and Science. www.erc.ie/erc_reports_and_publications/ESschoolvisits03.pdf Shiel, G., & Ní Dálaigh, U. (Eds.). (2003). Other ways of seeing: Diversity in
language and literacy. Volume 1. Proceedings of the 12th European Conference on Reading. Dublin: Reading Association of Ireland.
Shiel, G., & Ní Dálaigh, U. (Eds.). (2003). Other ways of seeing: Diversity in
language and literacy. Volume 2. Proceedings of the 12th European Conference on Reading. Dublin: Reading Association of Ireland.
Syropoulos, A., Tsolomitis, A., & Sofroniou, N. (2003). Digital typography using
LATEX. New York: Springer Weir, S. (2003). The evaluation of Breaking the Cycle: A follow-up of the
achievements of 6th class pupils in urban schools in 2003. Report to the Department of Education and Science.
www.erc.ie/erc_reports_and_publications/BTCfollowuprpt03.pdf
37 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 37
2004/in press/in preparation Articles James, D.J., Sofroniou, N., & Lawlor, M. (in press). Analysis of emotional response
to being bullied. Irish Journal of Psychology. James, D.J., Lawlor, M., & Sofroniou, N. (in press). Persistence of psychological
problems: A one year follow-up study. Irish Journal of Psychological Medicine.
Kellaghan, T. (in press). Donal F. Cregan. In A. Clarke, R. Fanning, E. M. Johnston-
Liik, J. Maguire, & M. Murphy (Eds.), Dictionary of Irish biography. Dublin: Royal Irish Academy.
Kellaghan, T. (in press). The use of assessment for quality assurance and system
improvement. In Proceedings of CIDREE Invitational Conference. Kellaghan, T., & Greaney, V. (2004). Monitoring performance: Assessments and
examinations. In A.M. Verspoor (Ed.), The challenge of learning: Improving the quality of basic education in Sub Saharan Africa. Paris: Association for the Development of Education in Africa.
Sofroniou, N. (in press). Review of Large-scale assessment programs for all
students: Validity, technical adequacy, and implementation (G. Tindall & T.H. Haladyna, Eds.). Journal of the Royal Statistical Society: Series A (Statistics in Society)
Sofroniou, N., & Kellaghan, T. (in press). The utility of TIMSS scales in predicting
students’ subsequent state examination performance. Journal of Educational Measurement.
Books/Monographs/Reports Eivers, E., Shiel, G., Shortt, F. (2004). Reading literacy in disadvantaged primary
schools. Dublin: Educational Research Centre. Kellaghan, T., & Greaney, V. (2004). Assessing student learning in Africa.
Washington DC: World Bank. Kellaghan, T., McGee, P., Millar, D., & Perkins, R. (2004). Views of the Irish
public on education: 2004 survey. Dublin: Educational Research Centre. www.erc.ie/erc_reports_and_publications/views_of_Irish_public_on_education_2004_survey.pdf
38 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 38
Ó Siaghail, G., & Deiseach, C. (2004). Próifílí measúnachta don Ghaeilge sna scoileanna Gaeltachta agus scoileanna lán-Ghaeilge. Lámhleabhar. Baile Átha Cliath: Foras Taighde ar Oideachas.
Weir, S. (2004). Analysis of school attendance data at primary and post-primary
levels for 2003/2004. Report to the National Educational Welfare Board. Dublin: Educational Research Centre. www.erc.ie/erc_reports_and_publications/NEWB_school_attendance04.pdf
Weir, S. (2004). A commentary on the implementation of Giving Children an Even
Break. Report to the Educational Disadvantage Committee. Dublin: Educational Research Centre.
Tests Educational Research Centre. (2004). Drumcondra Spelling Test. (Levels 1-6, Forms
A and B). Dublin: Author. Educational Research Centre. (2004). Drumcondra Spelling Test Manual. Dublin:
Author.
39 (ERC PROGRAMME OF WORK AND PUBLICATIONS 2004) 39