22
Testing and Assessment in EAP BALEAP Professional Issues Meeting The University of Salford 3 rd February 2007

Testing and Assessment in EAP BALEAP Professional Issues

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Testing and Assessment in EAP BALEAP Professional Issues

Testing and Assessment in EAP BALEAP Professional Issues Meeting The University of Salford 3

rd February 2007

Page 2: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 2

Contents

Introduction and Welcome ............................................................................. 3

Programme Outline: ....................................................................................... 4

Parallel Presentations Programme ................................................................ 5

Presentation Abstracts ................................................................................... 6

Information on speakers: ............................................................................. 14

Campus Map .................................................. Error! Bookmark not defined.

Notes Pages ................................................................................................ 17

Page 3: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 3

Introduction and Welcome

I would like to welcome you all to the University of Salford and the February 2007 PIM on Testing and Assessment. I hope that you enjoy the day! Coffee and tea will be available in the dining area of the Harold Riley suite during registration. During this time it would be very useful if you could sign up on the sheets for presentations in order to provide us with some idea of numbers for each room. We have tried to group presentations together so that, in theory at least, those interested in similar topics can remain in one room for the whole hour‟s session. However, we have also allowed five minutes between presentations so you can move location if you wish. There are three rooms in use for today‟s presentations: the main lecture room of the Harold Riley suite, and two smaller rooms (prosaically designated Room 1 and Room 2) which are located downstairs to the left of the main entrance to University House. We have a range of presentations and papers bringing new and interesting perspectives on EAP testing and assessment to the EAP community. I am particularly pleased to welcome Dr Barry O‟Sullivan from Roehampton University to give our opening plenary talk. I am also delighted that the BALEAP „can do‟ project team have agreed to provide an update on their work at the end of day plenary session. We also welcome Olly Twist of Garnet Education who will be displaying a wealth of EAP titles for you to browse through in the breaks. I would like to give my warmest thanks to the many people who have helped towards today‟s meeting. These include Professor Carole Roberts of the Higher Education Research Centre in Salford

University for lending her support to the meeting; Andy Seymour, the PIMs co-ordinator whose guidance on many organisational

matters has been invaluable; Benita Studman-Bedillo who took care of all the BALEAP website posting; Esther O‟Brien in the University conference office; The student helpers (Ice and Joyce); Andy Gillet for encouraging me to propose the PIM in the first place.

Please remember to complete the evaluation sheet before you leave! Siân Etherington School of Languages, University of Salford

Page 4: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 4

Programme Outline: 9.30 Registration and coffee

10.00 - 10.15

Welcome Siân Etherington, University of Salford Professor Carole Roberts, Director Higher Education Research Centre, University of Salford Harold Riley Suite

10.15 - 11.15

Plenary: Validation of in-house testing Barry O‟Sullivan, Centre for Language Assessment Research, University of Roehampton Harold Riley Suite

11.15 - 11.45

Coffee Break

11.45 - 12.15

Parallel presentation session (1) See programme for options Harold Riley Suite / Room 1

12.20 - 12.50

Parallel presentation session (2) See programme for options Harold Riley Suite / Room 1 / Room 2

12.50 - 2.00 Lunch

2.00 - 2.30

Parallel presentation session (3) See programme for options Harold Riley Suite / Room 1 / Room 2

2.35 - 3.05

Parallel presentation session (4) See programme for options Harold Riley Suite / Room 1 / Room 2

3.10 - 3.30 Coffee Break

3.30 - 4.00 Improving Information for our Report Users – Can Do statements for Pre-sessional Courses BALEAP „Can Do statements‟ project team Harold Riley Suite

4.00 - 4.15 Conclusions and Farewells Siân Etherington, Andy Seymour Harold Riley Suite

Page 5: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 5

Parallel Presentations Programme We have attempted to arrange papers in streams, so that presentations with a similar theme follow each other in the same room. This may reduce movement between rooms for each paper. However, we have also allowed a 5 minute gap for room changes, so you are not committed to stay in the same place throughout each session. Harold Riley Suite Room 1 Room 2

11.45- 12.15

The training of markers and standardisation of marking John Slaght, University of Reading

Introducing the Durham Academic Language Test (DALT) Philip Nathan University of Durham

12.20 - 12.50

Diagnostic testing of writing for academic purposes: seven common dilemmas and seven possible solutions Gerard Sharpling University of Birmingham

Pre-sessional study skills and English language use assessment (USELT) Alison Chisholm and Rachel Cole, University of Sussex

Vocabulary production under timed and un-timed conditions Paul Booth, Kingston University

12.50 -2.00

Lunch

2.00-2.30

Assessing students‟ use of a VLE group discussion: problems and solutions Andy Gillett, University of Hertfordshire

Pre-masters programmes, EAP and a different approach to testing listening skills Annabel Marsh University of Glamorgan

Pre-sessional testing: the mini-research project Judith Roads, University of Middlesex

2.35-3.05

Online Testing using a Virtual Learning Environment Hilary Arnold, James Arnold University of Surrey

Concurrent formative and summative EAP Assessment Bruce Howell, University of Reading

Pre-sessional testing: writing about cultural differences Caroline Corney University of Portsmouth

3.10- 3.30

Break

3.30– 4.00

Improving Information for our Report Users – Can Do statements for Pre-sessional Courses BALEAP „Can Do statements‟ project team Diane Schmitt, John Slaght, Sarah Brewer, Moira Calderwood and Carmel Roche Harold Riley Suite

4.00 – 4.15

Conclusions and Farewells Harold Riley Suite

Page 6: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 6

Presentation Abstracts 11.45-12.15 Parallel Presentations (1)

Title The training of markers and standardisation of marking Speaker(s) John Slaght

Speaking in Harold Riley Suite Affiliation Centre for Applied Language Studies, University of Reading

The training of markers and the standardisation of marking can, and should, effectively serve a dual purpose. Crucially, the combined process should ensure absolute fairness for all candidates in terms of the accuracy and consistency of marking. Also the process should help to inform any kind of assessment carried out during relevant preparation, in-sessional or pre-sessional courses. Thus all interested stakeholders should be confident that assessment results have been arrived at in a fully accurate and professional way, whilst the examiner/teachers should have added to their expertise through the experience of a particular standardisation and marking episode. This presentation will describe the practical features of marker training and the standardisation process during the administration of a topic based pre-sessional exit test of reading, listening & writing. The importance of double marking and moderation will be addressed with regard to writing. The standardisation of marking listening & reading will also be considered with particular reference to the marking key and the need for markers to engage fully with the test specifications and content of these two sections of the test; and how this might be achieved. Test marking should not be treated as a chore to be endured by tired teachers at the end of an exhausting course. Rather rigorous marker training and standardisation carried out appropriately should ensure that both teachers and students benefit fully from the experience. 11.45-12.15 Parallel Presentations (1)

Title Introducing the Durham Academic Language Test (DALT) Speaker(s) Philip Nathan

Speaking in Room 1 Affiliation University of Durham

In 2005/2006, the Durham University Executive Committee decided that the English language of incoming International Students should be tested. The Language Centre was charged with the process of designing and delivering an appropriate test and as a consequence the Durham Academic Language Test (DALT) was developed. The test that was designed evaluated student language on the basis of reading, writing and listening skills but was novel in that it constituted a thematic test in which source information from listening and reading tasks and texts was required to be integrated in a final writing task. The test was also novel in that it was substantially video-based. In this session, the practicalities of implementing the test will be discussed and the success of the test will be evaluated in terms of key testing criteria, as well as the relevance of the test to the subsequent placement of test-taking students on in-sessional courses.

Page 7: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 7

12.20-12.50 Parallel Presentations (2) Title 'Diagnostic testing of writing for academic purposes: seven

common dilemmas and seven possible solutions' Speaker(s) Gerard Sharpling

Speaking in Harold Riley Suite Affiliation University of Birmingham

The debate regarding whether EAP instruction should focus on the 'general' or the 'specific' may equally be applied to the area of language testing. This paper considers the nature and outcomes of the writing component of WELT (The Warwick English Language Test). This test may be seen as a general, rather than specific test of writing skills. This paper argues, however, that the generality of such a test actually assists, rather than hinders the formulation of a suitable prognosis of future student performance within academic departments. The paper will refer to discourse patterns across student WELT papers and will show the relevance of these patterns to future academic writing tasks across a range of departments. 12.20-12.50 Parallel Presentations (2) Title Pre-sessional Study Skills and English Language Use

Assessment: USELT ( University of Sussex English Language Test)

Speaker(s) Alison Chisholm and Rachel Cole Speaking in Room 1

Affiliation University of Sussex One of the most frequently used assessments of language ability for UK University entrance is IELTS. While IELTS style tests serve as a useful general indicator of language level, they do not adequately assess study skills, from the very practical, such as accurate referencing, to the more complex, such as using evidence to substantiate argument. Moreover, it is impossible to ascertain from IELTS results to what extent a student has met particular marking criteria. For example, a student with an IELTS 7 in writing may have performed well in terms of task achievement, but at the same time their language manipulation and accuracy may be somewhat weaker. We devised a new style assessment (USELT), to be taken by students exiting our September 2006 pre-sessional course. The marking scheme and criteria of USELT make it possible to identify whether students have a need for further support in language and/or study skills. We would like to present details about the background to, and the format of, USELT, and examine some of the findings and issues arising from its implementation.

Page 8: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 8

12.20-12.50 Parallel Presentations (2)

Title Vocabulary production under timed and un-timed conditions Speaker(s) Paul Booth

Speaking in Room 2 Affiliation Kingston University

Language testing can take many forms. One way of estimating language proficiency is to measure the frequency level of words which learners can use productively. Lexical frequency profiles are one way of analysing the percentage of words used at various frequency levels. However, test conditions can have a considerable effect on the type of words used. One variable is whether learners are tested under timed or un-timed conditions. Under timed conditions, learners may not have access to reference materials so they may be forced to use lexis which is fluent and under automatic processing. Under un-timed conditions, learners will normally have access to written sources and also be able to access lexis which is less fluent and under controlled processing. I will present the results of research into 20 English as a second language learners studying at Kingston University. The results are taken from the learners‟ written texts under timed and un-timed conditions. The texts were analysed using the Web Vocabprofile (Cobb 2002) which categorises learners‟ productive vocabulary into various frequency bands, ranging from the first thousand, the second thousand, and so on. The implications of these results will be discussed in the light of the test conditions, task type and how together they affect vocabulary production.

Page 9: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 9

2.00-2.30 Parallel Presentations (3) Title Assessing students’ use of a VLE group discussion: Problems

and solutions. Speaker(s) Andy Gillett, Claire Weetman

Speaking in Harold Riley Affiliation School of Combined Studies, University of Hertfordshire

For several years now, we have been investigating our students‟ use of the group discussion facility of StudyNet, our in-house VLE. The conclusion that we have come to is that in order to be successful, use of the VLE needs to be clearly and well integrated into the programme. We have reported this at previous PIMs (see, for example, Gillett & Weetman, 2006). Good integration into the programme, though, means that use of the VLE needs to be integrated into the assessment scheme of the programme. This is problematic. In our talk, we would like to report on how we have tried to do this and some of the results we have obtained. There will be opportunities for discussion and, we hope, participants will be able to suggest improvements. Gillett, A. J. & Weetman, C. (2006). Investigation of the use of a VLE group discussion facility by East Asian Postgraduate Students. The East Asian Learner, 2(2). Available at: http://www.brookes.ac.uk/schools/education/eal/ 2.00-2.30 Parallel Presentations (3)

Title Pre-Masters programmes, EAP and a Different Approach to Testing Listening Skills

Speaker(s) Annabel Marsh Speaking in Room 1

Affiliation University of Glamorgan Email address

The aims for pre-Masters programmes for international students are clearly stated in prospectuses and university websites. In terms of EAP, the aim is to prepare international students by helping them to develop their English language and study skills appropriate to Masters level study within the context of the UK university environment. But these aims perhaps cause dilemmas for course developers when it comes to assessing learning outcomes. On the one hand, because of language entry requirements set by receiving schools and departments, we need to be confident that our assessments produce accurate indicators of achieved levels of English. These levels are typically expressed using ratings that equate to IELTS or TOEFL scores. On the other hand, we want our assessments to be reliable indicators of learning outcomes beyond the language aims of the curriculum – i.e. study skills – note taking, seminar participation. So the quandary is this. How do we produce reliable assessment instruments which: are reliable indicators of achieved levels of English reflect the broader learning outcomes of our modules?

To answer this question, it is perhaps important to preface it with another. At this level of study, should we be more concerned with process or product?

Page 10: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 10

Surely our primary concern should be with attending to process. We want to prepare our students for study at Masters level by helping them to engage in the process of learning. We do this by enabling and encouraging them to actively participate in their studies, engage in their learning, become independent learners, and to be masterly in their approach. We want their experience of the pre-masters programme to be meaningful and relevant to the type of tasks that will later be required of them. Assessment tasks should be a reflection of the same learning experience. That is, they should be meaningful, relevant and realistic. So, when it comes to assessing skills in listening, are they? Or do we resort to product orientated assessments, such as IELTS / TOEFL type tasks in order to make the job of gauging language levels easier and possibly more accurate? Experience and anecdotal evidence seems to point to teachers relying on the more product orientated approach. In an effort to make listening assessments more meaningful, realistic, and process orientated I‟ve taken a different and perhaps unconventional approach. 2.00-2.30 Parallel Presentations (3) Title Pre-sessional Testing: The mini-research project

Speaker(s) Judith Roads Speaking in Room 2

Affiliation University of Middlesex This paper focuses on one element of our Pre-sessional course assessment: the mini-research project. This is a task type not frequently used in a short (one-month) intensive EAP course. At Middlesex, for several years we have used the final outcomes from this task – a written group report and a group oral presentation, both based on students‟ research. These results are fed into their final assessment profile. This integrated skills component is modelled on tasks to be required of most students on our university degree courses and forms part of the Pre-sessional assessment ethos of measuring more than just discrete language skills. However, implementing the task can be problematical for students as well as for some new tutors: for example, time is too short, pre-undergraduates are often inexperienced in the concept of carrying out research and lack independent learning skills and there can be insufficient use and evaluation of serious sources even for pre-postgraduates. The presentation task has recently been redesigned so that students offer a reflection on the process of the task whereas their report concentrates on the findings. We have experimented with a variety of research topics and input design. The paper looks at the strengths of and the difficulties in using this assessment format, gives examples of some of the tasks and reports on data collected during the winter 2006/7 course and shares conclusions and ideas for the future in the light of our review of the project assessment.

Page 11: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 11

2.35 – 3.05 Parallel Presentations (4)

Title Online testing using a virtual learning environment (VLE)

Speaker(s) Hilary Arnold, James Arnold Speaking in Harold Riley Suite

Affiliation Centre for Language Studies, University of Surrey At the beginning of every academic year at the University of Surrey, the Centre for Language Studies tests approximately 2,000 undergraduate and postgraduate students. As a result of the test some students are recommended to attend classes on the English Language Support Programme. Traditionally this has been a pen and paper test in reading, grammar and writing and has been marked manually by a group of ten EAP tutors. The process is time- consuming and not necessarily the most cost- effective use of tutors‟ time. In line with the University‟s e-learning policy it was decided to introduce an online test option and to evaluate its effectiveness particularly in terms of administrability. Consequently, the reading and grammar components of the test were adapted for use online using the University‟s VLE (WebCT Vista). In September 2005 the test was trialled on a group of 44 postgraduate students and feedback was collected from students, tutors and departments. In 2006 some amendments were made to the grammar component and the test was taken online by a further group of 83 postgraduate students from the departments of Economics and Electronics. This presentation provides a report on progress to date. It discusses the rationale for testing online, the process of test development, and administrative factors. It considers both the opportunities and constraints of using a VLE as an assessment tool and the implications for the future. 2.35 – 3.05 Parallel Presentations (4)

Title Concurrent formative and summative EAP Assessment Speaker(s) Bruce Howell

Speaking in Room 1 Affiliation University of Reading

EAP teaching tends to feature a great deal of formative assessing, particularly in recursive process-writing cycles and project work, and the value of this formative feedback at key points over an extended period is well recognised. However, at the end of these processes there are „products‟, such as essays or presentations. Should the feedback for the final product be summative? Or should we rely wholly on an independent test at the end of a course for our summative assessment? This paper argues that using extended project work for both formative and summative assessment simultaneously is a) feasible, b) pedagogically sound, and c) helpful in giving a clear picture of a student‟s proficiency in EAP.

Page 12: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 12

2.35 – 3.05 Parallel Presentations (4)

Title Pre-sessional testing: writing about cultural differences Speaker(s) Caroline Corney

Speaking in Room 2 Affiliation University of Portsmouth

It has been acknowledged that the process of following a Higher Education course of study tests students in a variety of ways, not all of which are strictly related to a narrowly defined curriculum. Ronald Barnett, Professor of Higher Education at the Institute of Education, University of London, for example, in a keynote address to the University of Portsmouth‟s 2005 Christmas Learning and Teaching Conference, referred to the way in which HE learners are challenged: “epistemologically, practically, ontologically”. For students coming to another country to study, the intellectual, personal, and cultural challenges are particularly severe. It could be argued that much EAP provision and thence assessment, with an emphasis on subject based writing tasks, ignores the massive personal and cultural changes with which students are forced to grapple and which effectively constitute part of their own learning curriculum. This summer (2006) the author coordinated a 4 week pre-sessional EAP course for international students at the University of Portsmouth. In order to encourage the development of skills without overburdening with acquisition of subject matter (teachers are not experts in the students‟ fields of study) the assessment was based on a comparison of the students‟ hometown and Portsmouth, either in general or focussed on a particular aspect chosen by the student. Both the written tasks and presentations generated original and relevant points of comparison with a developing use of the requisite academic conventions. The assessment design also resulted in no cases of plagiarism.

Page 13: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 13

3.30 – 4.00 Plenary Session

Title Improving information for our report users: Can Do statements for Pre-sessional courses

Speaker(s) Diane Schmitt, John Slaght, Moira Calderwood, Carmel Roche Speaking in Harold Riley Suite

Affiliation BALEAP research project team (Glasgow, Manchester, Nottingham Trent and Reading Universities)

BALEAP has embarked on a project to draw up a set of Can Do Statements. The project is being undertaken by members from Glasgow, Manchester, Nottingham Trent and Reading Universities. In this session, a panel will give a brief overview of the project, report on preliminary findings from our literature review, and give a brief overview of the methodology we intend to use. This will be followed by an opportunity for the wider BALEAP membership to discuss the goals and desired outcomes of the project.

Page 14: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 14

Information on speakers: Hilary Arnold is the IELTS co-ordinator and tutor at the University of Surrey. James Arnold is an EAP tutor at the University of Surrey. Paul Booth is senior lecturer in English Language at Kingston University. He has

taught in the European Commission, Brussels and the University of Nancy, France. His research interests are in Individual Differences and lexical profiles. Moira Calderwood is Director of Studies in the EFL Unit in Glasgow University, where

she has been responsible for developing tutor guidelines for testing and assessing pre-sessional students. She is an IELTS examiner trainer and is currently teaching on a module on assessment for the M.Ed in ELT. Alison Chisholm is English Language and Study Skills Officer and Director of Pre-sessional English Language courses at the University of Sussex. She is also a trainer in Trinity College Certificate in TESOL. She is currently registered for part-time D.Phil in Education. Rachel Cole has taught at the University of Sussex since 2001. She previously taught in

the USA, Indonesia and Sweden. At Sussex she has taught on the following courses: Intensive English, Graduate English, Pre-Sessional courses, English for Academic Purposes, Study Skills, Trinity College London Certificate in TESOL, Cambridge FCE and CAE preparation, BA in ELT. Caroline Corney

Caroline Corney has been a teacher of EAP and EFL for more than 15 years. For the last 3 years she has lectured at the University of Portsmouth. Previously she worked in Hong Kong and taught at the British Council and as a local tutor for the University of Sheffield M.Ed programme. Andy Gillett is principal lecturer in the School of Combined Studies at the University of

Hertfordshire. He has spent most of the last 25 years teaching English both in the UK and abroad and for the last 15 years most of his work has been involved with English for Academic Purposes - EAP - in British higher education. He is now mainly involved in organising, planning and teaching EAP courses to international students taking a wide range of courses at the University of Hertfordshire. He is the current treasurer of BALEAP. Bruce Howell works as a teacher on EAP courses in the Centre for Applied Language

Studies (CALS) at the University of Reading. He is also the administrator of the Test of English for Educational Purposes (TEEP). His current interests are in test validation, particularly the EAP testing and interpretation of scores. His experience includes teaching EFL in Japan, Slovakia and Poland.

Page 15: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 15

Annabel Marsh moved to Wales in 2003 and joined the International English unit, on a

part-time basis, at the University of Glamorgan. She is now Subject Leader for the unit and coordinates the pre-masters award programme for international students. Before moving to Wales, she spent 3 years at Abu Dhabi Women's College (HCT) in the UAE where I set up and coordinated the self-access centre (ELC). The year 2000 marked the end of 11 years in Hong Kong where she began her career in EFL. She has 4 children, 2 grandchildren and a Canadian accent! Philip Nathan is English and Academic Purposes Coordinator at Durham University. He

previously worked at Birmingham University on pre-sessional business programmes and was pre-sessional Director at the University of Surrey Roehampton. His research interests include characterising student written genres, developing the teaching of academic writing and developing effective approaches for teaching against plagiarism. Barry O’Sullivan has a PhD in language testing, and is particularly interested in issues

related to performance testing, test validation and test-data management and analysis. He has lectured for many years on various aspects of language testing, and is currently Director of the Centre for Language Assessment Research (CLARe) at Roehampton University, London. Barry‟s publications have appeared in a number of international journals and he has presented his work at international conferences around the world. His book „Issues in Business English Testing: the BEC revision project‟, was published in 2006 by Cambridge University Press in the Studies in Language Testing series, and his next book is due to appear early in 2007. Barry is very active in language testing around the world and currently works with government ministries, universities and test developers in Europe, Asia, the Middle East and the Central America. In addition to his work in the area of language testing, Barry taught in Ireland, England, Peru and Japan before taking up his current post. Judith Roads holds an M.Ed in ELT from the University of Sheffield. She has worked

for the English Language and Learning Support service for Middlesex University for the last eight years, managing both the Pre-sessional and the year-long International Foundation Programme courses. Previously she was the English Language professor for international students at the Royal College of Music, London. She has developed a special interest in the teaching of intonation and in English for Academic Music. Carmel Roche is the Director of English Language Programmes at the University Language Centre, University of Manchester. Carmel worked in Mozambique, Malaysia, Oman, Zimbabwe and many other places around the world before settling in Manchester. Current interests include assessment, teacher training and aspects of educational management (especially cultural webs and the management of change). In addition to the Can-do research project, Carmel is currently working with colleagues on the creation of a new reporting system using a banding system produced in-house. Diane Schmitt is a Senior Lecturer at Nottingham Trent University. She is currently working on a PhD on the topic of Writing from Sources. Her other EAP interests include English language testing and vocabulary development.

Page 16: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 16

Gerard Sharpling worked at the universities of Nantes and Birmingham before joining

the University of Warwick in 1996. He lectures in language testing and assessment within CELTE, and has been the co-ordinator of the Warwick English Language Test since 2002. He also has responsibilities for co-ordinating EAP classes and materials writing. He is particularly involved in disability issues, and is currently researching into disability, teaching and learning, jointly with colleagues in another department at the university. John Slaght has worked at the University of Reading in a variety of capacities since

1988. He now works in the EAP department. Currently, his responsibilities in testing include test administration and writing. He has been an item writer for Cambridge ESOL over a number of years and is regional team leader on a marking panel. Claire Weetman is a teacher of English with a variety of experience in Poland, Turkey, Thailand, Australia and the UK with a Masters degree in TESOL from Deakin University in Australia. She is primarily interested in student motivation, the use of innovative teaching methods, the use of authentic texts and the creation of materials to suit individual classes. She is currently an Associate Lecturer in the School of Combined Studies at the University of Hertfordshire.

Page 17: Testing and Assessment in EAP BALEAP Professional Issues

Notes Pages Presentation title: ………………………………………………………………………………… Presenter: …………………………………………………………………………………………. ...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

Page 18: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 18

Notes Presentation title: ………………………………………………………………………………… Presenter: …………………………………………………………………………………………. ...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

Page 19: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 19

Notes Presentation title: ………………………………………………………………………………… Presenter: ………………………………………………………………………………………… ...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

Page 20: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 20

Notes Presentation title: ………………………………………………………………………………… Presenter: ………………………………………………………………………………………… ...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

Page 21: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 21

Notes Presentation title: ………………………………………………………………………………… Presenter: ………………………………………………………………………………………… ...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

Page 22: Testing and Assessment in EAP BALEAP Professional Issues

EAP Testing and Assessment BALEAP PIM 3 February 2007

The University of Salford 22

Notes Presentation title: ………………………………………………………………………………… Presenter: ………………………………………………………………………………………… ...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...…………………………………………………………………………………………………….

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...………………………………………………………………………………………………… …

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………

...……………………………………………………………………………………………………

………………………………………………………………………………………………………

………………………………………………………………………………………………………