55
15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Embed Size (px)

Citation preview

Page 1: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

15 March 2005

Introducing Diagnostic Assessment at UNISA

Carol Bohlmann 

Department Mathematical Sciences

 

Page 2: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

 Reading Intervention Results Phase I

Overall reading scores < 60% => unlikely to pass maths exam

High reading scores do not guarantee mathematical success.

Low reading score a barrier to effective mathematical performance.

 

Page 3: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Results – Phase II

Reading scores improved by 10% (45% to 56%); but still < 60%. 

Reading skills and mathematics exam results considered in relation to matric English and matric first language (African language) results, and to mean mathematics assignments.

Strongest correlation was between pretest reading scores and mathematics exam mark; highly significant.

Page 4: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Results – Phase III (2002)

Reading speed and comprehension

test data (n = 78) (voluntary

submission)  Mean reading speed 128wpm (lowest

14 wpm - 11% in final exam) Mean comprehension score approx.

70%

Page 5: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Results – Phase III (2003)

Reading speed and comprehension

test data (n = 1 345) (1st assignment) Mean reading speed 115 wpm

(many < 20 wpm) Mean comprehension score approx. 70% Reading skill (total 54): anaphoric reference

(17), logical relations, academic vocabulary (in relation to maths), visual literacy

Page 6: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

High correlation between anaphor score and total reading score (0,830)

Moderately high correlation between comprehension and total reading score (t = 0,581; t2 = 0,338)

Total reading score correlated more strongly with exam mark than did other aspects of reading (t = 0,455; t2 = 0,207)

Attrition rate: 27% in 1997 to 64% in 2004 –effect of poor reading skills for drop-outs?

  

Correlations

Page 7: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

A longer intervention programme had little impact (measured in terms of exam results): students did not/ could not use video effectively on their own (feedback).

Students made aware of potential reading problems might be more motivated to take advantage of a facilitated reading intervention programme.

  

Diagnostic process necessary

Page 8: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Project assignment: 2001, 2002, 2003

Reading/language problems Meta-cognitive problems: Lack of

critical reasoning skills undermines conceptual development

General knowledge problems: Poor general knowldege impedes students’ ability to learn from examples used to illustrate concepts 

Page 9: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Assessment internationally and nationally accepted.

A New Academic Policy for Programmes and Qualifications in Higher Education (CHE, 2001) proposed an outcomes-based education model; commitment to learner-centredness. Learner-centredness => smooth interface between

learners and learning activities, not possible without a clear sense of learner competence on admission.

Pre-registration entry-level assessment can facilitate smoother articluation.

Diagnostic assessment accepted (2003)

Page 10: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

The official view …

nstitutions ‘will continue to have the right to determine entry requirements as appropriate beyond the statutory minimum. However, ..., selection criteria should be sensitive to the educational backgrounds of potential students ... .’ (Gov. Gazette 1997)

Page 11: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

SAUVCA’s role

National initiative to develop Benchmark Tests academic literacynumeracy maths

Page 12: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Piyushi Kotecha (SAUVCA CEO):

“This is a timeous initiative as the sector is ensuring that it will be prepared for the replacement of the Senior Certificate by the FETC in 2008. Specifically, national benchmark tests will gauge learner competencies so that institutions can better support and advise students. It will also enable higher education to ascertain early on the extent to which the FETC is a reliable predictor of academic success.

Page 13: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Piyushi Kotecha (cont.)

This exercise will therefore create a transparent standard that will ensure that learners, teachers, parents and higher education institutions know exactly what is expected when students enter the system. As such it will also allow for greater dialogue between the schooling and higher education sectors.

Page 14: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Purpose of Diagnostic Assessment

Better advice and support; scaffolding for ‘high-risk’ students.

Critical information can address increased access and throughput.

Admissions and placement testing can lead to a reduction in number of high-risk students admitted to various degree programmes.

Benchmark Tests can provide standardised tests without the process dictating how different universities should use the results.

Page 15: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Some difficulties …

Economies of scale favour large enrolments

ODL principles embrace all students Moral obligation to be honest with

potential students and responsible in use of state funding

Cost – benefit considerations

Page 16: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Establishing the testing process

Meetings with stakeholders Assessment not approved as a prerequisite

to study, but as a compulsory (experimental) co-registration requirement: students thus register for mathematics module and simultaneously for diagnostic assessment, even if they register late and are assessed mid year. BUT later assessment allows less time for remedial action; students advised to delay registration until after assessment.

Computerised testing the ideal; not initially feasible.

Page 17: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

The process (cont.)

Two module codes created for the two assessment periods: (i) supplementary exam period in Jan and (ii) end of first semester

Procedures managed by Exam Dept Explanatory letter for students, Marketing; Calendar; Access Brochure

Page 18: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Selecting content: some considerations

Internationally accepted standards Assessment tools adapted to suit specific

UNISA requirements Reliability and validity important criteria Need for practical, cost effective measures Various options investigated, e.g. AARP

(UCT); ACCUPLACER and UPE adaptations; UP

Page 19: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

UCT & UPE

AARP possibly most appropriate (trial undertaken at UNISA in 1998), but time consuming and not computer-based.

UPE demographically and academically similar to UNISA, research into placement assessment since 1999. ACCUPLACER (USA/ETS) found appropriate.

Page 20: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

ACCUPLACER at UPE

Computerised adaptive tests (unique for each student) for algebra, arithmetic and reading (pen-and-paper options available – only one test).

A profile developed across all the scores in the battery (including school performance)

Regression formulae and classification functions used to classify potential students with respect to risk.

Formulae based on research. 

Page 21: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

ACCUPLACER Reading Comprehension

Established reliability and validity Possible bias: Seven questions possibly

ambiguous or with cultural bias - did not detract from items’ potential to assess construction of meaning from text. Left in original format for first round of testing; can delete or adapt later.

Own standards

Page 22: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Basic Arithmetic Test

‘Home grown’ Testing for potential Assesses understanding rather than recall Items based on misconceptions that are

significant barriers to understanding the content and recognised problem areas

Experiment on benefit of drill-and-practice Reliability, validity to be established

Page 23: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Test format and data capture

Three hours, 35 MCQs (four options) in each category (Total 70)

Mark reading sheets; return papers Assignment section captured marks Computer services processed marks and

determined categories Marks made public

Page 24: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Aspects assessed in ARC

Aspect tested QuestionsCausal relations 2, 4

Contrastive relations 3, 7, 8, 23

Recognition of sequence

5

Interpretation of implied or inferred information

10, 14, 15, 17, 20, 27, 32, 33, 35

Page 25: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Aspects assessed in ARC (cont.)

Aspect tested QuestionsComprehension/interpretation of factual information; detailed/ general

9, 11, 23, 28, 29

Academic vocabulary All except 5, 26

Number sense 23

Recognition of contradiction, inconsistency

12

Substantiation 1, 6

Page 26: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Aspects assessed in BAT

Aspect tested QuestionsSimple arithmetic operations (whole numbers)

19

Simple arithmetic operations (fractions/decimals)

12

Pattern recognition 3

Number sense 5

Conversion of units 3

Academic (maths) /Technical vocab 3 / 4

Page 27: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Aspects assessed in BAT (cont.)

Aspect tested QuestionsComparison 10

Time - how long /Time - when 3/2

‘Translation’ from words to mathematical statements

17

Recognition of insufficient or redundant information

4

Learning from explanation 2

Spatial awareness 3

Insight 14

Page 28: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Grading criteria Three categories considered (exlp. in T/L 101)

Category 53: Students likely to be successful with no additional assistance.

Category 52: Students likely to be successful provided they had support.

Category 51: Students unlikely to be successful without assistance beyond that available at the university.

Criteria for classification based on ACCUPLACER guidelines and empirical results following Phases I, II and III of the reading intervention.

Page 29: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Conversion table in ACCUPLACERCoordinator’s Guide converts raw score out of35 to a score out of 120 - some form of weighting takes place. 0 to 4 out of 35 equivalent to 20 points. Increment between the numbers of correct

answers increases gradually from 1 to 4. ‘Reward’ for obtaining a greater number of

correct answers.

Criteria - Reading Compreh.

Page 30: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Three categories: Weakest: 51 out of 120 ~ 31%Moderate: 80 out of 120 ~ 60%Good: 103 or higher ~ 83%Scores reflect different reading skills, outlined in Coordinator’s Guide. From Phases I and II of the reading project: 60% ~ threshold below which reading skills too weak to support effective study.

ACCUPLACER recommendations

Page 31: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Our score

Giving students the ‘benefit of the doubt’the upper boundary of the lowest categoryin the MDA (students at risk with respect to reading) was pitched lower, at 70 out of 120 (~ 50%).

Page 32: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Comparison of boundaries

High-risk category

Raw score

(0 – 35)

Converted score

(0 – 120)%

Upper boundary

UNISA

22 67 47

23 71 51

Upper boundary

ACCUP

25 79 59

26 84 64

Page 33: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Criteria - Basic Arithmetic Test

Questions weighted: intention to enable students who demonstrated greater insight to score better than those who had memorised procedures without understanding.

Simplest, procedural questions (such as addition or multiplication): W = 1 or 1,5

Questions requiring some insight and interpretation of language: W = 2, 2,5 or 3

Raw score out of 35 ~ final score max 69

Page 34: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Weight distribution of BAT questions

Categ Weight % of

total

No. items

Aspect assessed

Easy 1 10 7 Simp arith

Easy 1,5 13 6 + frac/dec

Mod easy

2 30 10 + no. sense

lang, time

Mod

diff

2,5 22 6 + money,

insight

Diff 3 25 6 + pattern

Page 35: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Cumulative totals

Score up to 53% by correctly answering easy to moderately easy items (W = 1, 1,5 or 2)

Score up to 75% by correctly answering ‘easy’, ‘moderately easy’ and ‘moderately difficult’ items (W up to 2,5)

Score over 75% only if ‘difficult’ items (W = 3) also answered correctly.

Page 36: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Setting the lower limit

10 items (17% of total) computational - no reading skills. Possible for all students to answer these questions correctly.

25 items (83% of total) dependent on reading and other skills.

60% ‘threshold’ => 17% + (60% of 83%) i.e. 67% set as the lower boundary for BAT (~ raw score 46 out of 69).

Students with < 46 at risk wrt numerical and quantitative reasoning.  

Page 37: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Setting the upper limit

ARC top category begins at score of 103 (approximately 83%)

BAT equivalent: 57 out of 69 No other empirical evidence – 57 set as

cut-off point for high achievement in BAT

Page 38: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

MDA categories

Category Reading (weighted)

BAT (weighted)

51 S < 70 OR S < 46

53 S > 103 AND S > 59

52 All other scores

Page 39: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

The assessment process

Procedural issues Problems with co-registration

requirement Several administrative problems

Page 40: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

ARC results

Good item-test correlations. Only five questions with correlations of

below 0,5. Students scored reasonably well on

most of the potentially problematic items. Only three of these had low item-test correlations.

Page 41: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

BAT results

Weaker item-test correlations than Reading Comprehension score.

Low (< 0,30), moderate and high item-test correlations in all question categories.

Reading may play greater role in categorisation. 

Page 42: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Consolidation of results

January:

Category 53 10 3%

Category 52 93 29%

Category 51 223 68%

Total 326

Note: Oct 03/Jan 04 exams: 76% failed

Page 43: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Consolidation (cont.)

June (after exposure to mathematics and English study material):

Category 53 35 4%

Category 52 176 21%

Category 51 623 75%

Total 834

Page 44: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Further analysis of results

 Assignments (January group):

Category 51 mean = 39%

Category 52 mean = 48%

Category 53 mean = 65%

Page 45: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Exam results (all students)

Registered: 1 518 Not admitted to the exam: 912Obtained exam admission: 606Wrote exam: 551 Passed: 162 (October: 145; January: 17) Exam mean: 27% MDA students with exam results: 463MDA exam mean: 35%

Page 46: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Exam results by risk category

Category No. of students

Mean exam score

51 332 (72%) 30%

52 106 (23%) 45%

53 25 (5%) 57%

Page 47: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Exams – no ARC

Category No. of students

Mean exam score

51 136 (29%) 26%

52 189 (41%) 32%

53 138 (30%) 48%

Page 48: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

No exam admission

Cat. No. of MDA students in

category

n = 1 160

No. of MDA students without

admission

n = 698

% of MDA students in

category without exam

admission

51 846 514 (74%) 514/846 = 61%

52 269 164 (23%) 164/269 = 61%

53 45 20 (3%) 20/45 = 44%

Page 49: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Comparison between students who wrote/did not write MDA

Wrote MDA

(n = 463)

Did not write

MDA(n =101)

Did not write / wrote

Pass

( n = 162)

125 (77%) 37 (23%) 37/125 = 30%

Fail

( n = 401)

337 (84%) 64 (16%) 64/337 = 19%

Page 50: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Implications of assessment

Counselling essential, especially for Category 51and 52 students.All potential support options dependent on staff and resource allocation. Options: The Institute for Continuing Education (ICE): advice

regarding alternative directions of study, or measures to upgrade academic skills before studying. Initially agreed to investigate such options, but no progress to date.

Page 51: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Implications of assessment

National Tutorial Support Coordinator seemed in favour of using information obtained in the assessment process to inform the tutorial programme. No information forthcoming on support options or the necessary data collection procedures.

The Bureau for Student Counselling and Career Development (BSCCD) staff willing to assist where possible, but staff not deployed at all centres.

  

Page 52: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Implications of assessment

The Povey Centre (instruction in English language proficiency, reading and writing skills) possibly able to provide some reading instruction via the Learning Centres, at no additional charge (other than the basic Learning Centre registration fee required from all students who registered for tutorial classes) (START programme). No clarity yet regarding extent to which this will be rolled out; impact for mathematics to be investigated.

Page 53: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Implications for qualitative research

Psychological implications of ‘going to university’ but needing first to be taught how to count and read?

Cost implications of referral? Gate keeping or gateway?

Page 54: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences

Pre-registration assessment…

… is critically important. Costs, benefits, advantages and

disadvantages of instituting diagnostic assessment for mathematics at UNISA need to be thoroughly investigated.

Information regarding the implementation process must be well analysed and utilised.

True access depends on providing students with appropriate career guidance.

Ongoing research (quantitative & qualitative) into specific test aspects and components essential.

Page 55: 15 March 2005 Introducing Diagnostic Assessment at UNISA Carol Bohlmann Department Mathematical Sciences