31
Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane, Research Associate ([email protected]) Programme for Educational Research and Development Faculty of Health Sciences, McMaster University Hamilton, Ontario, Canada www.fhs.mcmaster.ca/perd

Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Embed Size (px)

Citation preview

Page 1: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress Testing with SIRA Case Study

Based on the McMasterUndergraduate MD Programme

SIR UK User Group ConferenceAberdeen, UK, 21 June 2002

David Keane, Research Associate ([email protected])Programme for Educational Research and Development

Faculty of Health Sciences, McMaster UniversityHamilton, Ontario, Canada www.fhs.mcmaster.ca/perd

Page 2: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Objectives

1. Introduction to progress testing– definition– purpose– method– goals– special features– basic patterns in performance data

2. Using SIR for progress testing

Page 3: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress testing \ a definition

•longitudinal testing of knowledge acquisition

Page 4: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress testing \ a definition

•longitudinal testing of knowledge acquisition

•an objective method for assessing the acquisition and retention of knowledge over time relative to curriculum-wide goals

Page 5: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress testing \ definition detail

• objective– use multiple choice questions

• knowledge– test what learners know

• over time– test repeatedly at regular intervals

• curriculum-wide– address end-of-programme learning

objectives

an objective method for assessing the acquisition and retention of knowledge over time relative to curriculum-wide goals

Page 6: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress testing \ the purpose

•to determine whether the learner is progressing

– learning enough?

– retaining enough?

– doing so quickly enough?

Page 7: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress testing \ the method

•progress is relative– compare learner to his/her peer group

• current class or past n classes

• standardized scores (z-scores)

– review performance on multiple tests• current and past

•assessed with one measure– percentage correct, whole test

• adjust for guessing?

Page 8: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress testing \ goals

•help the learner (formative evaluation)

– constructive feedback• about their knowledge base

• about their ability to self-assess

• has to be specific/detailed

– timely feedback• reassure those who are progressing

• alert those who are not– do so early enough to facilitate effective

remediation

Page 9: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress testing \ goals

•help the Programme (summative evaluation)

– provide defensible evidence to support critical decisions pertaining to individuals

• mandated remediation

• pass / fail / conditional advancement

• the emphasis..– on formative aspects

• minimize negative impact on learning behaviours

– tutorial functioning– self/peer-assessment

Page 10: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress testing \ special features

•the item bank– a sample of the knowledge that a

good student will likely encounter by the time that student. .• graduates?

• is six months / a year beyond graduation?

- content encompasses nearly the 'entire' domain of the field in question

• cf. course/curriculum 'core' knowledge

Page 11: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Progress testing \ special features

• instructions to examinees

– don't study for this test• 180 items, randomly selected from 2,600+

– don't guess• test your ability to self-assess

• penalty for guessing (optional)

– attempt only those items for which you have some knowledge and are reasonably confident you know the best/correct answer

Page 12: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Basic patterns in performance data

• Class means on whole test for..– % attempted

– % correct• not adjusted for assumed guesses

• look at patterns..– across time

• week in programme (x of 138)

– across classes at week x

Page 13: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

gif: Items Attempted (%)

Page 14: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

gif: Items corect (%)

Page 15: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Basic patterns in performance data

• % attempted, % correct

– patterns are relatively stable across tests and classes

– means at week x are relatively consistent across tests and classes

– examinee performance is relatively consistent across tests and classes• overall test reliability 0.6 - 0.7 (8 tests)

• test-retest correlation 0.6 - 0.8 (2 tests)

Page 16: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

End of Part I

Introduction to Progress Testing

Any questions?

Page 17: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Objectives

1. Introduce progress testing

2. Using SIR for progress testing– what's in an item?

– data management tasks

– managing dm tasks• software• databases and pql

– SIR \ valued features

– future enhancements

Page 18: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

What's in an item? (1)

• the examinee sees..

nn. An elderly woman has been showing signs of forgetfulness, poor concentration, and decreased energy and appetite. On exami-nation her cognitive functioning seems quite good and her mini-mental (Folstein) score is 27/30. The most likely diagnosis is:

A Anxiety disorderB Multi-infarct dementiaC Alzheimer diseaseD Personality disorderE Depression

Stem

Options

Page 19: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

What's in an item? (2)

• the data manager sees..– stem and options (text)

and

– unique item identifier

– correct response code– content codes (6 fields, 1, 2 or 3 sets)

– item performance data• stats on usage, power to discriminate• by test, class; across tests, classes

– and more..• date last used, don't use flag

Page 20: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Data management tasks

1. store, retrieve, manipulate and print

large volumes of textual information

• pre-test: test booklets– 180 items, 21–22 pages/booklet

• post-test: performance reports– for examinees: 2 reports x 1–2 pages/rep– for administrator: re. items, tests, classes and

examinees who are not progressing

• accommodate special needs re.– special characters – Greek letters, math symbols– page layout, fonts, typeface style

• merge data into report templates

Page 21: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Data management tasks \ post-test

2. read examinees' responses– 100-item optical mark response sheets

– tab-delimited ascii records• Mac: 2 sheets X approx. 280|420 examinees

3. score examinees' responses– requires item, test, class, examinee info

– compute and retain performance stats• key measures: % attempted, % correct

– mean & sd re. whole test (and major subdomains?)

• for: each examinee, each class|peer group

Page 22: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Data management tasks \ post-test

4. compute and retain item performance

stats– requires item, test, class, examinee info

5. compute/retrieve data needed in standard reports

– re. examinees, classes, tests, items

6. assemble and print reports

Page 23: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Data management tasks

7. enable support staff to do all of the above with relative ease

– minimal reliance on the application programmer after everything is up and running

Page 24: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

What's the best tool for the job?

• SIR is not a word processor

• SIR is a record management and stats-oriented reporting tool– allows user to build powerful custom

applications

– vendor provides exceptional support beyond the installed Help files• prompt, relevant and practical replies

Page 25: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Solutions \ the best tool

• the MD Programme's solution

– for text-intensive tasks..

Corel WordPerfect

- for numeric data and stats-intensive tasks..

SIR

Page 26: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Solutions \ Corel WordPerfect

• a set of merge data files (database)• case-based by item id• item stems, options and other fixed info

• export data via csv or fixed-format records

• import data via csv-format records• into merge data files

• multiple merge forms (report templates)

• extensive use of merge and macro commands to produce pre/post-test reports

• custom-build merge|macro applications

Page 27: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Solutions \ SIR ver. 3.2 - 2000

• 2 databases, case-based – ITEMS re. items– TEEX re. tests, examinees,

classes

• major reliance on (vb) PQL– custom applications

• csv-format records– add/update records/fields (eg, from WP)

– write records/values (eg, for WP)

– PQL procedures• csv save, tabulate, save table, spss save

Page 28: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

SIR 2000 \ valued features (1)

• DBMS

– case-based option for db type• system-maintained• easy access to any case's records• case id is on all dumped records

– global variables• pass user settings to applications

– utilities• Data \ File Dump, File Input

– tabfiles and tables• create, index, populate, delete tables

Page 29: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

SIR 2000 \ valued features (2)

• PQL– nested access to cases– read csv-format records– vb dialog boxes

• PQL Procedures– write csv-format records– xtab tables, flexibility re. headers (columns)

– write SPSS system files

Page 30: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

Future enhancements

• upgrade to SIR 2002 (from SIR 2000)

– update custom applications (to vb pql)

– add secondary indexes• examinees by name, current class

• web access– for examinees: performance reports

– method?• ColdFusion (CF –– SQL –– ODBC driver –– SIR db)

• CGI scripts

Page 31: Progress Testing with SIR A Case Study Based on the McMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane,

End of Part II

Using SIR for progress testing

Any questions?