15
Sterling Practices in Design & Scoring of Performance- Based Exams #156 F. Jay Breyer [email protected] Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix,

Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer [email protected] Presented at the 2005 CLEAR Annual Conference

Embed Size (px)

Citation preview

Page 1: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Sterling Practices in Design & Scoring of Performance-Based Exams

#156F. Jay Breyer

[email protected]

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Page 2: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

The Players• F. Jay Breyer, PhD Thomson Prometric• Ron Bridwell, PE National Council of Examiners for Engineering & Surveying• Beth Sabin, DVM, PhD

American Veterinary Medical Association• Ron Rodgers, PhD CTS/Employment Research & Development• Elizabeth Witt, PhD American Board of Emergency Medicine

Page 3: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Scoring Procedures for STRUCTURAL II

September 2005

Ron Bridwell, P.E.

Page 4: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Introduction

The Exam

Before the Scoring Session

The Scoring Process

The Cut Score (Passing Point) Process

Page 5: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Introduction

The Exam

Before the Scoring Session

The Scoring Process

The Cut Score (Passing Point) Process

Page 6: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Introduction

The Exam

Before the Scoring Session

The Scoring Process

The Cut Score (Passing Point) Process

Page 7: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Introduction

The Exam

Before the Scoring Session

The Scoring Process

The Cut Score (Passing Point) Process

Page 8: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

The Exam

Scoring Protocol DevelopmentNeed to Standardize the STR II Scoring

guidelines using a benchmark holistic method.

Scoring can drift due to fatigue or anger.

Scoring Criteria DevelopmentDeveloped by the exam committee as the

problems are developed.Candidates may respond differently.

Page 9: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Before the Scoring Session

Tasks

Identifying Scoring Committee MembersMost familiar with problemsCoordinators work with staffEmpowered to modify criteria as needed.

Page 10: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Before the Scoring Session

Tasks

Identify Sample Papers5 benchmarks for trainingRange finders for training5 benchmarks for certification

Page 11: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

The Scoring Process

Tasks

Certifying Scorers

5 benchmark papers are given to scorers as test

Pass or FailScorers have two chances to be certified

Training the ScorersScorers should be skilled at assigning scores to

specific problemsScorers are trained with benchmark papers

Page 12: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

The Scoring Process

TasksScoring

Care is taken to insure the scorers do not know the names or jurisdictions of the examinees

Papers are scored blind as if by machine Each paper is scored by two scorersIf the scores agree or are off by no more than 1

the score is assigned (averaged)If off by more than 1, the coordinator

adjudicatesAny scorer can be replaced by any other and

the same score would resultDatabase provides feedback

Page 13: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Kinds of Information

Discrepancy

Agreement

Summary

Shows how many papers scored

Shows consistency

Shows consistency of each scorer paired with all partners

Useful for Scoring Reliability

Aggregate &

Separate

Number, Mean, SD across entire test for each scorer and coordinator

Book keeping

Records

RoadTo Fair &

Quality Scores

Number of Papers to be adjudicated

Total Required Adjudications by Scorer

Re-Training may be necessary if too many

Number, Mean, SD read by each scorer & coordinator by problem

Monitoring Solution for Fair Scoring: Report Components

Adjudication Resolution Training

Page 14: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Overview ofStandard Setting Process

Report Results

Definition of Competence

Practice Session

Real Rating

3 Uniform Solution Samples Selected

Training undertaken

Assign candidates to PASS/FAIL status based on comparison of total performance to Standard

Page 15: Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Questions?