13
Regents Scanning – June/August 2011 Debrief

Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

Embed Size (px)

Citation preview

Page 1: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

Regents Scanning – June/August 2011 Debrief

Page 2: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds

Agenda

1. NYC Scanning Solution1. Process Overview2. Scanners3. Answer Documents4. Reports for Schools

2. June/August 2011 Debrief1. Summary2. Lessons Learned3. Process Improvements

3. Q & A

2

Page 3: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds 3

DOE Data Capture Solution

Leverage existing technology and processes currently in use in schools to capture item-level information & provide real-time results. This includes:

• Answer Documents: Generated & printed at school-level• Scoring: Teachers to score Constructed Response items only• Scanning: Using schools’ attendance scanners, locally scan documents on-site• Reporting Results: Automated transfers between internal systems

Benefits:• Increased automation of time-consuming manual processes

• Less overall time required to complete scoring by teachers• No further data entry requirements of aggregate scores

• Reduction in clerical scoring errors• eliminate manual scoring of MC items• adding totals from different sections• application of conversion tables

• Districts/schools/teachers will be able to better utilize item-level results to help inform instruction, similar to Grade 3-8 exams

Page 4: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds

NYC Scanning Process Overview• Schools schedule students to sit for Regents exam

• Schools locally print answer documents (student specific)• Pre-slugged with Bar Code (visible Student Name & ID)• Answer documents are 1 or 2 pages (depending on exam)

• NYC process does not allow for Blank (non pre-slugged) answer documents• Walk-ins: School must individually enter student ID into ATS (local Student Information System)

to generate & print the student’s Regents answer document

• Administer Test Scoring committee scores open-ended items• DOE inputs SED-released answer key & conversion table into ATS & “opens” scanning

• School scanning team SCANS all answer documents, data upload to ATS

• Confirm accuracy of data capture (manually score 5% of exams)

• Re-scan if needed (audit trail of any changes)

• Transfer of final score data: Report Cards, Transcripts Load to SED

4

Overnight data transfer

Real-time scoring:

Raw & Scale Score

Page 5: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds

Image Scanners Currently Utilized by Schools

5

Lexmark X656DECapacity: 75 pagesStandalone scanner(direct network connection)**Most commonly used scanner

Fujitsu fi-6670Capacity: 200 pagesConnected via desktop

Page 6: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds

NYCDOE Answer Docs

Subject # of Pages

English 1

Integrated Algebra 1

US History 2

Global Studies 2

Living Environment 2

Earth Science 2

6

Note: • NYCDOE will require use of Pencil (not

Pen) for students and raters

• Double-sided answer documents posed image capture challenges

Page 7: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds

NYCDOE Reports for SchoolsAll schools receive immediate reports detailing performance (available on screen with

options to print or export to MS Excel)

• Summary / Status reports• Item Level Reports • Item Distribution Reports• Change Reports• Omission/Multiple Reports

7

Page 8: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds 8

Utilization Metrics – June 2011

Exam Schools Administering

Documents Printed

Documents Scanned

% Scanned / Printed

Earth Science 452 51,278 46,787 91.2%

English 477 70,631 65,768 93.1%

Global History 486 101,678 99,648 98.0%

Integrated Algebra 781 117,608 113,989 96.9%

Living Environment 589 94,057 88,522 94.1%

US History 491 94,848 93,111 98.2%

ALL EXAMS 530,100 507,825 95.8%

Exams Administered: 1 2 3 4 5 6 Total

Number of Schools: 136 157 33 30 141 317 814

• Nearly 96% of answer documents generated (> 507K exams) were successfully scanned.

• 814 unique DBNs administered at least one scanned Regents.

Page 9: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds 9

Utilization Metrics – August 2011• Approximately 145K exams (≈197K pages) were successfully

scanned, resulting in a scaled score, ABS, or INV record.

Exam Schools Administering

Documents Printed

Documents Scanned

% Scanned / Printed

Earth Science 165 10,228 9,416 92.1%

English 303 17,937 17,185 95.8%

Global History 314 33,404 32,437 97.1%

Integrated Algebra 320 34,857 33,313 95.6%

Living Environment 304 25,074 23,749 94.7%

US History 301 23,348 22,789 97.6%

ALL EXAMS 338 144,848 138,889 95.9%

• 338 unique DBNs administered at least one scanned Regents

Page 10: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds 10

Implementation Highlights• More than 674K+ exams successfully processed end-to-end, with high

(96%) scanned/printed ratio across 800+ schools.

• New technology solution met SED requirements

• Data output: general excitement at both the school level & central office

• Item-level data available for instructional and planning purposes

• Image capture and data string available for audit purposes

• Time Savings: teachers were pleased not to have score everything themselves.

• 2.5 minutes per exam x 674,000 exams scanned = 28,083 hours of manual data entry work saved

• Schools’ and networks’ familiarity with Regents Scanning from June cycle resulted in fewer process errors and support needs in August.

Page 11: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds 11

General Challenges• High volumes of scanning (2x daily average) put strain on IT mainframe

and downstream systems. Systems bent, but did not break

• Process and IT improvements planned for future testing cycles (e.g. additional mainframe space, re-prioritization DBA activity during scanning week).

• Time lags in applying data from Scanner to Student Information System was at times confusing to schools, and problematic for downstream systems.

• SED errata corrections posed new challenges to correct previously scanned scores. Different answer keys/conversion tables based on language needed to be created.

• Students using wrong answer documents/mis-matching exams

• NYC Students outside of NYC – needed to use different answer documents

Page 12: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds 12

General Challenges• Rescanning – many schools had to re-scan answer documents, thereby

requesting a time extension on scanning. Key reasons include:• Light bubble marks • Multiple marks picked up / stray lines• Clerical errors on teacher scored sections – all exams• Light Toner• Incorrect printer settings

• NYC created an online tool to help address these rescanning issues

• Policy issues: majority of schools requested clarification on various administrative and policy topics, including

• Pre-populating accommodations onto answer documents• What’s allowed for transcription (e.g. paper rips, paper won’t scan, student gets sick)• Policy on scanning “absent” students or students not finishing science lab requirements• “Did not meet lab requirement” for Science Exams was a common question from

schools• Earth Science – Performance Score but no written test. What is the score?• Scanning before the constructed response section is scored (different for 1 to 2 page

exams)

Page 13: Regents Scanning – June/August 2011 Debrief. ds Agenda 1.NYC Scanning Solution 1.Process Overview 2.Scanners 3.Answer Documents 4.Reports for Schools

ds 13

Process Improvements• For upcoming Regents Administrations, we are implementing the

following process improvements1. Creating a scannable, generic answer document2. Incorporating “Did not meet lab requirement” on Science exams3. Changing the scoring algorithm so that any response will override

the absent bubble4. For scanning reports, the system will not display raw/scale score

until both pages are scanned5. NYC created an online tool to allow Principals to correct an answer if

the scanner did not pick up the intended response (e.g. light mark, stray mark)