52
Enhancing Assessment & Enhancing Assessment & Accountability Systems Accountability Systems through Systematic through Systematic Integration of Computer Integration of Computer Technology Technology Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education Presentation at the tenth annual Maryland Assessment Conference October 2010

Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

  • Upload
    tyrone

  • View
    29

  • Download
    0

Embed Size (px)

DESCRIPTION

A State Perspective on Enhancing Assessment & Accountability Systems through Systematic Integration of Computer Technology. Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education. Presentation at the tenth annual Maryland Assessment Conference October 2010. - PowerPoint PPT Presentation

Citation preview

Page 1: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

A State Perspective on A State Perspective on Enhancing Assessment & Enhancing Assessment & Accountability Systems Accountability Systems through Systematic Integration through Systematic Integration of Computer Technologyof Computer Technology

Joseph A. Martineau, Ph.D.Vincent J. Dean, Ph.D.Michigan Department of Education

Presentation at the tenth annual

Maryland Assessment Conference

October 2010

Page 2: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The Michigan StageThe Michigan StageMichigan offers an interesting

perspective◦Pilot in 2006◦Pilot in 2011 (English Language

Proficiency)◦Pilot in 2012 (Alternate Assessments)◦Pilots leading up to operational adoption

of SMARTER/Balanced Assessment Consortium products in 2014/15

◦Constitutional amendment barring unfunded mandates

Page 3: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The National StageThe National StageSurvey of state testing directors

(+D.C.)◦50 responses + one investigation via

state department of education website◦7 of 51 states have no CBT initiatives◦44 of 51 states have current CBT

initiatives, including: Operational online assessment Pilot online assessment Plans for moving online

Page 4: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The National Stage, The National Stage, continued…continued… Survey of state testing directors (+D.C.)

◦ CBT initiatives include Teacher entry of student responses online Student entry of responses online P&P replication CAT AI scoring MC via internet, CR via paper and pencil General populations (grade level and end of course) Special populations (eases infrastructure concerns)

Modified Alternate English language proficiency

Online repository and scoring of portfolio materials Item banks for flexible unit-specific interim assessment

◦ Initiatives are all over the board, piecemeal for the most part

Page 5: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The National Stage, The National Stage, continued…continued…Survey of state testing directors

(+D.C.)◦Of 44 states with some initiative

26 states currently administer large-scale general populations assessments online

15 states have plans to begin (or expand) online administration of large-scale general populations assessments

12 states currently administer special populations assessments online

3 states have plans to begin (or expand) online administration of special populations assessments

Page 6: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The National Stage, The National Stage, continued…continued…Survey of state testing directors (+D.C.)

◦Of 44 states with some initiative 7 states currently use Artificial Intelligence (AI)

scoring of constructed response items 4 states currently use Computer Adaptive Testing

(CAT) technology for general populations assessment, with one more moving in that direction soon

0 states currently use CAT technology for special populations assessment

10 states offer online interim/benchmark assessments

10 states offer online item banks accessible to teachers for creating “formative”/interim/benchmark assessments tailored to unique curricular units

Page 7: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The National Stage, The National Stage, continued…continued…Survey of state testing directors (+D.C.)

◦ Of 44 states with some initiative 6 states offer computer based testing (CBT) options on

general populations assessment as an accommodation for special populations

4 states report piloting and administration of innovative item types (e.g. flash-based modules providing mathematical tools such as protractors, rulers, compasses)

16 states offer End of Course (EOC) tests online, or are implementing online EOC in the near future

6 states report substantial failure of a large-scale online testing resulting in cessation of computer based testing Some have recovered and are moving back online Others have no plans to return to online testing

Page 8: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The National Stage, The National Stage, continued…continued…Development of the Common Core

of State Standards (CCSS)◦Content standards (not a test)

English Language Arts (K-12) Mathematics (K-12)

◦Developed with backing from 48 states◦Adoption tally

Adopted in full by 39 states Adoption declined in 5 states Adoption expected by remaining 6 states by

end of 2011

Page 9: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The National Stage, The National Stage, continued…continued…Assessment Consortia

◦Race to the Top Assessment Competition

◦Development of an infrastructure and content for a common assessment in measuring CCSS in English Language Arts and Mathematics

◦Two consortia SMARTER/Balanced Assessment Consortium

(SBAC) Partnership for the Assessment of Readiness

for College and Career (PARCC)

Page 10: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The National Stage, The National Stage, continued…continued…The consortia:

◦SMARTER/Balanced 31 states 17 governing states CAT beginning in 2014-2015

◦PARCC 26 states 11 governing states CBT beginning in 2014-15

Page 11: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Consortia MembershipConsortia Membership

SBAC Governing

PARCC Governing

SBAC Advisory

PARCC Advisory

Both Advisory

None

D.C.

Buy SmartDraw!- purchased copies print this document without a watermark .

Visit www.smartdraw.com or call 1-800-768-3729.

Page 12: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The National Stage, The National Stage, SummarySummary State efforts have been, with few exceptions, piecemeal by…

◦ Program

◦ Content area

◦ Grade level

◦ Type of assessment (summative, interim, formative)

◦ Population (general, modified, alternate)

Most states are…◦ Involved in some kind of pilot or operational use

◦ Intending to be operational on a large scale by 2014-2015

◦ Experiencing budget crises…◦ That make transitions difficult◦ That make efficiencies of technology integration critical

A strong need to take a systems look at how to integrate computer technology into assessment and accountability systems

Technology integration is a significant opportunity to provide a platform that connects all initiatives

Page 13: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The Organizing Framework The Organizing Framework for this Paperfor this Paper

From…◦Martineau, J. A., & Dean, V. J. (in

press). Making Assessment Relevant to Students, Teachers, and Schools. In V. Shute & Becker, B.J. (Eds.). Innovative Assessment for the 21st Century: Supporting Educational Needs. Springer-Verlag, NY.

◦Figure 1

Page 14: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Acc

ou

nt-

ab

ility

Cla

ssro

om

S

um

ma

tive

A

sse

ssm

en

t

Pro

fess

ion

al

De

velo

pm

en

t

Cla

ssro

om

F

orm

ativ

e

Ass

ess

me

nt

Se

cure

Ad

ap

tive

Inte

rim

Ass

ess

me

nt

Se

cure

A

da

ptiv

e

Su

mm

ary

A

sse

ssm

en

t

Co

nte

nt

&

Pro

cess

S

tan

da

rds

Limited number of high-school-exit standards

Limited number of K-12 content/

process standards

Model curriculum/instruction units

Classification of content & process standards for measurement purposes by:

Response type * on-demand timed * on-demand untimed * feedback looped

Task type * selected response * short constructed response * extended constructed response * performance event

Setting * classroom only * classroom and secure

Assessmentliteracy

standardsfor educatorcertification

Assessment literacy training requirements for: * teachers * consultants * leaders

Pre- and in-service balanced assessment training on: * content standards * classroom assessment (formative, summative) * large-scale assessment (benchmark, summative) * assessment data use for decision making * subjective item scoring

Ongoing support for implementation in the form of school teams and coaches

(for observation and followup)

Model classroom formative & summative assessment strategies & materials

Online classroom assessment strategies & materials clearing-house for educators

Learningprogressions

Repeatable, on-demand customizable, on-line, unit

assessments

End of year, on-demand summary assessment

(if needed)

Portfolio description(feedback looped tasks)

Overall achievement & growth scores

Scoring(maximize objective, distribute

subjective)

Summative classroom

assessments

Unit achievement scores

Classroomachievement scores

Portfolio development & submission

Ifneeded

Formative assessment

implementation

Teacher prep institution accountability (e.g.,

accreditation) for pre-service PD

SEA & LEA accountability (e.g., accreditation) for in-

service PD

Educator accountability (e.g., evaluation, performance pay) for

implementation of classroom assessment & data use practices

Educator accountability (e.g., evaluations, performance pay) for individual student achievement & growth scores on secure

summative assessments

Student accountability (e.g., grades, course credit) for

classroom (and possibly secure interim) summative scores

Learningprogressions

Growth scores based on learning progressions

Page 15: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Pro

fess

ion

al

De

velo

pm

en

t

Co

nte

nt

&

Pro

cess

S

tan

da

rds

Cla

ssro

om

F

orm

ativ

e

Ass

ess

me

nt

Cla

ssro

om

S

um

ma

tive

A

sse

ssm

en

t

Se

cure

Ad

ap

tive

Inte

rim

Ass

ess

me

nt

Se

cure

A

da

ptiv

e

Su

mm

ary

A

sse

ssm

en

t

Acc

ou

nt-

ab

ility

Professional Development as Footings

Content and Process Standards as Foundation

Classroom Formative Assessment as the Ground Floor

Classroom Summative Assessment Layered on Formative Assessment

Secure Adaptive Interim Assessment as a Policy andAccountability Metric (including Within-Year Growth Modeling) that

Makes Sense Only when the Foundational Layers are in Place

Secure Adaptive Summary Assessment as a Policy andAccountability Metric (including Cross-Year Growth Modeling)

Accountability as Protective Umbrella Over the Complete SystemMakes Sense Only when All Layers Below are in Place

Page 16: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Acc

ou

nt-

ab

ility

Cla

ssro

om

S

um

ma

tive

A

sse

ssm

en

t

Pro

fess

ion

al

De

velo

pm

en

t

Cla

ssro

om

F

orm

ativ

e

Ass

ess

me

nt

Se

cure

Ad

ap

tive

Inte

rim

Ass

ess

me

nt

Se

cure

A

da

ptiv

e

Su

mm

ary

A

sse

ssm

en

t

Co

nte

nt

&

Pro

cess

S

tan

da

rds

Limited number of high-school-exit standards

Limited number of K-12 content/

process standards

Model curriculum/instruction units

Classification of content & process standards for measurement purposes by:

Response type * on-demand timed * on-demand untimed * feedback looped

Task type * selected response * short constructed response * extended constructed response * performance event

Setting * classroom only * classroom and secure

Assessmentliteracy

standardsfor educatorcertification

Assessment literacy training requirements for: * teachers * consultants * leaders

Pre- and in-service balanced assessment training on: * content standards * classroom assessment (formative, summative) * large-scale assessment (benchmark, summative) * assessment data use for decision making * subjective item scoring

Ongoing support for implementation in the form of school teams and coaches

(for observation and followup)

Model classroom formative & summative assessment strategies & materials

Online classroom assessment strategies & materials clearing-house for educators

Learningprogressions

Repeatable, on-demand customizable, on-line, unit

assessments

End of year, on-demand summary assessment

(if needed)

Portfolio description(feedback looped tasks)

Overall achievement & growth scores

Scoring(maximize objective, distribute

subjective)

Summative classroom

assessments

Unit achievement scores

Classroomachievement scores

Portfolio development & submission

Ifneeded

Formative assessment

implementation

Teacher prep institution accountability (e.g.,

accreditation) for pre-service PD

SEA & LEA accountability (e.g., accreditation) for in-

service PD

Educator accountability (e.g., evaluation, performance pay) for

implementation of classroom assessment & data use practices

Educator accountability (e.g., evaluations, performance pay) for individual student achievement & growth scores on secure

summative assessments

Student accountability (e.g., grades, course credit) for

classroom (and possibly secure interim) summative scores

Learningprogressions

Growth scores based on learning progressions

Page 17: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Entry Entry PointsPoints

Limited number of high-school-exit standards

Assessmentliteracy

standardsfor educatorcertification

Page 18: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

OutcomesOutcomes

Overall achievement & growth scores

Unit achievement scores

Classroomachievement scores

Formative assessment

implementation

Growth scores based on learning progressions

Page 19: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

The Organizing Framework for this The Organizing Framework for this Paper, continued…Paper, continued…With a comprehensive system in

place, it is possible to identify comprehensively where integration of technology will enable and enhance the system

Components identified with bold outlines on the next slide

Page 20: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Acc

ou

nt-

ab

ility

Cla

ssro

om

S

um

ma

tive

A

sse

ssm

en

t

Pro

fess

ion

al

De

velo

pm

en

t

Cla

ssro

om

F

orm

ativ

e

Ass

ess

me

nt

Se

cure

Ad

ap

tive

Inte

rim

Ass

ess

me

nt

Se

cure

A

da

ptiv

e

Su

mm

ary

A

sse

ssm

en

t

Co

nte

nt

&

Pro

cess

S

tan

da

rds

Limited number of high-school-exit standards

Limited number of K-12 content/

process standards

Model curriculum/instruction units

Classification of content & process standards for measurement purposes by:

Response type * on-demand timed * on-demand untimed * feedback looped

Task type * selected response * short constructed response * extended constructed response * performance event

Setting * classroom only * classroom and secure

Assessmentliteracy

standardsfor educatorcertification

Assessment literacy training requirements for: * teachers * consultants * leaders

Pre- and in-service balanced assessment training on: * content standards * classroom assessment (formative, summative) * large-scale assessment (benchmark, summative) * assessment data use for decision making * subjective item scoring

Ongoing support for implementation in the form of school teams and coaches

(for observation and followup)

Model classroom formative & summative assessment strategies & materials

Online classroom assessment strategies & materials clearing-house for educators

Learningprogressions

Repeatable, on-demand customizable, on-line, unit

assessments

End of year, on-demand summary assessment

(if needed)

Portfolio description(feedback looped tasks)

Overall achievement & growth scores

Scoring(maximize objective, distribute

subjective)

Summative classroom

assessments

Unit achievement scores

Classroomachievement scores

Portfolio development & submission

Ifneeded

Formative assessment

implementation

Teacher prep institution accountability (e.g.,

accreditation) for pre-service PD

SEA & LEA accountability (e.g., accreditation) for in-

service PD

Educator accountability (e.g., evaluation, performance pay) for

implementation of classroom assessment & data use practices

Educator accountability (e.g., evaluations, performance pay) for individual student achievement & growth scores on secure

summative assessments

Student accountability (e.g., grades, course credit) for

classroom (and possibly secure interim) summative scores

Learningprogressions

Growth scores based on learning progressions

Page 21: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Starting from the Bottom Starting from the Bottom UpUpProfessional Development

Current lack of pre-service and in-service balanced assessment training

Need for rapid scale up to millions of educators on a small budget

Pre- and in-service balanced assessment training on: * content standards * classroom assessment (formative, summative) * large-scale assessment (benchmark, summative) * assessment data use for decision making * subjective item scoring

Ongoing support for implementation in the form of school teams and coaches

(for observation and followup)

Page 22: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Technology Integration into Technology Integration into Pre- and In-Service Professional Pre- and In-Service Professional DevelopmentDevelopmentScaling up is only feasible with integral use of

technological toolsHigh-quality online coursesSocial networking among educatorsLive tele-coachingElectronic (graphic, audio, video) capture for distance

streaming of materials, plans, and instructional practice vignettes over high speed networks To facilitate discussion regarding instructional practice between

Candidates and instructor/coach Candidates and mentor Mentors and instructor/coach

For example, repurposing Idaho’s special portfolio submission system for educator training

Page 23: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Moving to Content & Moving to Content & Process StandardsProcess StandardsStart a limited set of high school exit

standards based on college and career readiness

From that, develop K-12 content/process standards in a logical progression to college and career readiness

Based on the learning progressions and K-12 content/process standards, develop model instructional materials

Model curriculum/instruction units

Page 24: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Model Instructional Materials Model Instructional Materials ClearinghouseClearinghouseDevelop online clearinghouse of

materials for model curriculum and instructional units◦Lesson plans◦Lesson materials◦Video vignettes of high quality instructional

practices based on those units◦Flexible platform to accept user submission

in a variety of formats◦User moderated ratings of submission

quality

Page 25: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Moving to Assessment Moving to Assessment PracticesPracticesBefore actually moving into assessment

practices, it is important to classify content standards in three ways:◦ Timing

On-demand, time limited On-demand, not time limited Feedback-looped

◦ Task type Selected response Short constructed response Extended constructed response Performance events

◦ Setting Classroom only Classroom and secure

Based on these classifications, several types of assessment take place

Page 26: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Assessment Practices, Assessment Practices, continued…continued…Start with model classroom

materials and tools

Initial development of model materials, vignettes, strategies, and tools sets the stage for…

Model classroom formative & summative assessment strategies & materials

Online classroom assessment strategies & materials clearing-house for educators

Page 27: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Educator submissions toEducator submissions toPopulate online clearinghouse of materials

for model classroom assessment practice units◦ Summative assessment materials◦ Formative assessment vignettes, strategies,

and tools◦ Flexible platform to accept user submission in

a variety of formats◦ User moderated ratings of submission quality

Non-secure item bank generated by educators◦ Platform support various item types◦ User moderated ratings of submission quality◦ Large enough that security is not a concern

Empirically designed MC itemsFully customizable

Page 28: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Which in Turn Leads to…Which in Turn Leads to…

Implementation of formative assessment practices enhanced by technological aids, such as◦ Response devices (e.g., clickers, tablet computers,

phones)◦ Rapid response to teacher queries over online systems◦ Remote response to formative queries (e.g. rural areas

and cyberschools)

Online classroom assessment strategies & materials clearing-house for educators

Summative classroom

assessments

Formative assessment

implementation

Page 29: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Which in Turn Leads to…Which in Turn Leads to…

Selection or development of summative classroom assessments◦ On-demand micro-benchmark (small unit) assessments

◦ From non-secure item bank generated by educators

◦ Customizable to fit specific lesson plans/curricular documents

◦ Instant reporting for diagnostic/instructional intervention purposes

◦ Inform targeted professional development in real time

◦ RESULTS NOT used for large-scale accountability purposes (belongs to the schools and teachers)

Online classroom assessment strategies & materials clearing-house for educators

Summative classroom

assessments

Formative assessment

implementation

Page 30: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

With High-Quality Classroom With High-Quality Classroom Assessment Practices in PlaceAssessment Practices in PlaceLarge-scale assessment now

makes sense, with three types of large-scale assessment

Repeatable, on-demand customizable, on-line, unit

assessments

End of year, on-demand summary assessment

(if needed)

Portfolio development & submission

Page 31: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Large-Scale Assessment, Large-Scale Assessment, continued…continued…Start with classroom-based

For content standards best measured using “feedback-looped” tasks◦Meaning content standards (likely higher order) that are best accomplished with a feedback cycle between teacher and student

Portfolio development & submission

Page 32: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Portfolio Development & Portfolio Development & Submission, continued…Submission, continued…

Creation of portfolio includes scannable materials, electronic documents, and/or audio/video of student performance

Submitted via a secure online portfolio repository (e.g., Idaho’s alternate assessment portfolio submission site)

Unlikely to be scorable using AI, therefore, scored on a distributed online scoring system that prevents teachers from scoring their own students’ portfolios (e.g., Idaho’s alternate assessment portfolio scoring site

Can be scored both for final product and development over time

Portfolio development & submission

Page 33: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Moving to Secure Online TestingMoving to Secure Online Testing

For content standards that do not require “feedback-looped” tasks

Dynamic online CAT assessments◦ Based on dynamically selected clusters of content standards

covered in instructional units◦ Scaled to the same scale as the end-of-year assessment, with

cut scores for mastery/proficiency◦ Can move students on to higher grade level content once

mastery/proficiency of all grade level content is demonstrated through unit assessments

◦ What Race to the Top Assessment Competition calls “Through-Course Assessment”

Repeatable, on-demand customizable, on-line, unit

assessments

Page 34: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Moving to Secure Online TestingMoving to Secure Online Testing

◦ What Race to the Top Assessment Competition calls “Through-Course Assessment”◦ Provides advance look at trajectory toward proficiency◦ Provides multiple opportunities to demonstrate proficiency◦ More equitable for high-stakes accountability purposes◦ Useful for mid-year correction in instructional practice (e.g.

Response to Intervention)◦ Useful for placement purposes of newly arrived students◦ Useful for differentiated instruction◦ Anticipate increase educator motivation (because of timely

information)

Repeatable, on-demand customizable, on-line, unit

assessments

Page 35: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Moving to Secure Online TestingMoving to Secure Online Testing

Beyond traditional CAT/CBTAI Scoring of constructed response

itemsTechnology enhanced itemsPerformance tasks/events (through

simulations)Gaming type items

Repeatable, on-demand customizable, on-line, unit

assessments

Page 36: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Moving to Secure Online TestingMoving to Secure Online Testing

For three groups of students…

1. Initial scaling and calibration group2. Ongoing randomly selected validation groups (to

validate that students proficient on all required unit tests retain proficiency at the end of the year)

3. Students who do not achieve proficiency on all required unit tests

Final opportunity to demonstrate overall proficiency if proficiency was in question on any single unit assessment

Allows for the elimination of a single end-of-year test for most students

End of year, on-demand summary assessment

(if needed)

Page 37: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

ScoringScoringMaximize objective

scoring by◦Automated scoring of

objective items◦AI scoring of extended

written response items, technology enhanced items, and performance tasks wherever possible

◦Distributed hand-scoring of tasks not scorable using AI

Scoring(maximize objective, distribute

subjective)

Page 38: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Distributed Scoring as Distributed Scoring as Professional DevelopmentProfessional DevelopmentHuman scorers taken from ranks of

educators◦Online training on hand-scoring◦Online certification as a hand-scorer◦Online monitoring of rater performance◦Validation hand-scoring of samples of

AI-scored tasksOur experience with teacher-led

scoring and range-finding indicates that it is some of the best professional development that we provide to educators

Page 39: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

ReportingReportingFor the most part, reports are

difficult to read and poorly usedNeed online reporting of all scores

for all stakeholders, including:◦Policymakers (aggregate)◦Administrators (aggregate and

individual)◦Teachers (aggregate and individual)◦Parents (aggregate and individual)◦Students (individual)

Page 40: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Reporting PortalReporting PortalReporting portal needs to be able to integrate reports from classroom metrics all the way to large-scale secure assessment metrics

Overall achievement & growth scores

Unit achievement scores

Classroomachievement scores

Growth scores based on learning progressions

Page 41: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Reporting PortalReporting PortalReporting cycles

depend on the item types and application of AI scoring.◦ Immediate where

possible◦ Expedited hand-scoring

(shifting funding focus from printing, shipping, and scanning to on-demand hand-scoring)

Overall achievement & growth scores

Unit achievement scores

Classroomachievement scores

Growth scores based on learning progressions

Page 42: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Where the Rubber Hits the Where the Rubber Hits the RoadRoadThis is a nice system design (if we do

say so ourselves), but what are the impediments to implementation?

Infrastructure◦ LEA hardware and bandwidth capacity◦ Assessment vendor capacity◦ Moving from piecemeal components to an

integrated, coherent system◦ Development of educator-moderated

clearinghouses◦ Development of educator-moderated item

bank

Page 43: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Where the Rubber Hits the Where the Rubber Hits the RoadRoadSecurity

◦The more high-stakes the system, the more likely security breaches become

◦Critical need for training on user roles◦Critical need for training on data use,

since data will become much more readily available across the board

◦Security controls versus open-source and maximal access

Page 44: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Where the Rubber Hits the Where the Rubber Hits the RoadRoadFunding

◦Very high initial startup investment◦Dual systems during development

and initial implementation◦Ramping up LEA technology systems

to be capable of working within the system

Page 45: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Where the Rubber Hits the Where the Rubber Hits the RoadRoadSustainability

◦ Requires perpetual investment in administration

◦ Development is only the start (e.g. sustainability concerns regarding RTTT-funded assessment consortia)

◦ Requires early success and public understanding of the benefits of the system weighed against ongoing costs

◦ Recurring hardware/software technology upgrade costs for LEAs

◦ Recurring hardware/software technology maintenance costs for central IT systems

Page 46: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Where the Rubber Hits the Where the Rubber Hits the RoadRoadLocal Control

◦ This kind of system is only possible to create with significant funding and local buy-in

◦ No single state (let alone district) could afford the cost of development and implementation

◦ Consortia are imperative to creating such a system Consortia can tend toward self-perpetuation rather

than serving their members Consortia cannot ignore local nuances Consortia cannot ignore reasonable needs for

flexibility Consortia must monitor and maximize member

investment

Page 47: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Where the Rubber Hits the Where the Rubber Hits the RoadRoadBuilding an appetite for online systems

◦ Implementation may occur piecemeal, but should be undertaken within a framework for a coherent and complete system

◦ Each piece when implemented needs to be implemented in such a way that local educators and policymakers see a positive impact on the educational system, e.g., Immediate turnaround of results Connection between family and school Improved instructional practice Facilitation of differentiated instruction

Page 48: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Recommendations for Future Recommendations for Future DirectionsDirectionsSystem has the potential to make us

data-rich and analysis-poor◦ Build local (SEA and LEA) capacity for

appropriate analysis (possibly through re-defining positions that might be eliminated through consortia services)

◦ New practices (e.g. through-course, innovative items types, AI scoring) will require a significant research and validation agenda, including Equating Comparability Standard setting

Page 49: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Recommendations for Future Recommendations for Future DirectionsDirectionsSystem has the potential to make educators and students data rich◦Portfolios of assessment results and products as evidence of students’ college and career readiness

◦Portfolios of assessment results and products as evidence of teacher classroom practices and effectiveness

Page 50: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Recommendations for Future Recommendations for Future DirectionsDirectionsFinancial incentives from ARRA/RTTT

have provided the impetus for some of these initiative to get started

Sustainability needs to be a focus both within and across states

To maximize cross-state focus, we recommend continued significant funding of initiatives through ESEA reauthorization, Enhanced Assessment Grants, and other competitive/formula funding opportunities

Page 51: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Recommendations for Future Recommendations for Future DirectionsDirectionsScoring of competitive consortium

applications should be weighted toward…◦ The development of integrated systems across all

aspects of assessment & accountability◦ Significant and rigorous research, development, and

evaluation of the validity and impact (intended and unintended consequences) of system development and implementation

Formula funding should stipulate collaboration in system development

Use of formula funding guarantees…◦ Continued focus on students with the greatest needs◦ Access to quality systems for states without strong

resources for writing competitive grants

Page 52: Joseph A. Martineau, Ph.D. Vincent J. Dean, Ph.D. Michigan Department of Education

Contact InformationContact InformationJoseph A. Martineau, Ph.D.

◦Director of Assessment & Accountability

[email protected] J. Dean, Ph.D.

◦State Assessment Manager◦[email protected]

Michigan Department of Education