127
Welcome to the Race to the Top Assessment Program Technical Assistance Public Meeting Creating Valid, Reliable, and Fair Assessments for Students with Disabilities & English Learners Washington, DC August 10, 2011 Please silence all cell phones and pagers. Thank you!

Welcome to the Race to the Top Assessment Program Technical Assistance Public Meeting Creating Valid, Reliable, and Fair Assessments for Students with

Embed Size (px)

Citation preview

Welcome to the Race to the Top Assessment Program

Technical Assistance Public Meeting

Creating Valid, Reliable, and Fair Assessments for Students with Disabilities & English Learners

Washington, DCAugust 10, 2011

Please silence all cell phones and pagers.

Thank you!

Race to the Top Assessment (RTTA) Program Overview and

Meeting Goals

Joe Conaty Patrick Rooney

U.S. Department of Education

RTTA Public MeetingsThis is the third in a series of public meetings on RTTA.

o Two prior meetings: April 15 on State and Local Technology Infrastructure and June 10 on Automated Scoring of Assessments

o Details on additional meetings will be forthcoming

Purpose of the meetings:o To provide technical assistance to and to support collaborative efforts

of PARCC and SBAC as they develop new assessment systemso To expand the knowledge and expertise of the Department and the

public around key assessment issueso To facilitate discussion of key components of the systems with

experts and the public at large

Funded in part by The William and Flora Hewlett Foundation

8/10/20113

RTTA Program GoalsSupport states in delivering a system of more effective and

instructionally useful assessments that:o Provide accurate information about what students know and can do by:

Eliciting complex student demonstrations or applications of knowledge and skills, as appropriate

Accurately measuring student achievement across the full performance continuum

Accurately measuring student growth over a full academic year or course; Helping educators determine whether individual students are ready for

college and careers by the time of high school graduation and, in previous grade levels, whether they are on-track for readiness

o Reflect good instructional practice and support a culture of continuous improvement

o Effectively assess all students, including students with disabilities and English learners

8/10/20114

Looking ForwardAssessment systems must include one or more summative

assessment components that are fully implemented by every state in each consortium by SY 2014-15, and are administered at least once during the academic year in, at a minimum:o Reading/language arts and mathematicso Grades 3-8 and high school

Results used to inform:o Teaching, learning, and program improvemento Determinations of school effectivenesso Determinations of principal and teacher effectiveness for the

purposes of evaluation and supporto Determinations of individual student college and career readiness

8/10/20115

RTTA Grantees

8/10/2011

• Nearly $360 million awarded in September 2010 to two consortia, which together represent 45 states and DC:o Partnership for Assessment of Readiness of College and

Careers (PARCC) Project Management Partner: Achieve

o SMARTER Balanced Assessment Consortium (SBAC) Project Management Partner: WestEd

• PARCC and SBAC have demonstrated commitment from institutions of higher education (IHEs) in member states that they will use the results of these assessments to determine entry into credit-bearing courseso The IHEs represent 90% (PARCC) and 74% (SBAC) of

students who matriculate directly from K-12

6

Students with Disabilities & English LearnersThe absolute priority required the consortia to create

assessments for all students, including English learners and students with disabilities

The consortia are required to develop tests accessible for these populations and to create and standardize accommodations policies

Each consortium must develop a definition of “English learner” that is uniform across member states

Additional consortia:Alternate assessments for students with the most significant

cognitive disabilitiesEnglish language proficiency

8/10/20117

Expectations for the MeetingWe have invited a range of experts to this meeting to share

their knowledge and experience with the consortia members, looking at both current state of research and promising approaches to improving accessibility for these students

Format:o The morning will focus on key questions that need to be

addressed regarding the needs of students with disabilities and English learners in the assessment system and possible methods to address those questions

o The afternoon will focus on two standards – one in English language arts and one in mathematics – in a practical application of the issues to creating valid, reliable, and fair assessment items for these populations

8/10/20118

Meeting Agenda9:00-9:35 Welcome/setting the stage9:35-10:15 Fishbowl discussion10:15-10:30 Break10:30-Noon Fishbowl discussion continuedNoon-1:00 Lunch1:00-1:30 Fishbowl discussion of public

comments1:30-3:00 Table exercise3:00-3:15 Public comments3:15-3:30 Wrap-up3:30 Adjourn

8/10/20119

Invited ExpertsJamal Abedi, University of California, Davis

Lizanne DeStefano, University of Illinois

Rebecca Kopriva, Wisconsin Center for Educational Research

Mike Russell, Measured Progress

Stephen Sireci, University of Massachusetts, Amherst

Guillermo Solano-Flores, University of Colorado, Boulder

8/10/201110

Public CommentsED wants to hear from the public on key considerations for creating

valid, reliable, and fair assessments for students with disabilities and English learners

In the morning: o Comment cards are available at the registration desko ED, the consortia, and the experts will discuss the comments/questions

as time allows at the start of the afternoon sessiono All input from comment cards will be posted on our website

In the afternoon:o We have scheduled time for verbal public comment from 3:00-3:15 pmo Sign up to speak during the lunch break at the registration desko Time limit: Up to 3 minutes per person/organization

Due to limited time, those not able to provide comments in person may email them to: [email protected]

8/10/201111

RemindersPlease place all cell phones and other devices on vibrate

Race to the Top Assessment resources – Applications, FAQs, plus today’s materials and transcription available at:http://www2.ed.gov/programs/racetothetop-assessment

The purpose of this event is to promote a full discussion and hear a wide range of viewpoints on creating valid, reliable, and fair assessments for English learners and students with disabilities, as well as the challenges and opportunities afforded by the Race to the Top Assessment program. Through this meeting, the U.S. Department of Education is not seeking to promote and/or endorse any particular program, project, methodology or approach to this work.

8/10/201112

INTRODUCTIONS

Patrick RooneyU.S. Department of Education

Meeting Facilitator

Accessibility and Accommodations

Deborah Matthews - Kansas – Accessibility & Accommodations Workgroup

[email protected]

15

• To develop a set of comprehensive and innovative assessments for grades 3-8 and high school in English language arts and mathematics aligned to the Common Core State Standards

• Students leave high school prepared for postsecondary success in college or a career through increased student learning and improved teaching

• The assessments shall be operational across Consortium states in the 2014-15 school year

16

17

1. Transition to Common Core State Standards

2. Technology Approach

3. Assessment Design: Item Development

4. Assessment Design: Performance Tasks

5. Assessment Design: Test Design

6. Assessment Design: Test Administration

7. Reporting

8. Formative Processes and Tools/Professional Development

9. Accessibility and Accommodations

10. Research and Evaluation

18

Purpose Ensure the SBAC Assessment System is

maximally accessible to the broadest range of students through

• identifying, recommending, and evaluating strategies, tools, and technologies, thereby

• providing information and guidance that will positively impact critical aspects of assessment design and development

19

• New paradigm that focuses on the student first, not the test items which addresses accessibility issues as part of item development, not as an afterthought.

• Computer based assessment allows technology to open many doors for students because accessibility is built into the assessments.

• The necessity of accommodations is reduced. Accommodations that are allowed are more targeted.

20

In both policy and practice, SBAC will • include the broadest range of students • by facilitating each student’s ability to

demonstrate as fully as possible what they know and can do

• on the targeted constructs being measured

• in a manner that is equitable and reliable, and yields valid interpretations of results.

21

1. Create policies that reflect current research, best practices, and future possibilities related to accessibility and accommodations

2. Create assessments that are free from bias and sensitivity issues leveraging new technologies, including interoperability while preserving test constructs

3. Create accessible and accommodated assessments that will yield valid and reliable results

22

4. Ensure accessibility and accommodations practice and policy are implemented with fidelity

5. Develop useful reporting and presentation guidelines that include information on accessibility and accommodations actions in the aggregate and at the individual student level

23

Michael Hock – Vermont – Accessibility & Accommodations Workgroup Co-Chair

[email protected] Carver - Utah – Accessibility & Accommodations

[email protected]

Shelbi Cole – Connecticut – Performance Tasks [email protected]

Gaye Fedorchak – New Hampshire – Accessibility & Accommodations [email protected]

Viji Somasundaram – Wisconsin – Item Development [email protected]

Accessibility for Students

August 10, 2011www.PARCConline.org

• Create high-quality assessments that measure the full range of the Common Core State Standards

• Build a pathway to college and career readiness for all students and make accurate and reliable determinations as to whether students are “on track” or “ready” for college and careers

• Provide information that supports various accountability uses (e.g., school, educator, student)

• Provide timely and actionable information that supports continuous improvements in curriculum and instruction that inform effective classroom instruction and assessment practices

• Leverage technology for a variety of uses: innovative items, accommodations, administration, and scoring and reporting.

• Report results that allow for comparability across all PARCC states, across consortia, and to national and international assessments.

The PARCC Vision

PARCC Accessibility Goals

The Partnership will:•Work to minimize/eliminate features that are irrelevant to what is being measured and measure the range of complexity of the standards so that students can demonstrate their knowledge;•Design each component in a manner that allows ELL students and students with identified needs to demonstrate what they know and can do;•Apply principles of universal design for accessible assessments throughout every stage of developing assessment components, items, and performance tasks;•Leverage technology for delivering assessment components as widely accessible as possible; and•Establish a Committee on Accessibility and Accommodations comprised of knowledgeable testing officials from member states (OWG).

PARCC Governance Structure

Steering Committee

PARCC Technical Working Groups (TWG)

• Limited number of groups convened by the TAC to address high priority topics that would benefit from collective problem-solving by leading experts

• Comprised of domain-specific technical advisors who interact with leadership and working groups and report to the TAC

Accessibility, Accommodations, and Fairness TWG: • Committee members represent a range of expertise

in accessibility and accommodations• Role is to help guide the efforts of working groups in

designing accessible assessments that remain true to the intended constructs

Technical Advisory

Committee (TAC)

Technical Working Groups (TWG)

Operational Working Groups (OWG)

Leadership Team (LT)

Accessibility, Accommodations, and Fairness TWG Invited Members

Diane August Center for Applied Linguistics (ELL)

David EdyburnUniversity of Wisconsin-Milwaukee (SWD)

Claudia FlowersUniversity of North Carolina – Charlotte (SWD)

Dianne PicheLeadership Conference on Civil Rights

Charlene RiveraGeorge Washington University (ELL)

Diane SpenceRegion 4 Education Service Center, Braille Services (Braille)

Martha ThurlowNational Center on Educational Outcomes (SWD)

Dan Wiener, ChairMassachusetts Department of Elementary and Secondary Education (accommodations for state assessments)

Gerunda Hughes will serve as the liaison to the PARCC Technical Advisory Committee (TAC).

PARCC Operational Working Groups (OWG)

• Who: Comprised of state representatives, Achieve staff members, and eventually vendor representatives

• What: Responsible for the day-to-day aspects of work of key components of work

• Why: To ensure efficient and effective collaboration among PARCC members to meet PARCC goals.

Technical Advisory

Committee (TAC)

Technical Working Groups (TWG)

Operational Working Groups (OWG)

Leadership Team (LT)

Accessibility, Accommodations, and Fairness OWG Members

Roberta Alley (Chair / Leadership Team)Arizona Department of Education

Trinell BowmanMaryland State Board of Education

Mira Monroe Colorado Department of Education

Melissa Fincher (Leadership Team)Georgia Department of Education

Charity FloresIndiana Department of Education

Andrew HinkleOhio Department of Education

Leila Williams Arizona Department of Education

Bambi LockmanFlorida Department of Education

Phyllis LynchRhode Island Department of Education

Michael ReidOklahoma State Department of Education

Lori RodriguezFlorida Department of Education

Dan WienerMassachusetts Department of Elementary and Secondary Education

Jessica Tickle Achieve/PARCC

Danielle Griswold Achieve/PARCC

Accessibility and AccommodationsWorking Groups

The Working Groups will be responsible for: Drafting a set of Partnership-wide policies in a Partnership

Accommodations Manual to be adopted by each member state for identifying eligible students, selecting allowable accommodations, and administering accommodations. That process will include:

• Analyzing extant state accommodation policies,• Building a list of recommended standard accommodations,• Identifying constructs and research accommodations,• Recommending a set of proposed accommodation policies for the

assessment,• Drafting a common Partnership Accommodations Manual,• Ensuring comparability in assessment administrations,• Monitoring ongoing refinements of accommodations, and • Developing training modules for IEP teams.

Technical Working Groups (TWG)

Operational Working Groups (OWG)

Accessibility and AccommodationsWorking Groups

Adopting key policies and definitions that will include:• a common definition of “English Learner”;• a common set of policies and procedures for providing assessment

accommodations for English learners and students with identified needs; and

• a common set of policies and procedures for participation of English learners and students with identified needs in the assessment system.

Technical Working Groups (TWG)

Operational Working Groups (OWG)

Accessibility and AccommodationsWorking Groups

Accessibility and Accommodations as a part of the development process

• Design review and feedback• Test blueprint development• Technology development and selection• Passage and media review committee involvement• Item review committee involvement• Bias and sensitivity committee involvement• Testing the efficacy of assessment items with accommodations with

the intended groups of students in pilot and field testing• Including sufficient number of students with identified needs (across

sub-categories) in pilot and field testing• Data review committee involvement

Technical Working Groups (TWG)

Operational Working Groups (OWG)

Accessibility and AccommodationsWorking Groups

• Build accessibility throughout the test itself with no trade-off between accessibility and validity

• Use a combination of ‘accessible’-authoring and accessible technologies from the inception of items and tasks

• Establish and maintain a close working connection with the Technology, Design, and Research Working Groups

Technical Working Groups (TWG)

Operational Working Groups (OWG)

UNDERSTANDING THE POPULATION

Patrick RooneyU.S. Department of Education

Who are students with disabilities?

8/10/2011Source: U.S. Department of Education, SY 2008-09 Annual Performance Reports. Figure courtesy of the National Center on Educational Outcomes

Understanding the PopulationIn 2008-09, some 6.5 million children ages 3-21 received

special education services (13 percent of the population)95 percent were enrolled in regular public schools57 percent spent most of their time in general classes

The vast majority of students with disabilities take the general reading/language arts and mathematics assessmentso Students with significant cognitive disabilities: Current law permits

up to 1 percent of all students in the state (approximately 10 percent of students with disabilities) to take an alternate assessment based on alternate academic achievement standards

o All students are expected to have access to, and be assessed against, grade-level content standards

8/10/2011

Trends in Growth of English Learners

8/10/2011Source: Census Bureau; NCES Condition of Education 2011

Percentage of children ages 5-17 who spoke a language other than English at home and percentage who spoke a language other than English at home and spoke English with difficulty: Selected years, 1980-2009

Where are English learners?

8/10/2011

Percentage of children ages 5-17 who spoke a language other than English at home and spoke English with difficulty, by state or jurisdiction: 2009

Source: Census Bureau; NCES Condition of Education 2011

Understanding the PopulationOver 300 different languages are spoken by students.Top 5 languages spoken by English learners:

o Spanish (3.5 million)o Vietnamese (93,000)o Chinese (80,000)o Arabic (71,000)o Hmong (50,000)In 2000, some 64 percent of English learners were born in

the United Stateso 42 percent were 2nd generation o 22 percent were 3rd generation

The HHS Office of Refugee Resettlement identified 18,500 refugee children ages 6-18 in 2010.

8/10/2011

UNDERSTANDING THE POPULATION

What are the challenges or key questions that need to be addressed by the consortia

when developing their assessment systems?

BREAK

Measured Progress ©2011 | August 10, 2011

A Brief Overview

Accessible Assessment & APIP

Michael Russell

August 10, 2011

A Two-Way Street

Accessibility & Assessment

August 10, 2011

A Two-Way Street

Accessibility & Assessment

Test Item Access to Construct

ITEM

PresentContent

Stimulate Construct

Interact w/Content

ApplyConstruct

ProduceResponse

VisibleProduct ofConstruct

August 10, 2011

Match Content Form to Student Access Need Language

Translation (Directions, Whole Item, Individual Words/Phrases)

Simplified English Audio

Text-to-Speech, Pre-recorded Voice Text-based content, Graphics & Tables, Non-Visual

Descriptions Braille Sign

Digital Assessment DeliveryWhat we can already do:

August 10, 2011

Match Interaction & Response Modes to Need Alternate keyboards Tab-Enter control devices Touch screens Eye gaze Speech-to-text

What we can already do:

Digital Assessment Delivery

August 10, 2011

Standard Coding for Student Access Needs/Accommodations

Standard Tagging System for Item Content Accessibility Information

Standard File Exchange Format for Student Access Needs and Accessible Test Items

Interoperable Accessibility

Current Challenges

August 10, 2011

What is APIP?

Accessible Portable Item Profile Standard

File Exchange Format

Item ContentAccessibility InformationMeta DataCompanion Materials

Test Items

Access Needs Profile

Students

August 10, 2011

Translated Item

Translated Words/Phrases

Translated Directions

Simplified Language

Audio Representation

[Symbolic Representation]

Extended Time/Breaks

Language-Related

Access Needs Addressed by APIP

August 10, 2011

Magnification

Reverse Contrast

Alternate Fore/Background Colors

Color Tinting/Overlay

[Increased White Space]

Visual

Access Needs Addressed by APIP

August 10, 2011

Auditory Calming

Masking

Line Reader

Breaks

Extended Time

Executive Function/Maintaining Attention

Access Needs Addressed by APIP

August 10, 2011

Flagging

Keyword Highlight

Alternate Representations

[Scaffolding]

[Chunking]

[Reduced Answer Options]

[Negatives Removed]

Information Processing

Access Needs Addressed by APIP

August 10, 2011

Audio: text-only, graphic-only, text & graphic, non-visual

Tactile

Braille

Sign (ASL, Signed English)

Representational Form

Access Needs Addressed by APIP

August 10, 2011

Content Standards - CCSS

Performance Standards – Consortia Assessments

Data Standards – SIF, CEDS, Ed-Fi

Interoperability Standards – QTI-APIP

Each Standard Addresses A Distinct Need

Standards in the Education Sector

August 10, 2011

How They Work Together in an Assessment Context

Standards in the Education Sector

Common Core State Standards

SIF/Ed-Fi/CEDS Student Data Student

InformationGender

DOBID #

GradeEnrollment

AccommodationsTest Scores

Etc.

LibraryInformation

CurriculumInformation

IMS

QTI

APIP Item Content Standard

APIP PersonalNeeds Profile

Ass

ess

ment

Syst

em

StudentRostering

StudentLogin Test

AdministrationResponseRecording

Scoring

Reporting

August 10, 2011

Proprietary/Industry Led Standard

Competes/Clashes with SIF

Does not support Innovative Items

Requires High Bandwidth

Myths and Misunderstandings

APIP

August 10, 2011

Questions & Comments

Open Discussion

APIP

DISCUSSION

• What challenges and benefits does computer-administered testing create for accessibility?

U.S. DEPARTMENT OF EDUCATIONRTTA PUBLIC MEETING ON

CREATING VALID, RELIABLE, AND FAIR ASSESSMENTS FOR STUDENTS WITH DISABILITIES AND ENGLISH

LEARNERS

JAMAL ABEDI UNIVERSITY OF CALIFORNIA, DAVIS

AUGUST 10, 2011

What is the current state of research on accommodations

and what future research is necessary?

ELL Students

What we need to know about accommodations before using them for ELLs and SWDs…

1. Effectiveness: How effective are accommodations in making assessments more accessible to ELL students?

2. Validity: How valid is the outcome of the accommodated assessment when compared to a non-accommodated assessment?

3. Differential Impact: To what degree are these accommodations universally applicable to ELL students with different background characteristics?

4. Comparability: Can accommodated and non-accommodated assessment outcomes be aggregated?

5. Relevance: How appropriate are the accommodations used for these students?

6. Feasibility: How feasible is it to implement these accommodations in large-scale assessments?

What we know based on current research…

Some accommodations may not be effective in making assessments more accessible to ELLs (e.g. one-on-one and small group testing) [EFFECTIVENESS & RELEVANCE]

Some accommodations may alter the construct by providing an unfair advantage to the recipients (e.g. dictionary or glossary with content-related terms) [VALIDITY & COMPARABILITY]

Some accommodations do no alter the focal construct and ensure comparability (e.g. linguistic modification) [VALIDITY]

However… we have no hard evidence on the VALIDITY and EFFECTIVENESS for the majority of accommodations currently in use.

What research needs to focus on next…

Research needs to:Examine the validity of accommodations

i.e. if accommodations impact the focal constructDetermine the effectiveness of accommodations in making assessments more accessible to ELLs and SWDs

What does this look like?Randomized Field Experiment – in which all major sources of threat to internal and external validity of the experiment are controlled

Existing data on accommodations may not be a good source for examining either the validity or effectiveness of accommodations

How to test the validity and effectiveness of an accommodation in an experimentally controlled field

study

Effectiveness: Comparing G1 with G2Validity: Comparing G3 with G4

The main reason for inconsistencies between existing research on accommodations is the lack of control of extraneous variables

ELL Status/Accommodation

Accommodated Non-Accommodated

ELLG1 G2

Non-ELLG3 G4

U.S. DEPARTMENT OF EDUCATIONRTTA PUBLIC MEETING ON

CREATING VALID, RELIABLE, AND FAIR ASSESSMENTS FOR STUDENTS WITH DISABILITIES AND ENGLISH

LEARNERS

MARTHA THURLOWUNIVERSITY OF MINNESOTA

AUGUST 10, 2011

What is the current state of research on accommodations

and what future research is necessary?

Students with Disabilities

Current State of Research on Accommodations for Students with Disabilities

• Evidence-based AccommodationsExtended TimeOral Administration for Math Assessments

• Accommodations with Conflicting EvidenceOral Administration for Reading AssessmentsSegmented TextScribeCalculator

• Accommodations without ResearchEngagement/motivation AccommodationsHundreds of other accommodations currently listed in

state policies!

Needs for Research on Accommodations for Students with Disabilities

• Clarification of the content!

• Strategies for determining, based on strong rationales, which accommodations do not compromise the content assessed and therefore need not be subjected to research

• Improved selection of students for participation in studies (only those who truly need the accommodation studied)

• Both extant data studies and experimental [empirical?] studies – on targeted and controversial accommodations

• More attention to decision-making processes for who needs which accommodations

DISCUSSION

• How should the consortia focus on ensuring appropriate access to all students to minimize construct-irrelevance, etc. during item design and development?

• What methods or strategies do you need to determine whether the items are valid and fair for all populations?

LUNCH

DISCUSSION OF SELECTEDWRITTEN PUBLIC COMMENTS

Lizanne DeStefano, University of Illinois

Rebecca Kopriva, Wisconsin Center for Educational Research

TABLE EXERCISE ON PUTTING THEORY INTO PRACTICE

Questions to Consider

What skills and knowledge are you trying to measure?How can students demonstrate whether they have the

skills and knowledge?What are possible approaches to making accessible items

for students with disabilities or English learners? What challenges arise with these approaches? How can these efforts improve assessment for all learners?

8/10/2011

Lessons Learned from the NAEPAccessible Block Study

Lizanne DeStefano

Jeremiah Johnson

Purpose To explore the use of modified NAEP blocks as a

means of improving measurement of the abilities and skills of students who score at lower end of NAEP performance continuum (including SD and ELL) Develop a definition of what constitutes an accessible

block of items Refine the process for developing accessible blocks

that are aligned with the NAEP frameworks Develop two accessible blocks of math items per

grade level Scale modified blocks/items with existing item pool

Item Modification A panel of ten education professionals, math content

specialists, and individuals with ELL/SD experience was assembled

Blocks of NAEP items were modified according to the “Item Modification Guidelines and Procedures.”

All modified items maintained their original alignment with the math content areas defined by the NAEP framework.

The item modification panel edited, revised, and updated the “Item Modification Guidelines and Procedures” to reflect their thoughts on “best practice”

Fran Stancavage
Change "booklet" to "block" throughoutThe NAEP math framework is organized into content areas, topics, and objectives. it is separately scaled by content area, but people don't really talk about NAEP scales. In the third bullet, say "content area" if this is what you mean. I wonder if you didn't also retain the same topic?? Whether you mean "content area" or "topic," it would probably be good to explain/give an example of what that level means when you're doing the presentation.

Creating Accessible Items Clearly identify the construct(s) of interest for

each item aligned with standard. Identify the range of knowledge and skills

expressed within a single standard (level of proficiency).

Compare the emphasis of knowledge and skills in standard and on assessment.

Consider the characteristics of the target population

Guidelines for Increasing Accessibility Reduce language load. Carefully consider distracters. Provide consistent, simple formatting. Use supportive, complete graphics. Provide contextual information that enhances

understanding. Eliminate extraneous information. Provide cues to aid understanding

Reducing cognitive demand Reduce the number of objectives

assessed in a single item Limit the number of steps required to

correctly answer an item or provide a template to structure the process

Be transparent about how open ended items will be scored

Overall

4th Grade Average Percent Correct Difference: 32.27%Accessible Blocks: 81.64% Source Blocks: 49.37%

8th Grade Average Percent Correct Difference: 26.41%Accessible Blocks: 73.87% Source Blocks: 47.46%

Results By Block (4th Grade)

Percent Correct for SD(4th Grade)

Percent Correct for ELL (4th Grade)

Results By Block (8th Grade)

Percent Correct for SD(8th Grade)

Percent Correct for ELL(8th Grade)

Scaling Results

0.25

0.50

0.75

1.00

1.25

1.50

1.75

2.00

Accessible Source

2010 Math Accessible Booklet Study Grade 4 - IRT A Parameters

0.25

0.50

0.75

1.00

1.25

1.50

1.75

2.00

Accessible Source

2010 Math Accessible Booklet Study Grade 8 - IRT A Parameters

0.00

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

Accessible Source

2010 Math Accessible Booklet Study Grade 4 - IRT C Parameters

0.00

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

Accessible Source

2010 Math Accessible Booklet Study Grade 8 - IRT C Parameters

-4.0

-3.5

-3.0

-2.5

-2.0

-1.5

-1.0

-0.5

0.0

0.5

1.0

1.5

2.0

2.5

3.0

B Prm - Accessible Ability B Prm - Source

2010 Math Accessible Booklet Study Grade 4 IRT B Parameters and Ability Distribution

-3.0

-2.5

-2.0

-1.5

-1.0

-0.5

0.0

0.5

1.0

1.5

2.0

2.5

3.0

B Prm - Accessible Ability B Prm - Source

2010 Math Accessible Booklet Study Grade 8 IRT B Parameters and Ability Distribution

0.000

0.005

0.010

0.015

0.0

0.3

0.6

0.9

1.2

1.5

1.8

0 50 100 150 200 250 300 350 400 450 500

Tes

t In

form

atio

n

Re

lati

ve

Fre

qu

en

cy

(pe

rce

nta

ge

)

2010 NAEP Math Accessible Booklet Study Grade 4 -Ability Distributions and Test Information by book type

Accessible book 181

Regular book 183

Overall

0.000

0.002

0.004

0.006

0.008

0.010

0.0

0.3

0.6

0.9

1.2

1.5

0 50 100 150 200 250 300 350 400 450 500

Tes

t In

form

atio

n

Re

lati

ve

Fre

qu

en

cy

(pe

rce

nta

ge

)

2010 NAEP Math Accessible Booklet Study Grade 8 -Ability Distributions and Test Information by book type

Accessible book 181

Regular book 183

Overall

0

10

20

30

40

50

60

0.0

0.3

0.6

0.9

1.2

1.5

1.8

0 50 100 150 200 250 300 350 400 450 500

CS

EM

Re

lati

ve

Fre

qu

en

cy

(pe

rce

nta

ge

)

2010 NAEP Math Accessible Booklet Study Grade 4 -Ability Distributions and CSEM by book type

Accessible book 181

Regular book 183

Overall

0

10

20

30

40

50

60

0.0

0.3

0.6

0.9

1.2

1.5

0 50 100 150 200 250 300 350 400 450 500

CS

EM

Re

lati

ve

Fre

qu

en

cy

(pe

rce

nta

ge

)

2010 NAEP Math Accessible Booklet Study Grade 8 -Ability Distributions and CSEM by book type

Accessible book 181

Regular book 183

Overall

Summary of Findings

Across groups and subgroups there were: Substantial and similar average gains in

percent correct by block Consistent declines in the number of

students omitting various items Significant declines in the percentage of

students not reaching items

Summary of Findings

All items were scalable Modified items had similar discrimination

and guessing characteristics (a and c parameters)

There were significant reductions in item difficulty (b parameters)

Summary of Findings

For the lowest performing students, the conditional standard error of measurement was significantly lower on the accessible blocks than the source blocks

Some Effective Uses of Technology

Measuring Content Knowledge and Skills of English Learners and Students with Disabilities

RTTA Meeting, August 10, 2011

Rebecca Kopriva University of Wisconsin

[email protected]

Think About What Technology Can Do…

Technology can fundamentally improve the measurement of valued knowledge and skills for Els and SwDs in several ways, including:

1.Making use of multi-semiotic representations to primarily convey meaning

2.Establishing effective profiles so students can be provided the proper accommodations or adaptations.

1. Why Bother with Multi-Semiotic Representations?

Students with literacy and language challenges ARE learning complex content.

How?

They and their teachers have learned to convey meaning using modes other than text as primary communication methods, supported by key language as needed.

This means successful adaptations need to include ways to:convey meaning to the studentconvey meaning from the student

These adaptations may be useful for other students as well.

What Does This Mean for Assessment?

Properly constructed, these methods can

Broaden how students are allowed to respond.

Broaden how we present the problems.

Broaden our understanding of how students conceptualize knowledge and use skills.

Most often it is best if multiple avenues of access are built into each of the tasks at each of these points.

Open Up Response Methods

Open Up Presentation Methods

SAMPLE ITEMS

Open Up Problem Solving Windows:

Broadening our understanding of how students conceptualize and use skills

2. Identify Effective Student Profiles and Use Them

• While the EL and SwD populations are heterogeneous, a reasonable number of student accommodation profiles can be assembled.

• The purpose of the profiles is to group students by similar characteristics that make a difference in how to best accommodate them on assessments.

• Students within the same profile share similar strengths and challenges the same suite of accommodations

Effective Student Profiles

Effective profiles capture targeted student information that can successfully identify the most appropriate accommodations.

Ineffective profiles categorize students by irrelevant or incomplete information and lead to inappropriate or incomplete accommodation suites.

Relevant Characteristics of EL Students

Cultural Proximity

US Schooling

Effective Linking Procedures

• Effective profiles can be linked to appropriate accommodations.

• This is usually completed through a series of algorithms that are each keyed to specific profiles and specific accommodation choices and suites.

These algorithms can be appropriate or incomplete.

Effective Adaptations

• Profiles can also be used to build effective access avenues into assessment items and tasks.

• Examples of these kinds of adaptions have been shown above.

Questions to Consider

What skills and knowledge are you trying to measure?How can students demonstrate whether they have the

skills and knowledge?What are possible approaches to making accessible items

for students with disabilities or English learners? What challenges arise with these approaches? How can these efforts improve assessment for all learners?

8/10/2011

Mathematics Common Core StandardGrade 7Equations and Expressions (7.EE.3)

Solve multi-step real-life and mathematical problems posed with positive and negative rational numbers in any form (whole numbers, fractions, and decimals), using tools strategically. Apply properties of operations to calculate with numbers in any form; convert between forms as appropriate; and assess the reasonableness of answers using mental computation and estimation strategies.

8/10/2011

Mathematics Common Core Standard1. Identify a particular student profile 2. Specify what knowledge or skills  from this standard they

intend to measure in a basic way, and in a more complex way

3. Give an example of a particular item/task topic (outline of an item) that will measure the more basic knowledge and skills and an item/task topic that will measure the more complex knowledge/skills

4. Explain how they might make tasks that would cover each of these targets accessible for their chosen student profile. Use the questions to consider as a guide in explaining

8/10/2011

Mathematics Common Core StandardGrade 7Equations and Expressions (7.EE.3)

Solve multi-step real-life and mathematical problems posed with positive and negative rational numbers in any form (whole numbers, fractions, and decimals), using tools strategically. Apply properties of operations to calculate with numbers in any form; convert between forms as appropriate; and assess the reasonableness of answers using mental computation and estimation strategies.

8/10/2011

Reading/Language Arts Common Core StandardGrade 8Reading Informational Text (RI.8.8)

Delineate and evaluate the argument and specific claims in a text, assessing whether the reasoning is sound and the evidence is relevant and sufficient; recognize when irrelevant evidence is introduced.

8/10/2011

Reading Language Arts Common Core Standard1. Identify a particular student profile 2. Specify what knowledge or skills  from this standard they

intend to measure in a basic way, and in a more complex way3. Give an example of a particular item/task topic (outline of

an item) that will measure the more basic knowledge and skills and an item/task topic that will measure the more complex knowledge/skills

4. Explain how they might make tasks that would cover each of these targets accessible for their chosen student profile. Use the questions to consider as a guide in explaining

8/10/2011

Reading/Language Arts Common Core StandardGrade 8Reading Informational Text (RI.8.8)

Delineate and evaluate the argument and specific claims in a text, assessing whether the reasoning is sound and the evidence is relevant and sufficient; recognize when irrelevant evidence is introduced.

8/10/2011

CONCLUDING COMMENTS

PUBLIC COMMENTS

Race to the Top Assessment ProgramTechnical Assistance Public Meeting

Closing Comments

Joe ConatyU.S. Department of Education

Reminders

Transcript and presentations from today’s meeting will be available at:

www2.ed.gov/programs/racetothetop-assessment

Additional written input may be submitted to [email protected]

8/10/2011

Future Public MeetingsFuture meetings may focus on:

o Interoperability and technology standardso Selection of a uniform growth model consistent with test purpose,

structure, and intended useso Setting achievement standards setting and performance level

descriptors

As details are finalized, information will be posted on ed.gov and shared with stakeholder groups and prior meeting participants

8/10/2011