82
Response to Intervention www.interventioncentral.org Formative Assessment: Specific Tools to Measure Student Academic Skills Jim Wright www.interventioncentral.org

Response to Intervention Formative Assessment: Specific Tools to Measure Student Academic Skills Jim Wright

Embed Size (px)

Citation preview

Response to Intervention

www.interventioncentral.org

Formative Assessment: Specific Tools to Measure Student Academic Skills

Jim Wrightwww.interventioncentral.org

Response to Intervention

www.interventioncentral.org

Effective Formative Evaluation: The Underlying Logic…

1. What is the relevant academic or behavioral outcome measure to be tracked?

2. Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students?

3. What method(s) should be used to measure the target academic skill or behavior?

4. What goal(s) are set for improvement?

5. How does the school check up on progress toward the goal(s)?

Response to Intervention

www.interventioncentral.org 3

Use Time & Resources Efficiently By Collecting Information Only on ‘Things That Are Alterable’

“…Time should be spent thinking about things that the intervention team can influence through instruction, consultation, related services, or adjustments to the student’s program. These are things that are alterable.…Beware of statements about cognitive processes that shift the focus from the curriculum and may even encourage questionable educational practice. They can also promote writing off a student because of the rationale that the student’s insufficient performance is due to a limited and fixed potential. “ p.359

Source: Howell, K. W., Hosp, J. L., & Kurns, S. (2008). Best practices in curriculum-based evaluation. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp.349-362). Bethesda, MD: National Association of School Psychologists.

Response to Intervention

www.interventioncentral.org 4

School Instructional Time: The Irreplaceable Resource

“In the average school system, there are 330 minutes in the instructional day, 1,650 minutes in the instructional week, and 56,700 minutes in the instructional year. Except in unusual circumstances, these are the only minutes we have to provide effective services for students. The number of years we have to apply these minutes is fixed. Therefore, each minute counts and schools cannot afford to support inefficient models of service delivery.” p. 177

Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).

Response to Intervention

www.interventioncentral.org 5

Summative data is static information that provides a fixed ‘snapshot’ of the student’s academic performance or behaviors at a particular point in time. School records are one source of data that is often summative in nature—frequently referred to as archival data. Attendance data and office disciplinary referrals are two examples of archival records, data that is routinely collected on all students. In contrast to archival data, background information is collected specifically on the target student. Examples of background information are teacher interviews and student interest surveys, each of which can shed light on a student’s academic or behavioral strengths and weaknesses. Like archival data, background information is usually summative, providing a measurement of the student at a single point in time.

Response to Intervention

www.interventioncentral.org 6

Formative assessment measures are those that can be administered or collected frequently—for example, on a weekly or even daily basis. These measures provide a flow of regularly updated information (progress monitoring) about the student’s progress in the identified area(s) of academic or behavioral concern.

Formative data provide a ‘moving picture’ of the student; the data unfold through time to tell the story of that student’s response to various classroom instructional and behavior management strategies. Examples of measures that provide formative data are Curriculum-Based Measurement probes in oral reading fluency and Daily Behavior Report Cards.

Response to Intervention

www.interventioncentral.org 7

Formal Assessment Defined

“Formative assessment [in academics] refers to the gathering and use of information about students’ ongoing learning by both teachers and students to modify teaching and learning activities. …. Today…there are compelling research results indicating that the practice of formative assessment may be the most significant single factor in raising the academic achievement of all students—and especially that of lower-achieving students.” p. 7

Source: Harlen, W. (2003). Enhancing inquiry through formative assessment. San Francisco, CA: Exploratorium. Retrieved on September 17, 2008, from http://www.exploratorium.edu/ifi/resources/harlen_monograph.pdf

Response to Intervention

www.interventioncentral.org 8

Formative Assessment: Essential Questions…

1. What is the relevant academic or behavioral outcome measure to be tracked?

Problems identified for formative assessment should be:1. Important to school stakeholders.2. Measureable & observable. 3. Stated positively as ‘replacement behaviors’ or goal statements

rather than as general negative concerns (Bastche et al., 2008).

4. Based on a minimum of inference (T. Christ, 2008).

Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

Response to Intervention

www.interventioncentral.org 9

Academic or Behavioral Targets Are Stated as ‘Replacement Behaviors’

“The implementation of successful interventions begins with accurate problem identification. Traditionally, the student problem was stated as a broad, general concern (e.g., impulsive, aggressive, reading below grade level) that a teacher identified. In a competency-based approach, however, the problem identification is stated in terms of the desired replacement behaviors that will increase the student’s probability of successful adaptation to the task demands of the academic setting.” p. 178

Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).

Response to Intervention

www.interventioncentral.org 10

Inference: Moving Beyond the Margins of the ‘Known’

“An inference is a tentative conclusion without direct or conclusive support from available data. All hypotheses are, by definition, inferences. It is critical that problem analysts make distinctions between what is known and what is inferred or hypothesized….Low-level inferences should be exhausted prior to the use of high-level inferences.” p. 161

Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

Response to Intervention

www.interventioncentral.org 11

Examples of High vs. Low Inference Hypotheses

High-Inference Hypothesis. The student has an auditory processing issue that prevents success in reading. The student requires a multisensory approach to reading instruction to address reading deficits. Known

Unknown

Low-Inference Hypothesis. The student needs to build reading fluency skills to become more proficient in decoding. Known

Unknown

The results of grade-wide benchmarking in reading show that a target 2nd-grade student can read aloud at approximately half the rate of the median child in the grade.

Response to Intervention

www.interventioncentral.org 12

Adopting a Low-Inference Model of Reading Skills

Source: Source: Big ideas in beginning reading. University of Oregon. Retrieved September 23, 2007, from http://reading.uoregon.edu/index.php

5 Big Ideas in Beginning Reading

1. Phonemic Awareness

2. Alphabetic Principle

3. Fluency with Text

4. Vocabulary

5. Comprehension

Response to Intervention

www.interventioncentral.org 13

Formative Assessment: Essential Questions…

2. Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students?

Apply the ‘80-15-5 ‘Rule (T. Christ, 2008) :– If fewer than 80% of students are successfully meeting academic or behavioral

goals, the formative assessment focus is on the core curriculum and general student population.

– If no more than 15% of students are not successful in meeting academic or behavioral goals, the formative assessment focus is on small-group ‘treatments’ or interventions.

– If no more than 5% of students are not successful in meeting academic or behavioral goals, the formative assessment focus is on the individual student.

Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

Response to Intervention

www.interventioncentral.org 14

RTI Literacy: Assessment & Progress-MonitoringTo measure student ‘response to instruction/intervention’ effectively, the RTI Literacy model measures students’ reading performance and progress on schedules matched to each student’s risk profile and intervention Tier membership.

• Benchmarking/Universal Screening. All children in a grade level are assessed at least 3 times per year on a common collection of literacy assessments.

• Strategic Monitoring. Students placed in Tier 2 (supplemental) reading groups are assessed 1-2 times per month to gauge their progress with this intervention.

• Intensive Monitoring. Students who participate in an intensive, individualized Tier 3 reading intervention are assessed at least once per week.

Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

Response to Intervention

www.interventioncentral.org

Using Local Norms in Coordination with Benchmark Data

Response to Intervention

www.interventioncentral.org 16

Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 StudentsGroup Norms: Correctly Read Words Per Min: Book 4-1: Raw Data31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131

LOCAL NORMS EXAMPLE: Twenty-three 4th-grade students were administered oral reading fluency Curriculum-Based Measurement passages at the 4th-grade level in their school.

In their current number form, these data are not easy to interpret.

So the school converts them into a visual display—a box-plot —to show the distribution of scores and to convert the scores to percentile form.

When Billy, a struggling reader, is screened in CBM reading fluency, he shows a SIGNIFICANT skill gap when compared to his grade peers.

Response to Intervention

www.interventioncentral.org 17

Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students January Benchmarking

Low Value=31 Hi Value=131

Median (2nd Quartile)=71

3rd Quartile=1081st Quartile=43

Billy=19

Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131

0 20 40 60 80 100 120 140 160Correctly Read Words-Book 4-1

Group Norms: Converted to Box-Plot

Source: Tindal, G., Hansbrouck, J., &

Jones, C. (2005).Oral reading fluency: 90

years of measurement

[Technical report #33]. Eugene, OR:

University of Oregon.

National Reading Norms: 112 CRW Per Min

Response to Intervention

www.interventioncentral.org 18

Team Activity: Formative Assessment and Your SchoolsAt your tables, discuss:

• What kinds of formative measures your schools tend to collect most often.

• How ‘ready’ your schools are to collect, interpret, and act on formative assessment data..

Response to Intervention

www.interventioncentral.org 19

Formative Assessment: Essential Questions…

3. What method(s) should be used to measure the target academic skill or behavior?

Formative assessment methods should be as direct a measure as possible of the problem or issue being evaluated. These assessment methods can:

– Consist of General Outcome Measures or Specific Sub-Skill Mastery Measures

– Include existing (‘extant’) data from the school systemCurriculum-Based Measurement (CBM) is widely used to track basic student academic skills. Daily Behavior Report Cards (DBRCs) are increasingly used as one source of formative behavioral data.

Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

Response to Intervention

www.interventioncentral.org 20

Formal Tests: Only One Source of Student Assessment Information

“Tests are often overused and misunderstood in and out of the field of school psychology. When necessary, analog [i.e., test] observations can be used to test relevant hypotheses within controlled conditions. Testing is a highly standardized form of observation. ….The only reason to administer a test is to answer well-specified questions and examine well-specified hypotheses. It is best practice to identify and make explicit the most relevant questions before assessment begins. …The process of assessment should follow these questions. The questions should not follow assessment. “ p.170

Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176). Bethesda, MD: National Association of School Psychologists.

Response to Intervention

www.interventioncentral.org

Curriculum-Based Measurement: Assessing Basic Academic Skills

Response to Intervention

www.interventioncentral.org 22

Curriculum-Based Measurement: Advantages as a Set of Tools to Monitor RTI/Academic Cases

• Aligns with curriculum-goals and materials• Is reliable and valid (has ‘technical adequacy’) • Is criterion-referenced: sets specific performance levels for specific

tasks• Uses standard procedures to prepare materials, administer, and

score• Samples student performance to give objective, observable ‘low-

inference’ information about student performance • Has decision rules to help educators to interpret student data and

make appropriate instructional decisions• Is efficient to implement in schools (e.g., training can be done quickly;

the measures are brief and feasible for classrooms, etc.)• Provides data that can be converted into visual displays for ease of

communicationSource: Hosp, M.K., Hosp, J. L., & Howell, K. W. (2007). The ABCs of CBM. New York: Guilford.

Response to Intervention

www.interventioncentral.org 23

CBM Student Reading Samples: What Difference Does Fluency Make?

• 3rd Grade: 19 Words Per Minute

• 3rd Grade: 70 Words Per Minute

• 3rd Grade: 98 Words Per Minute

Response to Intervention

www.interventioncentral.org 24

• Phonemic awareness skills• Reading fluency• Reading comprehension• Early math skills• Math computation• Math applications & concepts• Writing• Spelling

CBM techniques have been developed to assess:

Response to Intervention

www.interventioncentral.org 25

CBM Math Measures: Selected Sources• AimsWeb (http://www.aimsweb.com)• Easy CBM (http://www.easycbm.com)• iSteep (http://www.isteep.com)• EdCheckup (http://www.edcheckup.com)• Intervention Central (http://www.interventioncentral.org)

Response to Intervention

www.interventioncentral.org 26

Measuring General vs. Specific Academic Outcomes

• General Outcome Measures: Track the student’s increasing proficiency on general curriculum goals such as reading fluency. Example: CBM-Oral Reading Fluency (Hintz et al., 2006).

• Specific Sub-Skill Mastery Measures: Track short-term student academic progress with clear criteria for mastery (Burns & Gibbons, 2008). Example: Letter Identification.

Sources: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. Hintz, J. M., Christ, T. J., & Methe, S. A. (2006). Curriculum-based assessment. Psychology in the Schools, 43, 45-56.

Response to Intervention

www.interventioncentral.org 27

Formative Assessment: Essential Questions…

4. What goal(s) are set for improvement?

Goals are defined at the system, group, or individual student level. Goal statements:

– Are worded in measureable, observable terms,– Include a timeline for achieving those goals.– Are tied to the formative assessment methods used to monitor progress toward

the goal(s).

Response to Intervention

www.interventioncentral.org 28

Response to Intervention

www.interventioncentral.org 29

IEP Goal Statements for CBA/CBM

Response to Intervention

www.interventioncentral.org 30

Writing CBM Goals in Student IEPs (Wright, 1992)

Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

Response to Intervention

www.interventioncentral.org 31

Writing CBM Goals in Student IEPs (Wright, 1992)

Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

Response to Intervention

www.interventioncentral.org 32

Writing CBM Goals in Student IEPs (Wright, 1992)

Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

Response to Intervention

www.interventioncentral.org 33

IEP Goals for CBA/CBM: READING

Reading In [number of weeks until Annual Review], when given a randomly selected passage from [level and name of reading series] for 1 minute

Student will read aloud

At [number] correctly read words with no more than [number] decoding errors.

Response to Intervention

www.interventioncentral.org 34

IEP Goals for CBA/CBM: Written Expression

Written Expression

In [number of weeks until Annual Review], when given a story starter or topic sentence and 3 minutes in which to write

Student will write

A total of:[number] of words or[number] of correctly spelled words or [number] of correct word/writing sequences

Response to Intervention

www.interventioncentral.org 35

IEP Goals for CBA/CBM: Spelling

Spelling In [number of weeks until Annual Review], when dictated randomly selected words from [level and name of spelling series or description of spelling word list] for 2 minutes

Student will write

[Number of correct letter sequences]

Response to Intervention

www.interventioncentral.org

Interpreting Data: The Power of Visual Display

Response to Intervention

www.interventioncentral.org 37

Sample Peer Tutoring Chart

Response to Intervention

www.interventioncentral.org 38

Sample Peer Tutoring Chart

Response to Intervention

www.interventioncentral.org 39

Single-Subject (Applied) Research Designs

“Single-case designs evolved because of the need to understand patterns of individual behavior in response to independent variables, and more practically, to examine intervention effectiveness. Design use can be flexible, described as a process of response-guided experimentation…, providing a mechanism for documenting attempts to live up to legal mandates for students who are not responding to routine instructional methods.” p. 71

Source: Barnett, D. W., Daly, E. J., Jones, K. M., & Lentz, F.E. (2004). Response to intervention: Empirically based special service decisions from single-case designs of increasing and decreasing intensity. Journal of Special Education, 38, 66-79.

Response to Intervention

www.interventioncentral.org 41

Jared: Intervention Phase 1: Weeks 1-6

1 131 17

1 201 24

1 271 31

2 32 7

2 102 14

2 242 28

3 33 7

3 103 14

3 173 21

3 243 28

3 314 4

4 74 11

4 144 18

XX

W 1/2271 CRW

W 1/2977 CRW

M 2/375 CRW

Th 2/1375 CRW

Th 2/2779 CRW

F 3/782 CRW

Response to Intervention

www.interventioncentral.org 42

Formative Assessment: Donald: Grade 3

Response to Intervention

www.interventioncentral.org 43

Formative Assessment: Donald: Grade 3

Response to Intervention

www.interventioncentral.org 44

Formative Assessment: Essential Questions…

5. How does the school check up on progress toward the goal(s)?The school periodically checks the formative assessment data to determine whether the goal is being attained. Examples of this progress evaluation process include the following:

– System-Wide: A school-wide team meets on a monthly basis to review the frequency and type of office disciplinary referrals to judge whether those referrals have dropped below the acceptable threshold for student behavior.

– Group Level: Teachers at a grade level assembles every six weeks to review CBM data on students receiving small-group supplemental instruction to determine whether students are ready to exit (Burns & Gibbons, 2008).

– Individual Level: A building problem-solving team gathers every eight weeks to review CBM data to a student’s response to an intensive reading fluency plan.

Sources: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

Shinn, M. R. (1989). Curriculum-based measurement: Assessing special children. New York: Guilford.

Response to Intervention

www.interventioncentral.org

Effective Formative Evaluation: The Underlying Logic…

1. What is the relevant academic or behavioral outcome measure to be tracked?

2. Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students?

3. What method(s) should be used to measure the target academic skill or behavior?

4. What goal(s) are set for improvement?

5. How does the school check up on progress toward the goal(s)?

Response to Intervention

www.interventioncentral.org 46

Team Activity: Data ‘Decision Points’

At your tables:

• Discuss what opportunities are available at the school, group, or individual student level to discuss data on student performance and make decisions about the effectiveness of your instructional or intervention programs.

Response to Intervention

www.interventioncentral.org

School-Wide Case Example: Using Data to Evaluate

Appropriateness of Core Reading Program

Response to Intervention

www.interventioncentral.org 48

“ ”“Risk for reading failure always involves the interaction of a particular set of child characteristics with specific characteristics of the instructional environment. Risk status is not entirely inherent in the child, but always involves a “mismatch” between child characteristics and the instruction that is provided.” (Foorman & Torgesen, 2001; p. 206).

Source: Foorman, B. R., & Torgesen, J. (2001). Critical elements of classroom and small-group instruction promote reading success in all children. Learning Disabilities Research & Practice, 16, 203-212.

Response to Intervention

www.interventioncentral.org 49

“direct instruction in letter-sound correspondences practices in controlled vocabulary texts (direct code)” (Foorman & Torgesen, 2001; p. 204)

Source: Foorman, B. R., & Torgesen, J. (2001). Critical elements of classroom and small-group instruction promote reading success in all children. Learning Disabilities Research & Practice, 16, 203-212.

“Literature-based instruction emphasizes use of authentic literature for independent reading, read-alouds, and collaborative discussions. It stands in contrast to skills-based programs that are typically defined as traditional programs that use a commercially available basal reading program and follow a sequence of skills ordered in difficulty.” (Foorman & Torgesen, 2001; p. 204)

Direct / Indirect Instruction Continuum

“less direct instruction in sound-spelling patterns embedded in trade books (embedded code)” (Foorman & Torgesen, 2001; p. 204)

“implicit instruction in the alphabetic principle while reading trade books (implicit code)” (Foorman & Torgesen, 2001; p. 204)

Response to Intervention

www.interventioncentral.org 50

RTI Core Literacy Instruction: ElementsUse Benchmarking/Universal Screening Data to Verify that the

Current Core Reading Program is Appropriate. The school uses benchmarking/universal screening data in literacy to verify that its current reading program can effectively meet the needs of its student population at each grade level.

• In grades K-2, if fewer than 80% of students are successful on phonemic awareness and alphabetics screenings, the core reading program at that grade level is patterned after direct instruction (Foorman & Torgesen, 2001).

• In grades K-2, if more than 80% of students are successful on phonemic awareness and alphabetics screenings, the school may choose to adopt a reading program that provides “less direct instruction in sound-spelling patterns embedded in trade books (embedded code)” (Foorman & Torgesen, 2001; p. 205).

Response to Intervention

www.interventioncentral.org 51

Source: DIBELS Website. Retrieved on May 8, 2007, from https://dibels.uoregon.edu/

Comparison of Sunnyside & Baylor Schools: Winter Benchmarking: Gr 1

Response to Intervention

www.interventioncentral.org 52

Nonsense Word Fluency: 34% of students fell below ‘Deficient’ level (<30 NWF)

Oral Reading Fluency: 35% of students fell below ‘Deficient’ level (<8 DORF)

Phoneme SegmentationFluency: 28% of students fell below ‘Deficient’ level(<10 PSF)

Sunnyside Central School District

District Student Population: 986

Eligible for Free/Reduced-Price Lunch: 43%

Number of Students in Grade1: 69

Winter Benchmarking: Gr 1 On all literacy screening measures, Sunnyside fell below the 80% success level:

PSF: 72% ‘emerging/ established’

NWF: 66% ‘emerging/ established’

DORF: 65% ‘some risk/ low risk’

Response to Intervention

www.interventioncentral.org 53

Nonsense Word Fluency: 9% of students fell below ‘Deficient’ level (<30 NWF)

Oral Reading Fluency: 14% of students fell below ‘Deficient’ level (<8 DORF)

Phoneme SegmentationFluency: 6% of students fell below ‘Deficient’ level(<10 PSF)

Winter Benchmarking: Gr 1Baylor Unified Free School District

District Student Population: 1452

Eligible for Free/Reduced-Price Lunch: 6%

Number of Students in Grade1: 106

On all literacy screening measures, Baylor exceeded the 80% success level:

PSF: 94% ‘emerging/ established’

NWF: 91% ‘emerging/ established’

DORF: 86% ‘some risk/ low risk’

Response to Intervention

www.interventioncentral.org

Winter Benchmarking: Gr 1:

Response to Intervention

www.interventioncentral.org 55

“direct instruction in letter-sound correspondences practices in controlled vocabulary texts (direct code)” (Foorman & Torgesen, 2001; p. 204)

Source: What Works Clearinghouse. Retrieved April 15, 2009, from . http://ies.ed.gov/ncee/wwc/

Direct / Indirect Instruction Continuum

“less direct instruction in sound-spelling patterns embedded in trade books (embedded code)” (Foorman & Torgesen, 2001; p. 204)

“implicit instruction in the alphabetic principle while reading trade books (implicit code)” (Foorman & Torgesen, 2001; p. 204)

Sunnyside Elementary Core Reading Program

Baylor Elementary Core Reading Program

Response to Intervention

www.interventioncentral.org

Individual Student Case Example: Collin: Letter

Identification

Response to Intervention

www.interventioncentral.org

Case Example: Letter IdentificationThe Concern• In a mid-year (winter) school-wide screening for Letter Naming

Fluency, a first-grade student new to the school, Collin, was found have moderate delays when compared to peers. In his school, Collin fell at the 15th percentile compared with peers (local norms).

• Screening results, therefore, suggested that Collin has problems with Letter Identification. However, more information is needed to better understand this student academic delay.

57

Response to Intervention

www.interventioncentral.org

Case Example: Letter IdentificationInstructional Assessment• Collin’s teacher, Ms. Tessia, sat with him and checked his letter

knowledge. She discovered that, at baselline, Collin knew 17 lower-case letters and 19 upper-case letters. (Ms. Tessia defined ‘knows a letter” as: “When shown the letter, the student can correctly give the name of the letter within 2 seconds.”)

• Based on her findings, Ms. Tessia decided that Collin was in the ‘acquisition’ phase in this letter identification skill. He needed direct-teaching activities to learn to identify all of the letters.

58

Response to Intervention

www.interventioncentral.org

Case Example: Letter Identification

59

Response to Intervention

www.interventioncentral.org

Case Example: Letter IdentificationIntervention• Ms. Tessia decided to use ‘incremental rehearsal’ (Burns, 2005) as an

intervention for Collin. This intervention benefits students who are still acquiring their math facts, sight words, or letters.

Students start by reviewing a series of ‘known’ cards. Then the instructor adds ‘unknown’ items to the card pile one at a time, so that the student has a high ratio of known to unknown items. This strategy promotes near-errorless learning.

• Collin received this intervention daily, for 10 minutes.

• NOTE: A paraprofessional, adult volunteer, or other non-instructional personnel can be trained to deliver this intervention.

60

Source: Burns, M. K. (2005). Using incremental rehearsal to increase fluency of single-digit multiplication facts with children identified as learning disabled in mathematics computation. Education and Treatment of Children, 28, 237-249.

Response to Intervention

www.interventioncentral.org

East Carolina University Evidence-Based Intervention

Projecthttp://www.ecu.edu/cs-cas/psyc/

rileytillmant/EBI-Network-Homepage.cfm

Incremental Rehearsal Guidelines

61

Response to Intervention

www.interventioncentral.org

Case Example: Letter IdentificationGoal-Setting and Data Collection• Ms. Tessia set the goals that, within 4 instructional weeks, Collin

would:– identify all upper-case and lower-case letters.– move above the 25th percentile in Letter Naming Fluency when

compared to grade-level peers.• The teacher collected two sources of data on the intervention:

– At the end of each tutoring session, the tutor logged any additional formerly unknown letters that were now ‘known’ (that the student could now accurately identify within 2 seconds).

– Each week, the teacher administered a one-minute timed Letter Naming Fluency probe and charted the number of correctly identified letters.

62

Response to Intervention

www.interventioncentral.org

Case Example: Letter IdentificationOutcome• Ms. Tessia discovered that Collin attained the first goal (‘able to

identify all upper-case and lower-case letters’) within 2 weeks.• Collin attained the second goal (‘move above the 25th percentile

in Letter Naming Fluency when compared to grade-level peers’) within the expected four instructional weeks.

63

Response to Intervention

www.interventioncentral.org

Individual Student Case Example: Angela: Reading

Fluency

Response to Intervention

www.interventioncentral.org 65

DIBELS Case Example: Angela

• Angela is a 3rd grade student.

• Angela struggled in her classroom with reading fluency. Her teacher tried a series of classroom strategies to promote fluency for the student, including providing Angela with additional opportunities to listen to fluent text modeling from an adult and opportunities to read aloud with corrective feedback.

Response to Intervention

www.interventioncentral.org 66

DIBELS Case Example: Angela

• In the mid-year schoolwide literacy screening in January, Angela read 77 words per minute on the DIBELS Oral Reading Fluency measure.

• According to DIBELS benchmark guidelines, Angela falls within the ‘strategic intervention’ range (between 67 and 92 WPM).

Response to Intervention

www.interventioncentral.org 67

Source: Good, R. H., & Kaminski, R. A. (Eds.). (2002). Dynamic Indicators of Basic Early Literacy Skills (6th ed.). Eugene, OR: Institute for the Development of Educational Achievement. Available: http://dibels.uoregon.edu/.

Response to Intervention

www.interventioncentral.org 68

DIBELS Case Example: Angela: Cont.• After the mid-year screening, the 3rd grade teachers, building

administrator, and reading teacher gathered for a ‘data meeting’.

• At that meeting, the group considered the screening results and discussed how to improve core literacy instruction to assist those students who fell within the ‘some risk’ and ‘at risk’ categories.

• The group next sorted students from the ‘some risk’ and ‘at risk’ categories into supplemental (Tier 2) groups, according to intervention need. Teacher knowledge of the student, classroom assessments, state test results, and other information was used to supplement the DIBELS data during this sorting process.

Response to Intervention

www.interventioncentral.org 69

DIBELS Case Example: Angela: Cont.

• At the data meeting, it was decided that Angela and other students in the 3rd grade needed supplemental intervention support to increase their reading fluency, as well as to build their phonics (alphabetics) skills.

• The reading teacher agreed to start a Corrective Reading group that would meet for 4 days per week in 45 minute sessions. (The Corrective Reading program met the school’s guidelines as an ‘evidence-based’ program, based on findings from the What Works Clearinghouse website.)

• Angela and 5 other children were placed in this Corrective Reading group.

Response to Intervention

www.interventioncentral.org 70

Corrective Reading: Description“Corrective Reading is designed to promote reading accuracy (decoding), fluency, and comprehension skills of students in third grade or higher who are reading below their grade level. The program has four levels that address students' decoding skills and six levels that address students' comprehension skills. All lessons in the program are sequenced and scripted. Corrective Reading can be implemented in small groups of four to five students or in a whole-class format. Corrective Reading is intended to be taught in 45-minute lessons four to five times a week. For the single study reviewed in this report, only the word-level skills components of the Corrective Reading program were implemented.

…Corrective Reading was found to have potentially positive effects on alphabetics and fluency and no discernible effects on comprehension.”

Source: What Works Clearinghouse. Retrieved on October 6, 2009 from http://ies.ed.gov/ncee/wwc/reports/beginning_reading/cr/c

Response to Intervention

www.interventioncentral.org 71

DIBELS Case Example: Angela: Cont.

• BASELINE: Before Angela began the Corrective Reading group, her reading teacher collected baseline data. The teacher used grade 3 progress-monitoring probes supplied by DIBELS. The student was administered Oral Reading Fluency probes across three separate days.

• At baseline, Angela was found to be reading 76 words per minute in grade 3 text. This became the starting point for setting a student goal for intervention.

Response to Intervention

www.interventioncentral.org 72

Goal

Response to Intervention

www.interventioncentral.org 73

DIBELS Case Example: Angela: Cont.• GOAL-SETTING. Because Angela would be monitored

using grade 3 ORF probes, it was decided to select an ambitious rate of progress. Using research norms, the reading teacher estimated that Angela should increase her reading rate by 1.5 additional words per week. Because the intervention would be in place for 6 instructional weeks, the teacher estimated that the student should read an additional 9 words per minute at the end of 6 weeks. Because the student’s baseline reading rate was 76 words per minute, her goal at the end of the 6 weeks is 85 words per minute. In other words, if the group intervention is successful, Angela should read at least 85 WPM at the end of the intervention period.

Response to Intervention

www.interventioncentral.org 74

Table 2: Predictions for Reading Growth

by Grade

Response to Intervention

www.interventioncentral.org 75

DIBELS Case Example: Angela: Cont.

• IMPLEMENTATION OF INTERVENTION. When the Corrective Reading program began, Angela was assessed weekly (progress-monitoring) using grade 3 ORF probes from DIBELS.

• After six instructional weeks, the data team and reading teacher met to consider Angela’s progress.

Response to Intervention

www.interventioncentral.org 76

Goal

Response to Intervention

www.interventioncentral.org 77

DIBELS Case Example: Angela: Cont.• INTERVENTION CHECK-UP. At the end of 6 weeks, Angela

had made ‘promising’ progress but had not quite hit her intervention goal of 85 WPM.

• The school kept Angela in the Corrective Reading program, but decided to add an intervention component. A high school student was recruited and trained in Paired Reading. The tutor met with Angela 3 times per week for 25 minutes and used the Paired Reading strategy. Additionally, Angela’s parent was recruited to use Paired Reading at home for at least 2 times per week. The intervention goal was reset for 94 WPM.

Response to Intervention

www.interventioncentral.org 78

Paired Reading

The student reads aloud in tandem with an accomplished reader. At a student signal, the helping reader stops reading, while the student continues on. When the student commits a reading error, the helping reader resumes reading in tandem.

Response to Intervention

www.interventioncentral.org 79

Response to Intervention

www.interventioncentral.org 80

DIBELS Case Example: Angela: Cont.

• INTERVENTION CHECK-UP 2. At the end of the second 6-week intervention, the reading teacher examined the student’s monitoring data and discovered that she had met her intervention goal of 94 words per minute.

Response to Intervention

www.interventioncentral.org 81

GoalGoal

Response to Intervention

www.interventioncentral.org 82

DIBELS Case Example: Angela: Cont.

• While the student had attained success, the school continued the intervention (Corrective Reading group and Paired Reading) for 3 more weeks to continue to strengthen Angela’s reading fluency. The school then discontinued the Tier 2 intervention.

• Although Angela’s teacher admitted that she was a bit anxious about the student’s ability to maintain success without the Tier 2 intervention, she was reassured that Angela would immediately be given RTI intervention support again if she were to be flagged as ‘at risk’ in a future grade-wide reading screening.

Response to Intervention

www.interventioncentral.org 83

Formative Assessment: Culminating Team Activity

As a team:

• Discuss the formative assessment concepts, tools, and resources reviewed in this workshop.

• What are the key ‘next steps’ that your team will take to follow up on this workshop?

• What additional questions do you have on the topic of formative evaluation?