53
Response to Intervention www.interventioncentral.org Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright www.interventioncentral.org

Response to Intervention Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Embed Size (px)

Citation preview

Page 1: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org

Evaluating the ‘RTI Readiness’ of School Assessments

Jim Wrightwww.interventioncentral.org

Page 2: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 2

Page 3: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 3

Interpreting the Results of This Survey… • YES to Items 1-3. Background. The measure gives valid general information about the

student’s academic skills and performance. While not sufficient, the data can be interpreted as part of a larger collection of student data.

• YES to Items 4-5. Baseline. The measure gives reliable results when given by different people and at different times of the day or week. Therefore, the measure can be used to collect a current ‘snapshot’ of the student’s academic skills prior to starting an intervention.

• YES to Items 6-7. Goal-Setting. The measure includes standards (e.g., benchmarks or performance criteria) for ‘typical’ student performance (e.g., at a given grade level) and guidelines for estimating rates of student progress. Schools can use the measure to assess the gap in performance between a student and grade level peers—and also to estimate expected rates of student progress during an intervention.

• YES to Items 8-11. Progress Monitoring. The measure has the appropriate qualities to be used to track student progress in response to an intervention.

Page 4: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 4

Background: Validity• Content Validity. Does the measure provide meaningful information

about the academic skill of interest? • Convergent Validity. Does the measure yield results that are

generally consistent with other well-regarded tests designed to measure the same academic skill?

• Predictive Validity. Does the measure predict student success on an important future test, task, or other outcome?

Page 5: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 5

Baseline: Reliability• Test-Retest/Alternate-Form Reliability. Does the measure have more

than one version or form? If two alternate, functionally equivalent versions of the measure are administered to the student, does the student perform about the same on both?

• Interrater Reliability. When two different evaluators observe the same student’s performance and independently use the measure to rate that performance, do they come up with similar ratings?

Page 6: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 6

Benchmarks & Goal-Setting • Performance Benchmarks. Does the measure include benchmarks

or other performance criteria that indicate typical or expected student performance in the academic skill?

• Goal-Setting. Does the measure include guidelines for setting specific goals for improvement?

Page 7: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 7

Progress-Monitoring and Instructional Impact • Repeated Assessments. Does the measure have sufficient

alternative forms to assess the student weekly for at least 20 weeks?

• Equivalent Alternate Forms. Are the measure’s repeated assessments (alternative forms) equivalent in content and level of difficulty?

• Sensitive to Short-Term Student Gains. Is the measure sensitive to short-term improvements in student academic performance?

• Positive Impact on Learning. Does research show that the measure gives teachers information that helps them to make instructional decisions that positively impact student learning?

Page 8: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 8

Team Activity: Evaluate the ‘RTI Readiness’ of Your School’s Academic Measures

Directions: Select one important literacy measure used by your school. On the form Evaluate the ‘RTI Readiness’ of Your School’s Academic Measures (next page), evaluate the ‘RTI readiness’ of this measure. Be prepared to share your results with the group.

Page 9: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org

A Review of RTI Literacy Assessment/ Monitoring ToolsJim Wrightwww.interventioncentral.org

Page 10: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 11

RTI Literacy: Assessment & Progress-Monitoring (Cont.) To measure student ‘response to instruction/intervention’ effectively, the RTI Literacy model measures students’ reading performance and progress on schedules matched to each student’s risk profile and intervention Tier membership.

• Benchmarking/Universal Screening. All children in a grade level are assessed at least 3 times per year on a common collection of literacy assessments.

• Strategic Monitoring. Students placed in Tier 2 (supplemental) reading groups are assessed 1-2 times per month to gauge their progress with this intervention.

• Intensive Monitoring. Students who participate in an intensive, individualized Tier 3 reading intervention are assessed at least once per week.

Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

Page 11: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 13

Curriculum-Based Measurement: Advantages as a Set of Tools to Monitor RTI/Academic Cases

• Aligns with curriculum-goals and materials• Is reliable and valid (has ‘technical adequacy’) • Is criterion-referenced: sets specific performance levels for specific tasks• Uses standard procedures to prepare materials, administer, and score• Samples student performance to give objective, observable ‘low-inference’

information about student performance • Has decision rules to help educators to interpret student data and make appropriate

instructional decisions• Is efficient to implement in schools (e.g., training can be done quickly; the measures

are brief and feasible for classrooms, etc.)• Provides data that can be converted into visual displays for ease of communication

Source: Hosp, M.K., Hosp, J. L., & Howell, K. W. (2007). The ABCs of CBM. New York: Guilford.

Page 12: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 16

Page 13: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 17

CBM Literacy Measures: Sources• DIBELS (https://dibels.uoregon.edu/)• AimsWeb (http://www.aimsweb.com)• Easy CBM (http://www.easycbm.com)• iSteep (http://www.isteep.com)• EdCheckup (http://www.edcheckup.com)• Intervention Central (http://www.interventioncentral.org)

Page 14: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 18

Reading: 5 Big Ideas• Phonemic Awareness/Specific Subskill Mastery• Alphabetics• Fluency with Text• Vocabulary• Comprehension

Page 15: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 19

Initial Sound Fluency (ISF)• “standardized, individually administered

measure of phonological awareness that assesses a child’s ability to recognize and produce the initial sound in an orally presented word. The examiner presents four pictures to the child, names each picture, and then asks the child to identify (i.e., point to or say) the picture that begins with the sound produced orally by the examiner.

• Time: About 3 minutes

SOURCE: Good et al. (2002) DIBELS administration and scoring guide. https://dibels.uoregon.edu/measures/files/admin_and_scoring_6th_ed.pdf

Page 16: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 20

Reading: 5 Big Ideas• Phonemic Awareness/Specific Subskill Mastery• Alphabetics• Fluency with Text• Vocabulary• Comprehension

Page 17: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 21

Phoneme Segmentation Fluency (PSF)• “assesses a student’s ability to segment three- and four-

phoneme words into their individual phonemes fluently. The PSF task is administered by the examiner orally presenting words of three to four phonemes. It requires the student to produce verbally the individual phonemes for each word.”

• Time: 1 minute

SOURCE: Good et al. (2002) DIBELS administration and scoring guide. https://dibels.uoregon.edu/measures/files/admin_and_scoring_6th_ed.pdf

Page 18: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 22

Reading: 5 Big Ideas• Phonemic Awareness• Alphabetics/Specific Subskill Mastery• Fluency with Text• Vocabulary• Comprehension

Page 19: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 23

Letter Naming Fluency (LNF)• “Students are presented with a page of upper- and lower-case

letters arranged in a random order and are asked to name as many letters as they can.”

• Time: 1 minute

SOURCE: Good et al. (2002) DIBELS administration and scoring guide. https://dibels.uoregon.edu/measures/files/admin_and_scoring_6th_ed.pdf

Page 20: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 24

Reading: 5 Big Ideas• Phonemic Awareness• Alphabetics/Specific Subskill Mastery• Fluency with Text• Vocabulary• Comprehension

Page 21: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 25

Reading: 5 Big Ideas• Phonemic Awareness• Alphabetics/Specific Subskill Mastery• Fluency with Text• Vocabulary• Comprehension

Page 22: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 26

Nonsense Word Fluency (NWF)• Tests the “alphabetic principle – including letter-sound correspondence

and of the ability to blend letters into words in which letters represent their most common sounds. The student is presented a sheet of paper with randomly ordered VC and CVC nonsense words (e.g., sig, rav, ov) and asked to produce verbally the individual letter sound of each letter or verbally produce, or read, the whole nonsense word.”

• Time: 1 minute

SOURCE: Good et al. (2002) DIBELS administration and scoring guide. https://dibels.uoregon.edu/measures/files/admin_and_scoring_6th_ed.pdf

Page 23: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 27

Reading: 5 Big Ideas• Phonemic Awareness• Alphabetics/Specific Subskill Mastery• Fluency with Text• Vocabulary• Comprehension

Page 24: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 28

Reading: 5 Big Ideas• Phonemic Awareness• Alphabetics• Fluency with Text/General Outcome Measure• Vocabulary• Comprehension

Page 25: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 29

Oral Reading Fluency (ORF)• “Student performance is measured by having students read a

passage aloud for one minute. Words omitted, substituted, and hesitations of more than three seconds are scored as errors. Words self-corrected within three seconds are scored as accurate. The number of correct words per minute from the passage is the oral reading fluency rate.”

• Time: 1 minute

SOURCE: Good et al. (2002) DIBELS administration and scoring guide. https://dibels.uoregon.edu/measures/files/admin_and_scoring_6th_ed.pdf

Page 26: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 30

Reading: 5 Big Ideas• Phonemic Awareness• Alphabetics• Fluency with Text• Vocabulary• Comprehension/General Outcome Measure

Page 27: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 31

Comparison of RTI Assessment/Monitoring Systems

DIBELS [Dynamic Indicators of Basic Early Literacy Skills]• Initial Sound Fluency: Preschool > Middle K• Letter Naming Fluency: Beginning K > Beginning Gr 1• Phoneme Segmentation Fluency: Middle K > End Gr 1• Nonsense Word Fluency: Middle K > Beginning Gr 2• Oral Reading Fluency: Middle Gr 1 > Gr 6

Page 28: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 32

Comparison of RTI Assessment/Monitoring Systems

Easy CBM• Letter Naming Fluency: K > Gr 1• Letter Sound Fluency: K > Gr 1• Phoneme Segmentation Fluency: K > Gr 1• Word Reading Fluency: K > Gr 3• Oral Reading Fluency: Gr 1 > Gr 8

Page 29: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 33

Comparison of RTI Assessment/Monitoring Systems

AimsWeb• Letter Naming Fluency: Beginning K > Beginning Gr 1• Letter Sound Fluency: Middle K > Beginning Gr 1• Phoneme Segmentation Fluency: Middle K > Middle Gr 1• Nonsense Word Fluency: Middle K > End Gr 1• Oral Reading Fluency: Gr 1 > Gr 8• Maze (Reading Comprehension Fluency): Gr 1 > Gr 8

Page 30: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 34

Comparison of 2 RTI Assessment/Monitoring Systems

DIBELS• Initial Sound Fluency:

Preschool > Middle K• Letter Naming Fluency:

Beginning K > Beginning Gr 1•

• Phoneme Segmentation Fluency: Middle K > End Gr 1

• Nonsense Word Fluency: Middle K > Beginning Gr 2

• Oral Reading Fluency: Middle Gr 1 > Gr 6

AimsWeb•

• Letter Naming Fluency: Beginning K > Beginning Gr 1

• Letter Sound Fluency: Middle K > Beginning Gr 1

• Phoneme Segmentation Fluency: Middle K > Middle Gr 1

• Nonsense Word Fluency: Middle K > End Gr 1

• Oral Reading Fluency: Gr 1 > Gr 8

• Maze (Reading Comprehension Fluency): Gr 1 > Gr 8

Page 31: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 35

‘Elbow Group’ Activity: ‘RTI-Ready’ Literacy Measures

In your ‘elbow groups’:

• Review the set of CBM literacy assessment tools in the handout.

• Select a ‘starter’ set of literacy measures by grade level that you would like your school to adopt. (If your school already has a standard set of CBM literacy/tools, discuss ways to optimize its use.)

Page 32: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org

CBM: Developing a Process to Collect Local Norms Jim Wrightwww.interventioncentral.org

Page 33: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 37

RTI Literacy: Assessment & Progress-Monitoring To measure student ‘response to instruction/intervention’ effectively, the RTI model measures students’ academic performance and progress on schedules matched to each student’s risk profile and intervention Tier membership.

• Benchmarking/Universal Screening. All children in a grade level are assessed at least 3 times per year on a common collection of academic assessments.

• Strategic Monitoring. Students placed in Tier 2 (supplemental) reading groups are assessed 1-2 times per month to gauge their progress with this intervention.

• Intensive Monitoring. Students who participate in an intensive, individualized Tier 3 intervention are assessed at least once per week.

Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

Page 34: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 38

Local Norms: Screening All Students (Stewart & Silberglit, 2008)

Local norm data in basic academic skills are collected at least 3 times per year (fall, winter, spring).

• Schools should consider using ‘curriculum-linked’ measures such as Curriculum-Based Measurement that will show generalized student growth in response to learning.

• If possible, schools should consider avoiding ‘curriculum-locked’ measures that are tied to a single commercial instructional program.

Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.

Page 35: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 39

Local Norms: Using a Wide Variety of Data (Stewart & Silberglit, 2008)

Local norms can be compiled using: • Fluency measures such as Curriculum-Based

Measurement.• Existing data, such as office disciplinary referrals.• Computer-delivered assessments, e.g., Measures of

Academic Progress (MAP) from www.nwea.org

Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.

Page 36: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 40

Measures of Academic Progress

(MAP)www.nwea.org

Page 37: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 41

Applications of Local Norm Data (Stewart & Silberglit, 2008)

Local norm data can be used to:• Evaluate and improve the current core instructional

program.• Allocate resources to classrooms, grades, and buildings

where student academic needs are greatest.• Guide the creation of targeted Tier 2 (supplemental

intervention) groups• Set academic goals for improvement for students on

Tier 2 and Tier 3 interventions.• Move students across levels of intervention, based on

performance relative to that of peers (local norms).

Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.

Page 38: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 42

Local Norms: Supplement With Additional Academic Testing as Needed (Stewart & Silberglit, 2008)

“At the individual student level, local norm data are just the first step toward determining why a student may be experiencing academic difficulty. Because local norms are collected on brief indicators of core academic skills, other sources of information and additional testing using the local norm measures or other tests are needed to validate the problem and determine why the student is having difficulty. … Percentage correct and rate information provide clues regarding automaticity and accuracy of skills. Error types, error patterns, and qualitative data provide clues about how a student approached the task. Patterns of strengths and weaknesses on subtests of an assessment can provide information about the concepts in which a student or group of students may need greater instructional support, provided these subtests are equated and reliable for these purposes.” p. 237

Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.

Page 39: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 43

Steps in Creating Process for Local Norming Using CBM Measures

1. Identify personnel to assist in collecting data. A range of staff and school stakeholders can assist in the school norming, including:• Administrators• Support staff (e.g., school psychologist, school social

worker, specials teachers, paraprofessionals)• Parents and adult volunteers• Field placement students from graduate programs

Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf

Page 40: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 44

Steps in Creating Process for Local Norming Using CBM Measures

2. Determine method for screening data collection. The school can have teachers collect data in the classroom or designate a team to conduct the screening:

• In-Class: Teaching staff in the classroom collect the data over a calendar week.

• Schoolwide/Single Day: A trained team of 6-10 sets up a testing area, cycles students through, and collects all data in one school day.

• Schoolwide/Multiple Days: Trained team of 4-8 either goes to classrooms or creates a central testing location, completing the assessment over multiple days.

• Within-Grade: Data collectors at a grade level norm the entire grade, with students kept busy with another activity (e.g., video) when not being screened.

Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf

Page 41: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 45

Steps in Creating Process for Local Norming Using CBM Measures

3. Select dates for screening data collection. Data collection should occur at minimum three times per year in fall, winter, and spring. Consider:• Avoiding screening dates within two weeks of a major

student break (e.g., summer or winter break).• Coordinate the screenings to avoid state testing periods

and other major scheduling conflicts.

Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf

Page 42: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 46

Steps in Creating Process for Local Norming Using CBM Measures

4. Create Preparation Checklist. Important preparation steps are carried out, including:• Selecting location of screening• Recruiting screening personnel• Ensure that training occurs for all data collectors• Line up data-entry personnel (e.g., for rapid computer

data entry).

Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf

Page 43: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 47

Team Activity: Draft a Plan to Conduct an Academic Screening in Your School or District

Directions: • Discuss a process for collecting screening data

three times per year in your school.• What are resources in your school that can

assist with these screenings?• What challenges do you anticipate—and how

can you overcome them?

Page 44: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org

Monitoring Student Progress at the Secondary Level

Jim Wrightwww.interventioncentral.org

Page 45: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 49

Universal Screening at Secondary Schools: Using Existing Data Proactively to Flag ‘Signs of Disengagement’

“Across interventions…, a key component to promoting school completion is the systematic monitoring of all students for signs of disengagement, such as attendance and behavior problems, failing courses, off track in terms of credits earned toward graduation, problematic or few close relationships with peers and/or teachers, and then following up with those who are at risk.”

Source: Jimerson, S. R., Reschly, A. L., & Hess, R. S. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 1085-1097). Bethesda, MD: National Association of School Psychologists. p.1090

Page 46: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 50

Mining Archival Data: What Are the ‘Early Warning Flags’ of Student Drop-Out?

A sample of 13,000 students in Philadelphia were tracked for 8 years. These early warning indicators were found to predict student drop-out in the sixth-grade year:

• Failure in English• Failure in math• Missing at least 20% of school days• Receiving an ‘unsatisfactory’ behavior rating from at least one

teacher

Source: Balfanz, R., Herzog, L., MacIver, D. J. (2007). Preventing student disengagement and keeping students on the graduation path in urban middle grades schools: Early identification and effective interventions. Educational Psychologist,42, 223–235. .

Page 47: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 51

What is the Predictive Power of These Early Warning Flags?

Source: Balfanz, R., Herzog, L., MacIver, D. J. (2007). Preventing student disengagement and keeping students on the graduation path in urban middle grades schools: Early identification and effective interventions. Educational Psychologist,42, 223–235. .

Number of ‘Early Warning Flags’ in Student Record

Probability That Student Would Graduate

None 56%

1 36%

2 21%

3 13%

4 7%

Page 48: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org

Breaking Down Complex Academic Goals into Simpler Sub-Tasks: Discrete

Categorization

Page 49: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 53

Identifying and Measuring Complex Academic Problems at the Middle and High School Level

• Students at the secondary level can present with a range of concerns that interfere with academic success.

• One frequent challenge for these students is the need to reduce complex global academic goals into discrete sub-skills that can be individually measured and tracked over time.

Page 50: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 54

Discrete Categorization: A Strategy for Assessing Complex, Multi-Step Student Academic TasksDefinition of Discrete Categorization: ‘Listing a number of behaviors and checking off whether they were performed.’ (Kazdin, 1989, p. 59).

• Approach allows educators to define a larger ‘behavioral’ goal for a student and to break that goal down into sub-tasks. (Each sub-task should be defined in such a way that it can be scored as ‘successfully accomplished’ or ‘not accomplished’.)

• The constituent behaviors that make up the larger behavioral goal need not be directly related to each other. For example, ‘completed homework’ may include as sub-tasks ‘wrote down homework assignment correctly’ and ‘created a work plan before starting homework’

Source: Kazdin, A. E. (1989). Behavior modification in applied settings (4th ed.). Pacific Gove, CA: Brooks/Cole..

Page 51: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 55

Discrete Categorization Example: Math Study SkillsGeneral Academic Goal: Improve Tina’s Math Study Skills

Tina was struggling in her mathematics course because of poor study skills. The RTI Team and math teacher analyzed Tina’s math study skills and decided that, to study effectively, she needed to:

Check her math notes daily for completeness. Review her math notes daily. Start her math homework in a structured school setting. Use a highlighter and ‘margin notes’ to mark questions or areas of confusion in her

notes or on the daily assignment. Spend sufficient ‘seat time’ at home each day completing homework. Regularly ask math questions of her teacher.

Page 52: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 56

Discrete Categorization Example: Math Study SkillsGeneral Academic Goal: Improve Tina’s Math Study Skills

The RTI Team—with teacher and student input—created the following intervention plan. The student Tina will:

Approach the teacher at the end of class for a copy of class note. Check her daily math notes for completeness against a set of teacher

notes in 5th period study hall. Review her math notes in 5th period study hall. Start her math homework in 5th period study hall. Use a highlighter and ‘margin notes’ to mark questions or areas of

confusion in her notes or on the daily assignment. Enter into her ‘homework log’ the amount of time spent that evening

doing homework and noted any questions or areas of confusion. Stop by the math teacher’s classroom during help periods (T & Th only)

to ask highlighted questions (or to verify that Tina understood that week’s instructional content) and to review the homework log.

Page 53: Response to Intervention  Evaluating the ‘RTI Readiness’ of School Assessments Jim Wright

Response to Intervention

www.interventioncentral.org 57

Discrete Categorization Example: Math Study SkillsAcademic Goal: Improve Tina’s Math Study SkillsGeneral measures of the success of this intervention include (1) rate

of homework completion and (2) quiz & test grades.

To measure treatment fidelity (Tina’s follow-through with sub-tasks of the checklist), the following strategies are used :

Approached the teacher for copy of class notes. Teacher observation. Checked her daily math notes for completeness; reviewed math notes, started math

homework in 5th period study hall. Student work products; random spot check by study hall supervisor.

Used a highlighter and ‘margin notes’ to mark questions or areas of confusion in her notes or on the daily assignment. Review of notes by teacher during T/Th drop-in period.

Entered into her ‘homework log’ the amount of time spent that evening doing homework and noted any questions or areas of confusion. Log reviewed by teacher during T/Th drop-in period.

Stopped by the math teacher’s classroom during help periods (T & Th only) to ask highlighted questions (or to verify that Tina understood that week’s instructional content). Teacher observation; student sign-in.