Upload
dusan
View
49
Download
2
Tags:
Embed Size (px)
DESCRIPTION
Supporting Students with Additional Needs in an RTI System. Jon Potter, Ph.D. Lisa Bates, Ph.D. David Putnam, Ph.D. Oregon RTI Project. OSPA Conference, Fall 2012. Afternoon Targets. - PowerPoint PPT Presentation
Citation preview
Supporting Students with Additional Needs in an RTI
SystemJon Potter, Ph.D.Lisa Bates, Ph.D.
David Putnam, Ph.D.Oregon RTI Project
OSPA Conference, Fall 2012
Afternoon Targets
Tier 2/3: Using data to place students in interventions (literacy) & evaluating intervention effectiveness
Tier 3: Individual Problem Solving
What is your role in ensuring the right students receive the right support at the
right time?
School Psychologists’ Role
early identification of learning and behavioral needs,
Assessment Consultation Program Evaluation
close collaboration among classroom teachers and special education personnel and parents, and a systemic commitment to locating and employing the necessary resources to ensure that students make progress in the general education curriculum.”
“RTI calls for
- NASP School Psych Role and RTI Fact Sheet
Using screening data to match interventions to student need (Literacy)
Which students receive interventions?
• Schoolwide/Districtwide decision rules should determine which students will receive additional support– Based on schoolwide screening data
(DIBELS, easyCBM, AIMSWEB, etc)– Based on available resources and
system capacity• Lowest 20%? 30%?• All student well below benchmark?Assessment
Lowest 20%All High Risk
Lowest 25%
Decision Rules guide placement in interventions
easyCBM
60 2nd Grade Students
DIBELS NextLowest 20%
Lowest 25%
All below and well below benchmark
Linking Assessment to Intervention
Screening Data
Intervention Program
Instructional need
Some will need moreOral Reading Fluency &Accuracy
ReadingComp
Phonemic Awareness
Vocabulary
Phonics (Alphabetic
Principle)
Logistics• When do these type of discussions
typically take place?– Initial intervention placement meetings
after schoolwide screenings – 3x year
–May also discuss every 6-8 weeks when reviewing student progress.
Consultation
Ensuring an Instructional Match
Question 1: What is the skill deficit?
Question 2: How big is that deficit?
Question 3: What interventions address that deficit?
Question 4: How do we implement the program?
Question 1: What is the skill deficit?
VocabularyReading Comprehension
Phonemic Awareness
Phonics(Alphabetic Principle)
Oral ReadingFluency & Accuracy
The Big 5 of Reading
Assessment
Vocabulary
Reading Comprehension
Phonemic Awareness
Phonics(Alphabetic Principle)
Oral ReadingFluency & Accuracy
Common Screening Data Sources
DIBELS Next easyCBM* AIMSWEB
•RTF•Daze•ORF CWPM
•MC Reading Comp
•Maze•Reading CBM
•ORF CWPM•ORF Acc %
•PRF•WRF •Reading CBM
•ORF Acc %•NWF WWR•NWF CLS
•PRF Acc %•Letter Sounds
•R-CBM Acc %•NWF•LSF
•PSF•FSF
•Phoneme Segmenting
•Phoneme Segmentation
DIBELS Next easyCBM* AIMSWEB *easyCBM includes a Vocabulary measure
CBM measures are linked to the Big 5 of Reading
Assessment
Reading ComprehensionPhonemic
Awareness
Phonics(Alphabetic Principle)
Oral ReadingFluency & Accuracy
Vocabulary
DIBELS Next Class List Report (2nd Grade – Fall)
Reading ComprehensionPhonemic
Awareness
Phonics(Alphabetic Principle)
Oral ReadingFluency & Accuracy
Vocabulary
easyCBM Class List Report (2nd Grade – Fall)
VocabularyReading Comprehension
Phonemic Awareness
Phonics(Alphabetic Principle)
Oral ReadingFluency & Accuracy
The Big 5 of Reading
19
How skills build on each other
• Activity: – Oral Reading Fluency Assessment
• Find a partner– Partner 1 (person with next Birthday) –
Reader– Partner 2 – Test Administrator
• Administer the reading assessment, and have the reader answer the questions
Words missed per page when accuracy is…
95% 98% 99%The Secret Life of Bees 7th Grade
18.5 7.4 3.6
My Brother Sam is Dead 5-6th grade
15 6 3
The Magic School Bus 2nd – 3rd grade
6 2.4 1.2
Phonics and accuracy are important
Accuracy is more important than fluency
Accurate at Skill
Fluent at Skill
Able to Apply Skill
IF no, teach skill.If yes, move to fluency
If no, teach fluency/automaticityIf yes, move to application
If no, teach applicationIf yes, the move to higher level skill/concept
Adapted from
VocabularyReading Comprehension
Phonemic Awareness
Phonics(Alphabetic Principle)
Oral ReadingFluency & Accuracy
The Big 5 of Reading
Application
Fluency
Accuracy
Phonics Example:Nonsense Word Fluency
Accurate at Skill
Fluent at Skill
Able to Apply Skill
Student knows all letter sounds and makes few, if any, mistakes
Student knows all letter sounds AND provides letter sounds fluently
Student automatically blends letter sounds into whole words
Accuracy Fluency Application
77984
35
00
0
000
35/56 letter sounds correct = 63%
Accuracy Fluency Application
14
7
35
00
0
014
35/36 letter sounds correct = 97%
14
68
55
24
514
1514 5
411
Accuracy Fluency Application
54/54 letter sounds correct = 100%
Validating the deficit• CBM measures (DIBELS, easyCBM,
AIMSWEB, etc) are “indicators”
• What does your other data tell you?– In-curriculum assessments– Other CBM data– OAKS
Assessment
Question 2: How big is that deficit?
Is the skill low or significantly low?
• You must define what is low and what is significantly low:
Examples: Low Significantly low
DIBELS Next Below benchmark Well below Benchmark
easyCBM* Between 11th and 20th percentile ≤10th Percentile
AIMSWEB** Between 11th and 25 percentile ≤10th Percentile
*easyCBM default percentile rank settings**AIMSWEB default percentile rank settings
…as compared to a Research-Based Standard
…as compared to Other Students
…as compared to Other Students or a Standard you set
Question 3: What interventions address that deficit?
Program Evaluation
What intervention programs does your school have that address the
skill need(s)?
Phonemic Awareness
Phonics
Oral Reading Accuracy & Fluency
Vocab
Reading Comp
Triumphs Phonics for Reading Read Naturally STARS SFA Tutoring Reading Mastery Language for Thinking Horizons
What intervention programs does your school have that address the skill need(s)?
Phonemic Awareness
Phonics
Oral Reading Accuracy & Fluency
Vocab
Reading Comp
Triumphs Phonics for Reading Read Naturally STARS SFA Tutoring Reading Mastery Language for Thinking Horizons
Phonemic Awareness
Phonics
Oral Reading Accuracy & Fluency
Vocab
Reading Comp
Triumphs Phonics for Reading Read Naturally STARS SFA Tutoring Reading Mastery Language for Thinking Horizons
Additional resources for evaluating interventions
• What Works Clearinghouse– http://ies.ed.gov/ncee/wwc/
• Florida Center for Reading Research– http://stage.fcrr.org/fcrrreports/CReportsCS.asp
x?rep=supp
• Oregon Reading First– http://oregonreadingfirst.uoregon.edu/inst_curr
_review_si.html
• Best Evidence Encyclopedia– http://www.bestevidence.org/
Question 4: How do we implement the program?
Consultation
Placement Tests
Once an intervention program that addresses the instructional need is identified, placement tests should be used to form instructional groups of students.
Other considerations• Available resources (time, staff,
materials) will guide how many groups are created.
• Consider the behavioral and social/emotional needs of the students
Additional Diagnostic data
• Diagnostic assessment in critical area of need:Quick phonics
screenerCurriculum-Based
EvaluationCORE multiple
measuresDIBELS booklets error
patternsRunning RecordsOther?
77984
35
00
0
000
With your partner
• What other data sources do you currently use or are available to you, to help match interventions to student need?–Reading–Math–Writing–Behavior
Documentation
Johnny Phonics (in text) X Quick Phonics ScreenerOral Reading Fluency Reading Mastery 2O O
Evaluating Interventions
What’s the Big Idea(s)!?
• Use appropriate progress monitoring tools
• Set Goals • Establish Decision Rules• Analyze data, apply decision rules
and determine what to change
Progress Monitoring Tools
Brief & Easy
Sensitive to growth
FrequentEquivalent forms!!!
What are some commonly used progress monitoring tools?
ReadingAIMSWEB Reading CBM, Maze
DIBELS NEXT FSF, PSF, NWF, ORF, Daze
easyCBM PSF, LSF, WRF, PRF, MC Reading Comp, Vocab
MathAIMSWEB M – Computation, M – Concepts & Applications, CBM –
Early Numeracy
easyCBM Numbers & Operations, Measurement, Geometry, Algebra
Written LanguageWriting – CBM (Total Words Written, Correct Writing Sequences, Words Spelled Correctly)
What are NOT good progress monitoring tools?
Reading•Phonic Screeners•Report Cards •OAKS
•DRA•Running Records
•Reading curriculum weekly or monthly tests or fluency passages
MathCurriculum weekly testsTeacher created math probes*
OAKS
Written LanguageWriting rubrics* OAKS
* when not administered and scored in a standardized and reliable way, or checked for consistency of multiple probes
• Oral Reading Fluency and Accuracy in reading connected text is one of the best indicators of overall reading comprehension (Fuchs, Fuchs, Hosp, & Jenkins, 2001)
Fluent & accurate reading is not the end goal… but a child who cannot read fluently and
accurately cannot fully comprehend written text.
Do we have the right “indicators”?
Additional Progress Monitoring Tools
For more info and a review of available tools, visit
www.rti4success.org (Progress Monitoring Tools Chart)
Goal Setting: Things to Consider
1.What is the goal? – Criterion-based• Research-based benchmarks/proficiency
– Norm-based• Minimum of 25th percentile (bottom limit of average)• School, District, State, National
How do you define success?
Goal Setting: Things to Consider
2. By when will they get there?– Long term goals always at proficiency
(i.e., grade placement benchmark)– Short term goals may be an incremental
step towards proficiency (i.e., instructional level material)
Does your goal close the gap?
Goal Setting: Things to Consider
3. What does reasonable growth look like?– National Growth rates (Fuchs, AIMSWEB,
Hasbrouck & Tindal)– Local Growth rates• District, School, Classroom, Intervention Group
What progress can we expect?
National Growth Rates: Reading
Grade Average ORF Growth
(WCPM)*
Ambitious ORF Growth
(WCPM)*
Average Maze Growth (WCR)**
1 2 3 0.42 1.5 2 0.43 1 1.5 0.44 0.85 1.1 0.45 0.5 0.8 0.46 0.3 0.65 0.4
*Fuchs et al (1993), **Fuchs & Fuchs (2004)
“Using national normative samples allows comparisons to be made with the performance levels expected of typical performing students from across the country and equates more closely with data sets that are used in well developed, published, norm-referenced tests.”
Shapiro, 2008
Local Growth Rates
What does typical growth look like in…
…your district?…your school?…your classroom?…your intervention group?
“…use of the combination of local and national norms provides the user of these data with opportunities to evaluate how student performance compares with a national sample of same-grade peers, as well as against the local peers within the particular school.”
Shapiro, 2008
Setting Appropriate Goals Is Important
Benchmark
36 WCPM
18 WCPM
Ora
l Rea
ding
Flu
ency
(W
ords
Cor
rect
Per
Min
ute)
Decision Rules• Decision rules guide how we decide
if our interventions are working—and when to move on
• Your decision rules create consistency across grade levels and schools
• Determine how to intensify and individualize interventions
• Standardizes process for eligibility decision making
Key features of decision rules
• Set the grade levels for the decision rules (K, 1-6)
• Number of points below the aimline• Give direction if the data is highly
variable– Trendline analysis
• Duration of intervention /frequency of monitoring (Length of time in between meetings (6 to 8 weeks)
• Define success
Evaluating Interventions:
Is What We Are Doing Working?
AAA• Apply Decision Rules: Is the student
making adequate progress based on decision rules?
• Analyze: Is it an individual or a group problem?
• Action: Determine what to change
Apply: Is the Student Making Adequate
Progress?
60
10
20
30
40
D ec.S cores
F e b.S cores
J an.S cores
Mar c hS core s
A prilS core s
MayS core s
J uneS cores
60
50
Aimline
Chase
Analyze: Is it an Individual or a Group
Problem?Cohort Group Analysis: Students who have similar literacy
programming:– Grade level– Intervention program– Time– ELD level
62
10
20
30
40
Dec.S cores
F eb.S cores
J an.S cores
Marc hS cores
AprilS cores
MayS cores
J uneS cores
60
50
AimlineAmy
Chase
Mary
Isaiah
Cohort Data
63
10
20
30
40
Dec.S cores
F eb.S cores
J an.S cores
Marc hS cores
AprilS cores
MayS cores
J uneS cores
60
50
Aimline
Amy
Mary
Isaiah
Cohort Data
Chase
Action: Determine What to
Change • Listen to the data• Gather additional data if necessary• Focus on instructional variables that
you can control!
65
Focus on what we can control
What do we change?
TimeGroup Size
Different Program
Individual Problem-Solving
Time/Engagement
A Final Thought
It’s better to shoot for the stars and miss than aim at the gutter and hit it.–Anonymous
68
Break Time
69
Individual Problem Solving
OSPA Fall ConferenceOregon RTI ProjectOctober 12th, 2012
70
Targets• Provide a framework for how to
individually problem-solve students with the most intensive needs
71
“It is better to know some of the questions than all of the answers.”
James Thurber
Problem-Solving Non-example
Problem-Solving Non-example
74
Who are students with the most intensive
needs?Students with identified
disabilitiesStudents who may have a
disabilityStudents with significant literacy
deficits
75
If there was a problem…
Why proactive problem solving?
“Problem solving assessment typically takes a more direct approach to the measurement of need than has been the case in historical special education practice” -Reschley, Tilly, & Grimes (1999)
“Intervention studies that address the bottom 10-25% of the student population may reduce the number of at-risk students to rates that approximate 2-6%” -Fletcher, Lyon, Fuchs, & Barnes (2007)
76
77
The Problem Solving Process
Improved Student
Achievement
2. Problem Analysis
1. Problem Identificati
on
3. Plan Developme
nt
4. Plan Implementation & Evaluation
What is the problem?
Why is the problem
occurring?
What are we going to do about the problem?
How is it working?
Problem Solving Form
79
Improved Student
Achievement
1. Problem Identificati
onWhat is the
problem?
Step 1: Problem Identification
80
Step 1: Problem Identification
A problem is defined as a discrepancy between:
Current performanceExpected performance Problem
Definition
81
Step 1: Problem Identification
• Expected performance is based on data:–Performance of typical/average peers–Research-based benchmarks–Proficiency scores
• Actual performance is based on current student data
82
Step 1: Problem Identification
• Calculating magnitude of discrepancy
Absolute discrepancy:
Discrepancy Ratio:
Expected performance Current performance–
÷Larger Number Smaller Number
72 wcpm (Winter 2nd Grade) 32 wcpm
=
= -40 wcpm–
72 wcpm (Winter 2nn Grade) 32 wcpm÷
2.25 times
discrepant
Discrepancy between Current Performance & Expected
Performance
84
Step 1: Problem Identification
Problem Definitions should be:
1. Objective – observable and measurable
2. Clear – passes “the stranger test”3. Complete – includes examples (and
non-examples when necessary) and baseline data
85
Problem Definition: Example
Harry (2nd grader) is currently reading a median of 44 words correct per minute (wcpm) with 83% accuracy when given 2nd grade level text. He also answers an average of 3/10 comp questions correct on weekly in-class tests. 2nd grade students in his school are reading an average of 85 wcpm with 97% accuracy on 2nd grade text and answering 9/10 comp questions correct.
86
Problem Definition: Non-Example
Harry struggles with being a fluent reader and is not meeting the 2nd grade reading benchmark. He makes a lot of mistakes and is currently reading at a 1st grade level. He also has difficulties answering comprehension questions at grade level and does poorly on his weekly reading tests.
87
Step 1: Problem Identification
• Replacement Skill or Target Behavior
–What would it look like if this student were successful?–What would we prefer the student
do, instead of the problem behavior?
Problem Definition & Target Skill
89
The Problem Solving Process
Improved Student
Achievement
2. Problem Analysis
1. Problem Identificati
on
Why is the problem
occurring?
90
The WHY should always drive the WHAT
Plan Developme
nt
Problem Identificati
on
Step 2: Problem Analysis
Problem Analysis
91
The Water…
IC
EFocus on “the water”-• Instruction• Curriculum• Environment
ICEL
I – Instruction C – CurriculumE – EnvironmentL – Learner
Instruction: Curriculum:
Environment: Learner:
How you teach
What you teach
Where you teach
Who you teach
Student Learning
94
We can control the how, what, and where.
We don’t have much control over the who.
95
When it comes to problem analysis, just remember…
96
ICE, ICE babyI – Instruction C – CurriculumE – Environment
thenL – Learner
John Hattie, Visible Learning, 2009
Effective teaching variables
Effect size Other variables Effect
size
Formative Evaluation +0.90 Socioeconomic Status +0.57Comprehensive interventions for students with LD +0.77 Parental Involvement +0.51
Teacher Clarity +0.75 Computer based instruction* +0.37Reciprocal Teaching +0.74 School Finances +0.23
Feedback +0.73 Aptitude by Treatment Interactions* +0.19
Teacher-Student Relationships +0.72 Family Structure +0.17
Direct Instruction +0.59 Retention -0.16
What impacts student achievement?
Hypothesis DevelopmentInstruction: Curriculum:
Environment: Learner:
? ?
? ?
ICEL
Assessment
100
Instruction, Curriculum, & Environment
• What should appropriate instruction, curriculum, and environment look like?
• Video: Early Reading Intervention– 3 students receiving direct instruction
on phonemic awareness & phonics– Observe and note effective teaching
practices with regard to instruction, curriculum, and environment
101
Instruction, Curriculum, Environment
102
Talk time• What effective teaching practices did you
see related to instruction, curriculum, & environment?
• What questions/concerns/suggestions might you have for this teacher?
103
Assessment ≠ Testing ≠ Evaluation
*Testing – “administering a particular set of questions to an individual to obtain a score”
*Assessment – “the process of collecting data for the purpose of making decisions about students”
**Evaluation – “procedures used to determine whether the child has a disability, and the nature and extent of the special education and related services that the child needs.”
Assessment*Salvia & Ysseldyke, 2004**Oregon Administrative Rules, 581-015-2000
104
Assessment: RIOT
R – Review I – Interview O – Observe T – Test
Hypothesis DevelopmentInstruction: Curriculum:
Environment: Learner:
RIOT
RIOT
RIOTRIOT
106
Instruction• Thinking about RIOT procedures,
what are some ways we can gather information about Instruction?
R – Review Examine lesson plans, attendance, permanent products for instructional demands
I – Interview Talk to teachers about expectations, instructional strategies used
O – Observe Observe instruction in the classroom for effective instructional practices
T – Test Aggregate test scores of classroom
107
Instruction: Examples
Explicitness
Pacing
Corrective Feedback
Who knows…? I do, we do, y’all do, you do
1-2 OTR’s/min 8-12 OTR’s/min
<50% errors corrected
95-100% errors corrected
Targets for Intervention
Is this effective instruction?
Is this effective instruction?
110
When it comes to interventions…
“It is clear that the program is less important than how it is delivered, with the most impressive gains associated with more intensity and an explicit, systematic delivery” Fletcher & Colleagues, 2007
111
Instruction ResourcesExplicit Instruction – Archer & Hughes
(2011)– www.explicitinstruction.org
Teaching Reading Sourcebook - CORE– http://www.corelearn.com/
Classroom Instruction that Works: Research-Based Strategies for Increasing Student Achievement – Marzano et al, (2001)
112
Curriculum• Thinking about RIOT procedures,
what are some ways we can gather information about Curriculum?
R – Review Examine permanent products for skills taught, scope & sequence, instructional match
I – Interview Talk to teachers, administrators about philosophy of curriculum, coverage, etc.
O – Observe Student success rate
T – Test Readability of textbooks
113
Curriculum: Examples
Skills Taught
Level of Difficulty
Fidelity to the core
Not matched to need Matched to need
Frustrational (<80%)
Instructional (>80-90%)
Weak (<80%) Strong (>80%)
Targets for Intervention
114
VocabularyReading Comprehension
Phonemic Awareness
Phonics(Alphabetic Principle)
Oral ReadingAccuracy & Fluency
Reading Skills Build on Each Other
115
Environment• Thinking about RIOT procedures,
what are some ways we can gather information about Environment?
R – Review Examine school rules, attendance, class size
I – Interview Talk to teachers about expectations, rules, behavior management system, classroom culture, talk to parents
O – Observe Observe in the classroom
T – Test Aggregate test scores of classroom
116
Environment: Examples
Behavior Expectations/Management
Teacher-studentinteractions
Physical setup
Not defined Explicitly taught & reinforced
Low rate of reinforcement
Mostly positive (4:1)
Chaotic & distracting
Organized & distraction-free
Targets for Intervention
117
Academic Learning Time: Typical School
1170 School Year (6.5 hours x 180 days) - 65 Absenteeism (1 day/month x 10 months)= 1105 Attendance Time (Time in School)- 270 Non-instructional time (1.5 hrs./day for recess, lunch, etc)= 835 Allocated Time (Time scheduled for teaching)- 209 (25% of allocated time for administration,
transition, discipline-15 minutes/hour)= 626 Instructional time (time actually teaching)- 157 Time off task (Engaged 75% of time)= 469 Engaged Time (On task)- 94 Unsuccessful Engaged Time (Success Rate 80%)= 375 Academic Learning Time
Education Resources Inc., 2005Efficiency Rating = 32%
Hours
118
Academic Learning Time: Effective School
1170 School Year (6.5 hours x 180 days) - 65 Absenteeism (1 day/month x 10 months)= 1105 Attendance Time (Time in School)- 270 Non-instructional time (1.5 hrs./day for recess, lunch, etc)= 835 Allocated Time (Time scheduled for teaching)- 125 (15% of allocated time for administration,
transition, discipline-9 minutes/hour)= 710 Instructional time (actually teaching-710 vs. 626)- 71 Time off task (Engaged 90% of time)= 639 Engaged Time (639 vs. 469 On task)- 64 Unsuccessful Engaged Time (Success Rate 90%)= 575 Academic Learning Time
Education Resources Inc., 2005
Hours
Efficiency Rating = 49%
119
The Difference: Typical vs. Effective Schools
Variable Typical School
Effective School Time gained How the time is gained
Allocated Non-instructional Time
25%(15 min/hr)
15%(9 min/hr)
+84 more hours
Teaching expectations, teaching transitions, managing appropriate and inappropriate behavior efficiently
Engagement Rate
75% 90% +86 more hours
Better management of groups, pacing
Success Rate 80% 90% +30 more hours
Appropriate placement, effective teaching
Academic Learning time
375 hours 575 hours = 200 more hours (53% more) OR95 more school days (4-5 months!)
120
Learner• Thinking about RIOT procedures,
what are some ways we can gather information about Learner?
R – Review Examine cumulative file, health records, developmental history, etc
I – Interview Talk to teachers, parents, student about perceptions of the problem
O – Observe Observe student in the classroom
T – Test Direct assessment
121
Learner: Examples
Educational History
Current CBM scores
Student behavior
Poor attendance Great attendance
Well below benchmarks At benchmarks
Off-task, disruptive, disengaged
Focused & attentive
122
Before considering additional testing
• Start with existing data:– Screening data– Progress monitoring data– State testing data (OAKS)– In curriculum data
• Is additional data needed?–What additional questions do you
have?–Which diagnostic assessments can
answer those questions? Assessment
123
Additional Resources• Curriculum-Based
Evaluation: Teaching & Decision Making– Howell & Nolet
• CORE Assessing Reading Multiple Measures
• Quick Phonics Screener
• DIBELS Deep
Hypothesis DevelopmentInstruction: Curriculum:
Environment: Learner:
RIOT
RIOT
RIOTRIOT
125
Hypothesis Development• What can we do that will reduce the
problem (decrease the gap between what is expected and what is occurring)? Expected
performance
Current performance
126
Problem Hypothesis• Why is the problem occurring?• Example: – Harry’s reading fluency and
comprehension problems occur because he lacks strategies for decoding silent-e words and vowel digraphs (oa, ea, ae, ou, etc). His current instruction does not provide enough explicit modeling of these skills. He also currently has a low level of engagement and is highly distracted in both his classroom and intervention room.
127
Prediction Statement• What will make the problem better?• Example:– Harry will improve if he receives explicit
instruction in his identified missing skills. He also needs instruction that utilizes high pacing and effective active engagement strategies to keep him highly engaged in instruction, and an environment that is quiet, without distraction from other students.
Problem Hypothesis & Prediction
129
Step 3: Plan Development
Improved Student
Achievement
2. Problem Analysis
1. Problem Identificati
on
3. Plan Developme
nt
What are we going to do about the problem? Consultation
Intervention Plan
Progress Monitoring Plan
Fidelity Monitoring Plan
Fidelity checklist
Importance of Feedback• Wickstrom et al studied 33
intervention cases. • Teachers agreed to do an
intervention and were then observed in class.
• 0/33 Teachers had fidelity above 10%.
• 33/33 on a self report measure indicated that they had used the intervention as specified by the team.
Slide taken from a presentation by Joseph WittConsultation
135
Importance of Feedback“Among the most powerful of interventions is feedback or formative evaluation – providing information to the teacher as to where he or she is going, how he or she is going there, and where he or she needs to go next” Hattie, 2012 (Visible Learning for Teachers)
“Feedback is the breakfast of champions”
Kevin Feldman
Consultation
136
Step 4: Plan Implementation &
Evaluation
Improved Student
Achievement
2. Problem Analysis
1. Problem Identificati
on
3. Plan Developme
nt
4. Plan Implementation & Evaluation
How is it working?
Attendance
Fidelity Data
Progress Monitoring Data…
…as compared to peers/expected growth
141
10
20
30
40
Dec.S cores
F eb.S cores
J an.S cores
Marc hS cores
AprilS cores
MayS cores
J uneS cores
60
50
AimlineAmy
Chase
Mary
Isaiah
Cohort Data
142
10
20
30
40
Dec.S cores
F eb.S cores
J an.S cores
Marc hS cores
AprilS cores
MayS cores
J uneS cores
60
50
Aimline
Amy
Mary
Isaiah
Cohort Data
Chase
Magnitude of Discrepancy
Next Steps: Based on Data & District Policies &
Procedures
Final Thought: Data, Data, Data
Questions/Comments
Jon Potter [email protected] Lisa Bates [email protected]
David Putnam [email protected]
146