71
USING SCHOOL DATA FOR PROBLEM-SOLVING Don Kincaid, Ed.D. Brian Gaunt, Ph.D. Shelby Robertson, Ph.D.

Using School Data for Problem-Solving

  • Upload
    newman

  • View
    57

  • Download
    0

Embed Size (px)

DESCRIPTION

Using School Data for Problem-Solving. Don Kincaid, Ed.D . Brian Gaunt, Ph.D. Shelby Robertson, Ph.D. Questions. Why are data essential for problem-solving at the school level? How do we identify the critical data needed for solving academic and behavior concerns? - PowerPoint PPT Presentation

Citation preview

Page 1: Using School Data for  Problem-Solving

USING SCHOOL DATA FOR

PROBLEM-SOLVINGDon Kincaid, Ed.D.Brian Gaunt, Ph.D.

Shelby Robertson, Ph.D.

Page 2: Using School Data for  Problem-Solving

Questions• Why are data essential for problem-solving

at the school level?• How do we identify the critical data

needed for solving academic and behavior concerns?

• How can the 4 step problem-solving model guide the use of academic and behavioral data in a systematic and common way across a district? (i.e., guiding questions).

Page 3: Using School Data for  Problem-Solving

WHY ARE DATA ESSENTIAL FOR PROBLEM-SOLVING?

Page 4: Using School Data for  Problem-Solving

…for planning?“Research has found that up-front planning

helped make data collection and use more efficient in many case study schools by clarifying what data were needed, aiding with integration of multiple data sources, and ensuring that data collection processes were on track (Keeney, 1998; Lachat, 2001).”

-Kerr, et al., (2006). Strategies to promote data use for instructional improvement: Actions, outcomes, and lessons learned from three urban

districts. American Journal of Education, 112, 496-520.

Page 5: Using School Data for  Problem-Solving

…for legal reasons?• Some data we are required to collect (e.g.,

NCLB, 2002):– But why? – What questions does this “required data”

potentially help answer?– Can this data be used in our problem-solving

efforts?• Schools are required to use “evidence”, collect

and analyze data, and use those data to make decisions for education improvement. (Coburn & Talbert, 2006; Halverson et al., 2005; Honig & Coburn, 2008; Mills, 2011; Young, 2006)

Page 6: Using School Data for  Problem-Solving

…to identify solutions to problems!!!

• Not all data we are asked to collect will be useful for solving local problems.

• Compliance vs. problem-solving• Main question: What data should we

collect for problem-solving. • Tied to what questions are being asked

– Types of data: Screeners, diagnostic measures, progress monitoring measures, outcome or summative measures.

– Each has a function and potential to answer particular types of questions.

Page 7: Using School Data for  Problem-Solving

Decision-making – Need Data

• Assumptions: – The appropriate data are needed for school

level problem-solving– No matter how useful the data may be, they

are NOT useful if they are not used.• Data Chaos! • To teach others about data we need to

separate– Management (infrastructure-data system)– Use (analysis and decision-making)

Page 8: Using School Data for  Problem-Solving

HOW DO WE KNOW WHAT CRITICAL DATA

ARE NEEDED?

Page 9: Using School Data for  Problem-Solving

Consensus on Data?Coburn & Talbert, 2002

• What does “data-based problem-solving” mean to you?

• Do you know what to collect?• Do you know why it is being collected?• Do you know what the data are used for?• Do you know what to do after you have the data?

After you analyze it?

• How aligned is your assessment system with answering key questions and how valuable are your district’s data to solve what problems?

Page 10: Using School Data for  Problem-Solving

Are your data useful? For what?

Data source(s) should:• provide sufficient information to

select appropriate services and supports.

• allow you to group students with similar needs

• match the nature of the problem, the target responses/knowledge identified for change, and key problem-solving questions.

Page 11: Using School Data for  Problem-Solving

Data Management: Let’s Redefine “Data System”

• Prepare to Collect• Collect the data• Organize the data• Summarize the data• Share the data• Support the team to use the data• Analyze the data• …Make a decision!

Page 12: Using School Data for  Problem-Solving

WHAT QUESTIONS SHOULD WE BE ASKING

TO SOLVE PROBLEMS?

Page 13: Using School Data for  Problem-Solving

Problem Solving Process

Define the ProblemWhat Do We Want Students to KNOW and Be Able to

DO?

Problem AnalysisWhy Can’t They DO It?

Implement PlanWhat Are WE Going To DO About

It?

EvaluateDid It WORK?

(Response to Intervention –RtI)

Page 14: Using School Data for  Problem-Solving

Florida’s Guiding Questions

Step 1 – Problem ID• What do we expect out students to know, understand, and do as a result of

instruction?• Do our students meet or exceed these expected levels? (How sufficient is

the core?)• Are there groups for whom core is not sufficient?  Step 2 – Problem Analysis• If the core is NOT sufficient for either a “domain” or group of students,

what barriers have or could preclude students from reaching expected levels? 

 Step 3 – Plan Development and Implementation• What strategies or interventions will be used?• What resources are needed to support implementation of the plan?• How will sufficiency and effectiveness of core be monitored overtime?• How will fidelity be monitored over time?• How will “good”, “questionable,” and “poor” responses to intervention be

defined?  Step 4 – Plan Evaluation of Effectiveness• Have planned improvements to core been effective?

Page 15: Using School Data for  Problem-Solving

Step 1: Problem Identification –Tier 1

• What do we expect our students to know, understand, and do as a result of instruction?

• Do our students meet or exceed these expected levels? (How sufficient is the core?)

• Are there groups for whom core is not sufficient?

Page 16: Using School Data for  Problem-Solving

Expectations for Behavior• 80% have 1 or fewer ODRs• Are the # of ODRs, ISS and OSS per

100 students higher than the national or district average?

• Are the # of ODRs, ISS and OSS per 100 students decreasing?

• Is attendance steady?

Page 17: Using School Data for  Problem-Solving

Expectations for Literacy & Math

• Sunshine State Standards (SSS)• Grade-level expectations (GLE)• Objectives and Goals of GLEs

• The standards are the curriculum.• Tier 1 data: AYP (state test-NCLB); State

reading test (FCRR/FAIR)• State assessments based on SSS.• Additional, district specific?

Page 18: Using School Data for  Problem-Solving

Annual Yearly Progress (AYP)

Page 19: Using School Data for  Problem-Solving

Step 1: Problem Identification –Tier 1

• What do we expect our students to know, understand, and do as a result of instruction?

• Do our students meet or exceed these expected levels? (How sufficient is the core?)

• Are there groups for whom core is not sufficient?

Page 20: Using School Data for  Problem-Solving

Do 80% of students exhibit appropriate behavior?

Page 21: Using School Data for  Problem-Solving

Do 80% of students exhibit appropriate behavior?

Page 22: Using School Data for  Problem-Solving

During the current year, does the school have students with 2 or more ODRs by October 1?

Page 23: Using School Data for  Problem-Solving

Are the # of ODRs, ISS and OSS per 100 students higher than the national or district average?

• National Average for MS is .05 per 100 students

Page 24: Using School Data for  Problem-Solving

Are the # of ODRs, ISS and OSS per 100 students decreasing?

Page 25: Using School Data for  Problem-Solving

Are the # of ODRs, ISS and OSS per 100 students decreasing?

Page 26: Using School Data for  Problem-Solving

Is attendance steady?

Page 27: Using School Data for  Problem-Solving

Sources of Data

Page 28: Using School Data for  Problem-Solving

Utilizing Common Assessment Data to Understand Student Needs

Page 29: Using School Data for  Problem-Solving

Class Recommended Level of Instruction Report

This report provides a summary of the students’ overall progress. It can be used to get an overall sense of instructional levels in the class and to calculate the Effectiveness of Core Instruction (ECI) index and the three Effectiveness of Intervention (EI) indices.

Page 30: Using School Data for  Problem-Solving

How sufficient is the core?

Page 31: Using School Data for  Problem-Solving

How sufficient is the core?

Page 32: Using School Data for  Problem-Solving

Academic Sufficiency

Average Scores 6 7 8

Statewide 57% 62% 68%

District 55% 58% 59%

Sunshine Middle 36% 43% 52%

Page 33: Using School Data for  Problem-Solving

Step 1: Problem Identification –Tier 1

• What do we expect our students to know, understand, and do as a result of instruction?

• Do our students meet or exceed these expected levels? (How sufficient is the core?)

• Are there groups for whom core is not sufficient?

Page 34: Using School Data for  Problem-Solving

Are there groups of students for whom the Tier 1 Core is not sufficient?

Page 35: Using School Data for  Problem-Solving

Are there groups of students for whom the Tier 1 Core is not sufficient?

Page 36: Using School Data for  Problem-Solving

Are there groups of students for whom the Tier 1 Core is not sufficient?

Page 37: Using School Data for  Problem-Solving

Are there groups for whom core is not sufficient?

Page 38: Using School Data for  Problem-Solving

Are there groups for whom core is not sufficient?

Page 39: Using School Data for  Problem-Solving
Page 40: Using School Data for  Problem-Solving

Step 2 – Problem AnalysisTier 1

• If the core is NOT sufficient for either a “domain” or group of students, what barriers have or could preclude students from reaching expected levels? – Why are some students not successful

(Initial Hypotheses)?  

Page 41: Using School Data for  Problem-Solving

What potential barriers have precluded us from achieving expected performance levels?

Lack of…• Common Assessments• Common Planning• Ongoing Progress Monitoring• Curriculum Mapping Aligned with

NGSSS and Common Assessments

• Resource Availability• Administrative Support• Professional Development

Page 42: Using School Data for  Problem-Solving

Instruction Curriculum Environment Learner

Alignment with Standards and Across Grade/School Levels, Relevancy to Students’ Personal Goals,Content, Pacing, Progression of Learning, Differentiation

Cognitive Complexity of Questions and Tasks, Gradual Release of Responsibility, Appropriate Scaffolding, Connection to Students’ Personal Goals, Interests and Life Experiences

Reward/Consequence System,Visual Cues,Climate/Culture, Quality of Student/Adult Relationships, Quality of Peer Relationships, High Expectations for ALL Students, Collaboration and Voice

Reinforcement Preferences, Perceptions of Competence and Control, Perceived Relevancy of Instruction/Education, Integration and Affiliation with School, Academic/Social-Emotional Skill Development

Page 43: Using School Data for  Problem-Solving

Hypothesis= Instructional

Page 44: Using School Data for  Problem-Solving

Step 2: Problem Analysis –Tier 1

1. Instruction • Are best practices in instruction being delivered to those

students? • Is instruction being delivered in sufficient amounts or as often as

necessary? 2. Curriculum• Are lesson plans in alignment with the appropriate core

standards/expectations?• Are the curricular materials being used with fidelity or as

designed?• Does staff have the knowledge and skills to utilize the curricular

materials in alignment with grade-level/school-wide standards or expectations?

3. Environment• Do all staff and students know the school-wide behavioral

expectations? • Are they being used consistently across all settings? (e.g., school

climate)?• Are the school-wide behavioral expectations in alignment with the

school/district missions?• Are best practices in classroom management being utilized and in

alignment with the school-wide behavioral expectations?4. Learner• Are students accessing the available instruction? (e.g.,

attendance)• Are students “actively engaged” in classroom instruction?• Do students perceive having a positive relationship with their

school/teachers?

Page 45: Using School Data for  Problem-Solving

Step 3: Plan Devel. and Implementation–Tier 1

• What strategies or interventions will be used?

• What resources are needed to support implementation of the plan?

• How will sufficiency and effectiveness of core be monitored overtime?

• How will fidelity be monitored over time?

• How will “good”, “questionable,” and “poor” responses to intervention be defined?

Page 46: Using School Data for  Problem-Solving

Key Considerations

• Utilize existing tools and resources whenever possible.

• Align strategies and interventions specifically to identified barriers which preclude student success within core instruction.

• Select research-based strategies and interventions to address identified barriers.

• Communicate the “compelling why” of interventions with teachers, parents, and students.

Page 47: Using School Data for  Problem-Solving

Intervention Linked to Underlying Barrier

• Mentoring programs• Goal Setting & career

planning support• Frequent progress reports• Targeted rewards• Mandatory study hall• Mandatory homework help• Study skills classes

• Targeted, differentiated instruction

• Additional instructional time

• Pre-teach essential skills, content, and vocabulary

• Review/Reteach prerequisite skills to address the learning gap

• Prevention (requires vertical articulation with middle/elementary school and early identification of at-risk students)

CAUTION: Failed Learners often become disengaged over

time and may require both categories of intervention

support

Disengaged Learners Failed Learners

Page 48: Using School Data for  Problem-Solving

Critical

Element

Step 1:What is the problem/issue/task to be

addressed?

Step 2:Why is it occurring?

Step 3:What are we going to

do about it?

To-Do List Persons Responsible

Follow-Up or Completion Date

Step 4: How will we know when we’ve

been successful?

     

  

  

1.      2.    3.    4.      

 5.    6.    

  

     

1.     2.    

3.    4.    

 5.    6.    

  

     

1.     2.    

3.    4.    

 5.    6.    

  

     

1.     2.    

3.    4.    

 5.    6.    

Tier 1/Universal PBS: Specific RtI:B Action Plan

Critical Elements: PBS Team; Faculty Commitment; Discipline Procedures; Data Entry & Analysis; Expectations & Rules; Reward/Recognition Program; Lesson Plans; Implementation Plan; Classroom Systems; Evaluation

Page 49: Using School Data for  Problem-Solving

Sources of Data

Page 50: Using School Data for  Problem-Solving

Selecting Research-Based Strategies and Interventions for Academics and Behavior

Page 51: Using School Data for  Problem-Solving

Selecting Research-Based Strategies and Interventions

Page 52: Using School Data for  Problem-Solving

Technology Resources

Page 53: Using School Data for  Problem-Solving

Planning for Step 4

• How will fidelity of interventions be monitored over time?

• How will sufficiency and effectiveness of strategies and interventions be monitored over time?– How will the data be displayed?

• How will “good”, “questionable,” and “poor” responses to intervention be defined?

Page 54: Using School Data for  Problem-Solving

How will fidelity be monitored over time?

• Fidelity of implementation is the delivery of instruction in the way in which it was designed to be delivered.

• Fidelity must also address the integrity with which screening and progress-monitoring procedures are completed and an explicit decision-making model is followed.

• Fidelity also applies to the problem solving process…bad problem solving can lead to bad decisions to implement otherwise good interventions.

Page 55: Using School Data for  Problem-Solving

Fidelity?

Page 56: Using School Data for  Problem-Solving

Fidelity Tier 1-3

Page 57: Using School Data for  Problem-Solving

Step 4: Plan Evaluation–Tier 1

• Have planned improvements to core been effective?

Page 58: Using School Data for  Problem-Solving

Are the # of ODRs, ISS and OSS per 100 students higher than the national or district average?

• National Average for MS is .05 per 100 students

Intervention in August produced immediate and sustained change.

Page 59: Using School Data for  Problem-Solving

Are the # of ODRs, ISS and OSS per 100 students decreasing?

Over 50% reduction in two years.

Page 60: Using School Data for  Problem-Solving

Are the # of ODRs, ISS and OSS per 100 students decreasing?

Implementation produced immediate and sustained change.

Page 61: Using School Data for  Problem-Solving

Are there groups of students for whom the Tier 1 Core is not sufficient?

Do these bar graphs level out indicating no disproportionality?

Page 62: Using School Data for  Problem-Solving

Fidelity?

Does the school see improvements next year in these area?

Page 63: Using School Data for  Problem-Solving

How will “good”, “questionable,” and “poor” responses to intervention be defined?

Decision Rules:• Positive Response

– Gap is closing– Can extrapolate point at which target student(s) will

“come in range” of target--even if this is long range• Questionable Response

– Rate at which gap is widening slows considerably, but gap is still widening

– Gap stops widening but closure does not occur• Poor Response

– Gap continues to widen with no change in rate.

Page 64: Using School Data for  Problem-Solving

Positive Outcomes in Tier 1• Positive

• Continue intervention with current goal

• Continue intervention with goal increased

• Fade intervention to determine if student(s) have acquired functional independence.

Page 65: Using School Data for  Problem-Solving

Performance

Fall

Positive Response to Intervention

Expected Performance

Observed Performance

Winter Spring

Gap is closing, Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range

Page 66: Using School Data for  Problem-Solving

Questionable Outcomes Tier 1

• Questionable– Was our DBPS process sound?– Was intervention implemented as

intended?• If no - employ strategies to increase

implementation integrity• If yes -

– Increase intensity of current intervention for a short period of time and assess impact. If rate improves, continue. If rate does not improve, return to problem solving.

Page 67: Using School Data for  Problem-Solving

Performance

Fall

Questionable Response to Intervention

Expected Performance

Observed Performance

Winter Spring

Rate at which gap is widening slows considerably, but gap is still widening

Gap stops widening but closure does not occur

Page 68: Using School Data for  Problem-Solving

Poor Outcomes Tier 1• Poor

– Was our DBPS process sound?– Was intervention implemented as intended?

• If no - employ strategies in increase implementation integrity

• If yes -– Is intervention aligned with the verified

hypothesis? (Intervention Design)– Are there other hypotheses to consider? (Problem

Analysis)– Was the problem identified correctly? (Problem

Identification)

Page 69: Using School Data for  Problem-Solving

Performance

Fall

Poor Response to Intervention

Expected Performance

Observed Performance

Winter Spring

Gap continues to widen with no change in rate.

Page 70: Using School Data for  Problem-Solving

Questions?

• Do you have questions?

Page 71: Using School Data for  Problem-Solving

Contact Information• Don Kincaid, Ed.D.

[email protected]• Brian Gaunt, Ph.D.

[email protected]• Shelby Roberston, Ph.D.

[email protected]