EE Coaches August 2015. Welcome! Introductions –Who you are –Where you are from –An EE success...

Preview:

Citation preview

EE CoachesAugust 2015

Welcome!

• Introductions– Who you are– Where you are from– An EE success from last year

Warm-up

• Read the 2 page ‘The Change Process’ document.

• Silently and alone…reflect:– How have you seen these stages play

out with EE?– What stage would gauge your school

at today?– How can you support forward

movement?

• 80% of teaches disagree with the statement that the WI Educator Effectiveness System gives them the tools to improve their practice.

Why are you called an Educator Effectiveness Coach and not an EE

Expertor

Guruor

Smarty Pants?

Page1

7

Beginning of Year Working collaboratively with their evaluator or a peer, educators draw upon the SLO and Outcome Summary Process Guide (see page 2) to develop a minimum of one SLO. The development of the SLO now must include the review of teacher and principal value-added, as well as graduation rates or schoolwide reading value-added (as appropriate to the role of the educator). Educators continue to document the goal within the appropriate online data management system (e.g., Teachscape or MyLearningPlan). Collaborative learning-focused conversations are required as part of the process, but flexibility exists in whom educators collaborate with in Supporting Years. However, in Summary Years, educators must conduct this process with their evaluators.

8

Page2

9

10

What is different from last year?

Finco

11

Page3

Finco

TEACHERS Teacher Value-Added and Schoolwide Reading: When developing SLOs, teachers must review individually, as well as with teacher teams at both the grade level and across the content area (e.g., schoolwide reading value-added), to identify trends (i.e., strengths and areas for growth) across time. These trends can inform SLOs or professional practice goals, based on areas of need. Working in teams with other teachers could inform the development of a team SLO that may align to a School Learning Objective identified by the principal. Value-added trends may also illuminate strategies that have worked well, based on areas of strength, and can support ongoing instructional efforts. Working in teams with other teachers could provide the opportunity to share best practices and successful strategies which support school improvement plans and/or goals.

Let’s walk through this…

Finco 12

13

Graduation Rate: When developing SLOs, high school teachers must review graduation rate data across time to identify positive or negative trends regarding the matriculation of their school’s students. During this review, teachers should reflect on how their practice has supported the trends within the graduation rate data. Teachers should also review the data in vertical and horizontal teams to review school (and district) practices which positively and negatively impact graduation rates. This analysis can inform the development of SLOs, as well as professional practice goals, to support the improvement of graduation rates of the educator’s students. This review can also illuminate the success of various college and career ready strategies implemented by teachers and across the school to be modified or duplicated.

Finco

14

Educators are not required to develop a goal based on these data or to develop a goal with the intention to improve these data, unless the data indicates that is necessary.

As always, the purpose of the Educator Effectiveness System is to provide information that is meaningful and supports each individual educator’s growth in their unique roles and contexts. By reviewing multiple data points, including those listed above, the educator has access to a more comprehensive view of their practice and a greater ability to identify areas of strength and need—both of which can inform the development of goals, as well as instructional/leadership strategies which can support progress towards goals.

Note: Due to the lag in data provided by DPI to districts, as well as the date in the year in which the data is provided to the districts (i.e., the following year), educators should only use the data to review trends across time when developing an SLO. Educators should not use the data to score SLOs.

Turn and Talk:What do you know about value-added

data?

There are 2 general ways to look at student assessment data

16

• Attainment model - a “point in time” measure of student proficiency– compares the measured proficiency rate with

a predefined proficiency goal.

• Growth model – measures average gain in student scores from one year to the next– accounts for the prior knowledge of students.

What is Value-Added?

• It is a type of growth model that measures the contribution of schooling to student performance on the WKCE in reading and in mathematics

• Uses statistical techniques to separate the impact of schooling from other factors that may influence growth

• Focuses on how much students improve on the WKCE (or our new assessment) from one year to the next as measured in scale score points

17

Last Year This Year

EducatorPractice

1-4

100%Based on Danielson or Stronge

Student Outcomes

1-4

5% VA

95% SLO

EducatorPractice

1-4

100%Based on Danielson or Stronge

Student Outcomes

1-4

100% SLO

Teachers

Last Year This Year

EducatorPractice

1 - 4

100%Based on

WI Principal Rubric or Stronge

Student Outcomes

1 - 4

5% VA

50% SLO

EducatorPractice

1 - 4

100%Based on

WI Principal Rubric or Stronge

Student Outcomes

1 - 4

100% SLO

Principals

45% Principal Value Add

More Clear Data Picture

Many data pieces give us a fuller picture…

STAR

WKCE or Badger

AIMSweb

ACT

WorkKeys

Classroom Assessments

AP

SurveysAspire

Observation Data

PALSVA Data

Why would we care about Value

Added data?

• VA allows for fairer growth comparisons to be made (in contrast to pure achievement)

21

90% Proficiency

School A School B

86% Proficiency

6% Free and Reduced 90% Free and Reduced

VA allows for fairer growth comparisons to be made (in contrast to pure growth)

• We know that in Wisconsin, certain groups of students do not grow (or achieve) at the same rate as others.

• This can be due to the achievement level of a child (lowest students can grow the most)

• This can also be related to demographics such as – Special Ed status– ELL– Race/ethnicity– Economically Disadvantaged – etc.

Hi! I’m a 4th grade boy. I got a scale score of

2418 on my WKCE in reading this year!

And these are all the other boys in WI who had the exact same scale score as me.

4th grade

Now I’m in 5th grade and just got a scale score of

2449 on my reading WKCE! I grew 31 points.

All of the other boys took the test again, too. Their average scale score was

2443. Their average growth was 25 points.

5th grade

So we would say that my teachers in

4th grade had a higher Value-Add

than would be expected.

5th grade

Their average growth was 25

pointsI grew 31 points

Outside the school’s influence

Race/Ethnicity

Gender

Section 504

Economic Status

Disability (by type)

Prior Year Score (reading and math)

English Proficiency (by category level)

Mobility

Using the same process, VA Controls for these factors

How do they decide what to control for?

Check 1: Is this factor outside the school or teacher’s influence?

Check 2: Do we have reliable data?

Check 3: If not, can we pick up the effect by proxy?

Check 4: Does it increase the predictive power of the model?

Checking for Understanding• What would you tell a 5th grade teacher who said they

wanted to include the following in the Value-Added model for their results?:

A. 5th grade reading curriculum

B. Their students’ attendance during 5th grade

C. Education level of the parents

D. Student motivation

Check 1: Is this factor outside the school or teacher’s influence?

Check 2: Do we have reliable data?

Check 3: If not, can we pick up the effect by proxy?

Check 4: Does it increase the predictive power of the model?

Reporting Value-Added

In the latest generation of Value-Added reports, estimates are color coded based on statistical significance. This represents how confident we are about the effect of schools and teachers on student academic growth.

Green and Blue results are areas of relative strength.Student growth is above average.

Gray results are on track. In these areas, there was not enough data available to differentiate this result from average.

Yellow and Red results are areas of relative weakness. Student growth is below average.

Grade 4 30

3

Value-Added is displayed on a 1-5 scale for reporting purposes.

About 95% of estimates will fall between 1 and 5 on the scale.

Most results will be clustered around 3

3.0 represents meeting predicted growth for your

students.

Since predictions are based on the actual

performance of students in your state, 3.0 also represents the state average growth for

students similar to yours.

Numbers lower than 3.0 represent growth

that did not meet prediction.

Students are still learning, but at a rate slower than predicted.

Numbers higher than 3.0 represent growth that beat prediction.

Students are learning at a rate faster than

predicted.

Grade 4 3.8

95% Confidence Interval

30

READING

Value-Added estimates are provided with a confidence interval.

Based on the data available for these thirty 4th Grade Reading students, we are 95% confident that the true Value-Added lies between the endpoints of this confidence interval (between 3.2 and 4.4 in this example), with the most likely estimate being 3.8.

3

Confidence Intervals

Grade 3 4.513

READING

Grade 4 36

Grade 5 84

4.5

4.5

3

Grade 3 1.513

Grade 4 36

Grade 5 84

1.5

1.5

3

MATH

Color coding is based on the location of the confidence interval.

The more student data available for analysis, the more confident we can be that growth trends were caused by the teacher or school (rather than random events).

LET’S LOOK AT THE VALUE ADDED REPORTS

THESE ARE HOUSED IN THE SCHOOL ACCESS FILE

EXCHANGE (SAFE)

We begin with some caveats!• VA is one data source among many that

provides a different perspective on student growth.

• VA should never be the sole data source to identify effective/ineffective schooling!

• Taking VA out of the Student Outcome score allows each educator to decide how (or if) this data informs the SLO process.

Page 1

Introduction to VA Color Coding

Page 2

With a partner: How is school growing students in reading and in math?

How is this school growing students across grades?

Share your thinking

Pages 3 & 4

With a partner: How is this school growing subgroups in reading?math?

Share your thinking

Page5

Introduction to VA Scatter

Plots

With a partner: What does this data tell you? How does this school compare to others in the state?

Grade Level VA & Achievement Plots

With a partner: What does this data tell you? How might a grade-level use this data to inform the focus of their SLO?

Pages 6 & 7

Share your thinking

How do/don’t these reports add to our total

data picture?

Finco

What are your take-aways about VA Data?

Peer Review is required

for those in Supporting

Years!

Feedback requested…

Time for Technology

• MLP and Teachscape updates…

Our Next Meeting…

• December 3rd

• April 7th

Closing…• What are you taking from today?

Recommended