30
Data Guided Data Guided Instruction Instruction

Data Summer

Embed Size (px)

DESCRIPTION

TechTuneUP 2008, monday presentation

Citation preview

Page 1: Data Summer

Data Guided InstructionData Guided InstructionData Guided InstructionData Guided Instruction

Page 2: Data Summer

Why Data?

• We assess to see if our students are learning.

• Data gathering and assessment involves going beyond the grade or score a student receives on a test.

• Instead it seeks to help a teacher and student understand better what was learned and what comes next.

Page 3: Data Summer

What are your primary job responsibilities?

• To teach the content?

• To ensure that students learn?

• What is the difference between the two?

Page 4: Data Summer

2121stst Century Skills Century Skills2121stst Century Skills Century Skills

Page 5: Data Summer

Don’t confuse data with NCLB!• Data gathering/analysis is an important

part of NCLB• You might not like NCLB but don’t hate

data gathering/analysis because of NCLB• Data is data…we need it to see what is

happening in our classrooms.• We need to make use of this data to help

us get better.

Page 6: Data Summer

Formative Assessment• Black and Wiliam published a

groundbreaking article in 1998 called “Inside Inside the Blackbox: Raising Standards through the Blackbox: Raising Standards through Classroom AssessmentClassroom Assessment”.

• Black and Wiliam published a groundbreaking article in 1998 called “Inside Inside the Blackbox: Raising Standards through the Blackbox: Raising Standards through Classroom AssessmentClassroom Assessment”.

Classroom(the black box)

Systems Engineering Model

Students, TeachersRequirements, Standards, Resources $$$, CurriculumTesting, Parent Needs, etc.

INPUTS

Competent studentsSatisfied teachersHigh test results

OUTPUTS

How Politicians See School

Page 7: Data Summer

Black & Wiliam• Synthesized 250 studies from all over the world.

(yes, other countries know something about (yes, other countries know something about teaching and learning)teaching and learning)

• Found that using formative assessment as a way to promote learning had a positive impact on student learning.

• The effect size (amount of growth) was .4 to .7• Further research showed the impact was even

greater in low achieving students.

• Synthesized 250 studies from all over the world. (yes, other countries know something about (yes, other countries know something about teaching and learning)teaching and learning)

• Found that using formative assessment as a way to promote learning had a positive impact on student learning.

• The effect size (amount of growth) was .4 to .7• Further research showed the impact was even

greater in low achieving students.

Page 8: Data Summer

AmountAmount of Growthof Growth

• .4 effect size would be the equivalent of an average student moving to the top 35% of a group not involved in the innovation.

• A .7 effect size would move the US in a recent worldwide math test from the middle of the 40 countries to the top 5.

Page 9: Data Summer

Formative vs. SummativeFormative vs. Summative

• Formative is assessment along the way…

• Summative is an end of year, end of semester, end of quarter, end of unit test.

• Formative assessment comes before the summative assessment and should help the student perform better on the summative assessment as a result of classroom instructional practices.

Page 10: Data Summer

Black & Wiliam findings cont.Black & Wiliam findings cont.Black & Wiliam findings cont.Black & Wiliam findings cont.

• The research indicates that improving learning through assessment depends on five, deceptively simple, key factors:

1. the provision of effective feedback to pupils;

2. the active involvement of pupils in their own learning;

Page 11: Data Summer

Black & WiliamBlack & Wiliam findings cont.3. adjusting teaching to take account of the

results of assessment;4. a recognition of the profound influence

assessment has on the motivation and self-esteem of pupils, both of which are crucial influences on learning;

5. the need for pupils to be able to assess themselves and understand how to improve.

Page 12: Data Summer

Providing effective feedback to pupils

• What is effective feedback?

Page 13: Data Summer

Active involvement of pupils intheir own learning

• How can we involve students in their own learning?

Page 14: Data Summer

Adjusting teaching to take account of the results of assessment

• This goes back to our instructional cycle!

• What do we do with the assessment data?

Page 15: Data Summer

Recognition of the influence assessment has on motivation and self-esteem of pupils, both

of which are crucial influences on learning

• How can we use assessment to make students feel successful?

• How can we use assessment to motivate students?

• How can assessment damage a student?

Page 16: Data Summer

Students need to be able to assess themselves and understand how to improve

• How do we as teachers make this possible for our learners?

Page 17: Data Summer

Just 5 things...we can do that, right?!

Page 18: Data Summer

• At the same time, several inhibiting factors were identified. Among these are:

1. teachers tend to assess quantity of work and presentation rather than the quality of learning;

2. greater attention given to grading, much of it tending to lower the self-esteem of pupils, rather than to providing advice for improvement;

Page 19: Data Summer

Inhibiting factors cont.3. comparing students with each other

which demoralizes the less successful learners;

4. teachers’ feedback to students often serves social and managerial purposes rather than helping them to learn more effectively;

5. teachers not knowing enough about their students’ learning needs.

Page 20: Data Summer

Teachers tend to assess quantity of work and presentation rather than the quality of learning

• We grade work…what is the grade based on?• Doing it, not doing it?• Quality of learning?• Are all of our assignments (grades) standards

aligned?• UBD: Is the assignment related to the end result

we seek?• Crayola curriculum?

Page 21: Data Summer

Grading practices tend to lower self-esteem of pupils, rather than provide advice for improvement

• Do you grade students the way you were graded when you went to school?

• Do students have a chance to improve upon a grade?

• Does our grading allow student growth?

Page 22: Data Summer

We use assessment to compare pupils witheach other which demoralizes the

less successful learners

• What is the first thing a student does when you pass back a test?

• How can we get students to measure their learning not based on what others have done, but what the student has done themselves?

Page 23: Data Summer

Teachers’ feedback to pupils often serves social and managerial purposes rather than

helping them to learn more effectively

• What does feedback look like in your classroom?

• How could you make it more specific and meaningful?

• What if we assign less “gradable” work and increase the quality of the feedback

Page 24: Data Summer

Teachers not knowing enough about their pupils’ learning needs.

• What do you know about your students’ learning needs?

• How do you keep track of those needs?• Do those needs influence your interactions with

them?

Page 25: Data Summer

Assessment that promotes learning…Assessment that promotes learning…

• is embedded in a view of teaching and learning of which it is an essential part;

• involves sharing learning goals with pupils;• aims to help pupils to know and to recognize the standards they are aiming for;

• involves pupils in self-assessment;• provides feedback which leads to pupils recognizing their next steps and how to take them;

• is underpinned by confidence that every student can improve;

• involves both teacher and pupils reviewing and reflecting on assessment data.

Page 26: Data Summer

Instructional Cycle

What if they already know it?

Differentiate

Differentiate

Blooms update!

Page 27: Data Summer

Data CollectionData Collection• Item analysis without technology is almost

impossible (time-wise).

• Bubble tests -Remark OMR software

• ARS – Clickers

• Respondus/Bb tests• Rubrics (complex tasks, subjective)

• Exit cards (self-assessment with bubbles, subjective)

Page 28: Data Summer

Analyze the ResultsAnalyze the Results• Did the student(s) learn?• Can they demonstrate the skill?• What do the results of the analysis tell you in

terms of what to do next? (inform your instruction)

• Ideally we have teams (PLC) to discuss the findings

• What lessons, techniques, practices worked (tell the story)

• Retest/assess to measure growth (PLC) with questions that measure same objectives

Page 29: Data Summer

Pinnacle AnalyticsPinnacle Analytics• District’s data warehousing software• OAT, OGT, Stanford, Early Literacy, Grades,

other.• Still slow moving…data from spring testing not in

PA because SASI rosters aren’t built yet.• This means we can look at the data, but not in

the most useful ways.• We need admin to get class placements set

earlier.

Page 30: Data Summer

Excel

• PA allows you easily (really!) into Excel.

• What do you want to do with Excel?

• Rank order, compare your class to others

• Analyze growth from one grade to the next

• Means, Standard Deviation?

• Share results with students….allow them to play along with data results