17
Evaluating Outcomes Across the Partnerships Tom Loveless Director, Brown Center on Education Policy The Brookings Institution [email protected] Saturday, June 14, 2003

Evaluating Outcomes Across the Partnerships Tom Loveless Director, Brown Center on Education Policy The Brookings Institution [email protected] Saturday,

Embed Size (px)

Citation preview

Evaluating Outcomes Across the Partnerships

Tom Loveless

Director, Brown Center on Education Policy

The Brookings Institution

[email protected]

Saturday, June 14, 2003

What is evaluation?

• Project to determine a program’s effectiveness

• Find out if money has been spent as intended

• Offer feedback on effective and ineffective elements of a project

Challenges to Effective Evaluation

• Complex outcomes

• Complex causes

• Politically controversial

Complex outcomesExample: professional development

• Satisfaction

• Behavioral change

• Systemic change

• Learning-- teacher or student

Complex causesExample: an innovation’s effect on student learning

• Factors outside school (e.g. parents, peers)

• Factors inside school--composition of classroom, change in personnel

• Other reforms occurring simultaneously

• Teacher effects (e.g. skills, attitudes)

Politically controversial

• Every education program has a constituency

• High stakes--continued funding, reputations

Why evaluate?• Determine whether you are using your money

effectively or not

• Parts of program may work better than others-- re-tool

• Are you meeting the goals of your program?– Can you get the same results with fewer activities?

– How do participants in your program fare against a comparable group of non-participants?

Examples of Good and Bad Evaluations in Education

• Annenberg Grant-- case studies of systemic reforms

• STAR Tennessee-- randomized field trials of reduced class size

Benefits of Randomized Field Trials

• Gold standard of evaluation and evidence in health sciences--drugs, treatments--and increasingly in policy fields

• Equalizes treatment and control groups

• Controls for unobserved characteristics--selection effects

• Isolates treatment effects

Evaluation of the MSPs

• I will focus on one of the MSP components– Summer institutes offering professional

development to increase teacher content knowledge

• Goals (outcomes to assess)– short term: increase teachers’ content

knowledge in mathematics– long term: increase student achievement in

mathematics

Key Qualities of Model Summer Institutes for Middle School Math

Teachers

• Expertise: conducted by mathematicians from college and university math departments

• Duration: 4-6 week summer sessions, with regular follow up during school year. Approximately 300 hours

• Content

– Basics I (whole #, integers, decimals)

– Basics II (fractions, rates, ratios, proportions, percents)

– Algebra

– Geometry

Design of Evaluation

• Use lottery to assign over-subscribed institute slots, setting up randomized field trials (work with math departments)

• pre- and post-testing of participating teachers and their students

• express learning gains in effect sizes (sd units of the pre-test)

Teacher Test -- Key elements

• Content of test matched to materials in summer institutes—mathematically sound

• no pedagogy or other extraneous topics• criterion-referenced-- purpose not to find

out where teachers fall on the distribution of scores, but to establish minimal levels of proficiency to teach mathematics

Comparison Groups

• Randomized field trials

• Matched pairs

• Participants vs everyone else

• Pre- and post-treatment, change over time

The Problem of Middle School Math

• Middle school math teachers with elementary teaching certificates, insufficient math training, and inadequate content knowledge

• push to increase the percentage of 8th graders taking algebra means an increasing number of teachers with inadequate content knowledge

• problem not simply mastery of algebra, but of mathematical content leading up to algebra and learning how algebra connects with other mathematical fields (e.g. geometry, trig., calculus)

Tom Loveless, The 2001 Brown Center Report on American Education, The Brookings Institution.

Tom Loveless, The 2001 Brown Center Report on American Education, The Brookings Institution.