Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

  • View

  • Download

Embed Size (px)


This workshop focused on evaluation tips and tools, lessons learned, and mistakes to avoid. It was designed for those charged with leading evaluation at their organizations.

Text of Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

  • 1. If I Knew Then What I Know Now:Building aSuccessfulEvaluation Roblyn Brigham, Brigham Nahas Research Associates Andy Hoge, New Jersey SEEDS Janet Smith, The Steppingstone Foundation Danielle Stein Eisenberg, KIPP Foundation April 8, 2010

2. Overview and Introduction

  • The workshop focus
    • if I knew then, what I know now
  • Presentation Outline
  • Introduction of Panelists

3. Internal and External Evaluation

  • The What, Why, and Who
  • New Jersey SEEDS Alumni Follow-up Study
    • The story from inside and out

4. Evaluation Planning:Factors to Consider

  • Organizational Characteristics
  • Data Collection Capacity
  • Data Analysis

5. Organizational Characteristicsand Evaluation Design

  • Size and Structure of Organization
  • Culture of Organization
  • Age of Organization
  • Nature of the Program Offering(s)

6. Designing Evaluation:Non-negotiables

  • Identify programmatic or evaluation goals upfront during the program design stage
  • Involve key stakeholders at all phases, including analysis
  • Articulate Evaluable Questions
  • Articulate Action Plan for Using Results
    • Short-term & Long-term Evaluation Plan

7. Data Collection:Capacity and Commitment

  • What Skills for What Aspects of Collection?
  • Standardize Terminology:e.g., enrollment, placement
  • Monitor Data Integrity and Accuracy

8. Data Analysis:Capacity and Action

  • Who is Involved in Analyzing the Data?
    • Skills
    • Key Points of View
    • What jumps out? What is missing?
  • Prioritize Action to be Taken in Response to Analysis

9. Presenting Results:Know Your Audience

  • The Presenter and the Audience
  • Lessons learned:
    • Making Claims, Issues of Accuracy
    • Multiple Audiences: Most Effective Format
    • Audience Response

10. Test Results Over Time 11. Test Results Over Time 12. 13. Additional Slides

  • Evaluation Design Tool (KIPP)
  • Vision Mapping (KIPP)

14. The KSS core team articulated specific goals, objectives, and metrics for the event (which mapped back to the overall vision).Strand leads did the same.

    • The Board Strands EvaluationP lanningT ool

Strand Goal Objective Metric Evaluation Tool Boards Board members should be inspired by KIPP's mission and energized to contribute as Board members Board members will feel inspired to continue their work with KIPP 95% of board members will indicate that they feel somewhat or very inspired to continue their work with KIPP Strand Survey Board members should feel part of a network-wide Board community, and national reform movement, rather than just a supporter of a local KIPP effort. Board members will feel like part of a national network 90% of board members will indicate that they feel somewhat or very connected to a national community Strand Survey Board members should learn practical skills and/or obtain tools that will enhance their Board's effectiveness Board members should leave KSS with at least one tool or practical skill they can immediately put to use Can name 1 tool or skill they used Strand Survey and Follow-Up Survey Board members should learn about KIPP initiatives that are meaningful to their Board service - e.g. KIPP share Board members will leave KSS knowing about national initiatives Can name 2 KIPP initiatives that are relevant to their region or school Strand Survey 15. Executive summary: KSS 2009 successfully delivered against our vision; per-participant costs were lowest level ever

    • 1,810 people attended KSS 2009 up 16% from last year and 7th consecutive year of record attendance
    • 94% of respondents strongly (70%) or somewhat agree (24%) that KSS enhances their sense of belonging to the KIPP community

Collective Power; Intro/ Reconnection Network; Share, Reflect, and Learn Personal Learning Kick off school year with high energy; Renew collectivecommitment Per-participant costs were lowest level ever Please see the appendix for the supporting data to the bullet points below.

    • 91% of attendees strongly (50%) or somewhat agree (41%) that KSS provides opportunities to learn helpful strategies from colleagues in other schools or organizations.
    • Thetop two reasonswhy teacher respondents attend KSS are I value the Professional Development opportunities KSS provides and I came to learn new instructional strategies
    • 90% of all respondentsattendees strongly (59%) or somewhat agree (31%)that they learned new ideas and strategies at KSS that they could directly apply to their
    • 94% of all overall session ratings were either excellent (34%) or good (60%)
    • 94% of respondents strongly (70%) or somewhat agree (24%) that KSS renewed my sense of purpose in my work
    • KSS was fun!
    • KSS 2009 costs were 11% higher than originally budgeted due primarily to attendance being 13% higher than projected
    • resulting in KSS 2009 per-participants costs being our lowest ever, down 5.5% from previous low in 2007

16. The Lie Factor ( The Visual Display of Quantitative Information, 2 E.R. Tufte, 2001) Los Angeles Times Aug. 5, 1979, p. 3