How’s it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education,

Embed Size (px)

Citation preview

  • Slide 1
  • Hows it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education, Lesley University
  • Slide 2
  • PERG Founded 1976 Over 600 program evaluation and research studies in various educational settings Also offers professional development and consultation
  • Slide 3
  • Session participants will: Be introduced to the basics of program evaluation through an example Define a question or questions about their own program Identify methods for collecting data that would help to answer their question/s Discuss next steps
  • Slide 4
  • What is program evaluation? A type of applied research focused on systematically collecting and analyzing data to help answer questions about a program, or some aspect of a program, in order to make decisions about it.
  • Slide 5
  • Purposes Accountability Program development Generating knowledge
  • Slide 6
  • Formative vs Summative Formative evaluation offers feedback along the way to improve programs Summative evaluations sum up the results of a program at the end of a period of development or implementation.
  • Slide 7
  • Audiences Funders Program leaders Program participants Organizational partners Others
  • Slide 8
  • Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM
  • Slide 9
  • An example: Evolutions After school program begun in 2005, connected with Peabody Museum of Natural History at Yale Universityinitially involved approximately 40 low SES/ minority students
  • Slide 10
  • Slide 11
  • Evolutions program goals To provide opportunities for students to: Prepare for post-secondary (college) education; Learn about scientificand other careers; Expand their knowledge of and interest in science (science literacy); Develop transferable skills for the future; and learn about the Peabody Museum/museum careers.
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • Slide 16
  • Slide 17
  • Logic models Map a coherent chain of connections between goals, resources, activities and what you expect (short term), want (over an intermediate period) and hope (in the long term) to happen. They also reflect your assumptions and theory of action or change.
  • Slide 18
  • Logic Model Key Concepts CategoryResources or Inputs Activities OutputsShort-term outcomes Long-term outcomes General information Staff, funds, materials, space, etc What we plan to do/who we will do it for The results of our program direct outputs Outcomes (changes) at completion of the project year or soon after Outcomes (changes) several years beyond completion of the project
  • Slide 19
  • And EVO example CategoryResources or Inputs ActivitiesOutputsShort-term outcomes Long-term outcomes General information Staff, funds, materials, space, etc What we plan to do/who we will do it for The results of our program direct outputs Outcomes (changes) at completion of the project year or soon after Outcomes (changes) several years beyond completion of the project EVO examples Full time project director funds from Peabody Museum and other funders, classroom space, etc. In-depth exploration of: science topics tours of Peabody collections Yale scientist labs Students will meet: at least 6 scientists students will visit no less than 2 natural history museums Students will: learn skills associated with producing a museum exhibition Understand key science themes Students will: understand different types of careers within disciplines understand the college application process be inspired to pursue a career in the sciences
  • Slide 20
  • Goal Rationale Assumptions Resources ActivitiesOutputs Short-term outcomes Mid-term outcomes Long-term outcomes Logic models may look different..
  • Slide 21
  • Develop a logic model for your own program/ project
  • Slide 22
  • Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM
  • Slide 23
  • Questions: Think Goldilocks Specific but not too detailed Important but not too broad in scope
  • Slide 24
  • Key Questions: Part One How does EVO prepare students for college or high school? How are EVO students involved in developing an exhibit at the museum? Do students develop increased science literacy, as defined by EVO staff?
  • Slide 25
  • Key Questions: Part Two How (if at all) do students express more confidence about and interest in doing science? Are students more aware of careers in science? How (if at all) do students demonstrate increased knowledge of the college application process, and develop criteria for choosing a college that meets their needs?
  • Slide 26
  • What questions do you want to answer about your program?
  • Slide 27
  • Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM
  • Slide 28
  • Data collection methods Observation Interviews/ focus groups Surveys Document/artifact review
  • Slide 29
  • PERG Evaluation Matrix Evolutions 2005-06 Data collection activities>> EVALUATION QUESTIONS: Observe Evo students Student focus groups Interview project director Review project docs and artifacts Examine pre-post survey Student prep for college/academic planning Student involvement in museum exhibit Students' development of science literacy Student learning Students' interest in science/environment Students' confidence in doing science Students' interest in/knowledge of science careers
  • Slide 30
  • Technical considerations: Validity Will the data answer the questions? Are we asking the right questions?
  • Slide 31
  • Triangulation Is there adequate triangulation (use of multiple methods and/or data sources) to ensure validity?
  • Slide 32
  • Drafting your own matrix: What data will help you answer your questions?
  • Slide 33
  • Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM
  • Slide 34
  • Collecting data Make sure your plan is doable given time and resources available. Design instruments to focus your data collection, ensure consistency and avoid bias. Be organized: take notes, develop a system for tracking/ filing your data.
  • Slide 35
  • Collecting data Communicate clearly about what you are doing, why and how the findings will be shared and used. Be mindful of human subjects protections. Does your organization have an institutional review board (IRB)?
  • Slide 36
  • The First Year: site visit On-site data collection Focus groups with students Interviews with director, project staff Observation of end of year event Parent interviews
  • Slide 37
  • Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM
  • Slide 38
  • Analyzing data What stands out? What are the patterns? What are the similarities? What are the differences? Is more information needed?
  • Slide 39
  • Reliability Are the patterns in the data, or judgments about the data, consistent?
  • Slide 40
  • Validity, again Is the data helping you answer the questions? Is the data credible?
  • Slide 41
  • Evaluation process 1. Goals/ logic model 2. Questions 3. Evaluation plan 4. Data collection 5. Data analysis 6. Reporting PROGRAM
  • Slide 42
  • Reporting Consider purpose and audience/s Reporting relevant findings, questions/ recommendations Engaging stakeholders in discussion Using findings to inform next steps
  • Slide 43
  • Results of the first-year evaluation The impact of the evaluation on EVO more focused program, clearer objectives, suggestions for sustainability. Evidence of program success: Retention, student engagement, positive changes in students view of doing science and scientists.
  • Slide 44
  • The Ongoing Evaluation-- shaping the program: Implementation of evaluator suggestionsexamples: informational interviewing, developing a smaller exhibit, refining requirements for students
  • Slide 45
  • EVO: 2006-Today Continued development and expansion of EVO2006 until today: Expansion of the program from approximately 40 to more than 80 students, introduction of internships and Sci Corps. Different areas of science focus environmental awareness, geoscience, depending on funding sources.
  • Slide 46
  • Evaluation resources W.K. Kellogg Foundation Evaluation Handbook www.wkkf.org/Pubs/Tools/Evaluation/Pub770.pdf www.wkkf.org/Pubs/Tools/Evaluation/Pub770.pdf Kellogg Logic Model Development Guide www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdf www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdf Basic Guide to Program Evaluation www.managementhelp.org/evaluatn/fnl_eval.htm
  • Slide 47
  • Evaluation resources Program Evaluation & Research Group Lesley University 29 Everett St. Cambridge, MA 02138 www.lesley.edu/perg.htm 617-349-8172 [email protected]