47
How’s it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education, Lesley University

How’s it Working? Evaluating Your Program

  • Upload
    kaili

  • View
    23

  • Download
    2

Embed Size (px)

DESCRIPTION

How’s it Working? Evaluating Your Program. MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education, Lesley University. PERG. Founded 1976 Over 600 program evaluation and research studies in various educational settings - PowerPoint PPT Presentation

Citation preview

Page 1: How’s it Working?  Evaluating Your Program

How’s it Working? Evaluating Your Program

MAAPS Conference, 7 May 2010Debra Smith & Judah Leblang

Program Evaluation & Research GroupSchool of Education, Lesley University

Page 2: How’s it Working?  Evaluating Your Program

PERG Founded 1976

Over 600 program evaluation and research studies in various educational settings

Also offers professional development and consultation

Page 3: How’s it Working?  Evaluating Your Program

Session participants will:

Be introduced to the basics of program evaluation through an example

Define a question or questions about their own program

Identify methods for collecting data that would help to answer their question/s

Discuss next steps

Page 4: How’s it Working?  Evaluating Your Program

What is program evaluation?

A type of applied research focused on systematically collecting and analyzing data to help answer questions about a program, or some aspect of a program, in order to make decisions about it.

Page 5: How’s it Working?  Evaluating Your Program

Purposes

Accountability

Program development

Generating knowledge

Page 6: How’s it Working?  Evaluating Your Program

Formative vs Summative

Formative evaluation offers feedback along the way to improve programs

Summative evaluations “sum up” the results of a program at the end of a period of development or implementation.

Page 7: How’s it Working?  Evaluating Your Program

Audiences

Funders Program leaders Program participants Organizational partners Others

Page 8: How’s it Working?  Evaluating Your Program

Evaluation process

1. Goals/ logic model

2. Questions

3. Evaluation plan4. Data

collection

5. Data analysis

6. Reporting

PROGRAM

Page 9: How’s it Working?  Evaluating Your Program

An example: Evolutions

• After school program begun in 2005, connected with Peabody Museum of Natural History at Yale University—initially involved approximately 40 low SES/ minority students

Page 10: How’s it Working?  Evaluating Your Program
Page 11: How’s it Working?  Evaluating Your Program

Evolutions program goalsTo provide opportunities for students to:• Prepare for post-secondary (college)

education; • Learn about scientific—and other

careers; • Expand their knowledge of and interest

in science (science literacy); • Develop transferable skills for the

future; and • learn about the Peabody

Museum/museum careers.

Page 12: How’s it Working?  Evaluating Your Program
Page 13: How’s it Working?  Evaluating Your Program
Page 14: How’s it Working?  Evaluating Your Program
Page 15: How’s it Working?  Evaluating Your Program
Page 16: How’s it Working?  Evaluating Your Program
Page 17: How’s it Working?  Evaluating Your Program

Logic models

Map a coherent chain of connections between goals, resources, activities and what you expect (short term), want (over an intermediate period) and hope (in the long term) to happen.

They also reflect your assumptions and theory of action or change.

Page 18: How’s it Working?  Evaluating Your Program

Logic Model

Key Concepts

Category Resources or Inputs

Activities—

Outputs Short-term outcomes

Long-term outcomes

General information

Staff, funds, materials, space, etc

What we plan to

do/who we will do it

for

The results of

our program—direct outputs

Outcomes (changes) at completion

of the project year or soon after

Outcomes (changes)

several years

beyond completion

of the project

Page 19: How’s it Working?  Evaluating Your Program

And EVO exampleCategory Resources

or InputsActivities— Outputs Short-term

outcomesLong-term outcomes

General information

Staff, funds, materials, space, etc

What we plan to do/who we will do it for

The results of our program

—direct outputs

Outcomes (changes) at

completion of the project

year or soon after

Outcomes (changes)

several years beyond

completion of the project

EVO examples

Full time project

director funds from

Peabody Museum and other funders, classroom space, etc.

In-depth exploration of:

science topics

tours of Peabody

collections Yale scientist

labs

Students will meet:

at least 6 scientists students

will visit no less than 2

natural history museums

Students will: learn skills

associated with producing

a museum exhibition

Understand key science

themes

Students will: understand

different types of careers

within disciplines

understand the college application

process

be inspired to pursue a

career in the sciences

Page 20: How’s it Working?  Evaluating Your Program

Goal

Rationale

Assumptions

Resources Activities Outputs

Short-term outcomes

Mid-term outcomes

Long-term outcomes

Logic models may look different..

Page 21: How’s it Working?  Evaluating Your Program

Develop a logic model for your own program/ project

Page 22: How’s it Working?  Evaluating Your Program

Evaluation process

1. Goals/ logic model

2. Questions

3. Evaluation plan4. Data

collection

5. Data analysis

6. Reporting

PROGRAM

Page 23: How’s it Working?  Evaluating Your Program

Questions: Think Goldilocks

Specific but not too detailed

Important but not too broad in scope

Page 24: How’s it Working?  Evaluating Your Program

Key Questions: Part One

How does EVO prepare students for college or high school?

How are EVO students involved in developing an exhibit at the museum?

Do students develop increased “science literacy,” as defined by EVO staff?

Page 25: How’s it Working?  Evaluating Your Program

Key Questions: Part Two

How (if at all) do students express more confidence about and interest in doing science?

Are students more aware of careers in science?

How (if at all) do students demonstrate increased knowledge of the college application process, and develop criteria for choosing a college that meets their needs?

Page 26: How’s it Working?  Evaluating Your Program

What questions do you want to answer about your program?

Page 27: How’s it Working?  Evaluating Your Program

Evaluation process

1. Goals/ logic model

2. Questions

3. Evaluation plan4. Data

collection

5. Data analysis

6. Reporting

PROGRAM

Page 28: How’s it Working?  Evaluating Your Program

Data collection methods

Observation

Interviews/ focus groups

Surveys

Document/artifact review

Page 29: How’s it Working?  Evaluating Your Program

PERG Evaluation MatrixEvolutions 2005-06 Data collection activities>>

EVALUATION QUESTIONS:

Observe Evo students

Student focus groups

Interview project director

Review project docs and artifacts

Examine pre-post survey

Student prep for college/academic planning

√ √ √ √ √

Student involvement in museum exhibit

√ √ √

Students' development of science literacy

√ √ √ √

Student learning √ √ √ √ √

Students' interest in science/environment

√ √ √ √ √

Students' confidence in doing science

√ √ √

Students' interest in/knowledge of science careers

√ √ √ √

Page 30: How’s it Working?  Evaluating Your Program

Technical considerations: Validity

Will the data answer the questions?

Are we asking the right questions?

Page 31: How’s it Working?  Evaluating Your Program

Triangulation

Is there adequate triangulation (use of multiple methods and/or data sources) to ensure validity?

Page 32: How’s it Working?  Evaluating Your Program

Drafting your own matrix: What data will help you answer your questions?

Page 33: How’s it Working?  Evaluating Your Program

Evaluation process

1. Goals/ logic model

2. Questions

3. Evaluation plan4. Data

collection

5. Data analysis

6. Reporting

PROGRAM

Page 34: How’s it Working?  Evaluating Your Program

Collecting data

Make sure your plan is doable given time and resources available.

Design instruments to focus your data collection, ensure consistency and avoid bias.

Be organized: take notes, develop a system for tracking/ filing your data.

Page 35: How’s it Working?  Evaluating Your Program

Collecting data

Communicate clearly about what you are doing, why and how the findings will be shared and used.

Be mindful of human subjects protections. Does your organization have an institutional review board (IRB)?

Page 36: How’s it Working?  Evaluating Your Program

The First Year: site visit

On-site data collection• Focus groups with students• Interviews with director, project staff• Observation of end of year event• Parent interviews

Page 37: How’s it Working?  Evaluating Your Program

Evaluation process

1. Goals/ logic model

2. Questions

3. Evaluation plan4. Data

collection

5. Data analysis

6. Reporting

PROGRAM

Page 38: How’s it Working?  Evaluating Your Program

Analyzing data

What stands out?

What are the patterns?

What are the similarities?

What are the differences?

Is more information needed?

Page 39: How’s it Working?  Evaluating Your Program

Reliability

Are the patterns in the data, or judgments about the data, consistent?

Page 40: How’s it Working?  Evaluating Your Program

Validity, again

Is the data helping you answer the questions?

Is the data credible?

Page 41: How’s it Working?  Evaluating Your Program

Evaluation process

1. Goals/ logic model

2. Questions

3. Evaluation plan4. Data

collection

5. Data analysis

6. Reporting

PROGRAM

Page 42: How’s it Working?  Evaluating Your Program

Reporting

Consider purpose and audience/s

Reporting relevant findings, questions/ recommendations

Engaging stakeholders in discussion

Using findings to inform next steps

Page 43: How’s it Working?  Evaluating Your Program

Results of the first-year evaluation

• The impact of the evaluation on EVO—more focused program, clearer objectives, suggestions for sustainability.

• Evidence of program success: Retention, student engagement, positive changes in students’ view of doing science and scientists.

Page 44: How’s it Working?  Evaluating Your Program

The Ongoing Evaluation--shaping the program:

• Implementation of evaluator suggestions—examples: informational interviewing, developing a smaller exhibit, refining requirements for students

Page 45: How’s it Working?  Evaluating Your Program

EVO: 2006-Today

• Continued development and expansion of EVO—2006 until today: Expansion of the program from approximately 40 to more than 80 students, introduction of internships and Sci Corps.

– Different areas of science focus—environmental awareness, geoscience, depending on funding sources.

Page 46: How’s it Working?  Evaluating Your Program

Evaluation resources

W.K. Kellogg Foundation Evaluation Handbook www.wkkf.org/Pubs/Tools/Evaluation/Pub770.pdf

Kellogg Logic Model Development Guide www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdf

Basic Guide to Program Evaluationwww.managementhelp.org/evaluatn/fnl_eval.htm

Page 47: How’s it Working?  Evaluating Your Program

Evaluation resources

Program Evaluation & Research GroupLesley University29 Everett St.Cambridge, MA 02138www.lesley.edu/[email protected]