50
A Conversation about Program Evaluation: Why, How and When? Uzo Anucha, MSW; PhD Associate Professor – School of Social Work Director – Applied Social Welfare Research and Evaluation Group York University

Street Jibe Evaluation Workshop 2

Embed Size (px)

DESCRIPTION

Dr. Uzo Anucha - Workshop presentation -Streetjibe - Thinking Critically to Improve Program Effectiveness

Citation preview

Page 1: Street Jibe Evaluation Workshop 2

A Conversation about Program Evaluation:Why, How and When?

Uzo Anucha, MSW; PhDAssociate Professor – School of Social Work

Director – Applied Social Welfare Research and Evaluation GroupYork University

Page 2: Street Jibe Evaluation Workshop 2

2

Presentation Outline

Setting the Context for our Program Evaluation Work Our Evaluation Principles….. Why Evaluate? Who is An Evaluation For? Types of Evaluation

Outcome Evaluation Planning a Program Evaluation

Engage Stakeholders Focus the Evaluation Collect Data Analyze & Interpret Use the Information

Ready, Set, Go? Some Things to Consider…..

Page 3: Street Jibe Evaluation Workshop 2

Setting the Context for our

Program Evaluation Work

Page 4: Street Jibe Evaluation Workshop 2

4

Our Evaluation Principles…..

We are committed to the following principles/values in our evaluation work Strengthen projects Use multiple approaches Design evaluation to address real issues Create a participatory process Allow for flexibility Build capacity

(W.K. Kellogg Foundation Evaluation Handbook, 1998)

Page 5: Street Jibe Evaluation Workshop 2

5

Our Evaluation Approach….

A Critical Approach: Question the questions. Some questions to consider:

How does this program work? Why has it worked or not worked? For whom and in what circumstances? What was the process of development and implementation? What were the stumbling blocks faced along the way? What do the experiences mean to the people involved? How do these meanings relate to intended outcomes? What lessons have we learned about developing and implementing this program? How have contextual factors impacted the development, implementation, success, and stumbling blocks of this program? What are the hard-to-measure impacts of this program (ones that cannot be easily quantified)? How can we begin to effectively document these impacts?

(W.K. Kellogg Foundation Evaluation Handbook, 1998)

Page 6: Street Jibe Evaluation Workshop 2

6

Our Evaluation Approach….

We acknowledge the influence of paradigms, politics, and values and are willing to deal with these by:

Getting ‘inside’ the project Creating an environment where all stakeholders are encouraged to discus their values and philosophies Challenging our assumptions Asking stakeholders for their perspectives on particular issues Listening Remembering there may be multiple “right” answers Maintain regular contact and provide feedback to stakeholders Designing specific strategies to air differences and grievances. Make the evaluation and its findings useful and accessible. Early feedback and a consultative relationship with stakeholders and project staff leads to a greater willingness by staff to disclose important and sensitive information Sensitivity to the feelings and rights of individuals. Create an atmosphere of openness to findings, with a commitment to considering change and a willingness to learn.(W.K. Kellogg Foundation Evaluation Handbook, 1998)

Page 7: Street Jibe Evaluation Workshop 2

What is Not Program Evaluation?

What is Program Evaluation?

Page 8: Street Jibe Evaluation Workshop 2

8

Program evaluation is not an assessment of individual staff performance. The purpose is to gain an overall understanding of the functioning of a program. Program evaluation is not an audit – evaluation does not focus on compliance with laws and regulations. Program evaluation is not research. It is a pragmatic way to learn about a program.

What is Not Program Evaluation?

Page 9: Street Jibe Evaluation Workshop 2

9

Program evaluation is not one method. It can involve a range of techniques for gathering information to answer questions about a program.

Most programs already collect a lot of information that can be used for evaluation. Data collection for program evaluation can be incorporated in the ongoing record keeping of the program.

What is Not Program Evaluation?

Page 10: Street Jibe Evaluation Workshop 2

10

Program evaluation means taking a systematic approach to asking and answering questions about a program.

Program evaluation is a collection of methods, skills and sensitivities necessary to determine whether a human service is needed and likely to be used, whether it is sufficiently intensive to meet the unmet needs identified, whether the service is offered as planned, and whether the human service actually does help people in need at reasonable cost without undesirable side effects (Posavac & Carey, 2003. p.2)

What is Program Evaluation?

Page 11: Street Jibe Evaluation Workshop 2

Why Evaluate?

Page 12: Street Jibe Evaluation Workshop 2

12

Verify that resources are devoted to meeting unmet needs

Verify that planned programs do provide services

Examine the results Determine which services produce the best

results Select the programs that offer the most needed

types of services

Why Evaluate?

Page 13: Street Jibe Evaluation Workshop 2

13

Provide information needed to maintain and improve quality

Watch for unplanned side effects Create program documentation Help to better allocate program resources Assist staff in program development and

improvement

Why Evaluate?

Page 14: Street Jibe Evaluation Workshop 2

14

Evaluation can….

• Increase our knowledge base• Guide decision making

– Policymakers– Administrators– Practitioners– Funders– General public– Clients

• Demonstrate accountability• Assure that client objectives are being achieved

Page 15: Street Jibe Evaluation Workshop 2

Who is an evaluation for?

Page 16: Street Jibe Evaluation Workshop 2

16

What do they want to know?

What do we want to tell them about the program?

How can they contribute to the evaluation?

Program participants? Family members and

caregivers? Program staff? Volunteers? Partner agencies and

professionals? Referral sources? Funders? Others?

Who is an evaluation for?

Page 17: Street Jibe Evaluation Workshop 2

Types of Evaluation….

Page 18: Street Jibe Evaluation Workshop 2

18

Types of evaluations

• Needs assessment• Evaluability assessment• Process evaluation• Outcome evaluation• Efficiency evaluation (cost evaluation)

Page 19: Street Jibe Evaluation Workshop 2

Process Evaluation….

Page 20: Street Jibe Evaluation Workshop 2

20

Process Evaluation

Sometimes referred to as “formative evaluation” Documents and analyzes how a program works and

identifies key factors that influence the operation of the program.

Allows for a careful description of a program’s actual implementation and services therefore facilitating the replication of the program.

Emphasis is on describing activities and characteristics of clients and workers.

Allows for an investigation of whether services are delivered in accordance with program design and makes it possible to study the critical ingredients of a model.

Page 21: Street Jibe Evaluation Workshop 2

21

Process Evaluation

Findings of a process evaluation are critical in shaping further development of a program’s services and assists in explaining why program objectives are (or are not) being met.

Focuses on verifying program implementation… looks at the approach to client service delivery...day to day operations

Two major elements: – 1) how a program’s services are delivered to clients (what

worker’s do including frequency and intensity; client characteristics; satisfaction

– 2) administrative mechanisms to support these services (qualifications; structures; hours; support services; supervision; training)

Page 22: Street Jibe Evaluation Workshop 2

22

Process Evaluation:

Examples of Process Evaluation Questions:– Is the program attracting a sufficient number of

clients?– Are clients representative of the target

population?– How much does the staff actually contact the

client?– Does the workload of staff match that

planned?– Are there differences in effort among staff?

Page 23: Street Jibe Evaluation Workshop 2

Outcome Evaluation….

Page 24: Street Jibe Evaluation Workshop 2

24

Outcome Evaluation

Outcomes are benefits or changes for individuals or populations during or after participating in program activities. Outcomes may relate to behavior, skills, knowledge, attitudes, values, condition, or other attributes.

They are what participants know, think, or can do; or how they behave; or what their condition is, that is different following the program.

Outcome evaluation helps us to demonstrate the nature of change that took place

Page 25: Street Jibe Evaluation Workshop 2

25

Outcome Evaluation

Outcome evaluation tests hypotheses about how we believe that clients will change after a period of time in our program.

Evaluation findings are specific to a specific group of clients experiencing the specific condition of one specific program over a specific time frame at a specific time.

Page 26: Street Jibe Evaluation Workshop 2

26

For example:

A program to counsel families on financial management, outputs--what the service produces--include the number of financial planning sessions and the number of families seen. The desired outcomes--the changes sought in participants' behavior or status--can include their developing and living within a budget, making monthly additions to a savings account, and having increased financial stability.

Page 27: Street Jibe Evaluation Workshop 2

27

Uses of Outcome Evaluation

Improving program services to clients Generating knowledge for the profession Estimating costs Demonstrate nature of change...evaluation of program objectives e.g. what we expect clients to achieve Guide major program decisions and program activities

Page 28: Street Jibe Evaluation Workshop 2

28

Outcome Evaluation

Describe program effects– Is the desired outcome observed?– Are program participants better off than non-

participants?– Is there evidence that the program caused the

observed changes?– Is there support for the theoretical foundations

underpinning the program?– Is there evidence that the program could be

implemented successfully elsewhere?

Page 29: Street Jibe Evaluation Workshop 2

29

Program-Level Evaluations

•Program level evaluations vary on a continuum and are fundamentally made up of three levels

–Exploratory–Descriptive –Explanatory

Page 30: Street Jibe Evaluation Workshop 2

30

Program-Level Evaluations

•Program level evaluations vary on a continuum and are fundamentally made up of three levels

–Exploratory–Descriptive –Explanatory

Page 31: Street Jibe Evaluation Workshop 2

31

Exploratory Outcome Evaluation Designs

Questions here include:–Did the participants meet a criterion (e.g. Treated vs. Untreated)?–Did the participants improve (e.g. appropriate direction)?–Did the participants improve enough (e.g. statistical vs. meaningful difference)?–Is there a relation between change and service intensity and participant characteristics?

Page 32: Street Jibe Evaluation Workshop 2

32

Exploratory Designs

•One group post test only•Multi-group post test only•Longitudinal case study•Longitudinal survey

Page 33: Street Jibe Evaluation Workshop 2

33

Strengths of Exploratory Designs

•Less intrusive and inexpensive•Assess the usefulness and feasibility of further evaluations•Can correlate improvement with other variables.

Page 34: Street Jibe Evaluation Workshop 2

34

Descriptive Designs

• To show that something causes something else, it is necessary to demonstrate:1. That the cause precedes the supposed effects in

time e.g. that an intervention precedes the change2. That the cause covaries with the effect – the change

covaries with the intervention – the more the intervention, the more the change.

3. That no viable explanation of the effect can be found except for the assumed cause e.g. there can be no other explanation for the change except the intervention.

• Both 1 and 2 can be achieved with exploratory designs…but not 3.

Page 35: Street Jibe Evaluation Workshop 2

35

Descriptive Designs

•Randomized one-group posttest only•Randomized cross-sectional and longitudinal survey•One-group pretest-posttest•Comparison group posttest only•Comparison group pretest-posttest•Interrupted time series 

Page 36: Street Jibe Evaluation Workshop 2

36

Explanatory Designs

Defining characteristic is observation of people randomly assigned to either a program or control condition.

• Considered much better at addressing threats to internal validity

• Program group vs. Control group: if groups are formed randomly there is no reason to believe they differ in rate of maturation; no self selection into groups; groups did not begin at different levels

Page 37: Street Jibe Evaluation Workshop 2

37

Explanatory Designs

•Classical experimental•Solomon four group•Randomized posttest only control group

Page 38: Street Jibe Evaluation Workshop 2

38

Explanatory Designs

• Strengths/Limitations:– counter threats to internal validity– allow interpretations of causation– expensive and difficult to implement– frequently resistance from practitioners who

already know what is best•  Suggested Times to Use:

– when new program is introduced– when stakes are high– when there is controversy over efficacy– when policy change is desired– when program demand is high

Page 39: Street Jibe Evaluation Workshop 2

Planning a Program Evaluation

Page 40: Street Jibe Evaluation Workshop 2

40

Planning a Program Evaluation

Engage Stakeholders Focus the Evaluation Collect Data Analyze & Interpret Use the Information

Page 41: Street Jibe Evaluation Workshop 2

41

Engage Stakeholders

Who should be involved?

How might they be engaged? Identify & meet with stakeholders – program director, staff, funders/program sponsors and clients/program participants.

Page 42: Street Jibe Evaluation Workshop 2

42

Focus the Evaluation

What are you going to evaluate? (Describe program logic model/theory of change) What is the evaluability of the program? What is the purpose of the evaluation? Who will use the evaluation? How will they use it? What questions will the evaluation seek to answer? What information do you need to answer the questions? When is the evaluation needed? What evaluation will you use?

Page 43: Street Jibe Evaluation Workshop 2

43

Collect Data

What sources of information will you use? Intended beneficiaries of the program (program participants, artifacts, community indexes) Providers of service (program staff, program records)Observers (expert observers, trained observers, significant others, evaluation staff)

What data collection method (s) will you use? When will you collect data for each method you’ve chosen?

Page 44: Street Jibe Evaluation Workshop 2

44

Analyze & Interpret

How will the data be analyzed? Data analysis methods Who is responsible

How will the information be interpreted – by whom?

What did you learn?

What are the limitations?

Page 45: Street Jibe Evaluation Workshop 2

45

Use the Information

How will the evaluation be communicated and shared?

To whom? When? Where? How to present?

Next steps

Page 46: Street Jibe Evaluation Workshop 2

Ready, Set, Go?

Some things to consider…..

Page 47: Street Jibe Evaluation Workshop 2

StreetJibe: Summary of Process and

Outcome Evaluation Questions

Page 48: Street Jibe Evaluation Workshop 2

48

Things to Consider…..

• Planning an evaluation follows similar steps to the conduct of more basic research with some additional considerations

• More effort needs to be expended in engaging and negotiating with stakeholder groups

• There needs to be a keener awareness of the social/political context of the evaluation (e.g. differing and competing interests)

Page 49: Street Jibe Evaluation Workshop 2

49

Important to consider…

• Internal or external evaluators?• Scope of evaluation?

– Boundary– Size– Duration– Complexity– Clarity and time span of program

objectives– Innovativeness

Page 50: Street Jibe Evaluation Workshop 2

50

Challenging Attitudes toward Program Evaluation…….

• Expectations of slam-bang effects• Assessing program quality is unprofessional• Evaluation might inhibit innovation• Program will be terminated• Information will be misused• Qualitative understanding might be lost• Evaluation drains resources• Loss of program control• Evaluation has little impact