27
Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre- Conference Workshop 19 th and 20 th May 2014

Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Embed Size (px)

Citation preview

Page 1: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Designing Influential EvaluationsSession 4Approaches, Methods & Tools

Uganda Evaluation Week - Pre-Conference Workshop19th and 20th May 2014

Page 2: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

2

Questions

Theory/Approach

Methods

Tools

Sequence of planning

Page 3: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Four contrasting key questions

To what extent can a specific (net) impact be attributed to the intervention?• Conditions that suit experiments and statistical models

Did the intervention make a difference?

• Contributory causes and causal packages

How has the intervention made a difference?

• Explanation and importance of theory

Will the intervention work elsewhere?

• External validity, transferability and generalisation

3

Page 4: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

4

THEORIES & APPROACHES

Page 5: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

5

Why are theories and approaches important? Different approaches to evaluation help to:

define the evaluation purpose and process determine different levels of participation draw boundaries for role of the evaluator

Theories and approaches: are linked to a specific philosophical orientation, design

and methodology stress the importance of socio-political and contextual

factors have considerable overlap have strengths and weaknesses

c. Helen Simons and Georgie Parry-Crooke

Page 6: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

6

Evaluation theories give guidance

The nature of what we evaluate

How evaluators

should practice in the

real world

How to assign value to

programmes and their

performance

How to construct

knowledge

How to use knowledge

generated by evaluation

… “offer a set of rules,

prescriptions, prohibitions, and

guiding frameworks that specify what a good or proper evaluation is

and how evaluations

should be done”(Alkin

2004)

Page 7: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

7

Theories and approaches: a personal selection

Utilization

focused

Results basedTheory

based

Democratically oriented

Realist

Empowerment

Participatory

Goal oriented

Case study

Responsive

Meta-evaluatio

n

Experiment-alism

Value for

moneyCost

benefit analysis

Page 8: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

8

Results based evaluation Public management agenda Assessment against planned targets Helps establish key goals and

outcomes Permits managers to identify and take

action to correct weaknesses Supports a development agenda that

is shifting towards greater accountability

Page 9: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Participatory evaluation

Joint enterprise – engages participants at all stages of planning and implementation.

Akin to Empowerment Evaluation (different claims) Reduces threat of evaluation – transparent

process Recognizes value of all participants’ contributions Responsive to specific cultural/political context Shifts focus of who does evaluation and how - collaborative

roles/responsibilities Promotes understanding between partners Increases evaluation capacity and use - involving those

affected by outcome Close links to utilization-focused

9

Page 10: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

10

Theory-based evaluationIncreasingly common for development evaluationClose links to managing for results agendaExamines how a programme is expected to workCentral to many donors’ evaluation policyBrings together elements of:

◦ social science theory the principles that shape social behaviour. e.g. social

cognitive learning theory; theory of health behaviour change; etc

◦ programme theory the assumptions that guide the way specific programmes,

treatments or interventions are implemented and expected to bring about change

Helps identify performance dimensions most critical to a programme’s success

Page 11: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

11

Discussion Exercise

Consider two contrasting approaches: Participatory and Results-based Evaluation.

Working in small groups try to identify settings in which one or the other would be the preferable approach to use.

Set out your thinking with notes about strengths and weaknesses of each approach on a flipchart.

Page 12: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

12

Discussion exercise

Work in groups of two or three and discuss the following question.

Why have theories; what benefits do they bring to evaluators?

List your ideas for discussion in plenary.

Page 13: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

13

Discussion feedback - Why have theories?Common language among evaluatorsUnique knowledge baseDistinct identityReflects underlying concernsFacilitate communication among evaluatorsHelp understand and share good practicesProvide a rationale for evaluation proceduresHelp fit needs of stakeholdersChoice for appropriateness in different

settings

© Derek Poate, Helen Simons and Georgie Parry-Crooke

Page 14: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

14

SELECTING METHODS

Page 15: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

What is ‘Quality’ & ‘Rigour’?

15(c) Eliot Stern for UKES

Is it biased?

Is it precise?

Are methods sound?

Are they well-used?

Can I trust findings?

All evaluation and research faces quality challenges:

Page 16: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Why definitions matter:Common definitions of “impact”

Positive and negative, primary and secondary

long-term effects produced by a

development intervention, directly or indirectly,

intended or unintended (OECD/DAC)

The difference in the indicator of interest with the intervention and without the

intervention... tackling attribution by rigorously

identifying a counterfactual value (e.g. 3ie / J-PAL)

•Search for any effect•Longer-term change•Effects may be +ve and –ve•Effects are somehow ‘produced’ (a weaker notion of causality)

•Search for a given effect•Tends to be methods-led•Emphasis on attribution (direct link between cause and effect)•Reliance on counterfactual logic 16Sources: Stern et al (2012: 6); OECD-DAC (2002); White (2010).

Page 17: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Broadening Impact Evaluation

“Impact evaluations are evaluations that assess the contribution of an intervention towards some outcome or

goal.

The contribution may be intended or unintended, positive or negative, long-term or short-term.

Impact evaluations attempt to identify a clear link

between causes and effects, and explain how the intervention worked and for whom”

Different approaches to causal inference

Power relations – who counts?

Explanations – mixed methods 17

Page 18: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Appropriate DesignFactors to consider:Evaluation purpose : Who will use the

evaluation, and what for? Evaluation Questions: What needs to be

measured? Is it ‘net effect’? Is it ‘how the impact occurred’?

Attributes of the intervention (+context): Small-N interventions/ long term and intangible effects/etc

‘Right Rigour’: how much certainty is needed?

Resources available, and Time constraints

18Sources: Stern et al (2012); Clemens & Demombynes (2013); White & Philips (2012).

Page 19: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Impact Evaluation Design Options

19Sources: Mayne (2013); Stern et al (2012).

Counterfactual frameworks that depend on the difference between two otherwise identical cases(Experimental and Quasi-

experimental designs)

Regularity frameworks that depend on the frequency of association between cause

and effect(Statistical approaches)

Comparative frameworks that depend on combinations

of causes that lead to an effect

(Case-based’ approaches, simulations and network

analysis)

Generative frameworks that depend on identifying

the causal links and mechanisms that explain

effects(Theory-based approaches)

Design Options for assessing causationNon-experimental

designs

Page 20: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Two defining features of ‘Impact Evaluation’Impact Evaluation =1. The need to demonstrate some causal effect

(i.e. that the intervention caused an effect or impact to happen).

2. The need to provide explanatory analysis by answering ‘how’ and ‘why’ this effect came about – as well as for whom.

Impact Evaluation is NOT the same as Process Evaluation – which considers a broader range of questions on programme relevance, the attainment of objectives, plus the quality and efficiency of implementation.

20

Page 21: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

DATA COLLECTION

Page 22: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

22

Case study

RCT/ quasi

experiment

Choices in data collection

Purposive or

random samplin

g Economi

c modellin

g

Document

reviewParticipator

y/ qualitative/

FGD

Direct measure-

ment

Individual/ group key informant interviews

Questionnaire survey

Page 23: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

QUANT vs. QUAL designs

Strengths of QUANT designs Strengths of QUAL designs

• Statistical control of selection bias

• Statistical confidence when generalising from a sample to pop.

• Quantification (numbers) for outcomes and impacts

• Ability to replicate data collection and analysis

• Accepted standards for sampling, data collection and analysis.

But common weaknesses:• Difficulty capturing sensitive

inform-ation, and hard-to-reach groups

• Data reduction losses information

• No analysis of context

• Examines the broader context within which a programme operates

• Flexibility to evolve, and provide a more holistic view

• Multiple sources provide complex understanding and interpretation

• Ability to capture views on sensitive issues, and from the marginalised

• Accessible to non-specialistsBut common weaknesses:• Lack of generalizability• Hard to reach consensus with

multiple views• Subjective, and prone to bias

23Sources: Bamberger (2013).

Page 24: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Design optionsFramework Inference Methods

Counterfactual Difference between identical or matched cases

RCTDifference in differencePropensity score matching

Generative (theory-based)

Identify and confirm causal processes

Theory of changeContribution analysisRealist evaluationCongruence analysis

Comparative (case based)

Comparison and across and within cases of causal factors

Grounded theoryEthnographyQCA

Participatory Validation by participants of effect caused by interventions

Participatory or democratic evaluationLearning by doingMost significant change

Page 25: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

Evening exerciseRead the handout texts dealing

with arguments for and against the use of RCTs and quasi-experimental designs.

Prepare for a short debate to argue the case for and against use of experimental approaches in evaluation. You will be asked to argue for a specific point of view for the debate.

Page 26: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

SummaryTheories and approaches: guide generic

orientation that influences the choice of methods and the spirit in which methods and tools and are selected and used.

Methods: a way of carrying out an evaluation or a type of evaluation (using one or more tools) in line with an evaluation approach (research design).

Tools: describe standard techniques for collecting and analysing data.

Evaluation design needs to deal explicitly with how questions of attribution will be dealt with and the nature of causal inference.

Page 27: Designing Influential Evaluations Session 4 Approaches, Methods & Tools Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014

END