67
J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Embed Size (px)

Citation preview

Page 1: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

J-PAL Executive Education Course

Outcomes, Indicators and Measuring Impact

19 January 2015

Isaac M. MbitiUniversity of Virginia

Page 2: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Course Overview

1. What is Evaluation?

2. Outcomes, Impact, and Indicators

3. Why Randomise?

4. How to Randomise?

5. Sampling and Sample Size

6. Threats and Analysis

7. Project from Start to Finish

8. Scaling up and Cost effectiveness

Page 3: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Setting & some background information on interventions

• Goals of Measurement– Practical issues– Design decisions– Human subjects

Lecture Overview

Page 4: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Keys to Successful Measurement

• Relevance• Appropriate source• Reliability and Accuracy• Innocuous• Feasible• Timely• Ethical

Page 5: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Relevance

• How do we ensure we measure the relevant outcomes?

Page 6: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Relevance

• We want to answer more than: – how effective is the intervention?

• We also want to answer: – why it is effective?

• We want to draw the link:

Intervention Intermediate Outcomes Primary Outcomes

• Defining and measuring intermediate outcomes will enrich our understanding of the program, reinforce our conclusions, and make it easier to draw general lessons

Page 7: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Relevance: The Critical Role of Theory

1. Map out a theory of change2. Use the theory to generate hypotheses

that you can test in your project3. What (theoretical) final outcomes and

indicators are needed to demonstrate the validity of hypothesis?

4. What (theoretical) intermediary indicators are needed to distinguish various hypotheses?

Page 8: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• What might be examples of a few key hypotheses to test in the following?– Malaria bed-nets– Conditional cash transfer– Micro-credit

• Which variables, or combinations of variables, might you use to test these key hypotheses?

Relevance: Defining key hypotheses

8

Page 9: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Relevance: An Example using Vocational Training

• What are the possible chain of outcomes for the vocational training program?– Information?– Vouchers?

Page 10: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Relevance: Theory of Change for voucher treatment

• Youth would like to enroll in vocational training but they are poor and credit constrained enrollment is low

• Providing vouchers of any kind (i.e. scholarships) will allow youth to overcome credit constraints and enroll large increases in enrollment

Page 11: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Relevance: Theory of Change for information treatment

• Youth do not know the returns to vocational training (maybe voc-ed is stigmatized)

• Youth also don’t know which trades are lucrative

• Providing information could boost demand for vocational training and change course selection.

Page 12: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Relevance: Hypothesis from Vocational Training intervention

• What variables should we try to obtain to better our understanding of the program?– Intermediary?– Final?

Page 13: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Relevance to Reality

• Our theory and hypothesis helps us define the set of outcomes and variables

• Often these are “basic” indicators which are relatively straightforward (e.g. height and weight)

• In many cases it is difficult to translate theoretical measures into practical/real-life measures– Cognitive ability, non-cognitive ability– Risk aversion, impatience, selfishness– Happiness,– Income, savings, – Discrimination

Page 14: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Good news: often no need to reinvent the wheel.

• Bad news: often not clear if such measures are implementable in your setting and context. Need to be properly adapted to context.

Relevance to Reality

Page 15: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Choose those with a reasonable chance of being “moved” within the evaluation timeline

• Choose those that are not too difficult to collect and measure

• Choose those that occur with enough frequency to detect an impact given your sample size

• Ensure sample size is large enough to detect changes in chosen measures (i.e. make sure you have statistical power)

Hints on outcomes and indicators

Page 16: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Keys to Successful Measurement

• Relevance• Appropriate source• Reliability and Accuracy• Innocuous• Feasible• Timely• Ethical

Page 17: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Sources of Measures

• First task is to define your study sample– Who should be in your sample?– How representative is the sample? (representative of

who?)

• Sampling issues depend on research project

• Our sample was drawn from youth who were in the KLPS panel data– Pros?– Cons?

Page 18: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Sources of Measures

• Where do you obtain such data?

• Two main sources– Administrative data– Survey data

Page 19: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Sources of Measures

• Administrative data:– Advantages?

• Often cheap, fast, easy and “clean” (although you would be surprised how often there are errors)

– Disadvantages?• Data may not be relevant

– Variables of interest may not be in the data– The data may not correspond to the relevant time period

• May be difficult to get permission• May not come in the format you want

– May be aggregated

Page 20: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Sources of Measures

• Survey data (collected by you)– Advantages:

• Control- you can ensure its relevant and comprehensive and collected at the right time

– Disadvantages:• Heavy time investment required

– Management, designing, piloting and refining– Expensive

Page 21: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Keys to Successful Measurement

• Relevance• Appropriate source• Reliability and Accuracy• Innocuous• Feasible• Timely• Ethical

Page 22: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability and Accuracy

• Regardless of data sources we need to make sure that the data is reliable and accurate– Recall the cliché: Junk in - Junk out– Unreliable data false or misleading

conclusions

Page 23: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability

• Administrative data:– General perception is that this source should

be reliable but it is often full of errors• Important to understand HOW the data is collected

and the CONTEXT• “Automated” data collection generally high quality

(e.g. M-pesa transfer records, phone records)• Important to think or ask whether the data could be

manipulated for someone’s gain– E.g. school enrollment records when schools get per

capita student grant

Page 24: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability

• Ensuring reliability in survey data collection is very important and time consuming

• Probably the most important part is piloting

Page 25: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Good surveys are developed by trial and error• Often it’s good to start with a very basic set of

questions asked in an open-ended way– More of a qualitative or focus group style

• Over time, the lessons from this can be refined into a survey

• Good to find out that respondents don’t understand certain issues before it’s too late

Piloting

25

Page 26: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Good piloting will almost always raise questions that weren’t thought of before

• Sometimes the research design can be changed slightly to capture these issues

Piloting

26

Page 27: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• As the survey becomes more formalized, working through the details becomes important– Phrasing of questions– Skip codes (“If no, skip to….”– Translation issues

Piloting

27

Page 28: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Paper, Phone, PDA, Tablet or Netbook

• With prices of electronics falling, more surveys are collected via PDAs etc

• Regardless of data collection method, it is necessary to implement relevant protocol and procedures to ensure data quality

Page 29: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Paper Survey

The paper survey process:

Survey printed on paper filled in by enumerator data entry electronic dataset

Where can this go wrong?

Page 30: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Paper Survey

The paper survey process:

Survey printed on paper filled in by enumerator data entry electronic dataset

Where can this go wrong? Everywhere!

Page 31: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Include ID number on all pages in case pages get separated

• The survey should look nice- easier for the enumerator to fill it out

• Minimize complicated skip patterns: – i.e. don’t have instructions like “if the answer

to the question 5 pages ago was `no’, skip this question.”

• Minimize data entry errors: double entry and reconcile differences

Reliability: Paper surveys

31

Page 32: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Electronic Survey

The electronic survey process:

Paper survey created programmed into netbook filled in by enumerator electronic dataset

Where can this go wrong?

Page 33: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Electronic Survey

The electronic survey process:

Paper survey created programmed into netbook filled in by enumerator electronic dataset

Where can this go wrong? Everywhere

Page 34: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Electronic surveys

• Follow paper design tips• Thoroughly test the electronic version before

launch (esp. skip patterns etc)• Make sure it is easy to fill out (stylus vs. typing)• Program logical consistency checks

– A male should not be a mother, daughter etc– Test these checks

• Protocol to transfer data from pda/laptop to central database efficiently and routinely.

Page 35: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Electronic vs Paper

• Trade off that should be evaluated on case by case

• Electronic surveys:– Huge upfront investment (time etc)

• Makes more sense for surveys with many respondents

– “Less flexible” (harder to change things on the go)

– Faster - (no need to do data entry)– Less error (if programmed well)– Risky for enumerators to carry?

Page 36: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Though it’s hard to come up with a real set of rules for designing a survey, there are a few important bullet points which would be good to keep in mind to think about

• Some apply to paper based surveys only

Reliability: Survey Design

36

Page 37: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Collect accurate tracking information– Directions to home, GPS location, etc.– It might be a different person doing the follow-

up– Get correct mobile phone number including

that of parents, siblings and friends• Surveys should be clear and should not

leave room for interpretation by the enumerator.

Reliability: General Tips

37

Page 38: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Training enumerators in these procedures is essential. Create manuals for all survey instruments.

• Good to do regular “back checking” – resurveying respondents to make sure that they were actually interviewed.

• Re-survey sample of respondents on a random basis.

Reliability: Surveys

38

Page 39: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Enumerators have to understand why these questions are being asked

• This is especially true for questions that aren’t as simple

• For example, we might be asking about a respondent’s assets and she’ll say she doesn’t have a TV even though the enumerator can see it right there

Reliability: General Tips

39

Page 40: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• A lot of times it’s easy for a respondent to always say “I didn’t do something because I don’t have enough money” even though there’s actually a deeper reason as well

• Important for the enumerator to understand what we’re trying to get at with all these questions

Reliability: General Tips

40

Page 41: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• It is important to ensure that you obtain the information from the “right respondent”– Sometimes this is simple: in voc-ed example, the

individual youth would be the right person– Suppose you are collecting household data or

farm data• Women in the household may know more about

children• Men may know more about household assets

Reliability: The Right respondent

41

Page 42: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Similarly, people might feel reluctant to tell the truth about some socially undesirable subjects– How many drinks did you have last week?– Do you always use a condom when having

sex?• We call this “social desirability bias”

– Almost 99% of voc-ed sample wants to start their own business

• Framing effects?

Survey Design: Social desirability bias

42

Page 43: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Manipulation of Data

• Often occurs when there is something valuable at stake – CCT eligibility on the basis of a poverty score – CCT payouts on the basis of attendance – Teacher incentives on the basis of attendance

measured by head-teacher – Voc-ed voucher winners “selling” their

vouchers• Can lead to false conclusions

Page 44: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Manipulation

• Various strategies can be used to circumvent this problem– CCTs and enrollment and teacher attendance:

unannounced spot checks can be used– Voc-ed: took pictures of students and also did spot

checks

• Lesson: measures should be designed to be free of manipulation

• In some cases we can use the cross-checks to measure the extent of manipulation

Page 45: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Recall issues

• Some variables are easier to remember than others e.g. birth of a child

• Often have to collect data that involves recall where things aren’t so easy:– Time use, consumption, health, finances,

exam scores• How to ensure reliability?

Page 46: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Reliability: Recall Issues

• Easiest: shorten the recall period– One strategy is to do high frequency short surveys

where you ask respondents to recall events from prior day.

• Diaries: e.g. time-use or financial diaries – Can be difficult to implement- respondents don’t

always fill them out (or fill them out when you ask for them)

• Administrative data: e.g. for test scores

• One survey on nutrition had enumerator live with family and weighed each individuals meals prior to eating and the amount wasted

Page 47: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Proliferation of new technologies can assist in collection of high quality data.

• New technologies that can rapidly test for various health issues– Hemoglobin tests (we used this in voc-ed)

• Rain sensors• Remote sensing from satellites

Reliability: Sensors, biomarkers

47

Page 48: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• How do you measure discrimination? – Implicit association tests?

• Is this real discrimination?

– By looking at how people perceive names? Eg in a resume study?• Is there something else they are inferring from the

name?

• Measuring Hard/Abstract Stuff?

Page 49: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Keys to Successful Measurement

• Relevance• Appropriate source• Reliability and Accuracy• Innocuous• Feasible• Timely• Ethical

Page 50: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Innocuous

• The act of measurement can actually influence behavior of respondents– Live-in enumerator measuring each person’s

food– Financial diaries

• If treatment and control group respond differently to measurement then it would be problematic

Page 51: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Innocuous

• Recall the random spot check method for school attendance in CCTs

• Baird et al: compare UCTs and CCTs– Outcome of interest is school attendance– Suppose they conducted random spot checks

- could this affect integrity of their research?

Page 52: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Innocuous

• Recall the random spot check method for school attendance in CCTs

• Baird et al: compare UCTs and CCTs– Outcome of interest is school attendance– Suppose they conducted random spot checks

- could this affect integrity of their research?– YES: if the spot checks send a signal to UCT

group that they should go to school

Page 53: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Keys to Successful Measurement

• Relevance• Appropriate source• Reliability and Accuracy• Innocuous• Feasible• Timely• Ethical

Page 54: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Budget adequately• Things go wrong

– Exchange rate movements– Enumerator downtime– Resurveys needed

• New opportunities

Feasibility

Page 55: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Financial resources – tradeoff between sample size and amount of information obtained from each household

• Human resource capacity of organization implementing the survey – research coordinators, interviewers, data entry staff

• Willingness and ability of respondents to provide desired information

Feasibility

Page 56: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• As with any job, managing surveyors creates its own problems

• Sometimes it’s possible to contract out survey work to a firm which specializes in it• Less hassle, but less control over data quality• Important to find a really competent firm

Feasibility

56

Page 57: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Keys to Successful Measurement

• Relevance• Appropriate source• Reliability and Accuracy• Innocuous• Feasible• Timely• Ethical

Page 58: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Timely

• When should you collect outcomes and variables?

• Related to theory and hypotheses

• Crucial question is “how long do you need for intervention effects to materialize?”

Page 59: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Although you can sometimes avoid collecting baseline data it is very risky to skip this step.

• Sample might not be balanced, especially if sample isn’t very big

• Having a baseline allows the researcher to control for baseline characteristics• E.g. initial test scores are very important to have in

education

• Baseline allows you to look at effects for subgroups

• Also contributes to reliability

Research design: Baseline

59

Page 60: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• The amount of data that needs to be collected at end line really depends on the type of experiment that is being run

• Sometimes take-up (not adoption or some other similar outcome) itself is the outcome of interest

•Voc-ed: measure enrollment each semester, completion, test performance, and then labor market outcomes

Research design : Follow-up

60

Page 61: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Other times, getting follow-up data is crucial

• Theory suggests that voc-ed impacts may take time to materialize

•This required multiple follow-ups over several years and also tracking of individuals

• Need to plan appropriately for follow-up as timing is key

•Will it take some time for the intervention to take off?•Will people migrate?

Research design : Follow-up

61

Page 62: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Keys to Successful Measurement

• Relevance• Appropriate source• Reliability and Accuracy• Innocuous• Feasible• Timely• Ethical

Page 63: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• An oft-neglected part of research is human subject approvals

• Respondents do not have to answer any questions and their rights have to be respected above all else

• It is important to remember that human subjects research is meant to be designed such that respondents are volunteering their time to the survey– They can’t be induced to answer questions they don’t

want to with punishments (or huge rewards)• Treating subjects with dignity / complying with all

review panels MUST come before the research

Human subjects

Page 64: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Check what approvals needed – Country IRBs– Research permits– University IRBs

• Permissions– National government– Local authorities– Relevant ministry

• Remember this takes time!

Human subjects

Page 65: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Respondents must be read or given a consent form which explicitly tells them what will be asked of them, how much time it will take, what the risks are, what the rewards are

• Contact information to complain if something goes wrong

• Usually need written consent, sometimes oral is allowed

• This can be intimidating but is necessary – and a good discipline device for the researcher too

Consent

Page 66: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

Keys to Successful Measurement

• Relevance • Appropriate source • Reliability and Accuracy • Innocuous • Feasible • Timely • Ethical

Page 67: J-PAL Executive Education Course Outcomes, Indicators and Measuring Impact 19 January 2015 Isaac M. Mbiti University of Virginia

• Randomised evaluations allow you to create a causal relationship between your outcome and your intervention because of the statistical equivalence of the control and the treatment group.

• In order to identify important causal relationships you need to measure the right things.

• The theory of change helps you identify what you need to measure

• Correct measuring methodology is essential for attaining a causal relationship.

• If your measuring is incorrect you jeopardize identifying

the correct causal relationshipp.

Summary