Upload
support-for-improvement-in-governance-and-management-sigma-oecd
View
182
Download
0
Embed Size (px)
Citation preview
1
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Institutional development of the Office of the Minister of State for Administrative
Reform (OMSAR)
Introduction to M&E of Public Policies: concepts and methods
Jaime Blasco 7-8 December 2015
BEIRUT
2
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
1. Observation/experience
2. Intuition/common sense
3. Theory
4. Eminences
5. Consensus
… good will
… very useful, but not an ex-ante guarantee
of success
On doubt and skepticism
3
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
1. Policy success consists of significantly
improving social, economic or
environmental conditions
Stating the obvious
4
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
2. Success should not be conceived in
terms of activity, satisfaction, survival,
budget, adherence to the rule,
professional recognition or good relations
with other actors (though they may be
crucial for success, they are not success)
Stating the obvious
5
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
3. Success hinges upon policy design,
implementation and context
Stating the obvious
6
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
4. Success is a hypothesis to be tested,
not an apriorism (even if you have
resources, consensus, experts, time to
reflect, etc.)
Stating the obvious
7
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
5. Without hypothesis testing (evaluation) ,
it is difficult to improve policy design,
reform & management (blind flying)
Stating the obvious
8
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
1. Public intervention is motivated by the existence of a
condition or problem
2. The program is expected to ameliorate the problem
(from A to B, being B better than A)
3. Program is expected to do so throughout a set of
mechanisms (not wishful thinking)
Program theory
9
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Needs
Inputs
Activities
Outputs
Impacts
(Outcomes)
So
cie
ty
Pu
blic
inte
rve
ntio
n
What is
needed
• Mosquito nets
• Transportation
What the
program does
Free mosquito
net distribution
The products
Nets
distributed
Short-, medium
and long-run
benefits
Reduction in
infant mortality
The problem
High infant
mortality in
región X
What for?
How?
Implementation failure Flawed theory Flawed diagnose
Program theory
10
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Program theory
11
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Program theory
ICT capital
expenditure
ICT operational
costs
Re-
organisation
and other
intangible costs
Deployment of
eHealth
applications in
1ary and 2ary
care (eHR,
telemonitoring,
ePrescription,
etc)
Availability of
eService for
citizens
(information,
consultation,
booking,
reimbursement)
Use of eHealth
applications by
professionals
Use of eHealth
applications by
citizens
Cases/
procedures/
transactions
handled online
Clinical
outcomes
Safety and
quality of
care
Input/output
efficiency
Spillover on
ICT sector
Health
capital
Healthcare
sustainability
eHealth
Misuraca et al. 2013
Type I outputs: supply
Type II outputs: use
Outcomes: usefulness Inputs: cost
Readiness Intensity Impact B>C
12
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Program theory
ICT capital
expenditure
ICT operational
costs
Re-
organisation
and other
intangible costs
Availability of
eServices for
citizens and
business
eService/
applications of
employees and
for cross-
government
exchanges
Re-use of open
data
Use eServices
Cases/
procedures/
transactions
handled online
Cost savings
Effectiveness
for
constituencies
Input/output
efficiency
Spillover on
ICT sector
GDP and
productivity
Good
governance
eGovernment
Misuraca et al. 2013
Open data
available online
for re-use
Trust in
government
and
participation
13
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Program theory
14
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Uses of program theory
1. Making program design and expected mechanisms
explicit
2. Focus on outcomes (“what for”)
3. Identify weaknesses-> evaluation questions
4. Planning the evaluation (indicators, sources)
5. Non evaluative uses: Program design, advocacy
Program theory
15
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Program theory
16
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Needs
Inputs
Activities
Outputs
Outcomes
Process evaluation
Economic evaluation
Impact evaluation
Needs assessment
Design evaluation
Evaluation questions
17
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Needs assessment:
1. How many unemployed workers lack basic skills on
ICT?
2. What are existing levels of computing and
communications infrastructure at the national colleges
3. What specific project management capabilities do
managers lack and need most?
Evaluation questions
18
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Process evaluation
1. Actual users correspond to the intended target
population?
2. What are municipalities doing with the subsidies?
3. Is the program covering all the regions?
4. What changes have introduced to the plan during the
implementation process? (what’s the actual policy
design?)
Evaluation questions
19
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Evaluation questions
Implementation gaps
• Displacement (micropolitics)
• Incompleteness
• Adjustment
“Designing by doing”
20
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Impact evaluation:
1. Do participation in direct employment schemes
increase the probability of finding and retaining a job?
2. Do tax deduction schemes for older workers increase
savings for retirement?
3. Does parole reduce crime recidivism?
Evaluation questions
21
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Economic evaluation:
1. Do savings from reduced administrative burdens
overcome costs of a new e-service?
2. Is care for the dependent elderly at home more cost-
effective than insitutionalisation?
Evaluation questions
22
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Design evaluation:
1. Do telecommuting schemes for public officers increase
productivity?
2. What features of a telecommuting scheme increase the
chances to increase public officers’ productivity?
Evaluation questions
23
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Design evaluation:
http://ies.ed.gov/ncee/wwc/
https://educationendowmentfoundation.org.uk
/toolkit/
http://webarchive.nationalarchives.gov.uk/2
0120919132719/http://communities.gov.uk/doc
uments/localgovernment/pdf/142922.pdf
Evaluation questions
24
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
21.40%
17.10%
0.00%
5.00%
10.00%
15.00%
20.00%
25.00%
2010 2012
Before
After
Distrust in bureaucracy
2013 2015
The elusive search for impacts
25
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
64.1%
53.9%
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
Beneficiàries No beneficiàries
Cit
izen
sat
isfa
ctio
n
Standard Procedure E-procedure
The elusive search for impacts
26
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
21.40%
17.10%
0.00%
5.00%
10.00%
15.00%
20.00%
25.00%
2010 2012
Before
After
IMPACT
COUNTERFACTUAL
20,40%
The elusive search for impacts
2013 2015
27
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
64.1%
53.9%
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
Beneficiàries No beneficiàriesAlumnes
74,0% COUNTERFACTUAL
Standard Procedure E-procedure
Cit
izen
sat
isfa
ctio
n
The elusive search for impacts
28
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
• Policy impact may be expressed as the subtraction:
Impact = Y1 – Y0
Where:
• Y1 is the factual: outcomes attained after the public policy • Y0 is the counterfactual: outcomes that would have been
attained in the absence of the intervention
Estimating impacts implies causal inferences
The elusive search for impacts
29
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Y1
Program
≈
No program
=Y0
Y2
Counterfactual hypothesis
The elusive search for impacts
30
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Impacte
0.0%
5.0%
10.0%
15.0%
20.0%
25.0%
30.0%
35.0%
40.0%
1T-2011 2T-2011 3T-2011 4T-2011 1T-2012
Regulation
What has happened
You
th la
bo
r m
arke
t p
arti
cip
atio
n
The elusive search for impacts
31
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Conclusions
1. “What has happened”, by itself, is useful to know, but tells us nothing about the program’s impact
2. “Nothing” is “nothing”: the very same evolution of a problem (“what has happened”), may entail that the impact has been positive, zero, or negative
3. “Impact” stems from the comparison of “what has happened” and what would have happened”
The elusive search for impacts
32
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Impacte
0.0%
5.0%
10.0%
15.0%
20.0%
25.0%
30.0%
35.0%
40.0%
1T-2011 2T-2011 3T-2011 4T-2011 1T-2012
The program
Factual Counterfactual
Main characters
The impact
The outcome
... The decision The measurement
N
You
th la
bo
r m
arke
t p
arti
cip
atio
n
The elusive search for impacts
33
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
1. There is no such thing as an “impact indicator”. Estimating impacts entails analysis
2. Most often, robust analysis requires planning the evaluation before the program starts off, in order to collect the data you need
3. Sometimes, it requires modifying certain program procedures (selection procedures, not necessarily criteria)
4. Some impact questions do not have a robust answer, since there is no feasible identification strategy
Consequences
The elusive search for impacts
34
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
The elusive search for impacts
Perspective Time Information
about...
Pros & cons
Retrospective
(ex-post)
Present Past • Most common
• Data restrictions
• Mistakes already made
Ex-ante Present Future • Pre-decision forecast
• Policy design benefits from
evaluation rationale
• Uncertainties about the future
undermine robustness
Prospective Present Future
retrospective
evaluations
• Setting the stage for a future robust
evaluation
• Policy design benefits from
evaluation rationale
35
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Methods
Impact estimations designs
Is it possible to use a comparison group?
Random assignment?
YES
YES NO
NO
Social experiment Discontinuous regression
Before/after
Time series
Matching
Dif in dif
Instrumental variables
36
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Methods
Garicano & Heaton, 2013
“The data do not suggest that agencies that substantially increased IT
over the sample period had superior improvements in clearance rates
compared with those that made little IT adjustment”
37
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Methods
Garicano & Heaton, 2013
MPI= 1 if high
level of IT
+ high
specialization
+ highly skilled
workers
“An agency implementing a combination of IT, specialization and
skilled workers experiences a roughly 5% lower offending rate than
does an agency with similar levels of IT that has not implemented
these other practices.
[This] is consistent with the presence of complementarities”
38
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Methods
COMPSTAT NYC
0.0
5.0
10.0
15.0
20.0
25.0
30.0
35.0
Murder rate (/100.000 people)
39
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Methods
COMPSTAT NYC
Real time mapping of crime (IT)
But also…
• A statement of the measurable goals of the NYPD
• Internal accountability through weekly COMPSTAT meetings
(officers accountable for understanding & reacting, not for results)
• Decentralization of command (authority and resources to precinct
commanders)
• Empowerment of middle managers
• Data-driven problem identification and strategy design
• Innovative problem solving tactics
Weisburd et al., 2003
40
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Methods
Micro or macro evaluations?
Accuracy vs. Scope
• Each concrete “program” has its own goals and peculiarities
• Aggregate measurements&counterfactuals are difficult to find
• Impact evaluations are more feasible at the micro level
• But the micro level has a very limited scope
• Start small & gradually replicate & try to scale up (with a better
grasp of theory and measurement issues)
• Estimate impacts at the micro level, and monitor at the macro level
41
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
From counterfactualism to pragmatism
Pragmatism
Counterfactualism
(experimentalism)
Pragmatism
Epistemology Successionist causation Knowledge validity depends on its
pragmatic acceptability
Method Experimental & quasi-experimental Enlighten policy using information
structured with best of breed
techniques
Ontology Social complexity coped with by
controlling of covariates
Social world as power-play & chain
of influence that research tries to
penetrate to promote change
Objects Programs as instruments to solve
social problems
Programs as institutional realities
(such as services, targets, clients)
Progress Replication to accumulate evidence
on what works
Incremental improvements within
the political feasibility of policy
development
Utility Inform policy making for rational
choices amongst alternatives on the
basis of robust evidence
Evaluation geared to advancing the
practical goals of policy making
Pawson &Tilley, 1997; Misuraca et al. 2013
42
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
1. Outcome measurement & monitoring (factual description)
2. Implementation analysis
3. Process tracing of impact structures
4. Qualitative interpretation of facts
5. Case studies
6. Precedent evaluations in the literature
7. Triangulation of imperfect & incomplete evidence
Pragmatism
43
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
The scale does not make you thinner
• Evaluation is instrumental: success is not a robust report, but an improvement in policy effectiveness, efficiency, legitimacy, equity...
• The step from knowing to learning, & from learning to action is anything but automatic
Ask questions, get answers, adapt
Adapt!
44
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
• Involve key stakeholders & decision-makers in the evaluation process
• Make knowledge trickle down to the whole organization
• Keep available recorded and enhance institutional memory of knowledge accrued
• Make knowledge penetrate the policy process
Adapt!
45
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
• Think of programs as hypothesis, and test them(evaluate them)
• Think of wicked problems as challenges: innovate, test, (often fail), learn and adapt
• Escape from the illusion of completeness. Plan
your analysis & prioritize programs, questions and outcomes.
• Do not get frustrated with incomplete & imperfect knowledge. Not being able to know everything does not mean that we should not know anything.
Closing remarks
46
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
• Escape from the illusion of an evaluation algorithm, model or template. Evaluation requires ad hoc approaches
• Struggle for the best knowledge you can produce given your constraints: a car is better than a bike, but a bike is better than a crashed car (whatever you do, do it properly).
• Think ex-ante about the ex post evaluation:
small changes in datasets and program procedures from the beginning allow for a robust, fast and cheap evaluation in the future.
Closing remarks
47
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
• Impact evaluation is a powerful tool, but has
strict requirements: well specified programs, tangible and measurable goals, a feasible identification strategy & good quality data.
• Do not think only of impact evaluation: there are other forms of evaluation and knowledge production that are useful for decision-making
Closing remarks
48
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
• Ask not what you can do for your evaluation but what evaluation can do for you: look out there for robust, relevant and summarized knowledge about what works, but pay attention to external validity. Context matters.
Closing remarks
49
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
• Use pilot programs to generate robust knowledge.
• Look for opportunity windows to generate robust
knowledge generation.
• Link evaluation with management, planning & decision-making. Knowledge is not an end by itself.
Closing remarks
50
A j
oin
t i
nit
iati
ve o
f th
e O
EC
D a
nd
th
e E
uro
pe
an
Un
ion
,
pri
nc
ipall
y f
ina
nced
by t
he
EU
Evaluation roadmap • Integrate &improve administrative datasets for
evaluation use. • Evaluate ex-ante evaluations in the design &
reform of major policies • Standardise a common monitoring & reporting
system for all programs • Perform impact evaluations for the major
programs at least once every four years • Perform pilot programs with experimental
designs • Plan and prioritise evaluations • Empower the unit for knowledge production
coordination & management • Link knowledge to decision-making processes
Closing remarks