Upload
alexis-randall
View
212
Download
0
Embed Size (px)
Citation preview
Using knowledge utilisation theory to demonstrate how commissioned evaluations can influence program design and funding decisions
Case studies from consultancy
Wendy Hodge, Principal Consultant
This paper
2
1.Knowledge utilisation 2.The case studies
A very brief potted history of knowledge utilisation
3
• “The results of research are worthless if the are not used” (Last 1989)
Focus on evidence based policy
• Assumes that using knowledge will lead to better policy, programs
Initiatives:• Topic specific centres of excellence with structural
links to government• Systematic reviews of evidence e.g. Cochrane
Collaboration• Clearing Houses e.g. evidence and practice
guidelines and summaries
4
Operating assumptions
• Knowledge is transferred from individual to individual and through organisational structures
• All knowledge is taken up subjectively• Not just users and researchers who influence use of
knowledge• Knowledge is refined and adapted by the user• Use and generation of knowledge are interdependent
and complicated to improve• Not all knowledge is intended to be directly
applicable to policy development or program design
5
Ways evidence is used
6
Instrumental or direct use
When findings/ data are used in specific or direct ways e.g. directly influence a program design or delivery, inform policy directions or professional practice
Page 7
Conceptual use
Involves using research evidence for general enlightenment. Users are exposed to new information, ideas but may not use the information directly
Page 8
Symbolic or strategic use
Using findings/ data to legitimise policy directions or to justify actions taken for other reasons.
Page 9
Predictors of use
“Data are no use if the report on them is too late. They are precious little good if
the relevant audience does not comprehend them”
(Cronbach, 1977)
10
Predictors of use (dissemination model)
• Decision-makers know about the research • Interdependence of policy makers and evaluators e.g.
organisational links exist or joint planning• Good personal relations between key players• The right people - credible source• The right evidence at the right time• The inherent quality of evidence• Whether the evidence conforms to commissioners beliefs and
previous knowledge• Whether data is interpreted in a way that suits the needs of the
user• Tells the story - clear, succinct formats, understood, user friendly
11
“Context matters-values matter – politics matter.” Brewer, 1983
“The interplay between science and policy is commonly neither purely instrumental nor purely political”
Hertin et al 2007
12
Organisational and political predictors of use
• structure, culture and politics of the user organisation including assumptions about a program or policy’s worth and service models
• rewards and incentives for dissemination activity in both the “user” and “researcher” context
• value placed on evaluation or research evidence in the user context
• Other inputs on policy development or program design; lobbying, negotiations
• Boundaries of policy assessment analysis
13
ARTD cases considered
• Evaluation of drink driver education program
• Evaluation of a carers program• Evaluation of drug education program
14
Case 1 – Evaluation of a drink driver education program
Predictor of use The evaluation
Findings known to decision makers High-level interagency senior officer committee + report tabled in parliament
Organisational links Contract+ interagency steering committee for project involved in planning discussion of findings
Good relations None at start but built over 2 years
Credible source Us + academic advisor + guru
The right evidence-the right time Quasi-experimental design + mixed methods. Timed to meet budget cycle.
User friendly report Evidence synthesised and report structured around evaluation questions
Assumptions of program worth Program valued; design based on best evidence
Value placed on evaluation Highly valued; direct client research background
15
Case 2 – Evaluation of a carers program
Predictor of use The evaluation
Findings known to decision makers
Responsible officers commissioned evaluation. Able to drive changes to delivery models.
Organisational links Contract provided formal structure.
Good relations Fostered by regular informal reporting of progress and findings.
Credible source Sought evaluation specialists. Previous knowledge of area and experience in conducting large reviews.
The right evidence-the right time
Extensive consultation with carers and service delivery organisations. Findings delivered in time to inform renewal of 3-year contracts
User friendly report Executive summary identified deficiencies of service model and suggested changes. Report told the story of carers and what respite was needed.
Assumptions of program worth
High given vulnerable nature of the carers...fits with national priorities
Value placed on evaluation Moderate to high...previous bad experience
16
Case 3 – Evaluation of an action enquiry as professional development
Predictor of use The evaluation
Findings known to decision makers Responsible offices commissioned evaluation. Able to drive changes to program structure.
Organisational links Contract provided formal structure and Premiers Panel.
Good relations Fostered by regular informal reporting of progress and findings.
Credible source Long standing clients.
The right evidence-the right time Qualitative methods. Findings reported verbally initially to inform stage 2 planning
User friendly report Answered evaluation questions; placed in context of adult learning principals
Assumptions of program worth New approach, testing
Value placed on evaluation High, value independence
17
In summary
• Evaluators generate knowledge • Our clients, policy officers and program designers are
users and disseminators of knowledge in their own sphere
• As evaluators, we need to pay attention to predictors of use under our control
• Policy officers transform evidence to meet their needs
• Policy officers could also actively pay attention to predictors of use within the agency context
18