Scoring Goals or Changing the Game: What Impacts Should We Measure? Jonathan Lomas Canadian Health...

Preview:

Citation preview

Scoring Goals or Changing Scoring Goals or Changing the Game:the Game:

What Impacts Should We What Impacts Should We Measure?Measure?

Jonathan LomasCanadian Health Services Research

Foundation Presentation to ESRC Symposium: ‘New

Approaches to Assessing the Non-Academic Impact of Social Science’London, May 12-13, 2005

My question is, are we having an impact?

Is Research Ready for Action?Is Research Ready for Action?Medline search (1966-02] to identify articles stating:

a. “need more research” or “need less research” Need more 161/162

Need less 1/162

b. “more questions than answers” or “more answers than questions”

More questions 163/166

More answers 3/166David, AS. BMJ 2002; 323:1462-3

Assessing the Impact of What?Assessing the Impact of What?The Research Produced?

A single research study published in a journalA summary of some research studies written in plain language and posted on a web siteA systematic review with targeted dissemination of key messages to potential usersA body of research knowledge developed and discussed face-to-face with potential users

Assessing the Impact of What? Assessing the Impact of What? (cont)(cont)

The Research Production Process?All the activities of a research commissioning or granting agency (including training)All the activities of a research production facility (e.g. institute, department, university)The activities of a potential research user organization and its staffThe entire ‘research regime’ in a country

Where I WorkWhere I Work

CHSRF’s mission:

“To support evidence-based decision-making in the healthcare system”

Our ultimate desired impact is cultural change in the research and healthcare systems

Changing the game, not just scoring goals

The Discipline of Objectives & Logic The Discipline of Objectives & Logic ModelsModels

Yogi Berra on objectives:“If you don’t know where you’re going, you might not get there”

Lewis Carroll on logic models:“If you don’t know where you’re going, any road will take you there”

Decision MakersPolicy Makers

Organized Interests e.g. drug companies,

professional associations

Service Professionals

Managers

‘Client’ & Public

Decision Maker Diversity

Why Social Science isn’t IBM or General Electric

ResearchersUniversity-

based

Stakeholder-based

System-based

Management Consultants

Policy Makers

Organized Interests e.g. drug companies,

professional associations

Service Professionals

Managers

Patient & Public

Decision Makers

PR

OB

LEM

S

SO

LUTIO

NS

Decision Makers

Researchers

Evidence-Based Decision-Making

Research Funders

Knowledge Purveyors

Receptor Capacity

Critical Evaluation P

RO

BLE

MS

SO

LUTIO

NS

Funding and Training Vehicles

Priority-setting structures

ISSUES & PRIORITIES

PRIORITY

TOPICS

IDEAS

RESEARCH

EVIDENCE

Other InfluencesPersonal Experience,

Anecdote, Wants, Interests, Myths, Assumptions, etc.

Synthesis and Influence

Linkage and

Exchange

CHSRF’s ObjectivesCHSRF’s Objectives

1. To increase health system decision-makers’ appreciation of the value of research

2. To increase the production of research relevant to the needs of health system decision- makers

3. To increase the availability and acquisition of needed research by health system decision-makers

4. To increase the appraisal and application of needed research by health system decision makers

Increased Appreciation of Value of Increased Appreciation of Value of ResearchResearch

Programs and activities:overall ‘linkage & exchange’ approachconsultations with users for prioritiescase study presentations of value

Measurable outcomes:# of decision-makers participating in researchexpenditures on research commissioning by system% of employees in system with research training

Increased Production of Relevant Increased Production of Relevant ResearchResearch

Programs and activities:priority theme-based program-fundingcommissioned syntheses on current issuesapplied training programsencourage university incentives for applied research

Measurable outcomes:amount of research in priority theme areasself-reported awareness/use of research syntheses# of graduates with applied research skills

Increased Availability/Acquisition of Increased Availability/Acquisition of ResearchResearch

Programs and activities:plain-language research summaries (1:3:25, Mythbusters)face-to-face exchanges on timely topicscreation/support of knowledge networkscreation/support for knowledge brokering

Measurable outcomes:self-reported awareness of disseminated researchself-reported follow-up contact with researchersself-reported use of web-based and other resources for research evidence acquisition (audit computer bookmarks)

Increased Appraisal/Application of Increased Appraisal/Application of ResearchResearch

Programs and activities:training users in research appraisal and applicationorganisational best practices in research use

Measurable outcomes:self-reported application of researchchanges in organisational structures and processes to better accommodate researchIncreased sense of ‘decision-certainty’ where synthesised research is available

The Attribution ChallengeThe Attribution Challenge

What, if any, is/was the role of research versus all the other influences on behaviour?

Adapted from Philip Davies, 2005

Program or Intervention Effectiveness

Types of Research Evidence

Implementation Evidence

Organizational Evidence

Economic/ Financial Evidence

Ethics Evidence

Forecast Evidence

Attitudinal Evidence

ExperimentalQuasi-Experimental

Counterfactual

SurveysAdmin DataComparativeQualitative

Cost-BenefitCost-Effectiveness

Cost-UtilityEconometrics

ExperimentalQuasi-Experimental

QualitativeTheories of Change

Public ConsultationDistributional Data

MultivariateRegression

SurveysQualitative

Adapted from Philip Davies, 2005

Types of Research Evidence

Program or Intervention Effectiveness

Implementation Evidence

Organizational Evidence

Economic/ Financial Evidence

Ethics Evidence

Forecast Evidence

Attitudinal Evidence

ExperimentalQuasi-Experimental

Counterfactual

SurveysAdmin DataComparativeQualitative

Cost-BenefitCost-Effectiveness

Cost-UtilityEconometrics

ExperimentalQuasi-Experimental

QualitativeTheories of Change

Public ConsultationDistributional Data

MultivariateRegression

SurveysQualitative

ResearchEvidence

Adapted from Philip Davies, 2005

Combining Research and Colloquial Evidence for Information

ResearchEvidence

Professional Experience &

Expertise

Political Judgement

Resources

ValuesHabits & Tradition

Lobbyists & Pressure Groups

Pragmatics & Contingencies

A Final Zany IdeaA Final Zany Idea

The ‘impact file’ - a deductive approach to assessing impact

Add photo of impact file

What’s In the Impact File?What’s In the Impact File?

T F H M & Y D G W - testimonialsAltered career trajectory (researcher or decision-maker)Changes in similar organization’s programs & processes - ‘lateral impact’Better communication of research and its implications - disseminationAwareness of research by decision-makers - acquisitionChanged decisions based on research – applicationChanges in researchers’ or decision-makers’ structures and processes - ‘cultural change’

Maybe Settling for Second Best Maybe Settling for Second Best is OK!is OK!Using an impact file is:

Motley, ad-hoc and not comprehensive or systematicBiased to individual not organizational responsesQualitative and potentially non-generalizable

But, as Churchill said of democracy:“Many forms of government have been tried … No one pretends that democracy

is perfect … Indeed, it has been said that democracy is the worst form of government, except all those other forms that have been tried from time to time.” House of Commons, 1947

THANK YOU!THANK YOU!

www.chsrf.caor

www.fcrss.ca