20
SYSTEMATIC REVIEW Open Access The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare: a systematic review Mitchell N. Sarkies 1* , Kelly-Ann Bowles 2 , Elizabeth H. Skinner 1 , Romi Haas 1 , Haylee Lane 1 and Terry P. Haines 1 Abstract Background: It is widely acknowledged that health policy and management decisions rarely reflect research evidence. Therefore, it is important to determine how to improve evidence-informed decision-making. The primary aim of this systematic review was to evaluate the effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare. The secondary aim of the review was to describe factors perceived to be associated with effective strategies and the inter-relationship between these factors. Methods: An electronic search was developed to identify studies published between January 01, 2000, and February 02, 2016. This was supplemented by checking the reference list of included articles, systematic reviews, and hand-searching publication lists from prominent authors. Two reviewers independently screened studies for inclusion, assessed methodological quality, and extracted data. Results: After duplicate removal, the search strategy identified 3830 titles. Following title and abstract screening, 96 full-text articles were reviewed, of which 19 studies (21 articles) met all inclusion criteria. Three studies were included in the narrative synthesis, finding policy briefs including expert opinion might affect intended actions, and intentions persisting to actions for public health policy in developing nations. Workshops, ongoing technical assistance, and distribution of instructional digital materials may improve knowledge and skills around evidence- informed decision-making in US public health departments. Tailored, targeted messages were more effective in increasing public health policies and programs in Canadian public health departments compared to messages and a knowledge broker. Sixteen studies (18 articles) were included in the thematic synthesis, leading to a conceptualisation of inter-relating factors perceived to be associated with effective research implementation strategies. A unidirectional, hierarchal flow was described from (1) establishing an imperative for practice change, (2) building trust between implementation stakeholders and (3) developing a shared vision, to (4) actioning change mechanisms. This was underpinned by the (5) employment of effective communication strategies and (6) provision of resources to support change. Conclusions: Evidence is developing to support the use of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare. The design of future implementation strategies should be based on the inter-relating factors perceived to be associated with effective strategies. Trial registration: This systematic review was registered with Prospero (record number: 42016032947). Keywords: Implementation, Translation, Health, Policy, Management * Correspondence: [email protected] 1 Kingston Centre, Monash University and Monash Health Allied Health Research Unit, 400 Warrigal Road, Heatherton, VIC 3202, Australia Full list of author information is available at the end of the article © The Author(s). 2017 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. Sarkies et al. Implementation Science (2017) 12:132 DOI 10.1186/s13012-017-0662-0

The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

SYSTEMATIC REVIEW Open Access

The effectiveness of researchimplementation strategies for promotingevidence-informed policy and managementdecisions in healthcare: a systematic reviewMitchell N. Sarkies1*, Kelly-Ann Bowles2, Elizabeth H. Skinner1, Romi Haas1, Haylee Lane1 and Terry P. Haines1

Abstract

Background: It is widely acknowledged that health policy and management decisions rarely reflect researchevidence. Therefore, it is important to determine how to improve evidence-informed decision-making. The primaryaim of this systematic review was to evaluate the effectiveness of research implementation strategies for promotingevidence-informed policy and management decisions in healthcare. The secondary aim of the review was todescribe factors perceived to be associated with effective strategies and the inter-relationship between thesefactors.

Methods: An electronic search was developed to identify studies published between January 01, 2000, andFebruary 02, 2016. This was supplemented by checking the reference list of included articles, systematic reviews,and hand-searching publication lists from prominent authors. Two reviewers independently screened studies forinclusion, assessed methodological quality, and extracted data.

Results: After duplicate removal, the search strategy identified 3830 titles. Following title and abstract screening,96 full-text articles were reviewed, of which 19 studies (21 articles) met all inclusion criteria. Three studies wereincluded in the narrative synthesis, finding policy briefs including expert opinion might affect intended actions,and intentions persisting to actions for public health policy in developing nations. Workshops, ongoing technicalassistance, and distribution of instructional digital materials may improve knowledge and skills around evidence-informed decision-making in US public health departments. Tailored, targeted messages were more effective inincreasing public health policies and programs in Canadian public health departments compared to messages anda knowledge broker. Sixteen studies (18 articles) were included in the thematic synthesis, leading to aconceptualisation of inter-relating factors perceived to be associated with effective research implementationstrategies. A unidirectional, hierarchal flow was described from (1) establishing an imperative for practice change,(2) building trust between implementation stakeholders and (3) developing a shared vision, to (4) actioning changemechanisms. This was underpinned by the (5) employment of effective communication strategies and (6) provisionof resources to support change.

Conclusions: Evidence is developing to support the use of research implementation strategies for promotingevidence-informed policy and management decisions in healthcare. The design of future implementationstrategies should be based on the inter-relating factors perceived to be associated with effective strategies.

Trial registration: This systematic review was registered with Prospero (record number: 42016032947).

Keywords: Implementation, Translation, Health, Policy, Management

* Correspondence: [email protected] Centre, Monash University and Monash Health Allied HealthResearch Unit, 400 Warrigal Road, Heatherton, VIC 3202, AustraliaFull list of author information is available at the end of the article

© The Author(s). 2017 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, andreproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link tothe Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver(http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Sarkies et al. Implementation Science (2017) 12:132 DOI 10.1186/s13012-017-0662-0

Page 2: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

BackgroundThe use of research evidence to inform health policy isstrongly promoted [1]. This drive has developed with in-creased pressure on healthcare organisations to deliverthe most effective health services in an efficient andequitable manner [2]. Policy and management decisionsinfluence the ability of health services to improve soci-etal outcomes by allocating resources to meet healthneeds [3]. These decisions are more likely to improveoutcomes in a cost-efficient manner when they are basedon the best available evidence [4–8].Evidence-informed decision-making refers to the com-

plex process of considering the best available evidencefrom a broad range of information when deliveringhealth services [1, 9, 10]. Policy and management deci-sions can be influenced by economic constraints, com-munity views, organisational priorities, political climate,and ideological factors [11–16]. While these elementsare all important in the decision-making process, with-out the support of research evidence they are an insuffi-cient basis for decisions that affect the lives of others[17, 18].Recently, increased attention has been given to imple-

mentation research to reduce the gap between researchevidence and healthcare decision-making [19]. Thisgrowing but poorly understood field of science aims toimprove the uptake of research evidence in healthcaredecision-making [20]. Research implementation strat-egies such as knowledge brokerage and education work-shops promote the uptake of research findings intohealth services. These strategies have the potential tocreate systematic, structural improvements in healthcaredelivery [21]. However, many barriers exist to successfulimplementation [22, 23]. Individuals and health servicesface financial disincentives, lack of time or awareness oflarge evidence resources, limited critical appraisal skills,and difficulties applying evidence in context [24–30].It is important to evaluate the effectiveness of imple-

mentation strategies and the inter-relating factors per-ceived to be associated with effective strategies. Previousreviews on health policy and management decisions havefocussed on implementing evidence from single sourcessuch as systematic reviews [29, 31]. Strategies that in-volved simple written information on accomplishablechange may be successful in health areas where there isalready awareness of evidence supporting practicechange [29]. Re-conceptualisation or improved meth-odological rigor has been suggested by Mitton et al. toproduce a richer evidence base for future evaluation,however only one high-quality randomised controlledtrial has been identified since [9, 32, 33]. As such, an up-dated review of emerging research in this topic is neededto inform the selection of research implementation strat-egies in health policy and management decisions.

The primary aim of this systematic review was toevaluate the effectiveness of research implementationstrategies for promoting evidence-informed policy andmanagement decisions in healthcare. A secondary aim ofthe review was to describe factors perceived to be associ-ated with effective strategies and the inter-relationshipbetween these factors.

MethodsIdentification and selection of studiesThis systematic review was registered with Prospero (recordnumber: 42016032947) and has been reported consistentwith the Preferred Reporting Items for Systematic Reviewsand Meta-Analysis (PRISMA) guidelines (Additional file 1).Ovid MEDLINE, Ovid EMBASE, PubMed, CINAHL Plus,Scopus, Web of Science Core Collection, and The CochraneLibrary were searched electronically from January 01, 2000,to February 02, 2016, in order to retrieve literature relevantto the current healthcare environment. The search waslimited to the English language, and terms relevant to thefield, population, and intervention were combined(Additional file 2). Search terms were selected based on theirsensitivity, specificity, validity, and ability to discriminate im-plementation research articles from non-implementationresearch articles [34–36]. Electronic database searches weresupplemented by cross-checking the reference list ofincluded articles and systematic reviews identified duringthe title and abstract screening. Searches were also supple-mented by hand-searching publication lists from prominentauthors in the field of implementation science.

Study selectionType of studiesAll study designs were included. Experimental andquasi-experimental study designs were included to ad-dress the primary aim. No study design limitations wereapplied to address the secondary aim.

PopulationThe population included individuals or bodies who maderesource allocation decisions at the managerial, execu-tive, or policy level of healthcare organisations or gov-ernment institutions. Broadly defined as healthcarepolicy-makers or managers, this population focuses ondecision-making to improve population health outcomesby strengthening health systems, rather than individualtherapeutic delivery. Studies investigating clinicians mak-ing decisions about individual clients were excluded, un-less these studies also included healthcare policy-makersor managers.

InterventionsInterventions included research implementation strat-egies aimed at facilitating evidence-informed decision-

Sarkies et al. Implementation Science (2017) 12:132 Page 2 of 20

Page 3: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

making by healthcare policy-makers and managers. Im-plementation strategies may be defined as methods toincorporate the systematic uptake of proven evidenceinto decision-making processes to strengthen health sys-tems [37]. While these interventions have been describeddifferently in various contexts, for the purpose of thisreview, we will refer to these interventions as ‘research im-plementation strategies’.

Type of outcomesThis review focused on a variety of possible outcomesthat measure the use of research evidence. Outcomeswere broadly categorised based on the four levels ofKirkpatrick’s Evaluation Model Hierarchy: level 1—reac-tion (e.g. change in attitude towards evidence), level2—learning (e.g. improved skills acquiring evidence),level 3—behaviour (e.g. self-reported action taking), andlevel 4—results (e.g. change in patient or organisationaloutcomes) [38].

ScreeningThe web-based application Covidence (Covidence,Melbourne, Victoria, Australia) was used to manage ref-erences during the review [39]. Titles and abstracts wereimported into Covidence and independently screened bythe lead investigator (MS) and one of two other re-viewers (RH, HL). Duplicates were removed throughoutthe review process using Endnote (EndNote™, Philadel-phia, PA, USA), Covidence and manually during refer-ence screening. Studies determined to be potentiallyrelevant or whose eligibility was uncertain were retrievedand imported to Covidence for full-text review. The leadinvestigator (MS) and one of two other reviewers (RH,HL) then independently assessed the full-text articles forthe remaining studies to ascertain eligibility for inclu-sion. A fourth reviewer (KAB) independently decided oninclusion or exclusion if there was any disagreement inthe screening process. Attempts were made to contactauthors of studies whose full-text articles were unable tobe retrieved, and those that remained unavailable wereexcluded.

Quality assessmentExperimental study designs, including randomised con-trolled trials and quasi-experimental studies, were inde-pendently assessed for risk of bias by the leadinvestigator (MS) and one of two other reviewers (RH,HL) using the Cochrane Collaboration’s tool for asses-sing risk of bias [40]. Non-experimental study designswere independently assessed for risk of bias by the leadinvestigator (MS) and one of two other reviewers (RH,HL) using design-specific risk-of-bias-critical appraisaltools: (1) Quality Assessment Tool for ObservationalCohort and Cross-Sectional Studies from the National

Heart, Lung, and Blood Institute (NHLBI; [41], February)and (2) Critical Appraisal Skills Program (CASP) QualitativeChecklist for qualitative, case study, and evaluationdesigns [42].

Data extractionData was extracted using a standardised, piloted data ex-traction form developed by reviewers for the purpose ofthis study (Additional file 3). The lead investigator (MS)and one of two other reviewers (RH, HL) independentlyextracted data relating to the study details, design, set-ting, population, demographics, intervention, and out-comes for all included studies. Quantitative results werealso extracted in the same manner from experimentalstudies that reported quantitative data relating to the ef-fectiveness of research implementation strategies in pro-moting evidence-informed policy and managementdecisions in healthcare. Attempts were made to contactauthors of studies where data was not reported or clarifi-cation was required. Disagreement between investigatorswas resolved by discussion, and where agreement couldnot be reached, an independent fourth reviewer (KAB)was consulted.

Data analysisA formal meta-analysis was not undertaken due to thesmall number of studies identified and high levels of het-erogeneity in study approaches. Instead, a narrative syn-thesis of experimental studies evaluating the effectivenessof research implementation strategies for promotingevidence-informed policy and management decisions inhealthcare and a thematic synthesis of non-experimentalstudies were performed to describe factors perceived to beassociated with effective strategies and the inter-relationship between these factors. Experimental studieswere synthesised narratively, defined as studies reportingquantitative results with both an experimental and com-parison group. This included specified quasi-experimentaldesigns, which report quantitative before and after resultsfor primary outcomes related to the effectiveness of re-search implementation strategies for promoting evidence-informed policy and management decisions in healthcare.Non-experimental studies were synthesised thematically,defined as studies reporting quantitative results withoutboth an experimental and control group, or studiesreporting qualitative results. This included quasi-experimental studies that do not report quantitative beforeand after results for primary outcomes related to the ef-fectiveness of research implementation strategies for pro-moting evidence-informed policy and managementdecisions in healthcare.The thematic synthesis was informed by inductive the-

matic approach for data referring to the factors per-ceived to be associated with effective strategies and the

Sarkies et al. Implementation Science (2017) 12:132 Page 3 of 20

Page 4: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

inter-relationship between these factors. The thematicsynthesis in this systematic review was based onmethods described by Thomas and Harden [43].Methods involved three stages of analysis: (1) line-by-line coding of text, (2) inductive development of descrip-tive themes similar to those reported in primary studies,(3) analytical themes representing new interpretive con-structs undeveloped within studies but apparent betweenstudies once data is synthesised. Data reported in theresults section of included studies were reviewed line-by-line and open coded according to meaning and con-tent by the lead investigator (MS). Codes were developedusing an inductive approach by the lead investigator(MS) and a second reviewer (TH). Concurrent with dataanalysis, this entailed constant comparison, ongoing de-velopment, and comparison of new codes as each studywas coded. Immersing reviewers in the data, reflexiveanalysis, and peer debriefing techniques were used to en-sure methodological rigor throughout the process. Codesand code structure was considered finalised at point oftheoretical saturation (when no new concepts emergedfrom a study). A single researcher (MS) was chosen toconduct the coding in order to embed the interpretationof text within a single immersed individual to act as aninstrument of data curation [44, 45]. Simultaneous axialcoding was performed by the lead investigator (MS) anda second reviewer (TH) during the original open codingof data to identify relationships between codes and or-ganise coded data into descriptive themes. Once descrip-tive themes were developed, the two investigators thenorganised data across studies into analytical themesusing a deductive approach by outlining relationshipsand interactions between codes across studies. To ensuremethodological rigor, a third reviewer (JW) was con-sulted via group discussion to develop final consensus.The lead author (MS) reviewed any disagreements in de-scriptive and analytical themes by returning to the ori-ginal open codes. This cyclical process was repeateduntil themes were considered to sufficiently describe thefactors perceived to be associated with effective strat-egies and the inter-relationship between these factors.

ResultsSearch resultsThe search strategy identified a total of 7783 articles,7716 were identified by the electronic search strategy, 56from reference checking of identified systematic reviews,8 from reference checking of included articles, and 3 ar-ticles from hand-searching publication lists of prominentauthors. Duplicates (3953) were removed using Endnote(n = 3906) and Covidence (n = 47), leaving 3830 articlesfor screening (Fig. 1).Of the 3830 articles, 96 were determined to be poten-

tially eligible for inclusion after title and abstract

screening (see Additional file 4 for the full list of 96 arti-cles). The full-text of these 96 articles was then reviewed,with 19 studies (n = 21 articles) meeting all relevant cri-teria for inclusion in this review [9, 27, 46–64]. The mostcommon reason for exclusion upon full-text review wasthat articles did not examine the effect of a research im-plementation strategy on decision-making by healthcarepolicy-makers or managers (n = 22).

Characteristics of included studiesThe characteristics of included studies are shown inTable 1. Three experimental studies evaluated the effect-iveness of research implementation strategies for promot-ing evidence-informed policy and management decisionsin healthcare systems. Sixteen non-experimental studiesdescribed factors perceived to be associated with effectiveresearch implementation strategies.

Study designOf the 19 included studies, there were two randomisedcontrolled trials (RCTs) [9, 46], one quasi-experimentalstudy [47], four program evaluations [48–51], three im-plementation evaluations [52–54], three mixed methods[55–57], two case studies [58, 59], one survey evaluation[63], one process evaluation [64], one cohort study [60],and one cross-sectional follow-up survey [61].

Participants and settingsThe largest number of studies were performed in Canada(n = 6), followed by the United States of America (USA)(n = 3), the United Kingdom (UK) (n = 2), Australia(n = 2), multi-national (n = 2), Burkina Faso (n = 1), theNetherlands (n = 1), Nigeria (n = 1), and Fiji (n = 1). Healthtopics where research implementation took place were var-ied in context. Decision-makers were typically policy-makers, commissioners, chief executive officers (CEOs),program managers, coordinators, directors, administrators,policy analysts, department heads, researchers, changeagents, fellows, vice presidents, stakeholders, clinical super-visors, and clinical leaders, from the government, academia,and non-government organisations (NGOs), of varyingeducation and experience.

Research implementation strategiesThere was considerable variation in the research imple-mentation strategies evaluated, see Table 2 for summarydescription. These strategies included knowledge broker-ing [9, 49, 51, 52, 57], targeted messaging [9, 64],database access [9, 64], policy briefs [46, 54, 63], work-shops [47, 54, 56, 60], digital materials [47], fellowshipprograms [48, 50, 59], literature reviews/rapid reviews[49, 56, 58, 61], consortium [53], certificate course [54],multi-stakeholder policy dialogue [54], and multifacetedstrategies [55].

Sarkies et al. Implementation Science (2017) 12:132 Page 4 of 20

Page 5: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Quality/risk of biasExperimental studiesThe potential risk of bias for included experimentalstudies according to the Cochrane Collaboration tool forassessing risk of bias is presented in Table 3. None ofthe included experimental studies reported methods forallocation concealment, blinding of participants andpersonnel, and blinding of outcome assessment [9, 46,47]. Other potential sources of bias were identified ineach of the included experimental studies including (1)inadequate reporting of p values for mixed-effectsmodels, results for hypothesis two, and comparison ofhealth policies and programs (HPP) post-intervention onone study [9], (2) pooling of data from both interventionand control groups limited ability to evaluate the successof the intervention in one study [47], and (3) inadequatereporting of analysis and results in another study[46]. Adequate random sequence generation was

reported in two studies [9, 46] but not in one [47].One study reported complete outcome data [9]; however,large loss to follow-up was identified in two studies[46, 47]. It was unclear whether risk of selectivereporting bias was present for one study [46], as out-comes were not adequately pre-specified in the study.Risk of selective reporting bias was identified for onestudy that did not report p values for sub-group ana-lysis [9] and another that only reported change scoresfor outcome measures [47].

Non-experimental studiesThe potential risk of bias for included non-experimentalstudies according to the Quality Assessment Tool forObservational Cohort and Cross-Sectional Studies fromthe National Heart, Lung, and Blood Institute, and theCritical Appraisal Skills Program (CASP) QualitativeChecklist is presented in Tables 4 and 5.

Fig. 1 PRISMA Flow Diagram

Sarkies et al. Implementation Science (2017) 12:132 Page 5 of 20

Page 6: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Table

1Characteristicsof

includ

edstud

ies

Autho

r,year,

coun

try

Stud

yde

sign

Health

topic

Health

organisatio

nsetting

Decision-maker

popu

latio

nCon

trol

grou

pResearch

implem

entatio

ngrou

pOutcomemeasure

Beynon

etal.2012,

multi-natio

nal[46]

Rand

omised

controlled

trial

Health

inlow-

andmiddle-

income

coun

tries

Publiche

alth

Profession

sfro

mgo

vernmen

tandno

n-go

vernmen

torganisatio

nsandacadem

ia(n

=807)

Existin

gInstitu

teof

Develop

men

tStud

ies

publication

from

theIn

FocusPo

licy

Briefingseries

Basic3-page

policybrief

Basic3-page

policybrief

plus

anexpe

rtop

inionpiece

Basic3-page

policybrief

plus

anun

named

research

fellow

opinion

piece

Onlinequ

estio

nnaires

(immed

iately,1

weekand

3mon

thspo

st)

Semi-structuredinterviews

(in-between1weekand3

mon

thsandafter3

mon

thqu

estio

nnaires)

Brow

nson

etal.

2007,U

SA[47]

Quasi-

expe

rimen

tal

Guide

lines

for

prom

oting

physicalactivity

Stateandlocalh

ealth

departmen

ts(n

=8)

Health

departmen

tprog

ram

managers,administrators,

division

,bureau,or

agen

cyhe

ads,and‘other’p

osition

se.g.

prog

ram

planne

r,nu

trition

ist

(State

n=58)

(Localn=55)

(Other

n=80)

Remaining

states

andthe

Virgin

Island

sserved

asthe

comparison

grou

p)

Worksho

ps,ong

oing

technicalassistanceand

distrib

utionof

aninstructional

CD-ROM

25-item

questio

nnaire

survey

(2years)

Bullock

etal.

2012,U

K[48]

Prog

ramme

evaluatio

ncase

stud

y

Non

-spe

cific

NHShe

alth

service

deliveryorganisatio

ns(n

=10)

Managem

entfellows

(n=11)

Chief

investigators(n

=10)

Add

ition

alco-app

licants

from

theresearch

team

s(n

=3)

Workplace

line-managers

(n=12)

(Totaln=36)

Non

eUKServiceDeliveryand

Organisation(SDO)Managem

ent

Fellowship

prog

ramme

Semi-structuredface-to-

face

interviews

Cam

pbelletal.

2011,A

ustralia

[49]

Prog

ram

evaluatio

nRang

eof

topics

related

topo

pulatio

nhe

alth,health

services

organisatio

nandde

livery,

andcost

effectiven

ess

State-levelp

olicy

agen

cies,including

both

theNew

South

Wales

andVictorian

Dep

artm

entsof

Health

(n=5)

Policym

akers(n

=8)

Non

e‘Evide

ncecheck’rapidpo

licy

relevant

review

andknow

ledg

ebrokers

Structured

interviews

(2–3

years)

Chambe

rset

al.

2012,U

K[58]

Casestud

yAdo

lescen

tswith

eatin

gdisorders

Prim

arycare

LocalN

HScommission

ers

andclinicians

(n=15)

Non

eCon

textualised

eviden

cebriefing

basedon

system

aticreview

Shortevaluatio

nqu

estio

nnaire

Champagn

eet

al.

2014,C

anada[59]

Casestud

ies

Non

-spe

cific

Acade

miche

alth

centres(n

=6)

Extrafellows,SEARC

Hers,

Colleagues,Supe

rvisors,

Vice-preside

ntsandCEO

s(n

=84)

Non

eExecutiveTraining

forResearch

App

lication(EXTRA

)prog

ram

Swift,Efficien

t,App

licationof

Research

inCom

mun

ityHealth

(SEA

RCH)Classicprog

ram

Semi-structuredinterviews

anddata

from

available

organisatio

nald

ocum

ents

Cou

rtne

yet

al.

2007,U

SA[60]

Coh

ortstud

ySubstance

abuse

Com

mun

ity-based

treatm

entun

its(n

=53

units

from

Directorsandclinical

supe

rvisors(n

=309)

Non

e2-dayworksho

p(entitled

“TCUMod

elTraining

-making

itreal”)

Com

pliancewith

early

step

sof

consultin

gand

Sarkies et al. Implementation Science (2017) 12:132 Page 6 of 20

Page 7: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Table

1Characteristicsof

includ

edstud

ies(Con

tinued)

Autho

r,year,

coun

try

Stud

yde

sign

Health

topic

Health

organisatio

nsetting

Decision-maker

popu

latio

nCon

trol

grou

pResearch

implem

entatio

ngrou

pOutcomemeasure

treatm

ent

prog

rams

n=24

multisite

parent

organisatio

ns)

planning

activities

(1mon

th)

OrganisationalR

eadine

ssforChang

e(ORC

)assessmen

t(1

mon

th)

Dagen

aiset

al.

2015,Burkina

Faso

[52]

Implem

entatio

nevaluatio

nMaternal

health,m

alaria

preven

tion,

freehe

althcare,

andfamily

planning

Publiche

alth

Researchers;Kn

owledg

ebrokers;he

alth

profession

als;

commun

ity-based

organisa

tions;and

local,region

al,

andnatio

nalp

olicy-makers

(n=47)

Non

eKn

owledg

ebroker

Semi-structuredindividu

alinterviewsandparticipant

training

session

questio

nnaires

Dob

bins

etal.

2001,C

anada[61].

Cross-sectio

nal

follow-up

survey

Hom

evisitin

gas

apu

blic

health

interven

tion,

commun

ity-

basedhe

art

health

prom

otion,

adolescent

suicide

preven

tion,

commun

ityde

velopm

ent,

andparent-

child

health

Publiche

alth

units

(n=41)

Publiche

alth

decision

-makers(n

=147)

Non

eSystem

aticreview

sCross-sectio

nalfollow-up

teleph

onesurvey

Dob

bins

etal.

2009,C

anada[9]

Rand

omised

controlledtrial

Prom

otionof

healthy

bodyweigh

tin

children

Publiche

alth

departmen

ts(n

=108)

Fron

t-linestaff35%

Managers26%

Directors10%

Coo

rdinators9%

Other

20%

(n=108)

Accessto

anon

lineregistry

ofresearch

eviden

ce

Tailored,

targeted

message

sAccessto

anon

line

registry

ofresearch

eviden

ce

Know

ledg

ebroker

Tailored,

targeted

message

sAccessto

anon

line

registry

ofresearch

eviden

ce

Teleph

one-administered

survey

(kno

wledg

etransfer

andexchange

data

collectiontool)

1–3mon

thspo

stcompletionof

interven

tion

(interven

tionlasted

12mon

ths)

Dop

pet

al.2013,

Nethe

rland

s[55]

Mixed

metho

dsprocess

evaluatio

n

Dem

entia

Hom

e-based

commun

ityhe

alth

Managers(n

=20)

Physicians

(n=36)

Occup

ationalthe

rapists

(n=36)

Non

eMultifaceted

implem

entatio

nstrategy

Semi-structured

teleph

oneinterviewswith

managers(3–5

mon

ths)

Semi-structuredfocus

grou

pswith

occupatio

nal

therapists(2

mon

ths)

Flande

rset

al.

2009,U

SA[53]

Implem

entatio

nevaluatio

nPatient

safety

Teaching

and

nonteaching,

urbanandrural,

governmen

tand

private,as

wellas

academ

icand

Hospitalistsor

quality

improvem

entstaff,

represen

tatives

from

each

institu

tions

departmen

tof

quality

orde

partmen

tof

patient

safety

(n=9)

Non

eTheHospitalistsas

Emerging

Leadersin

Patient

Safety

(HELPS)Con

sortium

Web

-based

survey

(post

meetin

gs)

Sarkies et al. Implementation Science (2017) 12:132 Page 7 of 20

Page 8: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Table

1Characteristicsof

includ

edstud

ies(Con

tinued)

Autho

r,year,

coun

try

Stud

yde

sign

Health

topic

Health

organisatio

nsetting

Decision-maker

popu

latio

nCon

trol

grou

pResearch

implem

entatio

ngrou

pOutcomemeasure

commun

itysettings

(n=9)

Gagliardietal.

2008,C

anada[56]

Mixed

metho

dsexploratory

Colorectal

cancer

Not

specified

Researchers(n

=6)

Clinicians

(n=13)

Manager

(n=5)

Policy-maker

(n=5)

(Totaln=29)

Review

ofCanadianhe

alth

services

research

incolorectal

cancer

basedon

publishe

dpe

rform

ance

measures

1-dayworksho

pto

prioritise

research

gaps,d

efineresearch

questio

nsandplan

implem

entatio

nof

aresearch

stud

y.

Participantsurvey

(prio

rto

worksho

p)Observatio

nof

worksho

pparticipants(during

worksho

p)Semi-structuredinterviews

andob

servationof

worksho

pparticipants

(duringworksho

p)

Kitson

etal.2011,

Australia[50]

Project

evaluatio

n7clinicaltopic

areasiden

tified

inTheOlder

Person

and

ImprovingCare

(TOPIC7)

project

Largetertiary

hospital

(n=1)

Clinicalnu

rsingleaders

(n=14)

Team

mem

bers(n

=28)

Managers(n

=11)

Non

eKn

owledg

etranslationtoolkit

Semi-structuredinterviews

andqu

estio

nnaires

Moatet

al.2014,

multi-natio

nal,[63]

Survey

evaluatio

nHealth

inlow-

andmiddle-

income

coun

tries

Publiche

alth

Policy-makers,stakeh

olde

rsandresearchers(n

=530)

Non

eEviden

cebriefs

Deliberativedialog

ues

Questionn

aire

surveys

Trayno

ret

al.2014,

Canada[57]

Sing

lemixed

-metho

dsstud

yanda

case

stud

y

Child

obesity

Canadianpu

blic

health

departmen

ts(n

=30)

(Casestud

iesn=3)

Health

departmen

tstaff

(RCTn=108)

(CaseAn=258)

(CaseBn=391)

(CaseCn=155)

Accessto

anon

lineregistry

ofresearch

eviden

ce

Know

ledg

ebrokering

Know

ledg

ebroker

journaling(baseline,

interim

,follow-up)

Qualitativeinterviews

n=12

(1year)

Casestud

yinterviews

n=37

(baseline,interim

and22

mon

thfollow-up)

Une

keet

al.2015,

Nigeria[54]

Implem

entatio

nevaluatio

nLow-and

middle-income

coun

tryhe

alth

Publiche

alth

Directorsfro

mMinistryof

Health

(n=9)

Senior

researchersfro

mthe

university

(n=5)

NGOexecutivedirector

(n=1)

Director

ofpu

bliche

alth

inthelocalg

overnm

ent

servicecommission

(n=1)

Executivesecretaryof

the

AIDScontrolage

ncy(n

=1)

Statefocalp

ersonof

Millen

nium

Develop

men

tGoals(n

=1)

(Totaln=18)

Non

eTraining

worksho

p(HPA

C)Certificatecourse

(HPA

C)Po

licybriefandho

stingof

amulti-stakeh

olde

rpo

licy

dialog

ue(HPA

C)

Semi-structuredinterviews

(end

ofeach

interven

tion)

Group

discussion

s

Senior

managers(n

=20)

Non

eSemi-structuredinterviews

Sarkies et al. Implementation Science (2017) 12:132 Page 8 of 20

Page 9: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Table

1Characteristicsof

includ

edstud

ies(Con

tinued)

Autho

r,year,

coun

try

Stud

yde

sign

Health

topic

Health

organisatio

nsetting

Decision-maker

popu

latio

nCon

trol

grou

pResearch

implem

entatio

ngrou

pOutcomemeasure

Waqaet

al.2013,

Fiji[51]

Process

evaluatio

nOverw

eigh

tandob

esity

Publiche

alth

governmen

torganisatio

ns(n

=6)

NGOs(n

=2)

Middlemanagers(n

=22)

Junior

managers(n

=7)

(Totaln=49)

Policybriefandho

stingof

amulti-stakeh

olde

rpo

licy

dialog

ue(HPA

C)

Processdiaries

Wilson

etal.2015,

Canada[64]

Process

evaluatio

nNon

specific

Policyanalysts(n

=9)

Health

departmen

tun

its(n

=6)

Senior

analysts(n

=8)

Junior

analysts(n

=1)

Non

eAccessto

anon

lineregistry

ofresearch

eviden

ceSemi-structuredteleph

one

interviews

Sarkies et al. Implementation Science (2017) 12:132 Page 9 of 20

Page 10: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Table 2 Implementation strategy summary description

Study (author,year)

Implementation strategy Theoretical framework Summary description

Dobbins2009, [57, 9]

Access to online registry ofresearch evidence

Dobbins framework Reference offered a link to a short summary and full textof each review

Tailored, targeted messagesand access to online registryof research evidence

Title of systematic review and link to full reference,including abstract sent via emailReference offered a link to a short summary and fulltext of each review

Knowledge broker, tailoredmessages, and access to onlineregistry of research evidence

Knowledge brokers ensured relevant evidence wastransferred in useful ways to decision-makers to assistskills and capacity development for translating evidenceinto local healthcare delivery. Activities included regularelectronic and telephone communication, one face-to-face site visit, and invitation to a workshop.Title of systematic review and link to full reference,including abstract sent via emailReference offered a link to a short summary and fulltext of each review

Beynon2012, [46]

Basic 3-page policy brief A simple theory ofchange for a policybrief

Link to policy brief sent via email

Basic 3-page policy briefplus an expert opinion piece

Same basic 3-page policy brief plus an expert opinionpiece credited and written by a sector expert, LawrenceHaddad. Link to policy brief sent via email

Basic 3-page policy brief plusan un-credited expert opinionpiece

Same basic 3-page policy brief and expert opinion piecebut credited to an unnamed research fellow. Link topolicy brief sent via email

Brownson2007, [47]

Workshops, ongoing technicalassistance, and distribution ofan instructional digitalmaterials

Framework for asystematic approachto promoting effectivephysical activityprograms and policies

Workshops included: formal presentations, case studyapplications, and ‘real-world’ examplesOngoing technical assistance included: strategic planning,grant writing, tuition waivers, consultation for effectivestrategy planning, and dissemination guidanceDigital materials included: additional information,prominent public health leader interviews, and resourcetools

Courtney2007, [60]

Workshop The change book Pre-workshop completion of organisational readiness forchange assessment.Workshop included: conceptual overview presentations,personalised feedback, comparison with other agencies,and group work

Bullock2012 [48]

Fellowship program Programme evaluationframework (adaptedfrom Kirkpatrick)

Practicing managers work within research teams for theduration of a funded project

Campbell2011, [49]

‘Evidence check’ rapid policyrelevant review andknowledge brokers

Van Kammen et al.’sapproach to knowledgebrokering

Pre-meeting commissioning tool completed prior toknowledge broker meetings, which clarified researchquestion. Then a rapid review summary of evidence onpolicy area is performed

Chambers2012, [58]

Contextualised evidencebriefing based on systematicreview

Facilitators of the use ofresearch evidenceidentified by a systematicreview (Innvaer et al. [28])

Researcher attended meeting to clarify research questionand prepared a concise evidence briefing on policy area

Champagne2014, [59]

Executive Training for ResearchApplication (EXTRA) program

Knowledge creation logicmodel

Program included: residency sessions, projects,educational activities, networking, and post-programactivities

Swift, Efficient, Application ofResearch in Community Health(SEARCH) Classic program

Program included: modules, inter-module work, andapplication of knowledge to practice-based projects

Dagenais2015, [52]

Knowledge broker Theoretical models forunderstanding healthbehaviour

Knowledge broker tasks included: liaison, informationmanagement and support, partner meetings, developingdocumentary research strategies, database set-up forrelevant information, drafting summary documents,workshops, and developing and monitoring actions plans

Systematic reviews –

Sarkies et al. Implementation Science (2017) 12:132 Page 10 of 20

Page 11: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Table 2 Implementation strategy summary description (Continued)

Study (author,year)

Implementation strategy Theoretical framework Summary description

Dobbins2001, [61]

Systematic reviews of the effectiveness of public healthinterventions disseminated to public health decision-makers

Dopp2013, [55]

Multifaceted implementationstrategy

The model of Grol andWensing

Educational materials, educational meetings, outreachvisits, newsletters, and reminders

Flanders2009, [53]

The Hospitalists as EmergingLeaders in Patient Safety(HELPS) Consortium

– Meetings on quality improvement methodology andsubstantiative patient safety-related topics, and a finalhalf-day session drawing out learning’s and next steps

Gagliardi2008, [56]

Comprehensive review andworkshop

Author’s conceptualmodel of factorsinfluencing effectivenessof knowledge exchange

Comprehensive review of Canadian health servicesresearch in colorectal cancer based on publishedperformance measures and workshop to prioritiseresearch gaps, define research questions, and planimplementation of a research study

Kitson2011, [50]

Knowledge translation toolkit – Team recruitment, clarification, stakeholder engagement,pre-strategy evaluation, training, support meetings,communication and feedback, process evaluation,dissemination (e.g. posters and presentations), futureplanning, and program evaluation

Moat et al.2014, multi-national, [50]

Evidence briefs Theory of plannedbehaviour

Evidence briefs and deliberative dialogues across a rangeof issues and low- and middle-income countries

Deliberative dialogues

Uneke2015, [54]

Training, workshop, certificatecourse, policy brief, andhosting of a multi-stakeholderpolicy dialogue

– Workshop featuring training on the role of researchevidence, preparation of policy briefs, how to organiseand use policy dialogues, and how to set priorities.Certificate course aimed to foster research capacity,leadership, enhance capacity for evidence-informeddecision-making, and health policy monitoring/evaluation. Policy briefs were produced, and themulti-stakeholder policy dialogue between keystakeholders was then held

Waqa 2013,[51, 62]

Knowledge broker capacitybuilding

– Knowledge coordinated organisation recruitment,mapping policy environment, analysed organisationalcapacity and support for evidence-informed policymaking,developed evidence-informed policymaking skills, andfacilitated development of evidence-informed policy briefs

Wilson et al.2015, Canada[64]

Access to online registry ofresearch evidence

Framework for assessingcountry-level efforts tolink research to action

The ‘self-serve’ evidence service consisted only of databaseaccess

Access to online registry ofresearch evidence, emailalerts, and full-text availability

The ‘full-serve’ evidence service included (1) database accessfor research evidence addressing questions about governance,financial and delivery arrangements within which programs,services and drugs are provided and about implementationstrategies; (2) monthly email alerts about new additions tothe database; and (3) full-text article availability

Table 3 Risk of bias of included experimental studies using the Cochrane Collaboration tool for assessing risk of bias

Sarkies et al. Implementation Science (2017) 12:132 Page 11 of 20

Page 12: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Narrative synthesis results: effectiveness of researchimplementation strategies for promoting evidence-informed policy and management decisions in healthcareDefinitive estimates of implementation strategy effect arelimited due to the small number of identified studies, andheterogeneity in implementation strategies and reportedoutcomes. A narrative synthesis of results is described forchanges in reaction/attitudes/beliefs, learning, behaviour,and results. See Table 6 for a summary of study results.

Randomised controlled trialsInterestingly, the policy brief accompanied by an ex-pert opinion piece was thought to improve both level1 change in reaction/attitudes/beliefs and level 3 be-haviour change outcomes. This was referred to as an“authority effect” [46]. Tailored targeted messages alsoreportedly improved level 3 behaviour change out-comes. However, the addition of a knowledge brokerto this strategy may have been detrimental to theseoutcomes. When organisational research culture wasconsidered, health departments with low research cul-ture may have benefited from the addition of a know-ledge broker, although no p values were provided forthis finding [9].

Non-randomised studiesThe effect of workshops, ongoing technical assistance,and distribution of instructional digital materials on level1 change in reaction/attitudes/beliefs outcomes was dif-ficult to determine, as many measures did not changefrom baseline scores and the direction of change scoreswas not reported. However, a reduction in perceivedsupport from state legislators for physical activity inter-ventions was reported after the research implementationstrategy. All level 2 learning outcomes were reportedlyimproved, with change scores larger for local than statehealth department decision-makers in every categoryexcept methods in understanding cost. Results were thenless clear for level 3 behaviour change outcomes. Onlyself-reported individual-adapted health behaviour changewas thought to have improved [47].

Thematic synthesis results: conceptualisation of factorsperceived to be associated with effective strategies andthe inter-relationship between these factorsDue to the relative paucity of evidence for effectivenessstudies, a thematic synthesis of non-experimental studieswas used to explore the factors perceived to be associatedwith effective strategies and the inter-relationship betweenthese factors. Six broad, interrelated, analytic themes

Table 4 Risk of bias of included non-experimental studies using the Quality Assessment Tool for Observational Cohort andCross-Sectional Studies

n/a not applicable

Sarkies et al. Implementation Science (2017) 12:132 Page 12 of 20

Page 13: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

emerged from the thematic synthesis of data captured inthis review (Fig. 2). We developed a conceptualisation ofhow these themes interrelated from data captured bothwithin and across studies. Some of these analytic themeswere specifically mentioned in individual papers, but noneof the papers included in this review identified all, nor de-veloped a conceptualisation of how they interrelated. Thesix analytic themes were conceptualised as having a unidir-ectional, hierarchal flow from (1) establishing an imperative

for practice change, (2) building trust between implementa-tion stakeholders, (3) developing a shared vision, and (4)actioning change mechanisms. These were underpinned by(5) employment of effective communication strategies and(6) provision of resources to support change.

Establish imperativeOrganisations and individuals were driven to implementresearch into practice when there was an imperative for

Table 5 Risk of bias of included non-experimental studies using the Critical Appraisal Skills Program (CASP) Qualitative Checklist

Sarkies et al. Implementation Science (2017) 12:132 Page 13 of 20

Page 14: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

practice change. Decision-makers wanted to know whychange was important to them, and their organisationand or community. Imperatives were seen as drivers ofmotivation for change to take place and were evidentboth internal to the decision-maker (personal gain) andexternal to the decision-makers (organisational and soci-etal gain).

Personal gain Individuals were motivated to participatein research implementation projects where they couldderive personal gain [48, 50, 56]. Involvement in re-search was viewed as an opportunity rather than an obli-gation [56]. This was particularly evident in one study byKitson et al. where all nursing leaders unanimouslyagreed the potential benefit of supported, experientiallearning was substantial, with 13 of 14 committing to

leading further interdisciplinary, cross-functional pro-jects [50].

Organisational and societal gain Decision-makers sup-ported research implementation efforts when they alignedto an organisational agenda or an area where societalhealth needs were identified [48, 50, 53, 55, 59, 64].Practice change was supported if it was deemed importantby decision-makers and aligned with organisational prior-ities, where knowledge exchange was impeded if changeshad questionable relevance to the workplace [48, 53, 64].Individuals reported motivation to commit to projectsthey felt would address community needs. For example, inone study, nursing leaders identified their passion forhealth topics as a reason to volunteer in a practice changeprocess [50]. In another study, managers were supportive

Table 6 Summary of study results

Study (author,year)

Implementationstrategy

Level 1: change inreaction/attitudes/beliefs

Level 2: learning Level 3: behaviour

Randomised controlled trial

Beynon 2012[46]

Basic 3-pagepolicy brief

High-quality ratings

Opinion about evidencestrength or interventioneffectiveness varies byhealth topic

– Less likely to source other information andresearch related to the topic than control

Basic 3-pagepolicy brief plusan expert opinionpiece

High-quality rating

Opinion about evidencestrength or interventioneffectiveness varies byhealth topic.Increased intention to sendpolicy brief to someone elseand tell someone about keymessages

– Less likely to source other information andresearch related to the topic than control.

Trend towards intentions persisting toactions.More likely to send policy brief to someoneelse

Basic 3-pagepolicy brief plus anun-credited expertopinion piece

High-quality ratingOpinion about evidencestrength or interventioneffectiveness varies byhealth topic

– Less likely to source other information andresearch related to the topic than control

Dobbins2009 [9]

Tailored, targetedmessages

– – Improved use of public health policies andprograms compared to control

Tailored, targetedmessages plus aknowledge broker

– – Addition of knowledge broker potentiallyreduced use of public health policies andprograms. However, improvements mayhave occurred in organisations with lowresearch culture

Non-randomised controlled trial

Brownson2007 [47]

Workshops,ongoing technicalassistance, anddigital resources

Change in whether heardof recommendations andattended training.Less likely to report statelegislators were supportiveof physical activity interventions.No change in other outcomesfrom baseline

All knowledge and skillmeasurements improved.Change larger for local thanstate health departmentdecision-makers in everycategory except methodsin understanding cost.The largest change related toattitudes

Improvement in self-reported individualadapted health behaviour change.No difference in other behaviour changeoutcomes

Sarkies et al. Implementation Science (2017) 12:132 Page 14 of 20

Page 15: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

of practice change to improve care of people with demen-tia, as they thought this would benefit the population [55].

Build trustRelationships, leadership authority, and governance con-stituted the development of trust between stakeholdergroups.

Relationships The importance of trusting relationshipsbetween managers, researchers, change agents, and staffwas emphasised in a number of studies [48, 50, 54, 59, 64].Developing new relationships through collaborative net-working and constant contact reportedly addressed mutualmistrust between policy-makers and the researchers, andengaged others to change practice [54, 59]. Bullock et al.described how pre-existing personal and professional rela-tionships might facilitate implementation strategy successthrough utilising organisational knowledge and identifyingworkplace “gatekeepers” to engagement with. In the samestudy, no real link between healthcare managers and aca-demic resources was derived from fellows that were onlyweakly connected to healthcare organisations [48].

Leadership authority The leadership authority of thoseinvolved in research implementation influenced the develop-ment of trust between key stakeholders [50, 52, 55, 59, 61].Dagenais et al. found recommendations and informationwas valued if credited from researchers and change agentswhose input was trusted [52]. The perception that individ-uals with senior organisational roles reduce perceived risk

and resistance to change was supported by Dobbins et al.,who reported that seniority of individuals is a predictor ofsystematic review use in decision-making [50, 59, 61].However, professional seniority should be related to theresearch implementation context, as the perceived lack ofknowledge in content area was a barrier to providing man-agerial support [55].

Governance A number of studies expressed the import-ance of consistent and sustained executive support in orderto maintain project momentum [48, 50, 52, 53, 59, 64]. Inthe study by Kitson et al., individuals expressed concernand anxiety around reputational risk if consistent organisa-tion support was not provided [50]. Organisational capacitywas enhanced with strong management support andpolicies [57]. Uneke et al. identified good stewardship in theform of governance to provide accountability and protec-tion for individuals and organisations in their study.Participants in this study unanimously identified the needfor performance measurement mechanisms for the healthpolicy advisory committee to promote sustainability andindependent evidence to policy advice [54]. Bullock et al.found that managers view knowledge exchange in a trans-action manner and are keen to know and use projectresults as soon as possible. However, researchers andchange agents may not wish to apply results due to thephase of the project [48]. This highlighted the importanceof governance systems to support confidentiality and limit-ing the release of project results before stakeholders areconfident of findings.

Fig. 2 Conceptualisation of Inter-related themes (analytic themes) associated with effective strategies and the inter-relationship betweenthese factors

Sarkies et al. Implementation Science (2017) 12:132 Page 15 of 20

Page 16: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Develop shared visionA shared vision for desired change and outcomes can bebuilt around common goal through improving under-standing, influencing behaviour change, and workingwith the characteristics of organisations.

Stakeholder understanding Improving the understand-ing of research implementation was considered a precursorto building shared vision [50, 52, 55, 56]. Policy-makersreported lack of time prevented them from performing anevidence review and desired experientially tailored informa-tion, education, and avoidance of technical language to im-prove understanding [52, 55, 58]. It was perceived that lackof clarity limited project outcomes in the study by Gagliardiet al., which emphasised the need for simple processes [56].When challenges arose in Kitson et al., ensuring all partici-pants understood their role from implementation outsetwas suggested as a process improvement [50].

Influence change Knowledge brokers in Campbell et al.were able to elicit well-defined research questions if theywere open, honest, and frank in their approach to policy-makers. Policy-makers felt that knowledge brokering wasmore useful for shaping parameters, scope, budget, and for-mat of projects, which provides guidance for decision-making rather than being prescriptive [49]. However, con-clusive recommendations that aim for a consensus areviewed favourably by policy-makers, which means a balancebetween providing guidance without being too prescriptive,must be achieved [63]. Interactive strategies may allowchange agents to gain better understanding of evidence inorganisational decisions and guide attitudes towardsevidence-informed decision-making. Champagne et al.observed fellows participating in this interactive, socialprocess, and Dagenais et al. reported practical exercises andinteractive discussions were appreciated by knowledge bro-kers in their own training [52, 59]. Another study reportedbarriers in work practice challenges being viewed as criti-cism; despite this, organisation staff valued leaders’ abilityto inspire a shared vision and identified ‘challenging pro-cesses’ as the most important leadership practice [50].

Characteristics of organisation Context-specific organ-isational characteristics such as team dynamics, changeculture, and individual personalities can influence theeffectiveness of research implementation strategies[50, 53, 56, 59]. Important factors in Flanders et al.were clear lines of authority in collaborative and effectivemultidisciplinary teams. Organisation readiness for changewas perceived as both a barrier and a facilitator toresearch implementation but higher staff consensus wasassociated with higher engagement in organisationalchange [60]. Strategies in Dobbins et al. were thought tobe more effective if they were implemented in

organisations with learning culture and practices, or facili-tated an organisational learning culture themselves, whereFlanders et al. reported solutions to hospital safetyproblems often created more work or change fromlong-standing practices, which proved a barrier toovercome [53, 61]. Individual resistance to change inthe form of process concerns led to higher levels ofdissatisfaction [50].

Provide resources to support changeIndividuals were conscious of the need for implementa-tion strategies to be adequately resourced [48–50, 55, 56,58, 59, 61]. There was anxiety in the study by Döpp et al.around promoting research implementation programs,due to the fear of receiving more referrals than could behandled with current resourcing [55]. Managers mentionservice pressures as a major barrier in changing practice,with implementation research involvement dependent onworkload and other professional commitments [50, 56].Lack of time prevented evidence reviews being performed,and varied access to human resources such as librarianswere also identified as barriers [58, 59]. Policy-makers andmanagers appreciated links to expert researchers, espe-cially those who had infrequent or irregular contact withthe academic sector previously [49]. Managers typicallyviewed engagement with research implementation as atransactional idea, wanting funding for time release (be-yond salary costs), while researchers and others from theacademic sector consider knowledge exchange inherentlyvaluable [48]. Vulnerability around leadership skills andknowledge in the study by Kitson et al. exposed the im-portance of training, education, and professional develop-ment opportunities. Ongoing training in critical appraisalof research literature was viewed as a predictor of whethersystematic reviews influenced program planning [61].

Employ effective communication strategiesStudies and study participants expressed different prefer-ences for the format and mode of contact for implemen-tation strategies [48, 51, 52, 55, 56, 59, 64]. Face to facecontact was preferred by the majority of participants inthe study by Waqa et al. and was useful in acquiring andaccessing relevant data or literature to inform the writ-ing of policy briefs [51]. Telephone calls were perceivedas successful in Döpp et al. because they increased involve-ment and opportunity to ask questions [55]. Electroniccommunication formats in the study by Bullock et al. pro-vided examples of evidence-based knowledge transfer fromacademic settings to the clinical setting. Fellows spent timereading literature at the university and would then sendthat information to the clinical workplace in an email, whilemanagers stated that the availability of website informationpositively influenced its use [48]. Regular contact in theform of reminders encouraged actions, with the study by

Sarkies et al. Implementation Science (2017) 12:132 Page 16 of 20

Page 17: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Dagenais et al. finding lack of ongoing, regular contact withknowledge brokers in the field limitated research imple-mentation programs [52].

Action change mechanismReviewers interpreted the domains (analytical themes)representing a model of implementation strategy successto lead to a change mechanism. Change mechanismsrefer to the actions taken by study participants to imple-ment research into practice. Studies did not explicitlymeasure the change mechanisms that lead to the imple-mentation of research into practice. Instead, implicitmeasurements of change mechanisms were reportedsuch as knowledge gain and intention to act measures.

DiscussionThis review found that there are numerous implementa-tion strategies that can be utilised to promote evidence-informed policy and management decisions in health-care. These relate to the ‘authority effect’ from a simplelow-cost policy brief and knowledge improvement froma complex multifaceted workshop with ongoing tech-nical assistance and distribution of instructional digitalmaterials [46, 47]. The resource intensity of these strat-egies was relatively low. It was evident that providingmore resource-intensive strategies is not always betterthan less, as the addition of a knowledge broker to a tai-lored targeted messaging strategy was less effective thanthe messages alone [9]. Due to the paucity of studiesevaluating the effectiveness of implementation strategies,understanding why some implementation strategies suc-ceed where others fail in different contexts is importantfor future strategy design. The thematic synthesis of thewider non-effectiveness literature included in our reviewhas lead us to develop a model of implementation strat-egy design that may action a change mechanism forevidence-informed policy and management decisions inhealthcare [48–61, 63, 64].Our findings were concomitant with change manage-

ment theories. The conceptual model of how themes in-terrelated both within and across studies includessimilar stages to ‘Kotter’s 8 Step Change Model’ [65].Leadership behaviours are commonly cited as organisa-tional change drivers due to the formal power and au-thority that leaders have within organisations [66–68].This supports the ‘authority effect’ described in Beynonet al. and the value decision-makers placed on informa-tion credited to experts they trust [46]. Authoritativemessages are considered a key component of an effectivepolicy brief, and therefore, organisations should considerpartnering with authoritative institutions, researchgroups, or individuals to augment the legitimacy of theirmessage when producing policy briefs [69]. Changemanagement research proposes change-related training

improves understanding, knowledge, and skills to embeda change vision at a group level [70–72]. The results ofour review support this view that providing adequatetraining resources to decision-makers can improve un-derstanding, knowledge, and skills, leading to desiredchange. The results of our thematic synthesis appear tosupport knowledge broker strategies in theory. Multi-component research implementation strategies arethought to have greater effects than simple strategies[73, 74]. However, the addition of knowledge brokers toa tailored targeted messaging research implementationstrategy in Dobbins et al. was less effective than the mes-sages alone [9]. This may indicate that in some cases,simple research implementation strategies may be moreeffective than complex, multi-component ones. Furtherdevelopment of strategies is needed to ensure that anumber of different implementation options are avail-able, which can be tailored to individual health contexts.A previous review by LaRocca et al. supports this find-ing, asserting that in some cases, complex strategies maydiminish key messages and reduce understanding of in-formation presented [10]. Further, the knowledge brokerstrategy in Dobbins et al. had little or no engagementfrom 30% of participants allocated to this group, empha-sising the importance of tailoring strategy complexityand intensity to organisational need.This systematic review was limited both in the quantity

and quality of studies that met inclusion criteria. Previousreviews have been similarly limited in the paucity of high--quality research evaluating the effectiveness of researchimplementation strategies in the review context area [10,29, 32, 75]. The limited number of retrieved experimental,quantitatively evaluated effectiveness studies, means theresults of this review were mostly based on non-experimental qualitative data without an evaluation of ef-fectiveness. Non-blinding of participants could havebiased qualitative responses. Participants could have feltpressured to respond in a positive way if they did not wishto lose previously provided implementation resources, andresponses could vary depending on the implementationcontext and what changes were being made, for example,if additional resources were being implemented to fill anexisting evidence-to-practice gap, versus the disinvestmentof resources due to a lack of supportive evidence. Despitethese limitations, we believe our comprehensive searchstrategy retrieved a relatively complete identification ofstudies in the field of research. A previous Cochrane re-view in the same implementation context area recentlyidentified only one study (also captured in our review)using their search strategy and inclusion criteria [33, 76].A meta-analysis was unable to be performed due to thelimited amount of studies and high levels of heterogeneityin study approaches, as such, the results of this synthesisshould be interpreted with caution. However, synthesising

Sarkies et al. Implementation Science (2017) 12:132 Page 17 of 20

Page 18: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

data narratively and thematically allowed this review toexamine not only the effectiveness of research implemen-tation strategies in the context area but also the mecha-nisms behind inter-relating factors perceived to beassociated with effective strategies. Since our originalsearch strategy, we have been unable to identify additionalfull-texts from the 11 titles excluded due to no datareporting (e.g. protocol, abstract). However, the Develop-ing and Evaluating Communication strategies to supportInformed Decisions and practice based on Evidence(DECIDE) project has since developed a number of toolsto improve the dissemination of evidence-based rec-ommendations [77]. In addition, support for the rela-tionship development, face to face interaction, andfocus on organisational climates themes in our con-ceptual model is supported by the full version [78] ofan excluded summary article [79], identified after theoriginal search strategy.Studies measured behaviour changes considered on the

third level of the Kirkpatrick Hierarchy but did not measurewhether those behaviour changes led to their intended im-proved societal outcomes (level 4, Kirkpatrick Hierarchy).Future research should also evaluate changes in health andorganisational outcomes. The conceptualisation of factorsperceived to be associated with effective strategies and theinter-relationship between these factors should be inter-preted with caution as it was based on low levels of evi-dence according to the National Health and MedicalResearch Council (NHMRC) of Australia designations [80].Therefore, there is a need for the association between thesefactors and effective strategies to be rigorously evaluated.Further conceptualisation of how to evaluate researchimplementation strategies should consider how to in-clude health and organisation outcome measures tobetter understand how improved evidence-informeddecision-making can lead to greater societal benefits.Future research should aim to improve the relativelylow number of high-quality randomised controlled tri-als evaluating the effectiveness of research implemen-tation strategies for promoting evidence-informedpolicy and management decisions in healthcare. Thismight allow formal meta-analysis to be performed,providing indications of what research implementationstrategies are effective in which context.

ConclusionsEvidence is developing to support the use of researchimplementation strategies for promoting evidence-informed policy and management decisions in health-care. A number of inter-relating factors were thought toinfluence the effectiveness of strategies through estab-lishing an imperative for change, building trust, develop-ing a shared vision, and action change mechanisms.

Employing effective communication strategies and pro-viding resources to support change underpin these fac-tors, which should inform the design of futureimplementation strategies.

Additional files

Additional file 1: PRISMA 2009 checklist. (DOCX 26 kb)

Additional file 2: Search Strategy. (DOCX 171 kb)

Additional file 3: Data extraction 1 and 2. (XLSX 884 kb)

Additional file 4: Full list of 96 articles and reasons for full-textexclusion. (DOCX 125 kb)

AbbreviationsCEO: Chief executive officer; NGO: Non-government organisation;NHMRC: National Health and Medical Research Council; PRISMA: PreferredReporting Items for Systematic Reviews and Meta-Analysis; RCT: Randomisedcontrolled trial; UK: United Kingdom; USA: United States of America

AcknowledgementsAuthors’ would like to acknowledge the expertise provided by Jenni Whiteand the support provided by the Monash University library staff and theMonash University and Monash Health Allied Health Research Unit.

FundingNo funding.

Availability of data and materialsData are available from the corresponding author on reasonable request.

Authors’ contributionsMS was responsible for the conception, organisation and completion of thissystematic review. MS developed the research question and search strategy,conducted the search, screened the retrieved studies, extracted the data,performed the analysis and quality appraisal, and prepared the manuscript.KAB was responsible for the oversight and management of the review. KABcontributed to the development of the inclusion and exclusion criteria;resolved screening, quality, and data extraction discrepancies betweenreviewers; and assisted with the manuscript preparation. ES also wasresponsible for the oversight and helped develop the final research questionand inclusion criteria. ES assisted with selecting and using the qualityappraisal tool, developing the data extraction tool, and preparing themanuscript. RH and HL were responsible for performing independentscreening of identified studies and deciding upon inclusion or exclusionfrom the review. RH and HL also performed independent quality appraisaland data extraction for half of the included studies and contributed to themanuscript preparation. TH was responsible for the oversight andmanagement of the review, assisted with data analysis and interpretation,and contributed to the manuscript preparation. All authors read andapproved the final manuscript.

Authors’ informationMitchell Sarkies is a Physiotherapist from Melbourne, Victoria, Australia, withan interest in translating research into practice. He is currently a Ph.D.candidate at Monash University.

Ethics approval and consent to participateNot applicable.

Consent for publicationNot applicable.

Competing interestsThe authors declare that they have no competing interests.

Sarkies et al. Implementation Science (2017) 12:132 Page 18 of 20

Page 19: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

Publisher’s NoteSpringer Nature remains neutral with regard to jurisdictional claims inpublished maps and institutional affiliations.

Author details1Kingston Centre, Monash University and Monash Health Allied HealthResearch Unit, 400 Warrigal Road, Heatherton, VIC 3202, Australia. 2MonashUniversity Department of Community Emergency Health and ParamedicPractice, Building H McMahons Road, Frankston, VIC 3199, Australia.

Received: 20 February 2017 Accepted: 1 November 2017

References1. Orton L, Lloyd-Williams F, Taylor-Robinson D, O’Flaherty M, Capewell S. The

use of research evidence in public health decision making processes:systematic review. PLoS One. 2011;6(7):e21704.

2. Ciliska D, Dobbins M, Thomas H. Using systematic reviews in healthservices, Reviewing research evidence for nursing practice: systematicreviews; 2007. p. 243–53.

3. Mosadeghrad AM. Factors influencing healthcare service quality. Int J HealthPolicy Manag. 2014;3(2):77–89. 10.15171/ijhpm.2014.65.

4. Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: afundamental concept for public health practice. Annu Rev Public Health.2009;30:175–201.

5. Jernberg T, Johanson P, Held C, Svennblad B, Lindback J, Wallentin L.Association between adoption of evidence-based treatment and survival forpatients with ST-elevation myocardial infarction. JAMA. 2011;305 https://doi.org/10.1001/jama.2011.522.

6. Davis D, Davis ME, Jadad A, Perrier L, Rath D, Ryan D, et al. The case forknowledge translation: shortening the journey from evidence to effect. BMJ.2003;327(7405):33–5.

7. Madon T, Hofman K, Kupfer L, Glass R. Public health: implementationscience. Science. 2007;318 https://doi.org/10.1126/science.1150009.

8. Chalmers I. If evidence-informed policy works in practice, does it matter if itdoesn’t work in theory? Evid Policy. 2005;1 https://doi.org/10.1332/1744264053730806.

9. Dobbins M, Hanna SE, Ciliska D, Manske S, Cameron R, Mercer SL, et al. Arandomized controlled trial evaluating the impact of knowledge translationand exchange strategies. Implement Sci. 2009;4(1):1–16. https://doi.org/10.1186/1748-5908-4-61.

10. LaRocca R, Yost J, Dobbins M, Ciliska D, Butt M. The effectiveness ofknowledge translation strategies used in public health: a systematic review.BMC Public Health. 2012;12

11. Lavis JN, Ross SE, Hurley JE. Examining the role of health services researchin public policymaking. Milbank Q. 2002;80(1):125–54.

12. Lavis JN. Research, public policymaking, and knowledge-translationprocesses: Canadian efforts to build bridges. J Contin Educ Health Prof.2006;26(1):37–45.

13. Klein R. Evidence and policy: interpreting the Delphic oracle. J R Soc Med.2003;96(9):429–31.

14. Walt G. How far does research influence policy? Eur J Public Health.1994;4(4):233–5.

15. Bucknall T, Fossum M. It is not that simple nor compelling!: comment on“translating evidence into healthcare policy and practice: single versusmulti-faceted implementation strategies—is there a simple answer to acomplex question?”. Int J Health Policy Manag. 2015;4(11):787.

16. Bowen S, Erickson T, Martens PJ, Crockett S. More than “using research”: thereal challenges in promoting evidence-informed decision-making. HealthcPolicy. 2009;4(3):87.

17. Macintyre S, Petticrew M. Good intentions and received wisdom are notenough. J Epidemiol Community Health. 2000;54(11):802–3.

18. Chalmers I. Trying to do more good than harm in policy and practice: therole of rigorous, transparent, up-to-date evaluations. Ann Am Acad Pol SocSci. 2003;589(1):22–40. https://doi.org/10.1177/0002716203254762.

19. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translationof research findings. Implementation science. 2012;7(1):50.

20. Peters DH, Adam T, Alonge O, Agyepong IA, Tran N. Implementationresearch: what it is and how to do it. Bmj. 2013;347:f6753. doi:10.1136/bmj.f6753.

21. Stone EG, Morton SC, Hulscher ME, Maglione MA, Roth EA, Grimshaw JM, etal. Interventions that increase use of adult immunization and cancerscreening services: a meta-analysis. Ann Intern Med. 2002;136(9):641–51.

22. Paramonczyk A. Barriers to implementing research in clinical practice. CanNurse. 2005;101(3):12–5.

23. Haynes B, Haines A. Barriers and bridges to evidence based clinical practice.BMJ : Br Med J 1998;317(7153):273-276.

24. Lavis JN. How can we support the use of systematic reviews inpolicymaking? PLoS Med. 2009;6(11):e1000141.

25. Wilson PM, Watt IS, Hardman GF. Survey of medical directors’ views and useof the Cochrane Library. Br J Clin Gov. 2001;6(1):34–9.

26. Ram FS, Wellington SR. General practitioners use of the Cochrane Library inLondon. Prim Care Respir J. 2002;11(4):123–5.

27. Dobbins M, Cockerill R, Barnsley J. Factors affecting the utilization ofsystematic reviews: A study of public health decision makers. Internationaljournal of technology assessment in health care. 2001;17(2):203–14.

28. Innvær S, Vist G, Trommald M, Oxman A. Health policy-makers’perceptions of their use of evidence: a systematic review. J Health ServRes Policy. 2002;7(4):239–44.

29. Murthy L, Shepperd S, Clarke MJ, Garner SE, Lavis JN, Perrier L, Roberts NW,Straus SE. Interventions to improve the use of systematic reviews indecision-making by health system managers, policy makers and clinicians.Cochrane Database of Systematic Reviews. 2012;(9):CD009401. doi:10.1002/14651858.CD009401.pub2.

30. Tetroe JM, Graham ID, Foy R, Robinson N, Eccles MP, Wensing M, et al.Health research funding agencies’ support and promotion of knowledgetranslation: an international study. Milbank Q. 2008;86(1):125–55. https://doi.org/10.1111/j.1468-0009.2007.00515.x.

31. Perrier L, Mrklas K, Lavis JN, Straus SE. Interventions encouraging theuse of systematic reviews by health policymakers and managers: asystematic review. Implement Sci. 2011;6:43. https://doi.org/10.1186/1748-5908-6-43.

32. Mitton C, Adair CE, McKenzie E, Patten SB, Waye Perry B. Knowledge transferand exchange: review and synthesis of the literature. Milbank Q. 2007;85(4):729–68. https://doi.org/10.1111/j.1468-0009.2007.00506.x.

33. Armstrong R. Evidence-informed public health decision-making in localgovernment [PhD thesis]. Melbourne: University of Melbourne; 2011.

34. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al.A cross-sectional study of the number and frequency of terms used to referto knowledge translation in a body of health literature in 2006: a tower ofBabel? Implement Sci. 2010;5:16. https://doi.org/10.1186/1748-5908-5-16.

35. McKibbon KA, Lokker C, Wilczynski NL, Haynes RB, Ciliska D, Dobbins M, etal. Search filters can find some but not all knowledge translation articles inMEDLINE: an analytic survey. J Clin Epidemiol. 2012;65(6):651–9. https://doi.org/10.1016/j.jclinepi.2011.10.014.

36. Lokker C, McKibbon KA, Wilczynski NL, Haynes RB, Ciliska D, Dobbins M, etal. Finding knowledge translation articles in CINAHL. Stud Health TechnolInform. 2010;160(Pt 2):1179–83.

37. Science I. About. 2016. http://implementationscience.biomedcentral.com/about. Accessed 19 Jun 2016.

38. Kirkpatrick DL. Evaluating human relations programs for industrial foremenand supervisors. Madison: University of Wisconsin; 1954.

39. Covidence. 2016. https://www.covidence.org/. Accessed 18 Nov 2016.40. Higgins JP, Altman DG, Gøtzsche PC, Jüni P, Moher D, Oxman AD, et al. The

Cochrane Collaboration’s tool for assessing risk of bias in randomised trials.BMJ. 2011;343:d5928.

41. NHLBI. (2014, February). Background: Development and use of study qualityassessment tools. US department of Health and Human Services. Retrievedfrom http://www.nhlbi.nih.gov/health-pro/guidelines/in-develop/cardiovascular-risk-reduction/tools/background.

42. CASP. NHS Critical Appraisal Skills Programme (CASP): appraisal tools. NHS PublicHealth Resource Unit. 2017. http://www.casp-uk.net/casp-tools-checklists.

43. Thomas J, Harden A. Methods for the thematic synthesis of qualitativeresearch in systematic reviews. BMC Med Res Methodol. 2008;8:45.https://doi.org/10.1186/1471-2288-8-45.

44. Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health servicesresearch: developing taxonomy, themes, and theory. Health Serv Res. 2007;42(4):1758–72.

45. Janesick V. The choreography of qualitative research: minuets, improvisations,and crystallization. In: Denzin N, Lincoln Y, editors. Strategies of qualitativeinquiry. 2nd ed. Ed. Thousand Oaks: Sage Publications; 2003. p. 46–79.

Sarkies et al. Implementation Science (2017) 12:132 Page 19 of 20

Page 20: The effectiveness of research implementation strategies ... · The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions

46. Beynon P, Chapoy C, Gaarder M, Masset E. What difference does a policybrief make? Full report of an IDS, 3ie, Norad study. Institute of DevelopmentStudies and the International Initiative for Impact Evaluation (3ie): NewDelhi, India. 2012. http://www.3ieimpact.org/media/filer_public/2012/08/22/fullreport_what_difference_does_a_policy_brief_make__2pdf_-_adobe_acrobat_pro.pdf.

47. Brownson RC, Ballew P, Brown KL, Elliott MB, Haire-Joshu D, Heath GW,et al. The effect of disseminating evidence-based interventions thatpromote physical activity to health departments. Am J Public Health.2007;97(10):1900–7.

48. Bullock A, Morris ZS, Atwell C. Exchanging knowledge through healthcaremanager placements in research teams. Serv Ind J. 2013;33(13–14):1363–80.

49. Campbell D, Donald B, Moore G, Frew D. Evidence check: knowledgebrokering to commission research reviews for policy. Evid Policy. 2011;7https://doi.org/10.1332/174426411x553034.

50. Kitson A, Silverston H, Wiechula R, Zeitz K, Marcoionni D, Page T. Clinicalnursing leaders’, team members’ and service managers’ experiences ofimplementing evidence at a local level. J Nurs Manag. 2011;19(4):542–55.https://doi.org/10.1111/j.1365-2834.2011.01258.x.

51. Waqa G, Mavoa H, Snowdon W, Moodie M, Schultz J, McCabe M.Knowledge brokering between researchers and policymakers in Fiji todevelop policies to reduce obesity: a process evaluation. Implement Sci.2013;8 https://doi.org/10.1186/1748-5908-8-74.

52. Dagenais C, Some TD, Boileau-Falardeau M, McSween-Cadieux E, Ridde V.Collaborative development and implementation of a knowledge brokeringprogram to promote research use in Burkina Faso, West Africa. Glob HealthAction. 2015;8:26004. doi:10.3402/gha.v8.26004.

53. Flanders SA, Kaufman SR, Saint S, Parekh VI. Hospitalists as emerging leaders inpatient safety: lessons learned and future directions. J Patient Saf. 2009;5(1):3–8.

54. Uneke CJ, Ndukwe CD, Ezeoha AA, Uro-Chukwu HC, Ezeonu CT.Implementation of a health policy advisory committee as a knowledgetranslation platform: the Nigeria experience. Int J Health Policy Manag. 2015;4(3):161–8. 10.15171/ijhpm.2015.21.

55. Döpp CM, Graff MJ, Rikkert MGO, van der Sanden MWN, Vernooij-DassenMJ. Determinants for the effectiveness of implementing an occupationaltherapy intervention in routine dementia care. Implement Sci. 2013;8(1):1.

56. Gagliardi AR, Fraser N, Wright FC, Lemieux-Charles L, Davis D. Fosteringknowledge exchange between researchers and decision-makers: exploring theeffectiveness of a mixed-methods approach. Health Policy. 2008;86(1):53–63.

57. Traynor R, DeCorby K, Dobbins M. Knowledge brokering in public health: atale of two studies. Public Health. 2014;128(6):533–44.

58. Chambers D, Grant R, Warren E, Pearson S-A, Wilson P. Use of evidencefrom systematic reviews to inform commissioning decisions: a case study.Evid Policy. 2012;8(2):141–8.

59. Champagne F, Lemieux-Charles L, Duranceau M-F, MacKean G, Reay T.Organizational impact of evidence-informed decision making traininginitiatives: a case study comparison of two approaches. Implement Sci.2014;9(1):53.

60. Courtney KO, Joe GW, Rowan-Szal GA, Simpson DD. Usingorganizational assessment as a tool for program change. J Subst AbusTreat. 2007;33(2):131–7.

61. Dobbins M, Cockerill R, Barnsley J, Ciliska D. Factors of the innovation,organization, environment, and individual that predict the influence fivesystematic reviews had on public health decisions. Int J Technol AssessHealth Care. 2001;17(4):467–78.

62. Waqa G, Mavoa H, Snowdon W, Moodie M, Nadakuitavuki R, Mc Cabe M.Participants’ perceptions of a knowledge-brokering strategy to facilitateevidence-informed policy-making in Fiji. BMC Public Health. 2013;13 https://doi.org/10.1186/1471-2458-13-725.

63. Moat KA, Lavis JN, Clancy SJ, El-Jardali F, Pantoja T. Evidence briefs anddeliberative dialogues: perceptions and intentions to act on what waslearnt. Bull World Health Organ. 2014;92(1):20–8.

64. Wilson MG, Grimshaw JM, Haynes RB, Hanna SE, Raina P, Gruen R, et al. Aprocess evaluation accompanying an attempted randomized controlled trialof an evidence service for health system policymakers. Health Res PolicySyst. 2015;13(1):78.

65. Kotter JR. Leading change—why transformation efforts fail. Harv Bus Rev.2007;85(1):96–103.

66. Whelan-Berry KS, Somerville KA. Linking change drivers and theorganizational change process: a review and synthesis. J Chang Manag.2010;10(2):175–93.

67. Trice HM, Beyer JM. Cultural leadership in organizations. Organ Sci.1991;2(2):149–69.

68. Downs A, Besson D, Louart P, Durant R, Taylor-Bianco A, Schermerhorn Jr J.Self-regulation, strategic leadership and paradox in organizational change. JOrgan Chang Manag. 2006;19(4):457–70.

69. Jones N, Walsh C. Policy briefs as a communication tool for developmentresearch: Overseas development institute (ODI); 2008.

70. Schneider B, Gunnarson SK, Niles-Jolly K. Creating the climate and culture ofsuccess. Organ Dyn. 1994;23(1):17–29.

71. Whelan-Berry K, Alexander P. Creating a culture of excellent service: ascholar and practitioner explore a case of successful change. Paperpresented at the Academy of Management. Honolulu, August; 2005.

72. Bennett JB, Lehman WE, Forst JK. Change, transfer climate, and customerorientation a contextual model and analysis of change-driven training.Group Org Manag. 1999;24(2):188–216.

73. Mansouri M, Lockyer J. A meta-analysis of continuing medical educationeffectiveness. J Contin Educ Heal Prof. 2007;27(1):6–15. https://doi.org/10.1002/chp.88.

74. Marinopoulos SS, Dorman T, Ratanawongsa N, Wilson LM, Ashar BH,Magaziner JL, et al. Effectiveness of continuing medical education. Evid RepTechnol Assess. 2007;149:1–69.

75 Chambers D, Wilson PM, Thompson CA, Hanbury A, Farley K, Light K.Maximizing the impact of systematic reviews in health care decisionmaking: a systematic scoping review of knowledge-translation resources.Milbank Q. 2011;89(1):131–56.

76 Armstrong R, Waters E, Dobbins M, Lavis JN, Petticrew M, Christensen R.Knowledge translation strategies for facilitating evidence-informed publichealth decision making among managers and policy-makers. CochraneDatabase of Systematic Reviews. 2011;(6):CD009181. doi:10.1002/14651858.CD009181.

77 Treweek S, Oxman AD, Alderson P, Bossuyt PM, Brandt L, Brożek J, et al.Developing and evaluating communication strategies to support informeddecisions and practice based on evidence (DECIDE): protocol andpreliminary results. Implement Sci. 2013;8(1):6.

78 Phillips SJ. Piloting knowledge brokers to promote integrated stroke care inAtlantic Canada, Evidence in action, acting on evidence; 2008. p. 57.

79 Lyons R, Warner G, Langille L, Phillips S. Evidence in action, acting onevidence: a casebook of health services and policy research knowledgetranslation stories: Canadian Institutes of Health Research; 2006.

80 Merlin T, Weston A, Tooher R. Extending an evidence hierarchy to includetopics other than treatment: revising the Australian 'levels of evidence'. BMCmedical research methodology. 2009;9:34. doi:10.1186/1471-2288-9-34.

• We accept pre-submission inquiries

• Our selector tool helps you to find the most relevant journal

• We provide round the clock customer support

• Convenient online submission

• Thorough peer review

• Inclusion in PubMed and all major indexing services

• Maximum visibility for your research

Submit your manuscript atwww.biomedcentral.com/submit

Submit your next manuscript to BioMed Central and we will help you at every step:

Sarkies et al. Implementation Science (2017) 12:132 Page 20 of 20