Upload
informa-australia
View
468
Download
1
Tags:
Embed Size (px)
Citation preview
Community Child Health research group | Murdoch Childrens Research Institute
located at The Royal Children’s Hospital Centre for Community Child Health
The meaning of ‘evidence’ in the child and family services field
Myfanwy McDonald & Tim Moore
The elephant in the room: CRFAA
The conference-related Friday afternoon
affect
Posing questions rather than providing
answers
“Questions are more
transformative than
answers… Questions
create the space for
something new to
emerge… Answers…
while satisfying, shut down
the discussion.” Peter Block
The focus of our work
Early intervention
Universal prevention
Source: Protecting Children is Everyone’s Business (2009)
Why is the question of evidence
important?
• Programs and initiatives designed to improve
outcomes for children often have only modest effects
- or they don’t work at all so
• We need to have a better understanding about what
works in order that our resources are more effectively
and efficiently utilised and for this
• We need evidence to demonstrate what works and
• We need evidence derived from rigorous, valid,
reliable research designs but
• There are inherent challenges in this
Questions guiding this
presentation
• What are the challenges?
• Is there a new way (or ways) of
thinking about the issues pertaining
to evidence?
• Where might we go from here?
Outline
• Highlight 3 issues:
1. getting the evidence we need
2. implementation
3. ‘program-centred’ mindset
• Highlight some questions for each
• Some concluding considerations
Issue 1: Getting the evidence we need
Why are Cochrane reviews so
boring?
“Five thousand (mostly) high-quality
Cochrane reviews notwithstanding, the
troubling aspect of this enterprise is not
the few narrow questions that the
reviews answer but the many broad
ones they leave unanswered...
“... The reason why Cochrane reviews
are boring — and sometimes
unimplementable in practice — is that the
technical process of stripping away all but
the bare bones of a focused experimental
question removes what practitioners and
policymakers most need to engage with:
the messy context in which people get ill,
seek health care (or not), receive and
take treatment (or not), and change their
behaviour (or not).”
Greenhalgh, 2013
What is the real problem?
• The people doing the research
aren’t asking the right questions?
• The evidence isn’t translated in a
way that can be utilised in practice?
• We’re trying to bring order to a
phenomenon that something that is
fundamentally chaotic / disorganised
(i.e. the ‘messy’ context)?
Issue 2: Implementing the findings
of research
The rise of Implementation Science
• Deliberate, purposeful attempt to
implement evidence-based interventions
into practice
• Typically involves a purveyor who
oversees the process of implementation
• Balancing fidelity with flexibility –
delivering it as intended, adapting it to the
unique context
Challenges of Implementation
• In organisations with no capacity for
ongoing implementation, whose
responsibility is it to keep up to date with
the evidence?
• Practitioners?
• Managers?
• Organisational level?
• Where does the boundary between
fidelity and flexibility lie? Who makes the
decision?
Alternatives to Implementation Science
• Continuous Quality Improvement (CQI):
• The point at which practitioners are engaged is
at the point a problem is identified
• Working through the problem together
• Using ‘localised’ data
• Rapid implementation of a practice change
• Review outcomes
• Adjust as required
So what?
• Is one approach superior?
• Challenges with CQI, e.g.,:
• may encourage a focus on small practice
issues rather than child and family outcomes
• Is one more suited to the ‘messy’ context of
practice?
• Which one should be used when?
Issue 3: The ‘program-centred’
mindset
The ‘program-centred’ mindset
• Focusing on programs as the ‘answer’ to poor
outcomes amongst children and families
• Program centred mindset as opposed to a
focus on:
- Process: how programs are delivered (rather
than what program is delivered)
- Surrounding contextual factors (e.g. service
system structure, community environments,
government policies)
Why do we like programs?
• Programs are easier to evaluate using gold
standard methodologies
• The evidence therefore is easy to interpret
and compare
• We don’t have the resources / time / capacity
to focus on the bigger issues (e.g. government
policies that impact negatively upon families)
• It’s not our role to focus on the bigger issues
• Touching upon the bigger issues is risky –
who will we get ‘offside’
Questions
• Could it be that programs have typically moderate
effects because of the impact of surrounding contextual
factors?
• If we continue to focus on programs rather than
processes and broader contextual factors will they only
ever have moderate effects?
• Whose role is it to address the broader contextual
issues that impact upon children and families? Child and
family services? NGOs? Policy-makers? Government?
Advocacy groups?
• How much is the nature of the evidence (and our views
about ‘gold standard’ evidence) driving our interest in
programs?
Where might we go from here?
For research synthesis projects
• A ‘realist approach’ to research synthesis:
• systematic reviews of RCTs and
• broad based review of research, theory,
practice-based evidence from a range of
different disciplines
• Doesn’t discount the importance of gold
standard evidence but also takes into
account a broader range of ‘evidence’
Questions for consideration
• How else can we make research more
relevant to practitioners and policy-makers?
How to bridge the gap between science and the
‘messy context’?
• What role should practitioners play in:
• identifying the problems?
• coming up with ‘localised’ solutions?
• How do we keep processes and broader
contextual factors ‘on the table’? Whose
responsibility is that?
Contact details
Myfanwy McDonald
Senior Project Officer
The Royal Children’s Hospital Centre for Community
Child Health
P: (03) 9345 4463
For CCCH Research & Policy papers see:
http://www.rch.org.au/ccch/resources_and_publications/