Introduction to Evaluation
Odette Parry & Sally-Ann [email protected] [email protected]
Aim and objectives of our presentation
Define evaluation and examine how it differs from research
Briefly introduce different evaluation designs and data collection approaches-Strengths and limitations
Developing evaluation plans Governance and ethical issues
What is evaluation?
Evaluation is “a set of procedures to judge a services merit by providing a systematic assessment of it’s aims, objectives, activities, outputs, outcomes and costs.” (NHS Executive, 1997)
Evaluation is learning about ‘what works’ and lessons for future development Evaluation determines how a service is doing
So why evaluate? Provides evidence of:
- Is what we are doing working?
- What are the benefits and impacts?
- What was successful and what not? And why?
- Have/are objectives being achieved?
So why evaluate?
Provides evidence for:
- Stakeholders
- Further programme development
- Staff development
- Other organisations
- Funders
What do you want to find out?
What is happening and how often?
How is it happening and why is it happening as it is?
Approaches
The ‘What and How Much’ question in evaluation addresses measurable ‘OUTCOMES’ associated with Quantitative approaches (e.g. RCTs and Surveys)
The ‘How and Why’ question addresses PROCESS, associated with Qualitative approaches (e.g. semi-structured interviews, observation and focus groups)
Horses for Courses
Evaluation usually requires both Outcome & Process information
Evaluation may Formative &/or Summative
Small scale evaluations, while most often qualitative, do collect some quantifiable data
Plan your evaluation
• Evaluation is not a ‘bolt on’• Evaluation key to informing project
development and delivery• Plan early• Involve funders, stakeholders in the
planning process
Define your purpose
Clearly define the purpose of the evaluation
What are the main aims and objectives
This is a key step in the planning process and will guide how the findings will be used
Develop an evaluation plan • What is being evaluated• Purpose of the evaluation• What is known already• The questions the evaluation will address• Evaluation method to be used• Who & Where will the evaluation take
place• Timescales• The resources available
Be realistic about what can be achieved
• Often evaluation is a compromise between the ideal and the achievable- the wants of different groups- constraints of methodology- evaluator skills- resource limitations- time limitations- ethical and governance issues
Different ways to collect data
Decide which methods to use in order to get the information you need. You may use one or more of the following:- Existing data- Document analysis- Interviews- Self completed questionnaires- Observation- Focus Group
Using existing data Use data routinely collected to examine
process and outcomes
Can save time but need to ensure that data collected, in a form that can be analysed and consistent with evaluation plan Evaluation of All Wales Dietetics Scheme used Minimum
data sets. Developed in conjunction with WAG and project dieticians to ensure consistency with evaluation aims
Collected data on project activity e.g. courses run, start and completion dates, description of activities, participant details, course outcomes, partnerships built etc
Document Analysis
Policy documents, minutes, operational policies- Track project development, aims of
project Whilst can provide rich information May only provide part of picture May be open to subjective
interpretation
Observation
Observing participants in activities
Participant vs non participant Observer effects Not suitable for some settings Can be difficult to collect & record
data
Interviews
Enables in depth exploration of how people think & feel about certain topics, effectiveness of your intervention etc
Rich data, allows in-depth understanding Can explore more sensitive area Can tailor to the individual No group influence Resource intensive
Focus groups
Investigation of how groups perceive topics, view the effectiveness of your intervention
Clarify issues identified in surveys Provide solutions to problems Less resource intensive and 1 to 1
interviews Group management issues Group process issues
Questionnaires
Identification of patterns, trends Investigate needs, expectations,
perspectives, preferences, satisfaction, knowledge
Large samples, relatively lower cost Ease of analysis Low response rates Respondent bias Language and literacy issues
Analysing your data
Avoid bias, involve more than one person in the task
Address your key questions Combine data types & findings from
different sources Compare views of different groups Don’t anticipate results, look for
unexpected findings
Evidence of impactFocus on whether:
the purpose has been achieved
the needs of those who take part have been met
there are unintended outcomes arising from the intervention
the intervention has resulted in changes in behaviour
there are barriers to and facilitators of successful implementation
Using your findings
Inform strategy, policy development Inform budgets Inform funding proposals Inform improvement plans & and make
changes Identification of training needs Identification of areas for future
research/evaluation
Presenting evaluation findings
Present balanced view, don’t just report the positive
Consider the needs of your audience, Different versions of reports or types of
presentation might be needed
Reflect on the evaluation process Were aims and objectives met? If not why Were methods employed appropriate?
Tools, recruitment methods, analysis etc
Did you reach target groups? Were resources sufficient? What changes resulted ? Future impacts?
Governance Issues
Be careful about crossing over from evaluation into the realms of research – carries additional connotations for ethics approval etc
Take advice Take an ethical/professional approach
to the evaluation, consider the participant and yourself
Ethics and the rights of Participants Requirements of Data Protection Act (1998) Some evaluations may require ethical
approvalIssues to consider- Informed Consent- Deception- Debrief- Withdrawal- Confidentiality
Data Protection Act 1998 – Your responsibility
Only hold genuinely required information Certain types of sensitive information carry added
restrictions and may only be used with permission Information should only be collected for the purposes for
which it was initially required Data must only be processed in accordance with the
legislation People must not be harmed by how their information is
used People have a right to view information being held about
them Information must be accurate, kept up to date, kept
secure and deleted when obsolete
In summary
Evaluation an essential component Need to plan Variety of approaches Methods used dependent on questions
to be asked Consideration of Governance issues
Resources
Data Protection Act see the information commissioners pages http://www.ico.gov.uk/what_we_cover/data_protection.aspx
National Research Ethics Service if your evaluation takes place on NHS premises or with Staff, patients ethical approval might be needed check with your local R&D office and LREC administrator http://www.nres.npsa.nhs.uk/applications/apply/
UK Evaluation Society has a useful resource page http://www.evaluation.org.uk/resources/online-resources.aspx