Upload
clinton-palmer
View
214
Download
0
Tags:
Embed Size (px)
Citation preview
Evidence-Based Evaluation for Victim Service Providers
Anne P. DePrince, Ph.D.Department of Psychology
Center for Community Engagement and Service Learning
Game Plan: Guiding Questions
1:45-2:45 Goals for
gathering evidence
What to measure? 255-3:15
Selecting measures (Part 1)
August Selecting
measures (Part 2)?
Who to measure? When to measure? Costs (to
respondents) of measuring?
Perspectives I bring
Researcher, TSS Group Director, Center for Community Engagement
and Service Learning (CCESL)
Big Picture
Evidence-based evaluation never happens in a vacuum Evidence goals should be directly tied to
strategic planning/program goals Evidence-based evaluation ends up in a black
hole Think about uses of data on front end
CCESL Sample Goals from Strategic Planning
Monitor Measure
New instructors trained
18 Increase knowledge of service learning pedagogy/practice
Advanced practitioners trained
20
Etc.
#
Faculty/Staff who Participated in Trainings for New Service Learning
Practitioners 19
Faculty/Staff who Participated in Trainings for Advanced Service
Learning Practitioners21
Continuing Faculty Mini-Grants 6
New Faculty Learning Pods 5
New Mini-Grants 8
Goals for gathering evidence
Goals for gathering evidence
Program evaluation Why do we do what we do; and does it
work? Do your programs/services/interventions
work? Funders care about this too…
Goals for gathering evidence
Building evidence-based evaluate capacity helps your agencies as research consumers Understanding measurement issues helps
you evaluate evidence for various practices
Goals for gathering evidence
Descriptive What new trends need to be addressed? Have we accurately characterized the
problem?
Goals for gathering evidence
Generative and Collaborative Building programs based on theory and data
What to measure?
What to measure
Monitoring/Process What (how much) did
clients receive? How satisfied were
clients?
Measuring/Outcome What changes occurred
because of your program?
knowledge attitude skill behavior expectation emotion life circumstance
What to measure
Monitoring/Process # clients served # calls on crisis line # prevention
programs delivered # clients satisfied
with services
Measuring/Outcome Increase in knowledge
about safety planning Increase in positive
preceptions of criminal justice process
Increase in engagement with criminal justice process
General Issues in Measuring
Building blocks to get to your specific evidence-based evaluation question
You start with a theory of change
Based on Theory A, we believe that increases in
Victim Safety will lead to lower psychological distress
and greater engagement with c.j. system
From theory, you identify constructs important to your agency/program
A variable, not directly observable, that has been identified to explain behavior on the basis of some theory
Based on Theory A, we believe that increases in
Victim Safety will lead to lower psychological distress
and greater engagement with c.j. system
Construct Example:
Victim Safety
Change at Your Agency
Pick one program within your agency What is the theory of change that
underlies that program? What are you trying to change? What factors lead to the change you
want? Identify 3-4 of the MOST relevant
constructs tied to this program’s theory of change.
Back to that construct
What on earth is
Victim Safety?
Measuring the weight of smoke
Victim Safety
Defining terms in evaluation
Variable Any characteristic or quantity that can
take on one of several values
Different kinds of variables
What comes first Predictor
(independent) variable Program
Program A versus B Program A versus
no program # sessions
Restraining order Yes/no
What comes next Outcome (dependent)
variable Aggressive behavior
Low/high Parenting skills
Ineffective to effective
Different kinds of variables
confound – any uncontrolled extraneous (or third) variable that changes with your predictor and could provide an alternative explanation of the results.
Earlier SANE exam
Earlier victim
advocacy
Better mental health
Operationalize variables
Operationalize definitions – make your variable specific and limiting. Get to something we can observe
Example: Big difference between saying I want to:
“treat trauma” “decrease children’s hyper-vigilance and
avoidance”
Different kinds of measurements…give you different information
Nominal Values differ by category, there is no ordering
Can’t calculate an average a.k.a qualitative, dichotomous, discrete,
categorical Least information Examples:
Sex, Race, Ethnicity Anything answered as Yes/no;
Present/absent
Scales of Measurement
Ordinal Values have different names and are ranked
according to quantity. e.g., Olympic medals
Example Divide people into low, moderate, and
high service needs ** You don’t know the exact distance between
two values on an ordinal scale; you just know high is higher than medium, etc.
Scales of Measurement
Interval and Ratio Spacing between values is known (so you know that
not only is one unit larger or smaller, but by how much it is larger or smaller)
Examples Scores on
a measure of PTSD symptoms a test of knowledge of safety planning
Number of calls for service revictimizations
How to decide on a measurement scale
Choice of scale affects the amount and kind of information you get
And generally the less information you get, the less powerful the statistics you can use
Interval and ratio scales provide the most information
But you can’t always use them – e.g., sex GUIDING RULE: When you can, always go for more
info (interval/ratio)
At your agency
How are you currently measuring the 3-4 constructs you identified? What kind of measurement scale?
Relevant Measure Websites Measuring Violence-Related Attitudes, Behaviors, and
Influences AmongYouths: A Compendium of Assessment Tools - Second Editionhttp://www.cdc.gov/ncipc/pub-res/measure.htm
Measuring Intimate Partner Violence Victimization and Perpetration: ACompendium of Assessment Toolshttp://www.cdc.gov/ncipc/dvp/Compendium/Measuring_IPV_Victimization_and_Perpetration.htm
http://mailer.fsu.edu/~cfigley/Tests/Tests.html ‘
http://vinst.umdnj.edu/VAID/browse.asp
What to look for in a good measure?
Validity and Reliability
What is Validity?
“truth” (Bryant, 2000) Degree to which our inference or conclusion is
accurate, reasonable and correct.
Examples of Types of Measurement Validity:Face Validity
The measure seems valid “on its face”. A judgment call. Probably the weakest form of measurement
validity.
A measure of anxiety includes items that are clearly about anxiety
Examples of Types of Measurement Validity: Construct Validity
Extent to which an instrument measures the targeted construct Haynes, Richard & Kubany, 1995.
Construct Validity
Like Law… The truth, the whole truth and nothing but
the truth The construct, the whole construct and
nothing but the construct
Trochim, 2001
Construct Validity Goal
The construct
Other construct: A
Other construct: C
Other construct: B
Other construct: D
Measure all of the construct and nothing else.
Trochim, 2001
Construct Validity Goal
Social Anxiety
Depression
Guilt
Generalized anxiety
Self-esteem
Measure all of the construct and nothing else.
Though, in reality, the construct is related to all 4 things.Though, in reality, the construct is related to all 4 things.
Reliability
You are measuring the construct with little error
Versus accuracy Versus validity
Something can be reliable but not valid But, if something is not reliable, it cannot
be valid b/c then your measure is only measuring random variability
How can we improve reliability
By reducing error Standardization
When measurement conditions are standardized, sources of variance become constants and therefore do not influence the variability of scores
(Strube, 2000). Aggregation
With more items, error might cancel itself out
Error on first item might be + and on second -
Random Error
XX
Fre
qu
ency
Fre
qu
ency
The distribution of X The distribution of X with no random errorwith no random errorThe distribution of X The distribution of X with no random errorwith no random error
The distribution of X The distribution of X with random errorwith random errorThe distribution of X The distribution of X with random errorwith random error
Trochim, 2001
Systematic Error
Any factors that systematically affect measurement of the variable across the sample.
Systematic error = bias. e.g., asking questions that start “do you agree with
right-wing fascists that...” will tend to yield a systematic lower agreement rate.
Systematic error does affect average performance for the group.
Trochim, 2001
Systematic Error
XX
Fre
qu
ency
Fre
qu
ency
The distribution of X The distribution of X with no systematic errorwith no systematic errorThe distribution of X The distribution of X with no systematic errorwith no systematic error
The distribution of X The distribution of X with systematic errorwith systematic errorThe distribution of X The distribution of X with systematic errorwith systematic error
Notice that systematic error affects the average; called a bias.
Trochim, 2001
This validity and reliability business is why evaluators make you crazy with…
Worries that you’ve made up your own evaluation instrument
Requests to standardize how evaluations are implemented (a new intern administers an interview vs. a seasoned staff member)
When you can…
Use existing measures that have some evidence of reliability and validity (and then brag that you are doing so)
Standardize assessment procedures Be thoughtful about number of respondents
Selecting Measures
Self-report surveys/questionnaires
Importance of how you ask what you ask…
Examples: Exit Polls 2004
Open-ended versus structured questions 2013 Obamacare v. Affordable Care Act
Wording matters…a lot!
Writing Surveys
Types of questions Decisions about question content Decisions about question wording Decisions about response form Placement and sequence of questions
Keeping in mind: Bias
Can social desirability be avoided? Can interviewer distortion and subversion be
controlled? Can false respondents be avoided?
Types of Questions
Unstructured Structured
Structured Questions
DichotomousDichotomousDichotomousDichotomous
MaleMale FemaleFemale
Structured Questions
NominalNominalNominalNominal
Occupation:Occupation:
1 = College Student1 = College Student2 = Lawyer2 = Lawyer3 = Veterinary Tech3 = Veterinary Tech
Structured Questions
OrdinalOrdinalOrdinalOrdinal
Rank the most helpful things that victim advocates did in Rank the most helpful things that victim advocates did in your case (from most to least helpful)...your case (from most to least helpful)...
___ Called you at home___ Called you at home___ Went with you to court___ Went with you to court___ Went with you to a medical or counseling appointment___ Went with you to a medical or counseling appointment___ Referred you for additional services elsewhere___ Referred you for additional services elsewhere
Filter or Contingency Questions
Have you ever smoked cigarettes?Have you ever smoked cigarettes?
yesyesnono
If If yesyes, about how many times have , about how many times have you smoked cigarettes?you smoked cigarettes?
onceonce2 to 5 times2 to 5 times6 to 10 times6 to 10 times11 to 20 times11 to 20 timesmore than 20 timesmore than 20 times
Question Content
Is the question necessary/useful? Do you need the age of each child or just
the number of children under 16? Do you need to ask income or can you
estimate?
Question Content
Do you need several questions? Always try to avoid combining 2+ questions
into 1. What are your feelings towards victims
and offenders of IPV? What do you think of proposed changes
in benefits and documentation requirements?
Question Content
Do you need several questions?
Did you get all the desired information? E.g.: What were your household earnings last
year? Will participants mention all income (e.g.,
child support)
Question Content
Do you need several questions?
Will you have enough information to interpret people’s responses?
If you ask about attitudes towards police, can you interpret this without finding out if they reported the crime to the police?
Question Content
Do respondents have the information they need?
‘Do you think that the police should have direct filed the charges in your case?
-- can’t say if they don’t know what that means.
Question Content
Is the question specific enough?
How well did you like the treatment you got?
Versus... Did you recommend the treatment to
others? Did you come back for additional services?
Question Placement
Is the answer influenced by prior questions? Does question come too early or too late to
arouse interest? Does the question receive sufficient attention?
Opening Questions
Should be easy to answer Should not be sensitive material Should get the respondent “rolling”
“Sensitive” Questions
Only after trust is developed Should make sense in that section of the
survey (not “out of left field”) Precede with warm-up questions
Placement Issues to Consider
Start with easy, non-threatening questions. Put more difficult, threatening questions near
end. Don’t start mail survey with an open-ended
question. Put demographics at end (unless needed to
screen).
Placement Issues to Consider
Ask about one topic at a time. When switching topics, use a transition. Reduce response set. For filter or contingency questions, make a
flowchart. Keep as short as possible!
Question Wording
Can the question be misunderstood? Did you talk with your victim advocate?
System or Community? Talk may or may not mean used
resources, etc..
Question Wording
What assumptions does your question make? If you ask what social class someone’s in,
you assume that they know what social class is and that they think of themselves as being in one.
Check assumption in a previous question.
Question Wording
Do you need a timeframe? Do you think you will need additional
services? When? In next week, 6 months, year?
Question Wording
Does question contain difficult or unclear terminology?
Does the question make each alternative explicit?
Is the wording objectionable? Is the wording loaded or slanted?