68
Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Embed Size (px)

Citation preview

Page 1: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Evidence-Based Evaluation for Victim Service Providers

Anne P. DePrince, Ph.D.Department of Psychology

Center for Community Engagement and Service Learning

Page 2: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Game Plan: Guiding Questions

1:45-2:45 Goals for

gathering evidence

What to measure? 255-3:15

Selecting measures (Part 1)

August Selecting

measures (Part 2)?

Who to measure? When to measure? Costs (to

respondents) of measuring?

Page 3: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Perspectives I bring

Researcher, TSS Group Director, Center for Community Engagement

and Service Learning (CCESL)

Page 4: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Big Picture

Evidence-based evaluation never happens in a vacuum Evidence goals should be directly tied to

strategic planning/program goals Evidence-based evaluation ends up in a black

hole Think about uses of data on front end

Page 5: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

CCESL Sample Goals from Strategic Planning

Monitor Measure

New instructors trained

18 Increase knowledge of service learning pedagogy/practice

Advanced practitioners trained

20

Etc.

Page 6: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

#

Faculty/Staff who Participated in Trainings for New Service Learning

Practitioners 19

Faculty/Staff who Participated in Trainings for Advanced Service

Learning Practitioners21

Continuing Faculty Mini-Grants 6

New Faculty Learning Pods 5

New Mini-Grants 8

Page 7: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Goals for gathering evidence

Page 8: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Goals for gathering evidence

Program evaluation Why do we do what we do; and does it

work? Do your programs/services/interventions

work? Funders care about this too…

Page 9: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Goals for gathering evidence

Building evidence-based evaluate capacity helps your agencies as research consumers Understanding measurement issues helps

you evaluate evidence for various practices

Page 10: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Goals for gathering evidence

Descriptive What new trends need to be addressed? Have we accurately characterized the

problem?

Page 11: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Goals for gathering evidence

Generative and Collaborative Building programs based on theory and data

Page 12: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

What to measure?

Page 13: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

What to measure

Monitoring/Process What (how much) did

clients receive? How satisfied were

clients?

Measuring/Outcome What changes occurred

because of your program?

knowledge attitude skill behavior expectation emotion life circumstance

Page 14: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

What to measure

Monitoring/Process # clients served # calls on crisis line # prevention

programs delivered # clients satisfied

with services

Measuring/Outcome Increase in knowledge

about safety planning Increase in positive

preceptions of criminal justice process

Increase in engagement with criminal justice process

Page 15: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

General Issues in Measuring

Building blocks to get to your specific evidence-based evaluation question

Page 16: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

You start with a theory of change

Based on Theory A, we believe that increases in

Victim Safety will lead to lower psychological distress

and greater engagement with c.j. system

Page 17: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

From theory, you identify constructs important to your agency/program

A variable, not directly observable, that has been identified to explain behavior on the basis of some theory

Based on Theory A, we believe that increases in

Victim Safety will lead to lower psychological distress

and greater engagement with c.j. system

Construct Example:

Victim Safety

Page 18: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Change at Your Agency

Pick one program within your agency What is the theory of change that

underlies that program? What are you trying to change? What factors lead to the change you

want? Identify 3-4 of the MOST relevant

constructs tied to this program’s theory of change.

Page 19: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Back to that construct

What on earth is

Victim Safety?

Page 20: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Measuring the weight of smoke

Victim Safety

Page 21: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Defining terms in evaluation

Variable Any characteristic or quantity that can

take on one of several values

Page 22: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Different kinds of variables

What comes first Predictor

(independent) variable Program

Program A versus B Program A versus

no program # sessions

Restraining order Yes/no

What comes next Outcome (dependent)

variable Aggressive behavior

Low/high Parenting skills

Ineffective to effective

Page 23: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Different kinds of variables

confound – any uncontrolled extraneous (or third) variable that changes with your predictor and could provide an alternative explanation of the results.

Earlier SANE exam

Earlier victim

advocacy

Better mental health

Page 24: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Operationalize variables

Operationalize definitions – make your variable specific and limiting. Get to something we can observe

Example: Big difference between saying I want to:

“treat trauma” “decrease children’s hyper-vigilance and

avoidance”

Page 25: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Different kinds of measurements…give you different information

Nominal Values differ by category, there is no ordering

Can’t calculate an average a.k.a qualitative, dichotomous, discrete,

categorical Least information Examples:

Sex, Race, Ethnicity Anything answered as Yes/no;

Present/absent

Page 26: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Scales of Measurement

Ordinal Values have different names and are ranked

according to quantity. e.g., Olympic medals

Example Divide people into low, moderate, and

high service needs ** You don’t know the exact distance between

two values on an ordinal scale; you just know high is higher than medium, etc.

Page 27: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Scales of Measurement

Interval and Ratio Spacing between values is known (so you know that

not only is one unit larger or smaller, but by how much it is larger or smaller)

Examples Scores on

a measure of PTSD symptoms a test of knowledge of safety planning

Number of calls for service revictimizations

Page 28: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

How to decide on a measurement scale

Choice of scale affects the amount and kind of information you get

And generally the less information you get, the less powerful the statistics you can use

Interval and ratio scales provide the most information

But you can’t always use them – e.g., sex GUIDING RULE: When you can, always go for more

info (interval/ratio)

Page 29: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

At your agency

How are you currently measuring the 3-4 constructs you identified? What kind of measurement scale?

Page 30: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Relevant Measure Websites Measuring Violence-Related Attitudes, Behaviors, and

Influences AmongYouths: A Compendium of Assessment Tools - Second Editionhttp://www.cdc.gov/ncipc/pub-res/measure.htm

Measuring Intimate Partner Violence Victimization and Perpetration: ACompendium of Assessment Toolshttp://www.cdc.gov/ncipc/dvp/Compendium/Measuring_IPV_Victimization_and_Perpetration.htm

http://mailer.fsu.edu/~cfigley/Tests/Tests.html ‘

http://vinst.umdnj.edu/VAID/browse.asp

Page 31: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

What to look for in a good measure?

Validity and Reliability

Page 32: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

What is Validity?

“truth” (Bryant, 2000) Degree to which our inference or conclusion is

accurate, reasonable and correct.

Page 33: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Examples of Types of Measurement Validity:Face Validity

The measure seems valid “on its face”. A judgment call. Probably the weakest form of measurement

validity.

A measure of anxiety includes items that are clearly about anxiety

Page 34: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Examples of Types of Measurement Validity: Construct Validity

Extent to which an instrument measures the targeted construct Haynes, Richard & Kubany, 1995.

Page 35: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Construct Validity

Like Law… The truth, the whole truth and nothing but

the truth The construct, the whole construct and

nothing but the construct

Trochim, 2001

Page 36: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Construct Validity Goal

The construct

Other construct: A

Other construct: C

Other construct: B

Other construct: D

Measure all of the construct and nothing else.

Trochim, 2001

Page 37: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Construct Validity Goal

Social Anxiety

Depression

Guilt

Generalized anxiety

Self-esteem

Measure all of the construct and nothing else.

Though, in reality, the construct is related to all 4 things.Though, in reality, the construct is related to all 4 things.

Page 38: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Reliability

You are measuring the construct with little error

Versus accuracy Versus validity

Something can be reliable but not valid But, if something is not reliable, it cannot

be valid b/c then your measure is only measuring random variability

Page 39: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

How can we improve reliability

By reducing error Standardization

When measurement conditions are standardized, sources of variance become constants and therefore do not influence the variability of scores

(Strube, 2000). Aggregation

With more items, error might cancel itself out

Error on first item might be + and on second -

Page 40: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Random Error

XX

Fre

qu

ency

Fre

qu

ency

The distribution of X The distribution of X with no random errorwith no random errorThe distribution of X The distribution of X with no random errorwith no random error

The distribution of X The distribution of X with random errorwith random errorThe distribution of X The distribution of X with random errorwith random error

Trochim, 2001

Page 41: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Systematic Error

Any factors that systematically affect measurement of the variable across the sample.

Systematic error = bias. e.g., asking questions that start “do you agree with

right-wing fascists that...” will tend to yield a systematic lower agreement rate.

Systematic error does affect average performance for the group.

Trochim, 2001

Page 42: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Systematic Error

XX

Fre

qu

ency

Fre

qu

ency

The distribution of X The distribution of X with no systematic errorwith no systematic errorThe distribution of X The distribution of X with no systematic errorwith no systematic error

The distribution of X The distribution of X with systematic errorwith systematic errorThe distribution of X The distribution of X with systematic errorwith systematic error

Notice that systematic error affects the average; called a bias.

Trochim, 2001

Page 43: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

This validity and reliability business is why evaluators make you crazy with…

Worries that you’ve made up your own evaluation instrument

Requests to standardize how evaluations are implemented (a new intern administers an interview vs. a seasoned staff member)

Page 44: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

When you can…

Use existing measures that have some evidence of reliability and validity (and then brag that you are doing so)

Standardize assessment procedures Be thoughtful about number of respondents

Page 45: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Selecting Measures

Page 46: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Self-report surveys/questionnaires

Importance of how you ask what you ask…

Examples: Exit Polls 2004

Open-ended versus structured questions 2013 Obamacare v. Affordable Care Act

Wording matters…a lot!

Page 47: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Writing Surveys

Types of questions Decisions about question content Decisions about question wording Decisions about response form Placement and sequence of questions

Page 48: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Keeping in mind: Bias

Can social desirability be avoided? Can interviewer distortion and subversion be

controlled? Can false respondents be avoided?

Page 49: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Types of Questions

Unstructured Structured

Page 50: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Structured Questions

DichotomousDichotomousDichotomousDichotomous

MaleMale FemaleFemale

Page 51: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Structured Questions

NominalNominalNominalNominal

Occupation:Occupation:

1 = College Student1 = College Student2 = Lawyer2 = Lawyer3 = Veterinary Tech3 = Veterinary Tech

Page 52: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Structured Questions

OrdinalOrdinalOrdinalOrdinal

Rank the most helpful things that victim advocates did in Rank the most helpful things that victim advocates did in your case (from most to least helpful)...your case (from most to least helpful)...

___ Called you at home___ Called you at home___ Went with you to court___ Went with you to court___ Went with you to a medical or counseling appointment___ Went with you to a medical or counseling appointment___ Referred you for additional services elsewhere___ Referred you for additional services elsewhere

Page 53: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Filter or Contingency Questions

Have you ever smoked cigarettes?Have you ever smoked cigarettes?

yesyesnono

If If yesyes, about how many times have , about how many times have you smoked cigarettes?you smoked cigarettes?

onceonce2 to 5 times2 to 5 times6 to 10 times6 to 10 times11 to 20 times11 to 20 timesmore than 20 timesmore than 20 times

Page 54: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Content

Is the question necessary/useful? Do you need the age of each child or just

the number of children under 16? Do you need to ask income or can you

estimate?

Page 55: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Content

Do you need several questions? Always try to avoid combining 2+ questions

into 1. What are your feelings towards victims

and offenders of IPV? What do you think of proposed changes

in benefits and documentation requirements?

Page 56: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Content

Do you need several questions?

Did you get all the desired information? E.g.: What were your household earnings last

year? Will participants mention all income (e.g.,

child support)

Page 57: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Content

Do you need several questions?

Will you have enough information to interpret people’s responses?

If you ask about attitudes towards police, can you interpret this without finding out if they reported the crime to the police?

Page 58: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Content

Do respondents have the information they need?

‘Do you think that the police should have direct filed the charges in your case?

-- can’t say if they don’t know what that means.

Page 59: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Content

Is the question specific enough?

How well did you like the treatment you got?

Versus... Did you recommend the treatment to

others? Did you come back for additional services?

Page 60: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Placement

Is the answer influenced by prior questions? Does question come too early or too late to

arouse interest? Does the question receive sufficient attention?

Page 61: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Opening Questions

Should be easy to answer Should not be sensitive material Should get the respondent “rolling”

Page 62: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

“Sensitive” Questions

Only after trust is developed Should make sense in that section of the

survey (not “out of left field”) Precede with warm-up questions

Page 63: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Placement Issues to Consider

Start with easy, non-threatening questions. Put more difficult, threatening questions near

end. Don’t start mail survey with an open-ended

question. Put demographics at end (unless needed to

screen).

Page 64: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Placement Issues to Consider

Ask about one topic at a time. When switching topics, use a transition. Reduce response set. For filter or contingency questions, make a

flowchart. Keep as short as possible!

Page 65: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Wording

Can the question be misunderstood? Did you talk with your victim advocate?

System or Community? Talk may or may not mean used

resources, etc..

Page 66: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Wording

What assumptions does your question make? If you ask what social class someone’s in,

you assume that they know what social class is and that they think of themselves as being in one.

Check assumption in a previous question.

Page 67: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Wording

Do you need a timeframe? Do you think you will need additional

services? When? In next week, 6 months, year?

Page 68: Evidence-Based Evaluation for Victim Service Providers Anne P. DePrince, Ph.D. Department of Psychology Center for Community Engagement and Service Learning

Question Wording

Does question contain difficult or unclear terminology?

Does the question make each alternative explicit?

Is the wording objectionable? Is the wording loaded or slanted?