Starting With the End in Mind: Capturing Results Sarah Thach, NC DHHS Office of Healthy Carolinians...

Preview:

Citation preview

Starting With the End in Mind: Capturing Results

Sarah Thach, NC DHHS Office of

Healthy Carolinians & Health Education

September 24, 2010

What words or feelings come to mind when you

think of evaluation?

Session objectives

Inspire you that evaluation can be fun, creative, illuminating, even joyous!

Discuss purposes of evaluation Review core steps in evaluation Share resources and tools Define evidence-based interventions (EBIs),

discuss why use then, where to find them, how to adapt them

Practice telling your program’s story

Why evaluate?

Assess

Evaluate

Implement

Plan

Why Evaluate?

1. Prove what you’re doing2. Improve what you’re doing

According to a recent study, one of the top predictors of nonprofit sustainability is spending time leveraging evaluation data for making meaning, decision making and planning, not primarily for accountability or validation (Peter York, TCC Group)

With thanks to Jill Fromewick of Summit Research Associates

Key Steps in Evaluation

1. Start with the end in mind: what do you want to accomplish? Desired outcomes

2. Figure out how you can achieve it Logic model, Theory of change, Evidence-based interventions

3. Figure out how to measure it Evaluation Plan

4. Measure it and adjust program as needed

5. Tell your story

Step 1: Start With the End in Mind

Ultimately, what change do you want to see in the

health status of the population you’re serving?

Step 2:

Figure out how to achieve it

Creating a Road Map

12University of Wisconsin-Extension, Program

Development and Evaluation

Tool: Logic model

What we do

Who we reach

Desired results

INPUTS OUTPUTS OUTCOMES

Resources: $, staff/

volunteers, eqpt, space

Activities Participation

Short

(1-3 yrs)

Know-ledge,

Attitudes, Skills

Medium

(4-6 yrs)

Behav-iors,

social norms, environment

Long-term

7-10 yrs

Ultimate goal

What we invest

Tool: A Theory of Change helps make connections between activities and desired outcomes

Use Evidence-Based Interventions

Evidence-Based Interventions have been… Implemented Evaluated Found to be effective (at least in one

particular setting and population)

Evidence-Based Interventions can include… Strategies Pre-packaged “soup-to-nuts” programs

With thanks to NCTraCS

A continuum of evidence:

Weaker… …Stronger

New intervention:- "Hey, I have an idea!" - "Sure, why not?"

Theory-driven new intervention

“Promising practice”

Evidence-based: Intervention implemented, evaluated, and found to be effective

“Best practice”

Research tested: Intervention found effective in a control study (some sites had intervention, some didn't)

Systematic review of multiple intervention trials shows effectiveness

Why Use EBIs?

Provide a recipe or road map so you don’t have to reinvent the wheel

Are likely to succeed Help you use scarce resources wisely: $,

time, volunteers/partners Are increasingly required by funders

Tool: Finding & Using EBIs

Finding EBIs

The Community Guide to Preventive Services Cochrane Reviews Canadian Best Practices Portal Databases of Journal Articles:

PubMed search engine for medical articlesGoogle Scholar (includes articles that are not peer-reviewed)

Content-Specific Databases:AHRQ Health Care Innovations ExchangeUS Preventive Services Task Force RecommendationsSAMHSA’s National Registry of Evidence-based Programs and

Practices

NC databases of interventions

CareShare Health Alliance’s Knowledge Bank https://www.caresharehealth.org/

Healthy Carolinians: CHA priorities and action plan strategies http://www.healthycarolinians.org/assessment/healthobjectives.aspx

NC Center for Public Health Quality’s Proven and Promising Practices http://www.ncpublichealthquality.org/ctr/

Assessing EBIs for a good fit

Think about:1. Will it work for your target group’s age, culture,

readiness to change?

2. Will it work for your host organization’s budget, staffing capabilities, timeframe?

3. Will it work in your community setting and synergize with existing resources/programs?

Contact program authors for further info

Adapting EBIs for a good fit

You can modify an EBI to make it more culturally-relevant to your target audience or community setting without compromising the intervention’s effectiveness as long as you stay true to the core of the program

Include community partners and staff Stay true to the core program Record reasons for changes See National Implementation Research Network

for tips on effective implementation http://www.fpg.unc.edu/~nirn/

Step 3: Figure out how to measure it

25University of Wisconsin-Extension, Program

Development and Evaluation

What do you measure?

What we do

Who we reach

Desired results

INPUTS OUTPUTS OUTCOMES

Resources: $, staff/

volunteers, eqpt, space

Activities Participation

Short

(1-3 yrs)

Know-ledge,

Attitudes, Skills

Medium

(4-6 yrs)

Behav-iors,

social norms, environment

Long-term

7-10 yrs

Ultimate goal

What we invest

Process Evaluation Outcome Evaluation

Tool: Evaluation plan

Activities and outcomes

Indicators: observable measures that describe how well outcomes have been achieved

Data collection methods, source Frequency and schedule of data collection

Using the logic model, identify activities & outcomes you want to measure

For each activity/outcome, identify at least one indicator

How will you collect data? Methods include interviews, focus groups, surveys, participant observation, document review, secondary data, analysis of existing databases

From who/ what entity?

How often does data collection take place? One time, twice, multiple times, continuously?

When?

A compelling argument appeals to the…

Head & WalletHeart

Make sure your evaluation includes…

DataStoriesAnecdotesTestimonials

Cost/Benefit Analysis Return on Investment$ Saved$ Generated

The hunt for good measures

You know it’s hot in NYC when…

Barron J. (7/22/10) New York Times“No Matter How You Cut It, a July That’s Too Hot”

What about access to care?

Outcomes: Decrease in ER use for primary care Increased appropriate use of primary care % uninsured with a medical home How long uninsured wait to get care after

symptoms appearCollection: Self-reported vs. provider-reported vs.

existing statistics

What about access to care?

Process measures you care about: Collaboration:

# partners # lead agencies for various elements of intervention # agencies contributing funding towards initiative/ inkind

quantified # meetings attended Self-reported ownership of the network Participants can articulate what collaborative is and its

goals Seeing collaborative as distinct from their own organization Cohesiveness (how was the collaborative named?)

Evaluation plan: access to care

Activities and outcomes

Indicators: observable measures that describe how well outcomes have been achieved

Data collection methods, source Frequency and schedule of data collection

Using the logic model, identify activities & outcomes you want to measure

For each activity/outcome, identify at least one indicator

How will you collect data? Methods include interviews, focus groups, surveys, participant observation, document review, secondary data, analysis of existing databases

From who/ what entity?

How often does data collection take place? One time, twice, multiple times, continuously?

When?

Step 5: Measure it and

Adjust Program as Needed

Step 6:

Tell Your Story

Telling Your Story

We are a storytelling species!

“Facts don’t have the power to change someone’s story. Your goal is to introduce a new story that will let your facts in.” – Annette Simmons

Tool: Storytelling Tips

Components of a good story: Start with a common assumption Introduce a point of conflict Cast the story with clear heroes and villains Include at least one memorable fact Point the way to a happy ending

Source: Terrence McNally

“Sticky” (Memorable) Stories:

Simple Unexpected Concrete Credible Emotional Stories Stick

Source: Heath D & Heath C. 2007. Made to Stick: Why Some Ideas Survive and Others Die.

Core stories to collect

1. Nature of our challenge To remind staff/volunteers why their time and energy is needed everyday

2. How we beganHow the group was founded – captures both the need for the work and the

specific approach you’ve taken

3. Emblematic successesDemonstrate group’s effectiveness over time

4. Individual PerformanceShows professionalism, creativity, commitment your group brings to the

challenge

5. Lessons learnedRemind your group that occasional misfires are inevitable and should be

embraced for what they can teach you

Sample story

In pairs, take turns telling a story about your program:

Storytellers, select a story re.: Nature of our

challenge How we began Emblematic

successes Individual

performance Lessons learned

Listeners, listen for: Simple Unexpected Concrete Credible Emotional Stories Stick

Showing your story

“The best statistical graph ever”

Minard’s map of Napoleon’s 1812 Russian campaign

From Swain Partnership for Health annual report

ResourcesEvaluation www.cdc.gov/eval/evalcbph.pdf http://www.cdc.gov/eval/resources.htm http://www.uwex.edu/ces/pdande/evaluation/pdf/tobaccomanual.pdf http://www.uwex.edu/ces/pdande/evaluation/index.html http://meera.snre.umich.edu/plan-an-evaluation/plonearticlemultipage.2007-

10-30.3630902539/participatory-evaluation

Logic Models and Theories of Change http://www.hfrp.org/publications-resources/browse-our-publications/how-to-

develop-a-logic-model-for-districtwide-family-engagement-strategies http://www.aecf.org/upload/publicationfiles/cc2977k440.pdf http://www.publichealth.arizona.edu/chwtoolkit/pdfs/Logicmod/chapter3.pdf

Telling Your Story http://discovery.wcgmf.org/resources/sps_resource_942.pdf Heath C & Heath D 2007. Made to Stick: Why Some Ideas Survive

and Others Die.

Recommended