1
The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania Commission on Crime and Delinquency and the Pennsylvania Department of Public Welfare as a component of the Resource Center for Evidence-Based Prevention and Intervention Programs and Practices. Organizational Correlates of Local Outcome Assessment Introduction The EPISCenter & The Prevention Research Center, The Pennsylvania State University Background •Increasing policy emphasis on evidence-based programs (EBPs) and results-based accountability (Bumbarger et al., 2010) •Next step in EBP dissemination: demonstrate impact, assess delivery locally •In context of state-wide natural dissemination, agency assessment of process and outcomes to -identify areas of program delivery to target for continuous quality improvement -garner support and buy-in from community stakeholders, an essential component of program sustainability (Rhoades et al., 2012) Mary H. Geier & Stephanie A. Bradley Analyses are based on data collected in 2011 as part of Annual Survey of Evidence-based Programs (ASEP; Doyle et al., 2012) for 93 organizations. Conducted Local Outcome Assessment Phase of Implementation Grant-funded (n=51) Sustainability (n=42) Total (n = 93) No (0) 18 (19.4%) 15 (16.1%) 33 (35.5%) Yes (1) 29 (31.2%) 16 (17.2%) 45 (48.4%) Missing 4 (4.3%) 11 (11.8%) 15 (16.1%) Method Conceptual Model Philosophy towards data use and evaluation (4 items, α = .86) Organizational capacity to evaluate (4 items, α = .64) • Knowledge of data collection and reporting process • Skill/expertise to analyze data • Internal or external consultant Program alignment with agency mission & goals (4 items, α = .81) • Internal stakeholder buy-in Training and TA factors (3 items, α = .69) • Logic model proficiency • Support surrounding local evaluation Active planning for sustainability (4 items, α = .80) • Fiscal plan • Garnering stakeholder support (understand value) External stakeholder buy-in & support for program (6 items, α = .90) • Prioritizing, attitude • Collaborative board, CTC, etc Organizatio n assess outcomes Program sustainabilit y Capacity & Motivation Factors Hypothesized to Influence Local Outcome Assessment Results Respondents could indicate that they used data to : A) Meet grant requirements B) Know if the program is working C) Use the data to garner financial support (grant writing, presentations to potential funders) Future Directions Data Phi... Org. Capacity Prog. Al... Training & TA Sus. Planning External Buy-in 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Mean d = .26 d = .37 d = .60 d = .02 t Are organizations that assess local outcomes different from those that do not? d = .49* d = .70** Landscape of Data Use & Reporting Conducted Assessmen t Did Not Conduct •To examine factors associated with assessing implementation quality •To distinguish between different levels of evaluation quality (e.g. “high” vs. “low”), and to subsequently examine how assessment capacity factors are associated with different levels of quality •To conduct longitudinal analyses to determine a) how capacity to evaluate is related to sustainability and continuous quality improvement, and how these relationships change across time; and b) direction of effect between program operating level and choice to assess The probability that an organization conducts program assessments is highest in grant- funded years (differences across groups not significant: χ = .779, p = .377). Agencies that conduct assessments differ from those that do not on: 1) program alignment with agency mission and goals; 2) active planning for sustainability; and 3) philosophy towards data use and evaluation. Most organizations report utilizing data in three or more ways, and all organizations reported using data to know if a program is working. Continuous Quality Improveme nt Organization assess implementatio n quality Explores the landscape of data use among local EBP providers Examines how capacity and motivation factors are associated with community providers’ ability to conduct local outcome assessments 2 **p<.01 *p<.05 p<.11 (n=45) (n=45 ) (n=45 ) (n=44) (n=29) (n=38) One Two Three 0 5 10 15 20 25 30 35 # Different Ways Organizations Use Data Sustainability Grant-funded Number of Organizations None One Two Three 0 5 10 15 20 25 30 # Different Audiences to whom Organizations Report Data Sustainabi lity Number of Organizations Respondents could indicate that they presented data to: A) PCCD B) Coalition/Collaborative Board C) Community Groups/Organizations Correlations Across Factors Hypothesized to Influence Assessment Most organizations reported sharing data with three audiences. Reporting to multiple audiences has been associated with program sustainability (Rhoades et al., 2012). t Data Philosophy Org. Capacity Prog. Alignment Training & TA Sus. Planning External Buy-in Org. Capacity 0.182 Prog. Alignment 0.249* 0.273* Training & TA 0.194 0.266* 0.436** Sus. Planning 0.204 0.091 -0.070 0.276 External Buy-in -0.028 0.585** 0.098 0.075 0.107 **p<.01 *p<.05 p<.11 t t Each factor is significantly correlated with at least one other factor, except Sustainability Planning. Present Study Within the context of a state-wide scale-up of evidence-based programming, including 200+ replications of a menu of EBPs in PA, since 2001: t t

The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania

Embed Size (px)

Citation preview

Page 1: The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania

The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania Commission on Crime and Delinquency and the Pennsylvania Department of Public Welfare as a component of the Resource Center for

Evidence-Based Prevention and Intervention Programs and Practices.

Organizational Correlates of Local Outcome Assessment

Introduction

The EPISCenter & The Prevention Research Center, The Pennsylvania State University

Background• Increasing policy emphasis on evidence-based programs (EBPs) and results-based accountability (Bumbarger et al., 2010)

• Next step in EBP dissemination: demonstrate impact, assess delivery locally

• In context of state-wide natural dissemination, agency assessment of process and outcomes to

-identify areas of program delivery to target for continuous quality improvement

-garner support and buy-in from community stakeholders, an essential component of program sustainability (Rhoades et al., 2012)

Mary H. Geier & Stephanie A. Bradley

Analyses are based on data collected in 2011 as part of Annual Survey of Evidence-based Programs (ASEP; Doyle et al., 2012) for 93 organizations.

Conducted Local Outcome

Assessment

Phase of Implementation

Grant-funded (n=51)

Sustainability (n=42)

Total (n = 93)

No (0) 18 (19.4%) 15 (16.1%) 33 (35.5%)

Yes (1) 29 (31.2%) 16 (17.2%) 45 (48.4%)

Missing 4 (4.3%) 11 (11.8%) 15 (16.1%)

Method

Conceptual Model

Philosophy towards data use and evaluation (4 items, α = .86)

Organizational capacity to evaluate (4 items, α = .64)• Knowledge of data collection and reporting process• Skill/expertise to analyze data• Internal or external consultant

Program alignment with agency mission & goals (4 items, α = .81)• Internal stakeholder buy-inTraining and TA factors (3 items, α = .69)• Logic model proficiency• Support surrounding local evaluation

Active planning for sustainability (4 items, α = .80)• Fiscal plan• Garnering stakeholder support (understand value)

External stakeholder buy-in & support for program (6 items, α = .90)• Prioritizing, attitude• Collaborative board, CTC, etc

Organization assess outcomes

Program sustainabil

ity

Capacity & Motivation Factors Hypothesized to Influence

Local Outcome Assessment

Results

Respondents could indicate that they used data to:A) Meet grant requirementsB) Know if the program is workingC) Use the data to garner financial support (grant writing,

presentations to potential funders)

Future Directions

0

0.5

11.5

22.5

3

3.54

4.55

Mean

d = .26 d = .37

d = .60 d = .02

t

Are organizations that assess local outcomes different from those that do not?

d = .49* d = .70**

Landscape of Data Use & Reporting

Conducted AssessmentDid Not Conduct

• To examine factors associated with assessing implementation quality

• To distinguish between different levels of evaluation quality (e.g. “high” vs. “low”), and to subsequently examine how assessment capacity factors are associated with different levels of quality

• To conduct longitudinal analyses to determine a) how capacity to evaluate is related to sustainability and continuous quality improvement, and how these relationships change across time; and b) direction of effect between program operating level and choice to assess 

The probability that an organization conducts program assessments is highest in grant-funded years (differences across groups not significant: χ = .779, p = .377).

Agencies that conduct assessments differ from those that do not on: 1) program alignment with agency mission and goals; 2) active planning for sustainability; and 3) philosophy towards data use and evaluation.

Most organizations report utilizing data in three or more ways, and all organizations reported using data to know if a program is working.

Continuous Quality

Improvement

Organization assess

implementation quality

Explores the landscape of data use among local EBP providers

Examines how capacity and motivation factors are associated with community providers’ ability to conduct local outcome assessments

2

**p<.01

*p<.05 p<.11

(n=45)

(n=45)

(n=45)

(n=44) (n=29)

(n=38)

One Two Three0

5

10

15

20

25

30

35 # Different Ways Organizations Use Data

SustainabilityGrant-funded

Nu

mb

er

of

Org

an

i-zati

on

s

None One Two Three0

5

10

15

20

25

30# Different Audiences to whom Organizations

Report Data

Sustainabil-ity

Nu

mb

er

of

Org

an

izati

on

s

Respondents could indicate that they presented data to:A) PCCD B) Coalition/Collaborative Board C) Community Groups/Organizations

Correlations Across Factors Hypothesized to Influence Assessment

Most organizations reported sharing data with three audiences. Reporting to multiple audiences has been associated with program sustainability (Rhoades et al., 2012).

t

Data Philosophy

Org. Capacity

Prog. Alignment

Training & TA

Sus. Plannin

gExternal Buy-in

Org. Capacity 0.182

Prog. Alignment 0.249* 0.273*

Training & TA 0.194 0.266*

0.436**Sus.

Planning 0.204 0.091 -0.070 0.276

External Buy-in -0.028 0.585** 0.098 0.075 0.107

**p<.01 *p<.05 p<.11

t

t

Each factor is significantly correlated with at least one other factor, except Sustainability Planning.

Present StudyWithin the context of a state-wide scale-up of evidence-based programming, including 200+ replications of a menu of EBPs in PA, since 2001:

t

t