31
1 Slide 1 Welcome to the 2016 CNCS Research Summit Hosted by the CNCS Office of Research and Evaluation WiFi Network: Wchmeeting Password: cncs2016 @NationalService #CNCSResearch Slide 2 Welcome Wendy Spencer CEO, Corporation for National and Community Service (CNCS) Slide 3 Corporation for National and Community Service: COST-EFFECTIVE SOLUTIONS FOR OUR COMMUNITIES AND NATION 324,000 members 3 million leveraged volunteers 55,000 locations AmeriCorps: 80,000 members in 20,000 sites Senior Corps: 244,000 volunteers in 35,000 sites Social Innovation Fund (SIF): $93m annual match and 426 organizations in 44 states Volunteer Generation Fund: 17 States United we Serve and National Days of Service DISASTER SERVICES | ECONOMIC OPPORTUNITY | EDUCATION | ENVIRONMENTAL STEWARDSHIP | HEALTHY FUTURES | VETERANS & MILITARY FAMILIES Slide 4 Doubled Staff

WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

1

Slide 1

Welcome to the 2016 CNCS Research Summit

Hosted by the CNCS Office of Research and Evaluation

WiFi Network: Wchmeeting

Password: cncs2016

@NationalService

#CNCSResearch

Slide 2

Welcome

Wendy Spencer

CEO, Corporation for National and Community Service (CNCS)

Slide 3

Corporation for National and Community Service:

COST-EFFECTIVE SOLUTIONS FOR OUR COMMUNITIES AND NATION

324,000 members

3 million leveraged volunteers

55,000 locations

• AmeriCorps: 80,000 members in 20,000 sites

• Senior Corps: 244,000 volunteers in 35,000 sites

• Social Innovation Fund (SIF): $93m annual match and 426 organizations in 44 states

• Volunteer Generation Fund: 17 States

• United we Serve and National Days of Service

DISASTER SERVICES | ECONOMIC OPPORTUNITY | EDUCATION | ENVIRONMENTAL STEWARDSHIP |

HEALTHY FUTURES | VETERANS & MILITARY FAMILIES

Slide 4

• Doubled Staff

Page 2: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

2

• Doubled Budget $2.5 to $5 Million

Slide 5

Improving Education (AmeriCorps)

Slide 6

Improving Education

Improvement for Minnesota Reading Corps students was 93% greater than non-MRC students.

Slide 7

Improving Education

AmeriCorps tutors helped the average first-grade students perform 26% better than the expected level

for on-track students

Slide 8

Evidence into Action

• After Minnesota Reading Corps research study

• Program expanded to 12 States & DC

• 350 School Districts

• Nearly 40K Students

Slide 9

Focus Areas

• Disaster services

• Economic Opportunity

• Education

• Environmental Stewardship

• Healthy Futures

• Veterans and Military Families

Page 3: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

3

Slide 10

Volunteer Employment

Volunteers have 27% higher odds of finding employment than non-volunteers

Slide 11

AmeriCorps Alumni Outcomes Study

• 8/10 Alumni say AmeriCorps benefited their career path.

• 7/10 helped them achieve their Educational goals.

Slide 12

Other Influences

• 86% of alumni reported voting in the 2012 presidential election, compared to a 58% national

voting rate that same year.

• 79% of alumni are or plan to become actively involved in their community post-service.

Slide 13

Pathway to Employment

• 82.7% of organizations surveyed online hired at least one AmeriCorps member since 2012.

• 57.8% hired members from their own sites.

Slide 14

Pathway to Employment

• 64.3% hired AmeriCorps members for full-time positions.

• >50% of those positions were newly created.

Slide 15

VCLA Findings

• 62.8 million Americans volunteered

• 7.9 billion hours volunteered

• $184 billion estimated value of volunteering (based on independent sector estimate of average

value of a volunteer hour ($23.07))

Page 4: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

4

• The value of all that volunteering is almost 20 times more than the amount shoppers spent on

Black Friday this year!

Slide 16

Turning Evidence into Action

Slide 17

Thank you!

Slide 18

Keynote

Kathryn Newcomer, PhD

Director of the Trachtenberg School of Public Policy and Public Administration at the George Washington

University and President Elect for the American Evaluation Association

Slide 19

Moving From Learning to Action in the Current Environment

Slide 20

Questions to Address Today

• When and why did the evidence-based imperative become so prevalent in the public and

nonprofit sectors?

• How can evaluators help government decision-makers use evidence to inform decision-making?

• How can we move from generating data for accountability to learning?

Slide 21

“Evidence-based Policy”

• The Mantra affecting governmental decision-makers, foundations, nonprofit boards,

intermediaries and --- evaluation practice!

• Myth or reality?

Page 5: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

5

• Advantages and disadvantages for decision-makers and for evaluators?

Slide 22

From Outputs to Evidence: Influential Events Across the Years

• Simon & Ridley focus on outputs (1938)

• Hitch and McKean (1960)

• State & Local Finance Project (1960s)

• Focus on effective-ness within DOD & HEW (1963)

• Hatry Senate Report on measuring effectiveness (1967)

• Urban Institute & IMCA Performance Work Begins (1970s)

• Mental Health Outcomes Measured (1970s)

• Some Federal laws require outcome measures (1970s)

• GASB calls for Service Accomplishments data (1980)

• Workforce training laws requires outcome measures (1982)

• Oregon Benchmarks (1989)

• Healthy People “2000” (1990)

• World Bank calls for outcome measures (1990s)

• Reinventing Government published (1993)

• Cochran Collaboration is established (1993)

• GPRA (1993)

• United Way requires Outcomes Measures (1996)

• CHEA establishes Outcomes Standards (1998)

• Millennial challenge sets impact goals (2000)

• Campbell Collaboration established (2000)

• Coalition Evidence-Based policy gains traction + PART (2001)

• What Works clearing house established (2002)

• Moneyball published (2003)

• Call for Key National indicators (2004)

• CDC Promotes DEBIs (2004)

• Community Indicators Consortium established (2004)

• CNCS Social Innovation Fund (2009)

• OMB Guidance on Tiers of Evidence (2010)

• OMB Guidance on Evidence-Based Grants (2010)

• Pew-MacArthur Results First (2011)

• Gates defines results as both output and outcomes (2013)

Slide 23

Embracing Evidence-Based Policy

• Health People “2000” (1990)

• World Bank calls for evaluation of outcomes (1990s)

Page 6: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

6

• CompStat focuses on Crime Rates in NYC (1992)

• Osborne and Gaebler book Reinventing Government published (1993)

• Cochran Collaboration is established (1993)

• GPRA (1993)

• United Way requires Outcomes Assessment (1996)

• CHEA establishes Outcomes Standards (1998)

• Millennial challenge sets impact goals (2000)

• Campbell Collaboration established (2000)

• Coalition for Evidence-Based policy gains traction (2001)

• Pew-MacArthur Results First initiative (2001)

• What Works clearing house established (2002)

• Michael Lewis’ book Moneyball published (2003)

• Call for Key National indicators (2004)

• Community Indicators Consortium established (2004)

• CDC Promotes DEBIs (2004)

• International initiative for Impact Evaluation (3ie) (2008)

• CNCS Social Innovation Fund (2009)

• OMB Guidance on Tiers of Evidence (2010)

• OMB Guidance on Evidence-Based Grants (2012)

• Pew-MacArthur Results First (2011)

• Moneyball for Government published (2014)

• Congress Passes the Commission on Evidence-Based Policy Act (2016)

Slide 24

Evidence-Based Policy – Made by Whom?

Decisions to be informed by Evidence:

• Political: Basing funding on use of “Demonstrated Evidence-Based Interventions” (DEBIs) and/or

CEA

• Programmatic: Making programmatic decisions based on impact evaluations

• Operational: Analyzing programmatic data – preferably outcomes – to target resources

Page 7: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

7

Slide 25

Contrasting Views on Evidence-Based Policy

Fixed Mindset Growth Mindset

1. We need to collect data to test if programs work or

do not work.

1. We need to learn which program mechanisms work

for whom, where and under what circumstances.

2. Policy should be made at the top and based on

evidence.

2. Policy is “made” through implementation processes

at multiple levels by multiple actors with different

types of data available to them.

3. Program impact can be measured precisely. 3. Measuring program impact is difficult as programs

and intended impactees change and evolve.

4. Random Control Trials (RCTs) are the gold standard

for research and evaluation design.

4. Research designs must be matched to answer the

question raised; RCTs are appropriate for certain

impact questions.

5. Proven program models can be replicated in

multiple locations as long as they are implemented

with fidelity to the original design.

5. Program mechanisms may be replicated in multiple

locations as long as they are adapted to meet local

conditions.

6. Benefit-cost analysis should be used to compare

social programs.

6. Benefit-cost analysis is difficult to use to compare

social programs given the challenge of costing out

benefits , especially those accruing over time.

Note: I expanded upon the notion of mindset in Mindset by Carol Dweck.

Slide 26

What are Challenges for Evidence to Inform Policymaking?

• Expectations regarding:

‒ What constitutes evidence?

‒ How transferable is evidence?

‒ When and where do we underestimate the role played by the “impactees?”

‒ Where is the capacity to support both the demand and supply of evidence?

Page 8: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

8

Slide 27

What are the Opportunities for Evidence to Inform Decision-making?

• Analyses of “performance” data collected by agencies (or delegated service delivery agents such

as grantees)

• Implementation, Outcome and Impact evaluations typically performed by other agents for

government

• Manipulations of services in experiments by agencies – “behavioral economics”

• Syntheses or systematic reviews of impact evaluations by external agents, e.g. websites like

“What Works”

Slide 28

Why isn’t There Agreement About the Quality of Evidence?

• Differing professional standards and “rules” or criteria for evidence, e.g., lawyers, accountants,

engineers, economists

• Disagreements about methodologies within professional groups, e.g., RCTs

• The constancy of change in problems and the characteristics of the targeted impactees

Slide 29

“Evidence-Based” Grant Making

• Grants comprise over $600 billion in the US federal Budget

• OMB started urging agencies to use evidence based grant making starting in 2010 but with little

guidance

• Where are we now?

Page 9: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

9

Slide 30

To what extent is there consensus on what constitutes evidence in the grants environment?

To a great

extent/ A lot

A moderate

amount

A little/

Not at

all

Number of

Respondents

Within your Agency 50% 30% 20% 132

With your legislative branch 29% 29% 32% 113

With other funders in your

field 30% 31% 39% 112

With academia 34% 29% 39% 98

Within your grantee network 51% 33% 27% 113

Source: Dawes and Newcomer Survey, September, 2016.

Slide 31

We Underestimate the Evolving Sources of Complexity Affecting the Production of Relevant Evidence

• Change in the nature of problems to be addressed by government, e.g., the nature of natural

security threats, the use of the internet in crime

• Change in the context in which programs and policies are implemented, e.g., increasingly

complicated service delivery networks, PPPs

• Changing priorities of political leaders (and would-be leaders)

Page 10: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

10

Slide 32

We Overstate the Ease of Flow of Evidence

Study conclusion: It plays a causal

role there

It plays a wide (enough) causal

role

Policy prediction: It will play a

causal role here

Source: Cartwright, N. (2013). Knowing what we are talking about: why evidence doesn't always travel.

Evidence & Policy: A Journal of Research, Debate and Practice, 9(1), 97-112.

Slide 33

How can evaluators help government decision-makers use evidence to inform decision-making?

Slide 34

A simple framework….

Informed decisions

& learning from Data

Develop and address

information needs

Cultivate an organizational

learning culture

Cater to individual

information processing

Page 11: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

11

Slide 35

Remember Evaluation Capacity = Both Demand and Supply

• Consider how to generate demand when there is little!

• Carefully think about who is asking for the data/evidence and who might use the information

provided and how and when they may use it

• Probe the extent to which there is a clear understanding between providers and requestors for

what sorts of evidence is needed, e.g., brokering

• Assess whether or not sufficient resources are available to meet demand

• Address the lack of interaction and facilitate synergies among the different potential providers

of evidence - such as monitoring and reporting staff, internal evaluation staff, external

evaluation contractors, etc.

Slide 36

Promising Practices from the Obama Administration

Promising Practice Affects Supply or Demand? Needed Support Factors

Knowledge Brokers Both Brokers have technical

expertise, interpersonal

skills, and contextual

wisdom

Learning Agendas Demand Strong leadership backing

and encouragement to be

innovative

Quarterly Reviews Supply Credible data, stress on

learning, no punitive actions

Strategic Reviews Both Encouragement to be

innovative, stress on

learning not accountability

Page 12: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

12

Slide 37

What are Evaluation-Receptive Organizational Cultures?

• Engage in self-reflection & self-examination

‒ Deliberately seek evidence on what it’s doing

‒ Use results information to challenge or support what it’s doing

‒ Promote candor, challenge and genuine dialogue

• Engage in evidence-based learning

‒ Make time to learn

‒ Learn from mistakes and failures

‒ Encourage knowledge sharing

• Encourage experimentation and change

‒ Support deliberate risk-taking

‒ Seek out new ways of doing business

(See John Mayne, 2010)

Slide 38

Move To Strategic and Synergistic Use of Evaluation!

Slide 39

Help Information Users Frame Pertinent Questions and then Match the Questions with the

Appropriate Evaluation Approach

Questions Relevant to Users informs Evaluation Design

Evaluation

Monitoring

Impact Evaluation

Implementation Evaluation

Behavioral Economics

Page 13: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

13

Slide 40

Match Evaluation Approach to Questions

Objective Illustrative Questions Possible Design

#1: Describe

program

activities

How extensive and costly are the program

activities?

How do implementation efforts vary across

sites, beneficiaries, regions?

Has the program been implemented

sufficiently to be evaluated?

Monitoring

Exploratory

Evaluations

Evaluability

Assessments

Multiple Case Studies

#2: Probe

targeting &

implementation

How closely are the protocols implemented

with fidelity to the original design?

What key contextual factors are likely to affect

achievement of intended outcomes?

How do contextual constraints affect the

implementation of a intervention?

How does a new intervention interact with

other potential solutions to recognized

problems?

Multiple Case Studies

Implementation or

Process evaluations

Performance Audits

Compliance Audits

Problem-Driven

Iterative Adaptation

#3: Measure

the impact of

policies &

programs

What are the average effects across different

implementations of the intervention?

Has implementation of the program or policy

produced results consistent with its design

(espoused purpose)?

Is the implementation strategy more (or less)

effective in relation to its costs?

Experimental

Designs/RCTs

Non-experimental

Designs: Difference-

in-difference,

Propensity score

matching, etc.

Cost-effectiveness &

Benefit Cost Analysis

Systematic Reviews &

Meta-Analyses

#4 : Explain

how/ why

programs &

policies

produce

(un)intended

effects

How/why did the program have the intended

effects?

To what extent has implementation of the

program had important unanticipated negative

spillover effects?

How likely is it that the program will have

similar effects in other communities or in the

future?

Impact Pathways and

Process tracing

System dynamics

Configurational

analysis,

Page 14: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

14

Slide 41

• Very, very carefully!!

• Signaling matters!

• Funders’ reporting requirements matter- perhaps too much!

Slide 42

A Delicate Balancing Act

There is an ongoing tension between producing evidence to demonstrate accountability versus to

promote learning.

Slide 43

Knowledge Generation and Use

How do we balance accountability with learning from evaluation?

Page 15: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

15

Slide 44

Knowledge Generation and Use

Slide 45

Knowledge Generation and Use

Page 16: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

16

Slide 46

AEA’s Participation in the International Evaluation Agenda

How AEA can

strengthen the:

AEA and the United States

Other Voluntary Organizations of

Public Evaluation(VOPEs) in Other

Countries

Enabling Environment

Institutional Capacities

Individual Capacities

Slide 47

AEA Will Support All Six Areas

How AEA can

strengthen the:

AEA and the United States

Other Voluntary Organizations of

Public Evaluation(VOPEs) in Other

Countries

Enabling Environment Yes Yes

Institutional Capacities Yes Yes

Individual Capacities Yes Yes

Slide 48

AEA’s 13 Specific Actions

AEA will work to

strengthen the:

Within AEA and the United States

Within Other VOPEs and Other

Countries

Enabling

Environment

1. Ask the Evaluation Policy Task

Force (EPTF) to identify 1-2 key gaps

in federal legislation, regulations, or

practices, then work to correct those

gaps

6. Find ways to learn systematically

from other VOPEs around the world

how they are strengthening their own

enabling environments

Page 17: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

17

Institutional

Capacities

2. Work to strengthen the demand

for evaluation within governments

at all levels of government, the

private sector, nonprofits, and

foundations

3. Use Federal Executive Institute

and other courses to train incoming

federal-level political officers and

SES candidates and leaders in

evaluation

7. Identify emerging VOPEs in other

parts of the world, and use AEA, Local

Affiliates, and/or TIGs to twin/mentor

their development (IPP Program)

Slide 49

AEA’s 13 Specific Actions

Individual

Capacities

4. Offer relevant international

topics to be featured in AEA’s

e-studies programs

5. Expand in-person training

opportunities beyond the

conference and summer

training institute, including

online courses

8. Waive the conference and workshop fees for

any developing country evaluator awarded

conference travel funds by EvalPartners

9. Solicit webinar speakers from outside the USA.

Offer relevant topics to be featured in AEA’s e-

studies programs (increase access by offering at

different times to accommodate time zones)

10. Step up the marketing for the Silent Auction,

including recruiting more corporate donations

11. Match the travel funds raised during the Silent

Auction, doubling the number of evaluators from

developing countries AEA supports

12. Offer access to AEA online services to selected

evaluators outside the USA (price/promotion to

be determined by management)

13. Continue to explore other ways AEA can

promote the Global Evaluation Agenda 2016-2020

in general and partnerships with other VOPEs

around the world in particular.

Page 18: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

18

Slide 50

The American Evaluation Association’s Competencies Initiative

• The current list available at our website has been developed via a multi-year, inclusive process

to develop and vet competencies in multiple arenas of evaluation practice by diverse evaluators

• Any end date? The list will be forever evolving, but will be “finalized” by 2018

• Five Domains: Professional, Methodology, Context, Management, and Interpersonal

Slide 51

AEA 2017 Conference - From Learning to Action: Learning from and about Evaluation

• Learning to strengthen evaluation practice

• Learning what works and why

• Learning from others

• Learning about users and use

Slide 52

Relevant References

• Dahler-Larsen, Peter. 2012. The Evaluation Society. Stanford University Press.

• Donaldson, S., C. Christie, and M. Mark (editors) 2015. Credible and Actionable Evidence, 2nd

Edition. Sage.

• Head, B. 2015. “Toward More “Evidence-Informed” Policy Making?” Public Administration

Review. Vol.76, Issue 3, pp. 472-484.

• Kahneman, D. 2011. Thinking, Fast and Slow. Farrar, Straus and Giroux Publishers.

• Mayne, J. 2010. “Building an evaluative culture: The key to effective evaluation and results

management.” Canadian Journal of Program Evaluation, 24(2), 1-30.

• Newcomer, K. and C. Brass. 2O16. “Forging a Strategic and Comprehensive Approach to

Evaluation within Public and Nonprofit Organizations: Integrating Measurement and Analytics

within Evaluation.” American Journal of Evaluation, Vol. 37 (1), 80-99.

• Olejniczak, K., E. Raimondo, and T. Kupiec. 2016. “Evaluation units as knowledge brokers:

Testing and calibrating an innovative framework.” Evaluation, Volume 22 (2)., 168-189.

• Sunstein. C. and R. Hastie. 2015. Wiser: Getting Beyond Groupthink to Make Groups Smarter.

Harvard Business Review Press.

• World Bank Group. Mind, Society and Behavior. 2015.

Page 19: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

19

Slide 53

Thank You!

I can be reached at [email protected]

Slide 54

15 Minute Break

Up next…Morning Plenary

Integrating Evidence into Federal Policy-Making Processes

WiFi Network: Wchmeeting

Password: cncs2016

@NationalService

#CNCSResearch

Slide 55

Morning Plenary: Integrating Evidence into Federal Policy-Making Processes

Facilitated by Mary Hyde, PhD, Director of Research and Evaluation, CNCS

Molly Irwin - U.S. Department of Labor

Naomi Goldstein - U.S. Department of Health and Human Services

Ruth Curran Neild - U.S. Department of Education

Jennifer Bell-Ellwanger - U.S. Department of Education

Jessica White - U.S. Department of Health and Human Services

Slide 56

Molly Irwin, PhD

Chief Evaluation Officer, U.S. Department of Labor

Page 20: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

20

Slide 57

Naomi Goldstein, PhD

Deputy Assistant Secretary for Planning, Research, and Evaluation, Administration for Children and

Families, U.S. Department of Health and Human Services

Slide 58

Ruth Curran Neild, PhD

Director of Policy and Research, Institute of Education Sciences, U.S. Department of Education

Slide 59

Jennifer Bell-Ellwanger

Director of Policy and Program Studies Service, Office of Planning, Evaluation and Policy Development,

U.S. Department of Education

Slide 60

Using Evidence to Strengthen Education Investments

[email protected]

Slide 61

The Basics About Using Evidence

What are the benefits?

Provides powerful insights on how strategies are working

Helps people make better decisions – should strategies be continued, adjusted, scaled-

up or discontinued?

What are the risks?

Disconnect between research and practice

Evidence for evidence-sake

Page 21: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

21

Slide 62

Evidence in ESSA

“Evidence-based” interventions in Titles I, II, IV, VI

Defines “evidence-based” as having 4 levels

Strong evidence

Moderate evidence

Promising evidence

Evidence that demonstrates a rationale

Higher levels of evidence required for select competitions & school improvement funds (1003)

Education Innovation and Research program, Pay-for-Success initiatives, program evaluations,

pooled evaluation authority

Slide 63

Motivation for Writing Guidance

Clarification – ED received many questions on the evidence provisions

Standard framework – without guidance, SEAs and LEAs would have to create their own

evidence frameworks or use one of several different frameworks

Slide 64

Evidence Guidance

Background

‒ Non-binding, non-regulatory guidance

‒ Applies to all programs in ESSA

‒ Use in conjunction with program-specific guidance

‒ Designed to support SEA/LEA/ partner use of evidence

Part I: Strengthening the Effectiveness of ESEA Investments

Part II: Guidance on the Definition of “Evidence-Based”

‒ Informs ED’s technical assistance materials for consistency

Available: http://www2.ed.gov/policy/elsec/leg/essa/guidanceuseseinvestment.pdf

Page 22: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

22

Slide 65

PART I: Strengthening the Effectiveness of ESEA Investments

Slide 66

5 Steps for Decision-making

1. Identify local needs

2. Select relevant, evidence-based interventions

3. Plan for implementation

4. Implement

5. Examine and reflect

Slide 67

Part II: The Definition of “Evidence-based”

Slide 68

General Recommendations

Look at the entire body of research, not just 1one study

Focus on important outcomes

The relevance of evidence matters

Use more rigorous evidence (e.g. strong or moderate) if available

What Works Clearinghouse (WWC) can be used to find evidence on the effectiveness of a

strategy or intervention

If not in WWC, look for studies of equivalent quality

Slide 69

4 Levels of Evidence

Tiered approach to evidence – reflects that the amount and rigor of evidence varies; not one size fits all

1) Strong evidence

2) Moderate evidence

3) Promising evidence

4) Demonstrates a rationale

Page 23: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

23

Slide 70

Summary Criteria by Level

Strong Evidence Moderate Evidence Promising Evidence Demonstrates a Rationale

Study

Design

Experimental study Quasi-experimental

study

Correlational study with

statistical controls for

selection bias

Provides a well-specified

logic model informed by

research or evaluation

WWC

Standard

Meets WWC Evidence

Standards without

reservations (or is the

equivalent quality)

Meets WWC Evidence

Standards with or

without reservations

(or is the equivalent

quality)

N/A N/A

Favorable

Effects

Shows a statistically

significant and positive

(i.e., favorable) effect of

the intervention on a

student outcome or

other relevant outcome

Shows a statistically

significant and positive

(i.e., favorable) effect

of the intervention on

a student outcome or

other relevant

outcome

Shows a statistically

significant and positive

(i.e., favorable) effect of

the intervention on a

student outcome or

other relevant outcome

Relevant research or an

evaluation that suggests

that the intervention is

likely to improve a student

outcome or other relevant

outcome

Other

Effects

Is not overridden by

statistically significant

and negative (i.e.,

unfavorable) evidence

from other findings in

studies that meet WWC

Evidence Standards with

or without reservations

(or are the equivalent

quality)

Is not overridden by

statistically significant

and negative (i.e.,

unfavorable) evidence

from other findings in

studies that meet

WWC Evidence

Standards with or

without reservations

(or are the equivalent

quality)

Is not overridden by

statistically significant

and negative (i.e.,

unfavorable) evidence

from other findings in

studies that meet WWC

Evidence Standards with

or without reservations

(or are the equivalent

quality)

An effort to study the

effects of the intervention,

ideally producing promising

evidence or higher, will

happen as part of the

intervention or is

underway elsewhere

Sample

Size and

Overlap

Includes a large sample

and a multi-site sample,

overlapping with

populations and settings

proposed to receive the

intervention

Includes a large sample

and a multi-site

sample, overlapping

with populations or

settings proposed to

receive the

intervention

N/A N/A

Page 24: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

24

Slide 71

Published Guidance & Regulations

Guidance

Preparing, training, and recruiting high quality teachers and principals (Title II)

English Learners (Title III)

Early learners and student support and academic enrichment (Title IV)

Regulations

Accountability, state plans, and data reporting

Slide 72

Next Steps Technical Assistance

New IES Find What Works Tools

Regional Educational Laboratories

Comprehensive Centers

State Support Network

Slide 73

For more information

Main ESSA Web Page: www.ed.gov/ESSA

ESSA Resources, including link to the Notice, Fact Sheet, and other ESSA resources:

http://www2.ed.gov/policy/elsec/leg/essa/index.html

Email Inbox: [email protected]

Contact Information: [email protected]

Slide 74

Jessica White

Social Science Analyst and CDC Team Lead, Division of Science Policy, Office of the Assistant Secretary for

Planning and Evaluation, U.S. Department of Health and Human Services

Slide 75

Morning Plenary: Integrating Evidence into Federal Policy-Making Processes

Facilitated by Mary Hyde, PhD, Director of Research and Evaluation, CNCS

Page 25: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

25

Molly Irwin - U.S. Department of Labor

Naomi Goldstein - U.S. Department of Health and Human Services

Ruth Curran Neild - U.S. Department of Education

Jennifer Bell-Ellwanger - U.S. Department of Education

Jessica White - U.S. Department of Health and Human Services

Slide 76

15 Minute Break

Moving to…First Set of Concurrent Sessions

1A: Evidence for the Health Benefits of Volunteering for Seniors Grand Ballroom

1B: Service Learning Research: Lessons Learned About Institutionalization and Benefits Ashlawn

1C: Using Research and Evaluation to Improve PreK-3 Reading Executive

1D: Improving Individual and Family Well-Being: Research, Practice, and Measurement (Panel 1 of 2)

Hermitage

1E: New Local Strategies for Measuring Civic Engagement and Social Capital Sagamore Hill

Slide 77

Lunch Break

Returning at 2pm for…Afternoon Plenary

Evaluating Innovation: A Catalyst for Organizational Change and Program Innovation

WiFi Network: Wchmeeting

Password: cncs2016

@NationalService

#CNCSResearch

Page 26: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

26

Slide 78

Afternoon Plenary: Evaluating Innovation: A Catalyst for Organizational Change and Program

Innovation

Facilitated by Melissa Bradley, Professor of Practice at the McDonough School of Business, Georgetown

University

Michelle Gilliard - Venture Philanthropy Partners (VPP)

Lori Kaplan - Latin American Youth Center (LAYC)

Seung Kim - Local Initiatives Support Corporation (LISC)

Deb De Santis - Corporation for Supportive Housing (CSH)

Slide 79

Michelle Gilliard, PhD

Partner, Venture Philanthropy Partners (VPP)

Slide 80

Lori Kaplan

President and CEO, Latin American Youth Center (LAYC)

Slide 81

Latin American Youth Center (LAYC)

• Strong Youth

• Strong Families

• Strong Communities

• Strong Futures

Slide 82

Promotor Pathway®

Launched in August 2008, LAYC’s Promotor Pathway® is a long-term client management intervention

model for disconnected and disengaged youth whose obstacles such as

• Lack of education

• Homelessness

• Trauma

Page 27: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

27

• Substance abuse

• court involvement

Prevent them from accessing resources and achieving educational, employment, and healthy living

goals.

Slide 83

Promotor Pathway® Study

The Promotor Pathway® has been rigorously evaluated through a randomized control trial, multi-year

study conducted by Urban Institute.

The evaluation tracked 476 youth over 18 months to analyze the effectiveness of the model and

revealed significant positive outcomes on school engagement, pregnancy prevention, and housing.

“MY PROMOTOR IS LIKE MY PERSONAL 911. I CAN CALL ANYTIME FOR WHATEVER I NEED.” -Jonathan,

age 17

Slide 84

School Engagement

Slide 85

Housing Stability

Page 28: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

28

Slide 86

Child Births

Slide 87

Thank You!

Email: [email protected]

Twitter: @LoriLAYC

Slide 88

Seung Kim

Director of Financial Stability, Local Initiatives Support Corporation (LISC)

Slide 89

Deb De Santis

President and CEO, Corporation for Supportive Housing (CSH)

Slide 90

Afternoon Plenary: Evaluating Innovation: A Catalyst for Organizational Change and Program

Innovation

Facilitated by Melissa Bradley, Professor of Practice at the McDonough School of Business, Georgetown

University

Michelle Gilliard - Venture Philanthropy Partners (VPP)

Lori Kaplan - Latin American Youth Center (LAYC)

Seung Kim - Local Initiatives Support Corporation (LISC)

Deb De Santis - Corporation for Supportive Housing (CSH)

Page 29: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

29

Slide 91

15 Minute Break

Moving to…Second Set of Concurrent Sessions

2A: Innovative Approaches to Evaluating Service Programs Grand Ballroom

2B: Service Learning in Practice Ashlawn

2C: Using Research and Evaluation to Improve K-12 Attendance and Achievement Executive

2D: Community Change through Collective Action and Multi-Sectoral Approaches Hermitage

2E: Stakeholder Participation in Research and Action Sagamore Hill

Slide 92

Networking Reception

Begins at 5:45pm upstairs in Atrium Ballroom

WiFi Network: Wchmeeting

Password: cncs2016

@NationalService

#CNCSResearch

Slide 93

Welcome to Day 2

Starting at 9am…Third Set of Concurrent Sessions

3A: Using Administrative and Alternative Data Sources Grand Ballroom

3B: Using Research and Evaluation to Improve Success in College Ashlawn

3C: Using Research and Evaluation to Learn about AmeriCorps Employment Outcomes Executive

3D: Improving Individual and Family Well-Being: Research, Practice, and Measurement (Panel 2 of 2)

Hermitage

3E: Measuring Outcomes of Civic Engagement and Volunteering Sagamore Hill

Page 30: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

30

Slide 94

Final Plenary: Collaborative Research and Action: Engaging Communities, Universities, and Other

Partners to Tackle Pressing Local Challenges

Facilitated by Andrea Robles, PhD, Research and Evaluation Manager, CNCS

Flint, MI and Richmond, VA

Slide 95

Flint, MI

Yanna Lambrinidou, PhD, Adjunct Assistant Professor, Virginia Tech; President, Parents for Nontoxic

Alternatives; Washington, DC resident during 2001-2004 DC lead-in-water crisis and its aftermath

Anurag Mantha, PhD Candidate, Virginia Tech

Jennifer McArdle, Civic Engagement Manager, United Way of Genesee County, and City of Flint Chief

Service Officer

Sue Peters, Director of Special Projects, Community Foundation of Greater Flint

Slide 96

Richmond, VA

Chanel Bea, Community Engagement Assistant, Center on Society and Health, Virginia Commonwealth

University, Richmond Resident

Gwen Corley Creighton, Director of Richmond Promise Neighborhood, Peter Paul Development Center,

Richmond Resident

Amber Haley, PhD Student, University North Carolina, Chapel Hill; Former Faculty/Research

Epidemiologist, Center on Society and Health, Virginia Commonwealth University, Richmond Resident

Emily Zimmerman, PhD, Director of Community Engagement, Center on Society and Health, Virginia

Commonwealth University

Slide 97

Final Plenary: Collaborative Research and Action: Engaging Communities, Universities, and Other

Partners to Tackle Pressing Local Challenges

Facilitated by Andrea Robles, PhD, Research and Evaluation Manager, CNCS

Flint, MI and Richmond, VA

Page 31: WiFi Network: Wchmeeting Password: cncs2016 ... · Moneyball for Government published (2014) ... engineers, economists • Disagreements about methodologies within professional groups,

31

Slide 98

Closing Remarks

Asim Mishra

Chief of Staff, CNCS

Mary Hyde, PhD

Director of Research and Evaluation, CNCS

Slide 99

Thank you for joining us!

Help us improve for next year – please fill out the survey you’ll be receiving soon by email.

Questions? Contact us at [email protected]

Post-conference participants should see their coordinator for time and room location.