Evaluation 101: A Workshop

Preview:

Citation preview

Evaluation 101: A Workshop Overview, Concepts and Application

Today:

Review some definitions Talk about practical concepts for

AETC evaluation Discuss a useful framework Apply these concepts and the

framework to your programs Identify other evaluation resources

Definition: Rossi, Freeman and Lipsey “Program evaluation is the use of

social research procedures to systematically investigate the effectiveness of … programs.”

Concepts: Rossi et al’s 5 program domains Program evaluation typically involves

assessment of one or more of the five program domains:

1. The need for the program 2. Design of the program 3. Program implementation and service

delivery 4. Program impact or outcomes 5. Program efficiency

Concepts: Chlemisky’s Perspectives

Accountability: Information collected for decision makers (emphasis: outcomes)

Developmental: Information collected to improve performance (emphasis: process measures)

Knowledge: Information collected in the interest of generating understanding and explanation (emphasis on processes and outcomes).

Scriven and context

Evaluation has two arms: 1. Data gathering 2. Contextualizing results

Some definitions…

Process evaluation: Addresses questions about how well the

program is functioning Is useful for diagnosing outcome Is critical in quality improvement

Some definitions…

Key questions in process evaluation: Who is served? What activities or services are provided? Where is the program held? When and how long?

Some definitions…

Outcome evaluation: Gauges the extent to which a program

produces the intended improvements it addresses Addresses effectiveness, goal attainment

and unintended outcomes Is critical in quality improvement

Some definitions…

Key questions in outcome evaluation: To what degree did the desired change(s)

occur? Outcomes can be initial, intermediate

or longer-term Outcomes can be measured at the

patient-, provider-, organization or system level.

Some definitions…

Impact is sometimes used synonymous to “outcome.” Impact is perhaps better defined as a

longer-term outcome. For clinical training programs, impacts may be improved patient outcomes.

Some definitions…

Indicators or measures are the observable and measurable data that are used to track a program’s progress in achieving its goals. Monitoring (program or outcome

monitoring, for example) refers to on-going measurement activity

Some definitions…

CQI/QM: HAB’s definition of quality: The degree

to which a health or social service meets or exceeds established professional standards and user expectations

Some tools for planning…

United Way’s Outcomes Manual HRSA’s Quality Improvement

Handbook W.K. Kellogg Foundation Evaluation

Handbook

Some methods…

Identify some Quantitative Methods Identify some Qualitative Methods Which is best, qual or quant?

A framework for AETC evaluation: Kirkpatrick Measure Reaction Measure Learning Measure Behavior Measure Results

Identify some ways to measure each

in an AETC training setting

Application to Your Program:

Identify Program Goals For each goal: Identify Process Objectives Identify Outcome Objectives

For each objective: Identify Indicators Identify Data Source Plan Data Collection Plan Data Analysis

Exercise:

Goal 1:

Objectives: Indicator Data Source

Data Collection

Data Analysis

Process Objective 1:

Outcome Objective 1:

Recommended Reading

Definitely: Kirkpatrick Cruz the American Journal Evaluation (the

Journal of the American Evaluators’ Association)

If you have time: Chelimsky Rossi

If you want a good read: Hunt

Handouts

Table 1.1 from Chapter 1 in Chelimsky: The Coming Transformations in Evaluations

Exhibit 2-P An Australian Teams’ Ten Step Approach to Program Evaluation (p. 75 in Rossi et al)

Chapters 1-3 in Kirkpatrick American Evaluators’ Association Guiding

Principles for Evaluators Isenberg’s CQI Program Handout