27
EVALUATION 101: HOW CAN WE EVALUATION 101: HOW CAN WE DEMONSTRATE PROGRAM RESULTS? DEMONSTRATE PROGRAM RESULTS? Jon E. Burkhardt, Westat Jon E. Burkhardt, Westat Dr. David J. Bernstein, Westat Dr. David J. Bernstein, Westat Prepared for the Prepared for the National Center on Senior National Center on Senior Transportation Transportation Easter Seals / n4a Easter Seals / n4a December 9, 2008 December 9, 2008

EVALUATION 101: HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

Embed Size (px)

DESCRIPTION

EVALUATION 101: HOW CAN WE DEMONSTRATE PROGRAM RESULTS?. Jon E. Burkhardt, Westat Dr. David J. Bernstein, Westat Prepared for the National Center on Senior Transportation Easter Seals / n4a December 9, 2008. Presentation Outline. NCST Project Objectives - PowerPoint PPT Presentation

Citation preview

Page 1: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

EVALUATION 101: HOW CAN WE EVALUATION 101: HOW CAN WE

DEMONSTRATE PROGRAM RESULTS?DEMONSTRATE PROGRAM RESULTS?

Jon E. Burkhardt, WestatJon E. Burkhardt, Westat

Dr. David J. Bernstein, WestatDr. David J. Bernstein, Westat

Prepared for the Prepared for the

National Center on Senior TransportationNational Center on Senior Transportation

Easter Seals / n4aEaster Seals / n4a

December 9, 2008December 9, 2008

Page 2: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

22

Presentation OutlinePresentation Outline NCST Project ObjectivesNCST Project Objectives

The transportation improvement processThe transportation improvement process

Performance measurement and evaluation: different Performance measurement and evaluation: different activities, different uses activities, different uses

Applying performance measurement and evaluation to Applying performance measurement and evaluation to transportation programstransportation programs

Summary Summary

Contact informationContact information

Sources and referencesSources and references

Page 3: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

33

NCST Project ObjectivesNCST Project Objectives

1. Greater coordination between the aging community and the transportation industry

2. Increasing the family of transportation options for older adults at the local level

3. Ensuring caregivers are educated regarding transportation options

4. Addressing barriers to implementing more transportation services for older adults

Page 4: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

44

The Transportation Improvement The Transportation Improvement ProcessProcess

Analyze existing conditions: determine local Analyze existing conditions: determine local transportation needs and resourcestransportation needs and resources

Define community goals, objectives, and evaluation Define community goals, objectives, and evaluation strategiesstrategies

Confirm working relationshipsConfirm working relationships

Design and assess alternative services and Design and assess alternative services and strategiesstrategies

Implement service changesImplement service changes

Evaluate and improve the services Evaluate and improve the services

Page 5: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

55

Performance measuresPerformance measures

EvaluationsEvaluations

Which?Which?

What?What?

When?When?

Why?Why?

Page 6: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

66

Why All the Fuss?Why All the Fuss?

Page 7: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

77

The “What:” Performance MeasurementThe “What:” Performance Measurement

Performance Measurement:Performance Measurement: periodic but regular periodic but regular monitoring and reporting of program monitoring and reporting of program accomplishments, particularly progress towards pre-accomplishments, particularly progress towards pre-established goalsestablished goals

Typical measures: Typical measures: inputs (resources applied to a problem) inputs (resources applied to a problem) outputs (numeric measures of program products)outputs (numeric measures of program products) outcomes (what changed)outcomes (what changed)

Page 8: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

88

The “Why:” EvaluationThe “Why:” Evaluation

Evaluations:Evaluations: systematic studies to assess how well systematic studies to assess how well a program is workinga program is working

Some of the possible components:Some of the possible components: The process of implementing the programThe process of implementing the program Report on program objectives achieved [or not Report on program objectives achieved [or not

achieved]achieved] Attribution of the results: Attribution of the results:

to the program? to the program? to events outside of the program?to events outside of the program?

Recommendations for improving program resultsRecommendations for improving program results

Page 9: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

99

The DifferencesThe Differences

Performance measurementPerformance measurement

will provide regular data for reports about the will provide regular data for reports about the progress of a programprogress of a program

can explain can explain what is happeningwhat is happening within a program within a program

EvaluationsEvaluations

are more in-depthare more in-depth

help explain help explain why program performance did or did why program performance did or did not changenot change

help attribute performance measures to program help attribute performance measures to program resultsresults

Page 10: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1010

Performance Measurement DataPerformance Measurement Data

Inputs:Inputs: Measures of the resources that are applied in Measures of the resources that are applied in providing servicesproviding services

Activities:Activities: Measures of the services that are being Measures of the services that are being providedprovided

Outputs:Outputs: Measures of the quantity of services provided or Measures of the quantity of services provided or the quantity of a service meeting quality requirementsthe quantity of a service meeting quality requirements

Outcomes:Outcomes: Measures that address the intermediate or Measures that address the intermediate or long-term results of a program on those receiving a servicelong-term results of a program on those receiving a service

Page 11: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1111

Program Outcome ModelProgram Outcome Model

Resources dedicated to or consumed by the program

• Money• # of staff• Staff time• # of volunteers• Volunteer time• Facilities• Equipment•  Supplies• Other

What the program does with the inputs to fulfill its mission

• More staff & volunteers to provide trips

• I & R services• New dispatch

systems• Driver training• New vehicles

The direct products of program activities

• More rides• More riders• Additional

volunteers• Greater service

span• Quicker I & R• Shorter wait time

Benefits for participants during and after program activities

• Improved access• Greater sense of

independence • Increased customer

satisfaction• Increased flexibility• Improved

communication among providers

Page 12: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1212

How to Use Performance MeasuresHow to Use Performance Measures

Improve decision making: Ensure that programs are being implemented. Provide input to day-to-day program management, funding decisions, and support strategic planning for services to clients.

Monitor Service Performance: track resources, program production and results, and monitor the need for more comprehensive examination and analysis (service interventions, evaluations).

Report Results/Be Accountable: Provide information to various parties, including staff, funders, service providers, program partners, clients, and other stakeholders.

Page 13: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1313

Good Performance MeasuresGood Performance Measures

Focus on resultsFocus on results

Are relevant and useful to program managers and Are relevant and useful to program managers and stakeholdersstakeholders

Are readily measurable and countableAre readily measurable and countable

Provide valid, verifiable, and reliable informationProvide valid, verifiable, and reliable information

Are clear and understandable, requiring only Are clear and understandable, requiring only minimal explanation to be understoodminimal explanation to be understood

Can be compared to targets, comparable Can be compared to targets, comparable programs, or legal or quality standardsprograms, or legal or quality standards

Page 14: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1414

Performance Measurement QuestionsPerformance Measurement Questions

1.1. How many people are being served now compared How many people are being served now compared with before we started?with before we started?

2.2. How many trips are provided now? How many How many trips are provided now? How many were provided previously?were provided previously?

3.3. Is my program meeting its targets?Is my program meeting its targets?

4.4. Is my program efficient? Effective? Cost Is my program efficient? Effective? Cost effective?effective?

5.5. How does my program measure up against other How does my program measure up against other programs?programs?

Page 15: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1515

Evaluation QuestionsEvaluation Questions

1.1. Can the results of our program be attributed to our Can the results of our program be attributed to our program?program?

2.2. How can we meet the needs of our community?How can we meet the needs of our community?

3.3. Did our program meet its goals? Why or why not?Did our program meet its goals? Why or why not?

4.4. Did the way we implemented our program Did the way we implemented our program influence the results we got, or did not get?influence the results we got, or did not get?

5.5. How can we improve our program?How can we improve our program?

Page 16: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1616

Primary Data Collection DecisionsPrimary Data Collection Decisions

Who will collect what kinds of data?

Which data will be collected at which points in time?

How will the data be used?

Page 17: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1717

Potential Evaluation ComponentsPotential Evaluation Components

System characteristics: Resources (inputs)

Performance measures: Efficiency (cost / mile, etc.) Effectiveness (trips / mile) Cost effectiveness ($ / trip)

Service quality: Consumer and management data

Service evaluations: Outcomes and impacts

Page 18: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1818

Statistics for Performance MeasuresStatistics for Performance Measures Services deliveredServices delivered

Vehicle miles of serviceVehicle miles of service Vehicle hours of serviceVehicle hours of service

Services consumedServices consumed

Unlinked passenger tripsUnlinked passenger trips Unduplicated persons servedUnduplicated persons served

Fully allocated costsFully allocated costs

All costs required to provide transportation serviceAll costs required to provide transportation service

Page 19: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

1919

Service Quality ComponentsService Quality Components

Acceptability: reliability, connections, trust, comfort, respect

Accessibility: can physically use, can get information to use, proximity

Adaptability: flexibility, responds to specific requests, meets trip needs and special needs of clients

Affordability: not excessive money, time, or effort required to travel

Availability: frequency, hours / days / places available

Page 20: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

2020

An Example of Outcome EvaluationAn Example of Outcome Evaluation

GOAL: Increase seniors’ knowledge and use of transportation options

EVALUATION STEPS: Do seniors know more about the options?

Have seniors increased their use of options?

Which outreach and education activities have been undertaken?

How do we know that the outreach and education activities are responsible for the changes?

Page 21: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

2121

Sources of Service Quality DataSources of Service Quality Data

Dispatch and driver logsDispatch and driver logs

Records of complaintsRecords of complaints

User surveysUser surveys

Page 22: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

2222

Survey: What’s your objective?Survey: What’s your objective?

Needs analysis:Needs analysis: Ask potential riders [or their Ask potential riders [or their advocates] to find out their needsadvocates] to find out their needs

Customer satisfaction:Customer satisfaction: Ask current transportation Ask current transportation users about their satisfaction with servicesusers about their satisfaction with services

Volunteer driver assessment:Volunteer driver assessment: Ask volunteer drivers Ask volunteer drivers [and passengers] about their experiences[and passengers] about their experiences

Service provider perspectives:Service provider perspectives: Ask existing service Ask existing service providers what improvements are neededproviders what improvements are needed

Page 23: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

2323

Survey ConsiderationsSurvey Considerations

Different strategies required for different groupsDifferent strategies required for different groups

How to administer the survey (by mail, phone, How to administer the survey (by mail, phone, internet, in-person, or some combination?)internet, in-person, or some combination?)

When to administer (before, after, or before and When to administer (before, after, or before and after?)after?)

Pilot test your survey and refine it based on the pilot Pilot test your survey and refine it based on the pilot test resultstest results

Page 24: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

2424

SummarySummary

Monitoring and evaluation shouldMonitoring and evaluation should

Be based on valid data and replicable analysesBe based on valid data and replicable analyses

Support day-to-day program management and Support day-to-day program management and operations: expand / contract; continue / changeoperations: expand / contract; continue / change

Help measure progress towards goals and objectivesHelp measure progress towards goals and objectives

Indicate potential service and program improvements Indicate potential service and program improvements

Provide accountability to funding sourcesProvide accountability to funding sources

Support program continuationSupport program continuation

Page 25: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

2525

Contact InformationContact Information

WESTATWESTAT1650 Research Blvd1650 Research BlvdRockville, Maryland 20854Rockville, Maryland 20854

Jon BurkhardtJon BurkhardtPhone: 301/294-2806Phone: 301/[email protected]@Westat.com

David BernsteinDavid BernsteinPhone: 301/738-3520Phone: 301/[email protected]@Westat.com

Page 26: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

2626

Sources and ReferencesSources and ReferencesSlide 6: United Features Syndicate, 3/13/99Slide 6: United Features Syndicate, 3/13/99

Slides 7-9: U.S. General Accounting Office. Performance Slides 7-9: U.S. General Accounting Office. Performance Measurement and Evaluation: Definitions and Relationships. Measurement and Evaluation: Definitions and Relationships. April 1998. April 1998. http://www.gao.gov/archive/1998/gg98026.pdfhttp://www.gao.gov/archive/1998/gg98026.pdf and and Wholey, J., Hatry, H., and Newcomer, K. (2004). Handbook of Wholey, J., Hatry, H., and Newcomer, K. (2004). Handbook of Practical Program Evaluation. Practical Program Evaluation. http://www.josseybass.com/WileyCDA/WileyTitle/productCd-078796http://www.josseybass.com/WileyCDA/WileyTitle/productCd-0787967130.html7130.html

Slide 10: Governmental Accounting Standards Board [GASB]. Slide 10: Governmental Accounting Standards Board [GASB]. (1994). (1994). Concepts statement no. 2, Service efforts and Concepts statement no. 2, Service efforts and accomplishments reportingaccomplishments reporting. . http://www.seagov.org/resources/glossary.shtmlhttp://www.seagov.org/resources/glossary.shtml and Montgomery and Montgomery County, MD. (March 2006). Montgomery Measures Up; County, MD. (March 2006). Montgomery Measures Up; http://www.montgomerycountymd.gov/content/omb/FY07/mmurec/hhttp://www.montgomerycountymd.gov/content/omb/FY07/mmurec/howtoread.pdfowtoread.pdf

Page 27: EVALUATION 101:  HOW CAN WE DEMONSTRATE PROGRAM RESULTS?

2727

Sources and References (cont.)Sources and References (cont.)Slide 11: United Way of America (2006). Measuring Program Slide 11: United Way of America (2006). Measuring Program

Outcomes: A Practical Approach. Outcomes: A Practical Approach. http://http://www.liveunited.org/Outcomes/Resources/MPO/model.cfmwww.liveunited.org/Outcomes/Resources/MPO/model.cfm

Slide 12: Epstein, P.D. (1988). Slide 12: Epstein, P.D. (1988). Using performance measurement in Using performance measurement in local government: A guide to improving decisions, performance, local government: A guide to improving decisions, performance, and accountabilityand accountability. New York, National Civic League Press.. New York, National Civic League Press.

Slide 13: Bernstein, D.J. (2000). Slide 13: Bernstein, D.J. (2000). Local Government Performance Local Government Performance Measurement Use: Assessing System Quality and EffectsMeasurement Use: Assessing System Quality and Effects. . Washington, DC: George Washington University. Available from Washington, DC: George Washington University. Available from ProQuest-University Microfilms Inc., ProQuest-University Microfilms Inc., http://www.umi.com/hp/Products/Dissertations.htmlhttp://www.umi.com/hp/Products/Dissertations.html..

Slides 17 - 19: Burkhardt, J.E. (2004) Slides 17 - 19: Burkhardt, J.E. (2004) Critical Measures of Transit Critical Measures of Transit Service Quality in the Eyes of Older Travelers.Service Quality in the Eyes of Older Travelers. Transportation Transportation Research Record No. 1835Research Record No. 1835, Transportation Research Board, , Transportation Research Board, Washington DC.Washington DC.