13
Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: [email protected] Evvy Award: The US MEP Program - A System Model? Workshop on “Research Assessment: What’s Next,” Arlie House, May 18, 2001

Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: [email protected] Evvy Award:

Embed Size (px)

Citation preview

Page 1: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

Evaluating Technology Programs

Philip Shapira

School of Public Policy

Georgia Institute of Technology, Atlanta, USA

Email: [email protected]

Evvy Award:The US MEP Program - A System Model?

Workshop on “Research Assessment: What’s Next,” Arlie House, May 18, 2001

Page 2: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

Overview

Program structure

Evaluation mode

Evaluation orientation

Fixed programs with clear goals, simple structures

“Classic” goal-activity program evaluation (maybe?)

Justification (WHAT DID PROGRAM DO?)

Multi-year programs with multiple goals, changes in goals, multiple stakeholders (practical norm?)

Evaluation systems of some complexity – “networked” evaluations (hopefully?)

Justification+ learning and improvement (HOW CAN PROGRAM DO BETTER?)

Page 3: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

Evaluation caseThe US Manufacturing Extension Partnership (MEP)

MEP program aims: “Improve the technological capability, productivity, and

competitiveness of small manufacturers.” “Transform a larger percentage of the Nation’s small manufacturers

into high performance enterprises.”

Policy structure: federal-state collaboration Management: decentralized partnership - 70 MEP

centers Services: 25,000 firms assisted/year

Assessments 18%; projects 60%; training 22%

Revenues: 99-00 ~ $280m Federal $98m (35%); state $101m (35%); private $81m

(29%)

Page 4: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

MEP Program Model

DevelopmentOutcomes

Business Outcomes

IntermediateActions

Centers

Projects

Companies

Page 5: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

MEP Evaluation System

NIST Telephone survey of

customers of projects based on center activity data reports

Panel reviews of centers and staff oversight

National Advisory Board review of program

Special studies

Federal Oversight (e.g. GAO)

State Evaluation

s

Independent Researchers

& Consultants

3rd Party sponsors

MEP Program

Page 6: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

Complex Management Context for Evaluation

DevelopmentOutcomes

Business Outcomes

IntermediateActions

Centers

Projects

Companies

CenterReviews

NeedsAssessments

ActivityReporting

CustomerSurveys

CenterBenchmarking

SpecialEvaluation

Studies

CenterBoards

CenterPlans

NIST PlansGPRA goals

NationalAdvisory

Board

Page 7: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

30 MEP Evaluation Studies, 1994-99:Multiple Methods

Methods* Number of studies using method Customer survey 9 Survey with comparison group 8 Case study 5 Benefit-cost 5 Longitudinal study 3 Simulation model 3 Center study 4 Total 37 *Some studies used more than one method. Source: Analysis of 30 empirical evaluation studies of manufacturing extension services conducted between 1994 and 1999. Updated from Youtie and Shapira (1998).

Page 8: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

30 MEP Evaluation Studies, 1994-99:Varied Performers

Performers Number Census Bureau 2 Cosmos 2 GA Tech 7.5 GAO 1 ITI 3 MEP 3.5 Nexus 5.5 Others 5.5 Total 30

Source: Analysis of 30 empirical evaluation studies of manufacturing extension services conducted between 1994 and 1999. Updated from Youtie and Shapira (1998).

MEP revenues 94-99:~ $1.0 B. - $1.2 B.Evaluation expenditures:~ $3m-$6m ??=.25%-.50%

Page 9: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

Utility of Evaluative Methods(with schematic ranking, based on GaMEP experience 94-00)

Program JustificationMethodState Federal

ProgramManagement

andImprovement

Management information system

Client valuation surveys;customer follow-ups

Program impact analysis

Cost-benefit analysis

Longitudinal controlled surveys

Case studies

External reviews

Note: Ranking (schematic): 5 = extremely important; 3 = somewhat important; 1 = not important. Ranking weights are schematic, based on experience.

Page 10: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

30 MEP Evaluation Studies, 1994-99:Summary of Key Findings

More than 2/3 of customers act on program recommendations. Enterprise staff time committed exceeds staff time provided

(leverage) More firms report impacts on knowledge and skills than are able to

report hard dollar impacts Networked companies using multiple public and private resources

have higher value-added than more isolated firms (raises issues of attribution)

Robust studies show skewed results - important impacts for a few customers, moderate impacts for most

Service mix and duration matters in generating impacts Case studies show that management and strategic change in

companies is often a factor in high impact projects In comparative studies, there is evidence of improvements in

productivity, but these improvements are modest. Impact on jobs is mixed.

Cost-benefit analyses show moderate to strongly positive paybacks

Page 11: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

30 MEP Evaluation Studies, 1994-99:Assessment

Advantages Multiple methods

and perspectives Encourages

methodological innovation

Discursive - findings promote exchange, learning

Can signpost improved practice

Challenges Fragmented, many results, some

contradict Program justification still prime Variations in quality; reliable

measurement oftenl a problem; Dissemination Different evaluation methods are

received and valued differently by particular stakeholders

Agency interest in sponsorship may be waning - fear of “non-standard” results

Page 12: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

Insights from the MEP case (1)

Technology program evaluation should not focus exclusively on narrow economic impacts; but also assess knowledge transfer, strategic change & stimulate learning and improvement Multiple evaluation methods and performers are key to

achieving this goal Strong internal dynamic to promote assessment,

benchmarking, discursive evaluation

Illustrates a “networked evaluation partnership” Balancing of federal and state perspectives, with federal role

adding resources and consistency to evaluation system Local experimentation is possible and can be assessed Emergence of an evaluation cadre and culture - development

of methodologies Highly discursive: signposts improved practice Evaluation becomes a forum to negotiate program direction

Page 13: Evaluating Technology Programs Philip Shapira School of Public Policy Georgia Institute of Technology, Atlanta, USA Email: ps25@prism.gatech.edu Evvy Award:

Insights from the MEP case (2)

Also illustrates threats Variations in robustness, effectiveness, awareness of multiple

evaluation studies Oversight “demand” for complex evaluation system is weakly

expressed - GPRA is a “low” hurdle to satisfy Agency push for “results” and performance measurement

(rather than evaluation) - fear of non-standard results Vunerability to fluctuations in agency will to support

independent outside evaluators Translating evaluation fundings into implementable program

change is a challenge, especially as program matures. Threats to learning mode? Maturization; bureacratization;

standard result expectations; political support.