Upload
alexandrea-sennott
View
217
Download
1
Tags:
Embed Size (px)
Citation preview
Defining and Measuring Successin Technology-based Economic Development
Catherine Searle Renault, Ph.D.RTI InternationalCenter for Technology Applications
Science, Technology and Economic Growth: A Practicum for StatesMarch 23, 2004
Center for Technology Applications
Overview
• Principles of Evaluation and Measurement• Theory of Technology-based Economic Development• Indicators • Ways to Collect Data• Example – Evaluation of Maine’s Public Investment in R&D• Challenges and Opportunities• Lessons Learned
Center for Technology Applications
Why Measure and Evaluate?
Evaluation is the collection, analysis, interpretation and communication about the effectiveness of programs undertaken for the public good.
• Aids in decisions about whether program should be expanded, continued, improved or curtailed
• Increase the effectiveness of program management• To satisfy calls for accountability• To measure the impact on the core problem
Center for Technology Applications
Key Concepts
• Evaluation is a process, not an event. • Evaluation is for practical use, not to sit on the shelf.• The questions to be answered are derived from the
program itself.• Compares “What is” with “What would have been” and
“What should be”• Takes place in a setting where work/programs are
ongoing.
Center for Technology Applications
Definitions
• Program, e.g. The Advanced Technology Program, a complete initiative
• Project, one interaction with a client, e.g. a single ATP award.• Input: resources used to produce outputs and outcomes.• Outputs: Products and services delivered. The completed
products of internal activities.• Outcomes: An event, occurrence or condition that is outside the
program itself that is important. • Intermediate outcomes: important outcomes, but not the end in
itself.• End outcome: The sought after result
Center for Technology Applications
General Logic Model
•Personnel•Facilities•Funding
•Activities•Clients Served•Awards Made
•Mid-Point Events •Ending Events
INPUTS OUTPUTSINTERMEDIATE
OUTCOMESEND
OUTCOMES
Center for Technology Applications
Definitions
• Measurement/Monitoring — What are the outcomes of the program/project?
• Impact Measurement— Calculate economic impact of outcomes
• Program/Project Evaluation — Involves causality
Center for Technology Applications
Causality and Attribution
• To prove causality, you need four conditions:— That the outcomes exist— That the inputs precede the outcomes— That the inputs, outputs and outcomes are related— That all other explanations are accounted for.
• Attribution is weaker, but easier to prove.— That the outcomes exist— That the inputs precede the outcomes— That the inputs, outputs and outcomes are related— Clients say (attribute) their results to the program/project.
Center for Technology Applications
Principles for Evaluation
• If more than one program, establish consistent approach for all programs
• Ensure clear articulation of goals in as concrete terms as possible
• Be as rigorous as possible in design and analysis to increase validity and credibility, but make tradeoffs reflecting operational issues
• Gain evaluation at state level as well as data for individual program management
Center for Technology Applications
Program Theory-based Evaluation
• Use the theory behind the intervention to design appropriate indicators of intermediate and end outcomes. — Identify the goals and objectives of the program— Construct a model of what the program is supposed to
accomplish— Collect data to compare:
• Goals• Actual observed outcomes• What would have happened otherwise, i.e. without intervention
— Analyze and interpret results
Center for Technology Applications
Goals and Objectives of Technology-based Economic Development
Improve citizens’ quality of life by:
• Creating and retaining high quality jobs (defined as higher pay), generally in technology-based businesses
• Creating and retaining (and in some cases, recruiting) high quality companies, defined as high growth, high paying), generally in technology-based industries
• Improving the stability and/or competitiveness of local and regional economy through innovation
Center for Technology Applications
Logic Model for Technology-Based Economic Development
BasicResearch
AppliedResearch
TECHNOLOGYTRANSFER
OFFICE
GovernmentFunding
FoundationFunding
RESEARCH INSTITUTIONS
MarketInnovationEconomy
GovernmentR&D Grants
Competition
Workforce
Debt &Equity
Funding
Cost ofDoing
Business
DevelopMarketing
Opportunity
R & DDriven
Industry
Center for Technology Applications
Product/Company Life Cycle Model
BasicResearch
AppliedResearch
ProductLaunch
EnhanceProduct
ProductMaturity
t
Center for Technology Applications
Interventions to Build an Innovation Economy
Build Research Capacity
Company Basic/Applied
Research
Design for Manufacturing
Product Launch
Enhance Product
Product Maturity
Technical Assistance
Centers of ExcellenceAdvanced
Manufacturing Centers
MEP
Sea Grant
CREES
Business Assistance
Incubators, Business Development; Science Parks
FundingEPSCOR
Federal Funding
ATP
SBIR
STTR
State Research Grants
State - Sponsored Seed Funds
SBA Loans
Center for Technology Applications
Intermediate Indicators
• Researchers—S&E graduate
students—Federal R&D grants—R&D expenditures—Patents—Publications—New Sponsored R&D
with local companies
• Companies—Patents—Venture capital
raised—SBIR and STTRs
won—Other federal
programs (ATP) won—M&A activity— IPO activity
Center for Technology Applications
End Outcome Indicators
• Average annual earnings of employees• Number of high-technology companies in the
state/region• Number of scientists and engineers employed in the
state/region• Number of company births, especially high-technology• Percent of revenue from outside state• Revenue per employee (productivity)
Center for Technology Applications
Collecting Data
Three possible methods; use one or all:
• Annual survey of all recipients of (all) programs— Use with control group to assess causality— Potentially split companies and research institutions
• Indicator data for states and benchmark states to assess changes in competitiveness.
• Case studies to understand detailed trends
Center for Technology Applications
Key Decisions for Annual Survey
• Who to survey: universe of companies and researchers; develop single list; sample or all?
• What is unit of analysis? Company? Project?• How frequently: annually? keep respondents on list for 5 years• When to survey: July-August good match for government
reporting, poor for companies• When to analyze data and report: driven by state budget cycles• What methods to use: develop innovative and low cost methods to
collect data– mail and web• How to assess causality: establish a control group for statistical
comparison purposes
Center for Technology Applications
Issues for Indicator Analysis
• Linkage with Innovation Index activities— Same or related indicators?— Degree of analysis— Sources that will be consistent over time
• Data availability – not always available by states, region, county or locality
• Timeliness of data• What are appropriate comparison states/regions?
Center for Technology Applications
Decisions to Make about Case Studies
• How to chose which ones to do• Who to interview? We suggest Program Managers,
clients, other stakeholders, e.g. Board members, trade associations, related programs.
• How to ensure reliability and replicability of data— Protocol based on indicators — Maintain database— Consistency of process
Center for Technology Applications
Analysis and Interpretation
• Be descriptive• Note trends
— Especially useful to note benchmark year, e.g. before beginning of program
• Norm by population, gross state product, etc.• Graph for easy interpretation• Acknowledge limitations of data, research design
Center for Technology Applications
Example – Maine Evaluation of Public Investments in R&D
Context of Evaluation• Maine has substantially increased its ongoing
investments in R&D starting in 1996• Evaluation legislatively mandated, funded by “tax” on
R&D investments• Required outside experts to perform the evaluation of
all public R&D investments.• An ongoing process
— initial evaluation and process design 2001— annual data collection— Five year evaluation in 2006
Center for Technology Applications
The Three Questions
1. How competitive is Maine’s sponsored R&D and has it improved over time?
2. What is the impact of Maine’s R&D investment on the development of Maine’s R&D industry?
3. What is the impact of Maine’s R&D investment on the level of innovation and innovation-based economic development?
Center for Technology Applications
Results Reported for 2003
How competitive is Maine’s sponsored R&D and has it improved over time?
Maine started from a lagging position and is making some gains … but generally just keeping up since other states are also investing heavily. Maine appears to be gaining on other EPSCoR states in SBIR/STTR awards and in venture capital investments.
Center for Technology Applications
Results Reported for 2003
What is the impact of Maine’s R&D investment on the development of Maine’s R&D industry?
Maine made sizeable investments in research capacity in the late 1990s and the intermediate outcomes are evident: more faculty, more research equipment and facilities, more proposals submitted, more publications. However, there is little change in intellectual property and joint research with industry or commercial outcomes.
Center for Technology Applications
Results Reported for 2003
What is the impact of Maine’s R&D investment on the level of innovation and innovation-based economic development?
The state’s R&D investments are reaching the appropriate targets – the clients are overwhelmingly small, R&D companies (less than 10 employees, revenues less than $1 million, less than five years old).
The companies are reporting better than average results in employment growth, revenue growth, per capita income, productivity.
We detect many elements of causality for gains in SBIR/STTR, intellectual property and venture capital investments.
Center for Technology Applications
Challenges and Opportunities
• Faces at the table change constantly over six-year period
• General distrust of evaluation process• Many programs don’t keep good records and/or
contacts with past clients• Research design for Technology-based Economic
Development challenging because of long lead times for outcomes to develop, difficulty in assessing causality and lack of good measures for innovation per se
Center for Technology Applications
Lessons Learned
• Doing evaluation correctly is not cheap … Surveying, in particular, is time consuming.
• Excellent tool for program management; less effective, but may be required, for accountability
• Credibility is linked to your program’s overall positioning; a good evaluation can help, but not necessarily. A bad evaluation is a bad evaluation.
• The work we are doing in technology-based economic development pays off in the mid- to long-run.