31
Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties AEA Evaluation 2006 “The Consequences of Evaluation” RTD TIG, Think Tank Session Portland, Oregon November 4, 2006 Rosalie T. Ruegg Managing Director TIA Consulting, Inc. [email protected] Connie K.N. Chang Research Director Technology Administration U.S. Department of Commerce [email protected]

Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

  • Upload
    sancho

  • View
    38

  • Download
    0

Embed Size (px)

DESCRIPTION

Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties. AEA Evaluation 2006 “The Consequences of Evaluation” RTD TIG, Think Tank Session Portland, Oregon November 4, 2006. Rosalie T. Ruegg Managing Director TIA Consulting, Inc. [email protected]. - PowerPoint PPT Presentation

Citation preview

Page 1: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

Meeting an Evaluation Challenge: Identifying and Overcoming Data

and Measurement DifficultiesAEA Evaluation 2006

“The Consequences of Evaluation”RTD TIG, Think Tank Session

Portland, OregonNovember 4, 2006

Rosalie T. RueggManaging DirectorTIA Consulting, [email protected]

Connie K.N. ChangResearch DirectorTechnology AdministrationU.S. Department of [email protected]

Page 2: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

2

The 4th in a Series of Think Tanks on Barriers to Evaluation 2003: Identification of 6 Types of Barriers

to Evaluation

2004: Focus on Institutional and Cultural Barriers—Feedback Loops

2005: Focus on Methodological Barriers

2006: Focus on Data and Measurement Difficulties

Page 3: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

3

OverviewSix Categories of Barriers Identified --

20031. Institutional/cultural -- 20042. Methodological -- 20053. Resources4. Communications5. Data/Measurement -- 20066. Conflicting stakeholder agendas

Page 4: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

4

2003 Think Tank found …Striking commonality of evaluation barriers among programs and across countries

Page 5: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

5

These barriers were said to impede … Demand for evaluation Planning and conducting evaluation Understanding of evaluation studies Acceptance and interpretation of

findings Use of results to inform

program management budgetary decisions public policy

Page 6: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

6

2006 Think Tank focus – data and measurement difficultiesData difficulties Trail gone cold Missing data Data quality Other?Measurement difficulties Incommensurable effects Forecasts for prospective analysis Aggregation across studies Inadequate treatment of uncertainty and risk Accounting for additionality and defender technologies Other?

Page 7: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

7

Data difficulties Trail gone cold Missing data Data quality Other?

Page 8: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

8

Difficulty: Trail gone coldWith long time gaps between research, results, and

evaluation, evaluators find … Memory lapses “Over the transom” effect with trail broken Mixture of funding sources Distinctiveness lost with technology integrations Departure of key employees Acquisition, merger, death of companies Other? (Generated by Think Tank discussion)

Use of a financial instrument that does not have a legal requirement or dedicated budget to stimulate reporting/cooperation with evaluators

Reliance on a partner to report without regard to capability Mechanics of surveying may be a problem

Page 9: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

9

Dealing with trail gone cold Your thoughts? (Generated by Think Tank

discussion) Build in requirement to report Be proactiveness in conducting surveys a few

years post project (e.g., Tekes – 3 yrs out; 67% response rate; no big changes between 3 yrs and 5 yrs out)

Incentives to high response rates are key because 5% of projects account for 95% of economic gains that random sampling would miss

Page 10: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

10

Dealing with trail gone cold—Overview Third best: Conduct “archeological

digs”

Second best: Use clues to focus “detective work”

Best: Be proactive. Track and document data in real time

Page 11: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

11

Difficulty: Missing Data Data collection is spotty Responses are incomplete Files are corrupted Not all paper records have been

converted to electronic files Other?

Page 12: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

12

Dealing with missing data Your thoughts? (Generated by Think Tank discussion)

Compose data from several sources to fill gaps. If questions are too difficult to respond to – not reasonable,

or confidential – you end up with missing data. So, do trial testing before survey launch.

If program is too young to have data, use proxy data, e.g. ATP used Japanese data to test a firm productivity model, and later ran it with ATP data and got similar results.

Explore statistical techniques to impute missing data. Use security to ensure staff are not taking or corrupting

data. Look for pattern of missing data. Look for biasing effects in data collection. Check for errors in transcribing data from paper to

electronic records (e.g., a check of one survey found tabulated 200 pregnant men!)

Page 13: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

13

Dealing with missing data—Overview Prevention is the best approach: implement sound data

collection

Find the missing data using multiple strategies

Use proxy data

Use techniques for dealing with partial data

Use techniques to impute data

Forthcoming book: Missing Data by Patrick McKnight, and Katherine McKnight (George Mason U); Souraya Sidani (U of Toronto); and Aurelio Jose Figueredo (U of Arizona)

Page 14: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

14

Difficulty: data quality issues Are the data valid for the intended use? Other? (Generated by Think Tank discussion)

Source is key to data quality (i.e., are we asking the right people; are they motivated to answer truthfully?)

Are data used for making comparisons comparable? Has a thorough definitional effort been conducted prior

to data collection to ensure the right data are collected?

Page 15: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

15

Dealing with data quality issues

Your thoughts? (Generated by Think Tank dicussion) Think through upfront, scope out what you want to

collect; how you go about collecting (e.g., NEDO thought through data collection; at the same time, flexibility is important – to avoid locking-in too early and to allow for adjusting/modifying/correcting – i.e., to support an iterative process in data design)

Avoid overly complex data collection instruments Check data entry to detect human error Provide means of calibrating answers and spotting

outliers

Page 16: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

16

Dealing with data quality issues—Overview Understand what is required by detailed up-front

exploration Assess reliability of data processes Use data quality assurance tools

Monitor data quality over time “Clean” data Standardize data to conform to quality rules

Verify calculations[Possible Sources: International Association for Information and Data Quality

(IAIDQ); Data Management Association (DAMA)]

Page 17: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

17

Measurement difficulties Incommensurable effects Forecast for prospective analysis Aggregation -- across studies, projects, levels, etc. Inadequate treatment of uncertainty and risk Accounting for additionality and defender technologies Other?

Instrumentation, who does the measurement? How does calibration occur?

Problem of attribution is profound, scientists argue over it (discovery, use), and problem increases exponentially closer to commercialization (lack of acknowledgement of significance of competitor’s work). Reluctance to give government credit (e.g., companies don’t want gov’t to recoup; want to deny gov’t support helped create success today).

Double counting Difficulties in measuring innovation

Page 18: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

18

Difficulty: Incommensurable effectsPresenting effects measured in

different units in a single study Knowledge – # papers, patents Economic -- $ Environmental – level of emissions Safety -- # accidents Employment -- # jobs Energy security – barrels of oil imports …

Page 19: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

19

Dealing with incommensurables—some ideas In some cases, you can express different

effects in a common measure (e.g., make them commersurable)

In other cases, decision makers may want effects to be expressed separately in their own units In which case, the decision maker must

make trade-offs subjectively Or, the evaluator weights and combines

different effects using index values that are easier to compare, e.g., ATP’s Composite Performance Rating System (CPRS)

Or, other?

Page 20: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

20

Difficulty: Forecast for prospective analysis More uncertainties compared with

ex-post analysis technical uncertainties resource uncertainties market uncertainties (market size,

timing, speed of commercialization) other …

Other?

Page 21: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

21

Dealing with forecast for prospective analysis—some ideas Build uncertainty into the estimations Consider all important alternatives

different levels of technical success different market applications to consider

Revise forecast as additional info becomes available

Other?

Page 22: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

22

Difficulty: Aggregation across studies Different base years Different time periods Different methods Differences in underlying

assumptions, models, algorithms Different types of effects measured Other?

Page 23: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

23

Dealing with aggregation across studies—some ideas Best is to reduce incompatibility by

standardization, and require transparency and replicability

Where there is internal consistency, combine common measures across studies

In place of aggregation, summarize across studies in terms of a single measure (e.g., a table of IRRs), at your own risk!

Other?

Page 24: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

24

Difficulty: Inadequate treatment of uncertainty and risk, leading to -- Overstatement of results

Unrealistic expectations

Faulty decisions

Other?

Page 25: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

25

Dealing with inadequate treatment of uncertainty and risk Use a techniques such as the following: Sensitivity analysis Statistical test of variation (e.g., confidence

interval) Expected value analysis Decision trees Risk-adjusted discount rates Certainty equivalent technique Computer simulations using random draw across

range of values Other?

Page 26: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

26

Difficulty: Accounting for additionality and defender technology Incorrectly attributing all observed changes to a

program’s effect Partial identification or double counting of

additionality effects Problems in defining reference groups for control

studies Recognizing limitations of counterfactual

questions (i.e., non-experimental design) Ignoring or incorrectly modeling the defender

technology Other?

Page 27: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

27

Dealing with additionality and defender technology—some ideas Always consider effects with and without

the program (e.g., counterfactuals, before/after comparisons, control groups)

Breakout additionality effects into component parts

Page 28: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

28

Dealing with additionality and defender technologies, cont’d Systematic comparison

Program--yes

Program--no

Success rate ASuccess rate BSuccess rate C

Success rate DDefender tech with improvement rate XDefender tech with improvement rate Y

prob=10%

prob=50%

prob=40%

“Dynamic modelling of defender tech”

Page 29: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

29

2006 Think Tank focus – data and measurement difficultiesData difficulties Trail gone cold Missing data Data quality Other?Measurement difficulties Incommensurable effects Forecast for prospective analysis Aggregation across studies Inadequate treatment of uncertainty and risk Accounting for additionality and defender technologies Other?

Page 30: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

30

SummarySix Categories of Barriers Identified --

20031. Institutional/cultural -- 20042. Methodological -- 20053. Resources4. Communications5. Measurement/data -- 20066. Conflicting stakeholder agendasNov 2007 … what’s next?

Page 31: Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties

2006

AEA

Eva

luat

ion

Conf

eren

ce T

hink

Tank

Rue

gg &

Cha

ng

31

Contact information Rosalie T. RueggManaging DirectorTIA Consulting, [email protected]

Connie K.N. ChangResearch DirectorTechnology AdministrationU.S. Department of

[email protected]