17
S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy Institute American Evaluation Association October 27, 2005

S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

Embed Size (px)

Citation preview

Page 1: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

The Use of Social Network Analysis in Evaluation Design

Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery

Science and Technology Policy Institute

American Evaluation Association

October 27, 2005

Page 2: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

2

Overview

• Using Social Network Analysis (SNA) in evaluation design

• SNA for sampling• SNA for pipeline evaluation design• Tentative conclusions

Page 3: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

3

Using Social Network Analysis in Evaluation Design

• Social network analysis increasingly being used as tool for evaluating program outcomes– Interdisciplinarity, collaboration, partnerships

• Features that make SNA effective outcome evaluation tool also valuable in designing evaluations– Sampling for surveys, interviews, site visits– Visualization of networks for pipeline evaluations

Page 4: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

4

SNA for Sampling

• Evaluations of R&D programs often require data collection from individual scientists/grants– Surveys, interviews, site visits

• Often sampling frame is stratified based on demographic characteristics– Organizational affiliation, department, gender

• Scientists, however, are not atomized – form “invisible colleges” through working relationships

Page 5: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

5

SNA for Sampling (2)

• Where evaluation designs require controlling for working relationships– Spread of information through a field or fields– Programmatic influence that may vary by

subfield/subdiscipline

• Using SNA as basis for stratifying population may be superior to relying solely on demographics of individual scientists

Page 6: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

6

Program Evaluation Example: Spread of Methods Through Community

• Formative evaluation partner of NSF-funded, large-scale Center award

• Center/program goal to develop tools and methods that will diffuse throughout community– Evaluation design includes longitudinal surveys of

scientists to assess use of concepts/tools and Center’s influence

– Using SNA to visualize “community” and select survey sample

Page 7: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

7

Sampling Frame: Demographic Representation

Borgatti, S.P., M.G. Everett, and L.C. Freeman. 1999. UCINET 5.0 Version 1.00. Natick: Analytic Technologies.

Page 8: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

8

Sampling Frame: Representation by Co-Authorship

Borgatti, S.P., M.G. Everett, and L.C. Freeman. 1999. UCINET 5.0 Version 1.00. Natick: Analytic Technologies.

Page 9: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

9

Sampling Frame: Key Nodes Identified

Borgatti, S.P., M.G. Everett, and L.C. Freeman. 1999. UCINET 5.0 Version 1.00. Natick: Analytic Technologies.

Page 10: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

10

SNA for Pipeline Evaluation Design

• “Pipeline” evaluation concept– Programs may be locally optimal but globally

suboptimal, leading to “leaks” from the pipeline, because of:

• Poor articulation across programs/pipeline segments• Misdistribution of resources

– Need instead to assess contribution of programs/organizations to overall flow through pipeline rather than contribution of each program individually

• Network analysis as tool for visualizing pipeline first step in evaluation design

Page 11: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

11

Pipeline Evaluation Example: “Translational” Biomedical Research

• Flow of science from “bench to bedside”• Assessment of “translational” research by NIH

Institute for strategic planning purposes– Balance between individual-investigator awards and

large-scale Center-like programs– Design of future/desired pipeline

• Portfolio of programs• Relationships between Institute and other stakeholders• Policies/regulations/structures to enhance translation

• Reviews in past have used portfolio assessment methods to provide quantitative data for evaluation

Page 12: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

12

Examples of Portfolio-Based Analyses Institute Historically Uses in Reviews

Discovery Pre-Clinical

Clinical Trials

Individual-Investigator Awards

Large-Scale Center-likePrograms

We are using SNA to visualize entire translational research pipeline

Page 13: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

13

SNA Allows for Representation of Pipeline

Discovery Pre-clinical Clinical Trials

Location on translational research continuum

“End to end” Centers program

“End to end” Centers program

Page 14: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

14

SNA Allows for Representation of Pipeline

Discovery Pre-clinical Clinical Trials

Location on translational research continuum

“End to end” Centers program

“End to end” Centers program

Page 15: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

15

SNA Allows for Representation of Pipeline

Discovery Pre-clinical Clinical Trials

Location on translational research continuum

Page 16: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

16

Visualized Networks Will Be Assessed for Productivity and Efficiency

Discovery Pre-clinical Clinical Trials

Location on translational research continuum

Page 17: S T P I The Use of Social Network Analysis in Evaluation Design Brian Zuckerman, Bhavya Lal, Alexis Wilson, Nathaniel Towery Science and Technology Policy

S T P I

17

Tentative Conclusions: SNA Methods Can Contribute to Evaluation Design

•  Potentially broad applications for sampling– Evaluations where communication/diffusion of

information/collaboration key program element– Cross-sectional designs that require identification of "key nodes"

of research activity– Growing sophistication of databases and tools reducing barriers

• More specialized applications for pipeline evaluation– Assessment of flow across multiple stages/levels

• Large-scale, mission-oriented vertically-integrated R&D organizations (e.g., DoD, NASA, large private companies)

• Evaluation of STEM education/workforce programs– Requires integration of data across programs and levels – data

generally not collected with pipelines or integration in mind