Relationships between Involvement and Use in the Context of Multi-site Evaluation

Preview:

DESCRIPTION

Relationships between Involvement and Use in the Context of Multi-site Evaluation. American Evaluation Association Conference November 12, 2009. Beyond Evaluation Use. Four-year NSF grant to study the relationships between involvement in program evaluation and use/influence - PowerPoint PPT Presentation

Citation preview

Relationships between Involvement and Use in the Context of Multi-site Evaluation

American Evaluation Association Conference

November 12, 2009

Beyond Evaluation Use• Four-year NSF grant to study the relationships

between involvement in program evaluation and use/influence

• Research team (2 co-PIs and 8 graduate students) based at the University of Minnesota

• Context of four NSF-funded multi-site programs• Involvement and use by not directly intended

(unintended) users

Framework for Involvement

• Cousins and Whitmore’s (1998) Systematic Collaborative Inquiry– Control of the Evaluation– Stakeholder Selection– Depth of Participation

• Burke’s (1998) Key Decision Points– Evaluation Stages– Activities– Levels of control

Framework for Use

Type Use For Definition: The Use of Knowledge. . .

Instrumental Action . . . for making decisions

Conceptual orEnlightenment

Understanding . . . to better understand a program or policy

Political,Persuasive, or Symbolic

Justification . . . to support a decision someone has already made or to persuade others to hold a specific opinion

Framework for Use and Influence

Term Definition

Evaluation useThe purposeful application of evaluation

processes, findings, or knowledge to produce an effect

Influence ON evaluation

The capacity of an individual to produce effects on an evaluation by direct or indirect means

Influence OF evaluation

(from Kirkhart, 2000)

The capacity or power of evaluation to produce effects on others

by intangible or indirect means

More Recent Developments

• Kirkhart, 2000– Evaluation Influence = capacity of persons or things to

produce effects on others by intangible or indirect means (Kirkhart, 2000)

– Map influence along three dimensions: source, intention, and time

• Mark & Henry 2003, Henry & Mark 2004 – Intangible influence on individuals, programs, and

communities– Focus on direct use of evaluation results or processes

not adequate

“Beyond Evaluation Use” NSF Programs

Name of Program Years of Evaluations

Local Systemic Change through Teacher Enhancement (LSC)

1995 – present

Advanced Technological Education (ATE) 1998 - 2005

Collaboratives for Excellence in Teacher Preparation (CETP) 1999 - 2005

Building Evaluation Capacity of STEM Projects: Math Science Partnership Research Evaluation and Technical Assistance Project (MSP-RETA)

2002 – present

Four Programs and their Evaluations

• ATE: Advanced Technological Education—mainly community college level projects to enhance work force—evaluation included site visits, yearly survey

• LSC: Local Systemic Change—professional development for STEM in K-12 school districts—evaluation included observations, interviews, and surveys

Four Programs and Their Evaluations (cont.)

• CETP: Collaboratives for Excellence in Teacher Preparation—projects to improve STEM teacher education—evaluation included surveys and observations

• MSP-RETA: Math Science Partnerships, Research Evaluation and Technical Assistance—evaluation technical assistance included national meetings and provision of consultants

Methods

• Surveys of project PIs and evaluators in the four projects (645 respondents, 46%)

• Document review• Interviews with key informant project

personnel (29)• Citation analysis (246 documents 376

citations)• Survey of NSF PIs (191 respondents,

54.7%)• In-depth analytic case studies

Results • Perception of Evaluation Quality

– Ability to conduct high quality evaluation – Be recognized as capable

• Interface with NSF– Evaluators as brokers and negotiators – NSF leveraging involvement and use– Importance of dissemination

• Life Cycles – Program– Projects– Individuals

Results

• Project Control – Complete choice – Required involvement– Balance affects use

• Community and Networking – Outreach – Development of a community of practice – Mutual respect– Skill sharing– Process use

Results

• Tensions– Where best to spend time and money– Balance local and national evaluation – Balance project and evaluation goals

• Uniqueness– Complex context– Individual responses

Implications

• Participants differentially affected by the depth and breadth of involvement in evaluation activities.

• Neither breadth nor depth was consistently predictive of perceived level of involvement .

• Lack of consistency in perceived involvement and use makes measuring involvement challenging.

• Any investigation likely to be substantially affected by the nature of the evaluation and the characteristics of the individual.

Limitations

• Only four instances of large, multi-site NSF evaluations and therefore generalizations to other settings are not possible, although potentialities can be suggested.

• The case studies themselves are based on self-report data along with some archival records.

• The numbers of people surveyed and interviewed are small but appear to be at least representative of the groups included.

• The instruments used for data gathering were developed as part of the project and therefore might not be valid as measures of involvement and use in other contexts.

Future Research

• Research on the causal nature of involvement with evaluation use

• Themes presented here provide fruitful areas for more investigation

• Cross-case analysis provides a strong baseline for more positivistic research

• Examine the issues raised here through quantitative path analytic procedures

• Develop strong theories about the relationship between involvement and use that could form the basis for hypothesis formulation

Note

This material is based upon work supported by the National Science Foundation under Grant No. REC 0438545. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

For Further InformationOnline -

http://cehd.umn.edu/projects/beu/default.html

E-mail – Lawrenz@umn.edu

Research Team:– Dr. Frances Lawrenz– Dr. Jean A. King– Dr. Stacie Toal– Kelli Johnson– Denise Roseland

Recommended