20
Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating Programs Nathan R. Durdella, PhD Monterey, California April 10, 2006

Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Embed Size (px)

Citation preview

Page 1: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Building an Information Community: IT and Research Working Together

Responsive Evaluation in the Community College: An Alternative Approach to

Evaluating ProgramsNathan R. Durdella, PhD

Monterey, CaliforniaApril 10, 2006

Page 2: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Presentation Overview

Results: Project HOPE & MESA

Findings &Conclusions

Background, Design, & Methods

Page 3: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Background, Design,

& Methods

Page 4: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Research Context and Problem

• Increasing institutional, accreditation requirements to document student outcomes

• Dominant model: systematic evaluation (Rossi, 1993) – Program objectives, outcomes

• Alternative evaluation models– Recently been used successfully (Shapiro,

1988)– Responsive evaluation

Page 5: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Evaluation Models: Systematic vs. Responsive

Evaluation• Stake’s problem with systematic evaluation:

– Systematic evaluation’s narrow focus to assess program’s goals, measurements, and standards (Shaddish et al, 1991)

– Systematic evaluations best suited for summative evaluations

• Responsive evaluation’s focus:– The primary purpose should be “to respond to audience

requirements for information” (Guba, 1978, p. 34) – Process-oriented issues

• Program implementation– Stakeholder-based

• Locally-generated criteria

Page 6: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Stake’s Responsive Evaluation

• Responsive Evaluation’s prescriptive steps:

1.Program staff/participants “are identified and solicited for those claims” (Guba & Lincoln, 1989, p. 42)

2. Issues of program staff and participants are organized and brought to staff members for comment

3. Issues not resolved are used as “organizers for information collection” (Guba & Lincoln, 1989, p. 42)

4. The evaluator approaches each audience member with the evaluation results to resolve all issues

Page 7: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Research Questions

• Two research questions:1. How effectively does responsive evaluation

theory work as a way to evaluate instructional support programs?

2. How does responsive evaluation articulate with systematic evaluation approaches?

Page 8: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Research Design and Methods

• Design: Comparative, qualitative case study

• Data sources and Sampling:– Interviews and journals– 2-step procedure: purposeful and random

• Case selection– Institutions

• Cerritos College & Santa Ana College = HSIs– Programs

• Project HOPE & MESA

• Data Collection– Interviews: 19 total subjects, 23 total interviews

• Per program: 3 students, 2 staff and 2 faculty, 2-3 administrators• Program directors were interviewed 3 times

Page 9: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Results:Project HOPE &

MESA

Page 10: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Results: Project HOPE

1. Faculty resisted cultural pedagogy

Project HOPE faculty:“It’s a method of learning

where you would approach you’re teaching looking at culture”

“They don’t feel like it would have any impact on their students.”

Faculty and administrators:“We need to serve all of our

students equitably.”“Well we’re not really a

minority any more.”

2. Campus did not value Project HOPE

Project HOPE staff: “There are issues of, I’d say,

with respect to this program and the college in general about the value of it, the need for it because I think there’s a prevailing thought that we do already all we can for students of color just by default because we have such a diverse student population to have programs like these.”

Page 11: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Results: Project HOPE• Guidance Counseling“Well now I know exactly what am I supposed to be taking for every,

every semester and everything.” • Parent, family participation“[My mom] was telling my dad, ‘We have to do our taxes because

they have to file.’ So now she knows what we’re talking about when we have to do our financial aid paperwork.”

• Health Occupations 100 as central“I definitely know I want to stay in in L.A. and really serve those communities in

need.”

• Program communication, coordination“There was nothing said or nothing exchanged.”

• Lack of faculty buy-in, participation“The only things I ever hear is why aren’t we part of this.”

Page 12: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Results: MESA Program

• MESA staff: central to students“I know you really want to go, call me. If you

can’t make it, call me. If you can’t come to class, tell me why. If you think you’re doing bad in class, just talk to me. We can work something out.”

• Major issue: Program impact– In general, MESA students outperform

math/science, SAC students

• Successful program coordination“We have an organized system.”

Page 13: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Results: MESA Program• Other emerging themes:

– Student finances: book loans & more“I then use the money I saved to attend events sponsored

by the Transfer Center.” – MESA Study Center“The MESA Study Center is a good place if one wants to share a

friend’s company and eat lunch while one studies.”

– Program focus: no parent participation“A big obstacle for me as well was that the lack of

information available to my parents.” – Course scheduling, engineering“These classes are not offered every semester.”

Page 14: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Findings & Conclusions

Page 15: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Findings: Responsive Evaluation

• Ongoing programs, categorically funded or institutionalized

• Program staff: cooperation, participation

• Programs: challenges, underlying problems

• Program processes, improvement • Programmatic or institutional need

– Not solely program impact

Page 16: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Further Findings: Responsive Evaluation

• Politically charged context• Personality and power conflicts

– Project HOPE: preexisting– UC, well established MESA programs

• Responsiveness: no assurance model responds to all stakeholders– Identification, development of issues

Page 17: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Findings: Responsive & Systematic Models

• Models articulate well– Project HOPE: prior evaluations vs.

responsive evaluation– MESA: program impact

• Results meaningful– Project HOPE: new “face”

• But, reinforce perceptions

– MESA: few surprises but useful• Student voices

Page 18: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Findings: Responsive Evaluator

• Balance between encouraging participation and maintaining control– Stakeholder-based models

• Initial phases: conditions present to conduct evaluation

• Key: understanding programs as insider while maintaining checks

• Presentation of results: critical

Page 19: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Conclusions: Responsive Evaluation in the Community

College• Institutional charge: respond to

students, faculty, staff, stakeholders• Responsive evaluation: powerful tool

for community colleges programs • Community colleges: limited resources

• Research offices: overburdened

Page 20: Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating

Building an Information Community: IT and Research Working Together

Thank you for attending…Questions or comments?

Nathan R. Durdella, PhDCerritos College

[email protected]