Using Action Research to Ensure Relevance and Excellence

Preview:

DESCRIPTION

Using Action Research to Ensure Relevance and Excellence. Student Success Symposium. Dan Friedman, Ph.D. Director, University 101 Programs University of South Carolina. Does it work?. Academic achievement – GPA and hours earned Retention persistence to second year - PowerPoint PPT Presentation

Citation preview

Using Action Research to Ensure Relevance and Excellence

Dan Friedman, Ph.D.Director, University 101 Programs

University of South Carolina

Student Success Symposium

Does it work?

Academic achievement

– GPA and hours earned

Retention persistence to second year

Utilization of campus resourcesPaul Fidler

A movement created

87% of institutions with a FYS

Padgett & Keup (2011). 2009 National Survey of First-Year Seminars.

High-Impact Educational Practices (AACU)

My Focus for Today

Using assessment/action research to demonstrate the value of our programs, and to continually improve what we do by understanding why our programs work and for whom.

FAITH-BASED?

“Estimates of college quality are essentially "faith-based," insofar as we have little direct evidence of how any given school contributes to students' learning.” RICHARD HERSCH (2005). What does college teach?

ATLANTIC MONTHLY.

Assessment Cycle

3) Interpret Evidence

4) Implement Change

2) Gather Evidence

1) Identify Outcomes

Maki, P. (2004). Assessing for learning: Building a sustainable commitment across the institution. Sterling, VA: Stylus Publishing.

Easy Stuff!

Friedman, D. (2012). Assessing the first-year seminar.

Two Types of Assessment

1) Summative – used to make a judgment about the efficacy of a program

2) Formative – used to provide feedback in order to foster improvement.

The Prescription

Relevance (doing the right things)

Excellence (doing things right)

EEnvironments

(College)

I O

Astin’s Value-Added

I – E – O Model

Astin, A. (1991)

“outputs must always be evaluated in terms of inputs”

Outcomes Inputs (Students)

EEnvironments

I O Inputs Outcomes

Common Mistakes

Just looking at inputs

15

EEnvironments

I O Inputs Outcomes

Common Mistakes

Just looking at environment

16

EEnvironments

I O Inputs Outcomes

Common Mistakes

Just looking at outcomes

17

EEnvironments

I O Inputs Outcomes

Common Mistakes

E-O Only (No Control for Inputs)

18

EEnvironments

(College)

I O

Astin’s Value-Added

I – E – O Model

Astin, A. (1991)

“outputs must always be evaluated in terms of inputs”

Outcomes Inputs (Students)

Disaggregating the Inputs

What does this tell us?

2002 2003 2004 2005 2006 2007 2008 2009 2010 201180

81

82

83

84

85

86

87

88

89

90

84.585.2

83.8

86.2

87.488.1

87.6

86.3 86.3

88

83.5

84.3

81.3

84.6

85.4

84.384.6

85.1

86.8

84.4

1-Year Retention

U101Non-U101

Year

Perc

enta

ge

What does this tell us?

PGPA Quintile UNIV 101 (n) Non-101 (n) P-Value

5 (High) 93.8% (n=497) 94.6% (n=297) NS

4 90.5% (n=652) 84.4% (n=154) .02

3 88.3% (n=656) 86.5 % (n=163) NS

2 87.4% (n=657) 79.1% (n=148) .008

1 (Low) 83.1% (n=628) 70.8% (n=182) .001

All 88.0% (n=3483) 84.4% (n=1086) .002

Positive Impact on Graduation

U101 (all) Non-101 (all) U101 (lowest ability)

Non-101 (lowest ability)

25%30%35%40%45%50%55%60%65%70%75% 71.1%

64.9%

56.9%

36.9%

5-Year Graduation Rates(2007 Cohort)

Need to Disaggregate

Disaggregate data by input variables Predicted GPA

SAT/ACT High School grades

Race/Ethnicity Family Income (Pell eligible) First Generation

Outcome(O)

College(E)

Student

(I)

High ImpactPractic

e

How to do it

WELL

Disaggregating Environmental Variables

Which factors predict persistence?

Used FYI data set & included variables from student data file (persistence and GPA) 2,014 responses (72% response rate)

A series of logistic regressions were conducted Controlled for gender, race, and high school grades

A standard deviation increase in Sense of Belonging & Acceptance increased the odds of persisting into the second year by 38% (p < .001), holding all other variables constant.

Assessing Educational Methods

Compare methods to determine if one approach is better than another

Continual Improvement

Identifying and replicating best practices

Structural Variables

FACTCRA

P

Fact or Crap?

Student Affairs professionals had higher ratings on overall course effectiveness than other instructors.

No statistically significant differences were found on any of the fifteen FYI factors or course evaluation factors for Division of Student Affairs employees versus non-division employees.

CRAP

Fact or Crap?

Sections that met 3 days a week (MWF) had significantly higher course effectiveness ratings than sections that met twice a week.

CRAP

  Mean (n)

p

2 Days Per Week

4.14 (2153)

.286

3 Days Per Week

4.19 (668)

 

Overall Course Satisfaction by Days Per WeekData from fall 2011 Course Evaluations

Fact or Crap?

Sections with a Peer Leader had significantly higher course effectiveness ratings than sections without a Peer Leader.

FACT

Overall Course Satisfaction by Peer Leader StatusData from fall 2011 Course Evaluations

  Mean (n)

p

Peer Leader 4.22 (1585)

.004

No Peer Leader 4.05 (530)

 

“Can’t fatten pig without weighing it”

Need to use assessment data to drive continual improvement

Contact Information

Dan Friedmanfriedman@sc.edu

University 101 Programs 1728 College Street

Columbia, South Carolina 29208(803) 777-6029www.sc.edu/univ101

Recommended