Developing an Outcomes Assessment Program:

Preview:

DESCRIPTION

Developing an Outcomes Assessment Program:. The CCBC Approach. Dr. Irving Pressley McPhail, Chancellor The Community College of Baltimore County Delgado Community College – February 18, 2005. LearningFIRST 2.0. The Strategic Plan For CCBC FY 2004 to FY 2008. - PowerPoint PPT Presentation

Citation preview

Developing an Outcomes Assessment Program:

The CCBC Approach Dr. Irving Pressley McPhail, Chancellor

The Community College of Baltimore County

Delgado Community College – February 18, 2005

LearningFIRST 2.0

The Strategic Plan For CCBC

FY 2004 to FY 2008

StudentLearning

Learning College

InfusingTechnology

LearningSupport

EffectiveCommunication

Organizational Excellence

ValuingDiversity

Community& InstitutionalAdvancement

EnrollmentManagement

Learning First 2.0

CCBC: A Vanguard Learning College

One of twelve community colleges from across the U.S. and Canada chosen to participate in the League’s Vanguard Learning College Project

Five areas of concentration: Organizational Culture Staff Recruitment and Development Learning Outcomes Assessment Technology Underprepared Students

Middle States Standard 14: Assessment of Student Learning

Assessment of student learning

demonstrates that the institution’s

students have knowledge, skills, and

competencies consistent with

institutional goals and that students at

graduation have achieved appropriate

higher education goals.

Middle States Standard 14: Assessment of Student Learning

In order to carry out meaningful assessment activities, institutions must articulate statements of expected student learning at the institutional, program, and individual course levels, although the level of specificity will be greater at the course level. Course syllabi or guidelines should include expected learning outcomes.

SACS – Principles of Accreditation

3.3 Institutional Effectiveness 3.3.1 The institution identifies expected

outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.

Council on Innovation and Student Learning

(CISL)

How it all began!!

Chancellor’s Charge to CISL

Serve as a college-wide think tank

Lead the transformation of the CCBC into

a premier, learning centered college

Serve as change agents

Help to frame policies, procedures and

infrastructure needed to become a

learning college

Getting Started - Fall 1998

Council on Innovation and Student Learning Membership included trustees, chancellor, faculty,

professional staff and classified staff (35-40 members)

Assessment subcommittee created the Learning Outcomes Assessment Plan for the College

(8 members) Guide to Learning Outcomes Assessment and

Classroom Assessment (updated Spring 2003) www.ccbcmd.edu/loa/index.html

Measuring Student Learning

Measuring Student Learning

Non-measures Student satisfaction

surveys Program evaluation MHEC reporting Student grades Retention rates Graduation rates Transfer rates

Indirect measures Exit interviews of

graduates Employer surveys Transfer studies

Measuring Student Learning

Direct measures include: Standardized tests Portfolio assessment Capstone experience Locally developed tests Externally reviewed exhibitions and

performances

CCBC’s Learning Outcomes Assessment Program

CCBC’s Assessment Plan Principles

Primary reason for assessment is to improve and expand student learning

Development of an effective program is a long-term, dynamic process

Must involve a multi-method approach Must include training and support for faculty

and staff Results are not used punitively for students or

faculty

CCBC’s Assessment Plan Principles

Seek to use the most reliable, valid methods and instruments available

Never an end in itself, only a means to an end – the improvement of student learning

Continuous Quality Improvement

DesignImpleme

ntAssess

CCBC’s Outcomes AssessmentPhilosophy

Assessment is: A natural and on-going part of instruction Consistent with traditional instructional

practices Designed to meet specific objectives Conducted in a risk-free environment

LOA Project Collaboration

Learning Outcomes Associate Planning, Research and Evaluation Liaison Learning Outcomes Assessment Advisory Board Deans’ Council General Education Review Board Developmental Education Advisory Committee

LOA Project Collaboration Learning Outcomes Associate

Development of design

Survey/tool development

Staff development

Logistical plan

Institutional Research Data entry Data analysis Data interpretation

Types of Designs

Portfolio assessment Standardized tests External graders/experts Pre- and post-tests Cooperation with other schools Creativity Abounds!

Project Elements

Stage 1: Design Course Project

Determine Measurable Objectives Select Assessment Instrument Include External Validation Control Important Variables

Develop Full Proposal

Stage 2: Implement Design Administer Assessment Instrument(s) Collect Data Analyze Data

Project Elements

Stage 3: Test Objectives

Met? > Write Final Report

Not Met? > Design Course Improvements Write Interim Report

Project Elements

Stage 4: Implement Course Improvements Collect Data Analyze Data

Stage 5: Final Report

Discipline/Campus College

Project Elements

In Summary

Establish Climate of Continuous Improvement

Empower Faculty Provide Substantial Administrative and

Fiscal Support Share Results - Celebrate Successes

Course Level Assessment

High Impact LOAs

CINS 101 HLTH 101 PEFT 101 SDEV 101 ENGL 101

MGMT 101 SOCL 101 PHIL 101 BIOL 110 SPCM 101

ARTS 104 ENVS 101 PSYC 105 MATH 082

FY ‘04FY ‘03 FY ‘05

HLTH 101: Health and Wellness

High impact project Pre-test/post-test design 100 item, four option multiple choice format exam Topics included subjects commonly covered in

introductory health textbooks Draft instrument was reviewed by a health/curriculum

expert at a four-year university Two and four-year colleges and universities have been

invited to share our instrument so that we might develop external comparisons

Health 101 LOAResults

Matched pair t-test analysis Students’ scores improved significantly from pre-

to post-test At least 75% of students scored a 75% or higher

on the post-test Scores varied by campus; small difference in

scores based on race Reassessment will be conducted after

recommendations have been implemented

Health 101 LOA Recommendations

Three topic areas were identified as needing improvement: nutrition, heart disease, and human sexuality

The campus with students who consistently scored higher on the pre-test will offer more honors’ sections

A faculty guide, “Strategies for Teaching Health Education” has been developed and shared with full-and part-time faculty

RDNG 052: College Reading

Instruments Used the Nelson-Denny Reading Test (Forms G

and H) pre and post for the 2001 and the 2003 assessments

Used the Learning Attitudes Study Strategies Inventory (LASSI) for the two semesters of pilot projects and the 2001 assessment

Design Pre-post project design Matched pair data analysis

RDNG 052: College Reading - Fall 2001 Results

Mean pre-post difference of 3.5 (raw points) was statistically significant at the .001 level

Post-test score was greater than pre-test score in 74% of cases

Resulting grade level increase of .7 grade level (in a 15 week semester)

RDNG 052: College Reading - Fall 2001 Results

Exiting students read at 9.6 mean grade level

Differences among campuses—learned from best practices on each campus

LASSI results showed significant improvement in 10 of 12 scales from pre to post

RDNG 052: College Reading – Fall 2001 Interventions

Interventions included: Using more Internet-based learning

opportunities Enhancing instructional activities that focused

on literal and inferential comprehension Greater focus on working with adjunct faculty

RDNG 052: College Reading – Fall 2003 Results

Mean pre-post difference of 3.7 was statistically significant at the .001 level

Post-test score was greater than pre-test score in 75% of cases (larger sample)

Resulting grade level increase of 2.0! (in a 15 week semester)

Exiting students read at 10.1 mean grade level Differences among campuses; every campus

improved

RDNG 052 - 2001 to 2003: Improvements in Student Learning

Significant improvements in students’ reading levels from 2001 to 2003 indicate increased learning in RDNG 052.

RDNG 052 College Reading - Final Recommendations

The Cycle of Continuous Improvement Research Accuplacer cut scores between

RDNG 051 and RDNG 052 Continue to close the performance gap Continue to share best practices Continue to study post-Reading success rates

(success in subsequent courses)

Making the Transition from Course to Program Level Assessment

Program Review General Education Developmental Education

General Education

The Academic Profile

And

The “GREATs”

The Academic Profile

Norm-referenced, externally developed test of general knowledge

Widely used Sample size: 1,017 Students Purpose - to establish baseline data upon

implementation of new General Education Program in Fall 2001

Academic Profile - Findings

The Academic Profile provides CCBC with a baseline measure of how our students are acquiring academic skills developed through General Education courses compared to national norms. CCBC students scored at or close to the national

norms on the total mean score and all sub-scores the first time the test was administered.

The Academic Profile will be administered again in Fall 2004 (sample size 2,000+).

Academic Profile –Data Implications

Work on Critical Thinking skills Reinforce skills learned in one class in

other classes Provide Culturally Mediated Instruction Possible statewide initiatives

GREAT Projects

General Education Assessment Teams (GREAT) Projects for all Maryland Higher Education Commission general education categories

Common Graded Assignments (CGA) and accompanying Scoring Rubrics Created by faculty teams

External Consultant Faculty Training Three semesters of Pilot Projects Full implementation began in Fall 2003 Every course assessed once every three years

GREAT Data (Fall, 2003) Implications Increased awareness of faculty regarding how

General Education courses are defined by the six criteria

Midrange to higher scores in Content may indicate a traditional approach in how “Content” is conveyed with little expectation for higher level use of content in application, critical thinking, analysis, and synthesis activities

Workshops on how to use the GREAT data to improve student learning

LOA Successes

Creating a “culture of assessment” with increased faculty participation and buy-in

Using outcomes assessment for Program-level assessment

Forging new partnerships between faculty teams, institutional research staff, the Vice Chancellor for Learning and Student Development’s office, and the Outcomes Associate in analyzing data and making curricular and pedagogical recommendations for change

LOA Successes

Creating a newly updated Guide for Learning Outcomes Assessment and Classroom Learning Assessment, a model guide for assessment

www.ccbcmd.edu/loa/index.html

Establishing the Learning Outcomes Assessment Advisory Board

Recognition by the League for Innovation in the Community College and Middle States Commission on Higher Education as a national leader in Learning Outcomes Assessment

For Further Information:

Dr. Irving Pressley McPhail

Chancellor

The Community College of Baltimore County

800 South Rolling Road

Baltimore, Maryland 21228-5317

Telephone: 410-869-1220

Fax: 410-869-1224

Email: imcphail@ccbcmd.edu

Website: www.ccbcmd.edu

For Further Information:

Dr. Alvin Starr, Acting Vice Chancellor for Learning and Student Development

Email: astarr@ccbcmd.edu

Dr. Rose Mince, Assistant to the VCLSDEmail: rmince@ccbcmd.edu

Professor Tara Ebersole, Outcomes AssociateEmail: tebersol@ccbcmd.edu

Questions /Comments

Recommended