126
Baltimore City Community College Comprehensive Learning Outcomes Assessment A Practical Guide

Baltimore City Community College Comprehensive …...2 BCCC Culture of Assessment Vision Through a college-wide culture of assessment, Baltimore City Community College provides a quality

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Baltimore City Community CollegeComprehensive Learning Outcomes AssessmentA Practical Guide

AuthorsNevada Winrow, Ph.D.

Terry Doty, M.S., Malathi Radhakrishnan, Ph.D.

EditingInstitutional Advancement,

Marketing and Research DivisionMelvin Brooks, M. Ed.

Michael Butler, J.D.Karen Shallenberger, Ph. D.

Cynthia Webb, M. Ed.

Design/LayoutEd Towles Graphic Design

1

Comprehensive Learning Outcomes Assessment-

A Practical Guide

2

BCCC Culture of Assessment Vision

Through a college-wide culture of assessment, Baltimore City Community College provides a quality education for all who enter our doors. Assessment is by nature a goal-driven, evidence-based, and improvement-oriented process that involves all stakeholders working collaboratively. This ongoing process promotes excellence in teaching and learn-ing by assessing all elements of the educational process. Our culture of assessment pro-vides institutional resources, training, and support. Continual assessment is an integral component of BCCC’s commitment to excellence as an institution of higher education.

Assessment is a celebration of student learning. It guides our curriculum and instruction and provides an overall sense of intrinsic validation that we have transformed the lives of our students.

SLO/CA Task Force 2011

3

Table of Contents

BCCC Culture of Assessment Vision..........................................................................2

Table of Acronyms......................................................................................................5

List of Figures.............................................................................................................6

List of Tables ...............................................................................................................6

Executive Summary.....................................................................................................7

Part I:

Introduction..............................................................................................................8

BCCC’s Guiding Principles for Student Learning Outcomes Assessment..................11

Frequently Asked Questions about Assessment .........................................................12

Part II:

The Comprehensive Assessment Model and Processes (BCCC)....................15

BCCC Assessment Model.........................................................................................16

Program Student Learning Outcomes Assessment.....................................................17

Approach to the Assessment Process..........................................................................17

Outcomes Assessment Methodology..........................................................................19

Course-Level Student Learning Outcomes................................................................22

The Outcomes Assessment Methodology..................................................................24

Outcomes Assessment Tracks....................................................................................25

Planning...................................................................................................................26

Implementation.........................................................................................................28

Submission and Reporting.........................................................................................29

Closing the Loop......................................................................................................30

Part III.

Developing a Plan for Program and Course-level Outcomes Assessment.............34

How to write a Program Assessment Plan..................................................................34

Defining the program mission statement...................................................................34

Characteristics of a well-written mission statement....................................................35

General structure of a mission statement...................................................................35

Aligning Program Goals with Courses.......................................................................38

4

How to Develop a Course-Level Outcomes Assessment Plan....................................42

Developing and Writing a Student Learning Outcomes Assessment Plan..................43

Learning and Teaching Activities...............................................................................47

Selecting an Assessment Method...............................................................................48

Developing an Assessment Instrument......................................................................52

Developing a Scoring Tool ........................................................................................54

Determining Success and Meeting the Mark.............................................................54

Part IV.

Putting it all together..............................................................................................56

Bibliography..............................................................................................................57

Appendix A: Student Learning Outcomes Data Protocol...........................................59

Appendix B: Ethical Use of Student Learning Outcomes Data..................................60

Appendix C: General Education Statement...............................................................61

Appendix D: Forms, Templates, and Worksheets.......................................................65

Worksheet for Creating a Mission Statement.............................................................67

General Guide for Writing a Mission Statement........................................................69

Goal Definition Worksheet........................................................................................71

Program Learning Outcomes Worksheet...................................................................72

Program Learning Outcomes Assessment Report.......................................................73

Student Learning Outcomes Pilot Survey...................................................................75

Recommendations and Observations.........................................................................82

Outcomes Assessment Observations Form................................................................83

Outcomes Assessment Recommendations Form........................................................84

Student Learning Outcomes Assessment Report........................................................85

Samples of Outcomes Assessment Plans ....................................................................87

Appendix E: Additional Resources...........................................................................112

Comparative Analysis of Assessment Strategies .......................................................112

5

Table of Acronyms

CAP Core Competency Assessment Project

CWAC College-wide Assessment Council

DL Discipline Liaison

IR Institutional Research

ISLO Institutional Student Learning Outcomes

LIP Learning Improvement Plan

OAP Outcomes Assessment Project

OARF Outcomes Assessment Recommendations Form

PAP Program Assessment Plan

PLO Program Learning Outcomes

POAP Program Outcomes Assessment Project

SEC Senate Executive Committee

SLO Student Learning Outcomes

SLOA Student Learning Outcomes Assessment

SLOAP Student Learning Outcomes Assessment Plan

SLOAR Student Learning Outcomes Assessment Report

SLOCA Student Learning Outcomes and Curriculum Assessment (Task Force)

6

List of Figures

1. Outcome Assessment Process.................................................................................31

2. BCCC Assessment Process....................................................................................33

3. Curriculum Mapping............................................................................................39

4. Blooms Taxonomy.................................................................................................44

5. BCCC Outcomes Assessment: Structural Organization and Functional Processes (TBD)

List of Tables

1. Forms and Submission Deadlines for Program Outcomes Assessment Project.......21

2. Assessment Process................................................................................................24

3. Example of Outcome Assessment Schedule...........................................................27

4. Forms and Submission Deadlines for Student Learning Outcomes Assessment.....32

5. Summary of Assessment Methods.........................................................................51

6. Analysis for Test Question Development................................................................52

7. Rubrics in Higher Education.................................................................................54

8. Perspectives in Meeting Criteria.............................................................................55

7

Executive Summary

Baltimore City Community College (BCCC) has developed a comprehensive frame-

work for the assessment of student learning that not only addresses accountability to

external constituents, but also speaks to our own institutional effectiveness and continu-

ous self-improvement. This framework establishes a clear alignment between our col-

lege’s mission and academic programs and courses to ensure that students have acquired

the knowledge, skills, and competencies expected by the college and within the student’s

program of study. The framework systematically addresses three Characteristics of Ex-

cellence as defined by the Middle States Commission on Higher Education (MSCHE):

Institutional Assessment (Standard 7), General Education (Standard 12) and Assess-

ment of Student Learning (Standard 14).

The approach to our assessment process is a “design backward and assess forward”

model. We first examined the institution’s mission statement and identified the insti-

tutional learning goals that address the mission. BCCC adopted the core competencies

from the general education curriculum to serve as the institutional student learning

outcomes (ISLO). We then linked these ISLO to program learning outcomes, which are

then aligned with course learning outcomes. In the forward assessment, course learning

outcomes assess content knowledge, skills, and competencies to ensure the achievement

of program outcomes. Program outcomes assess program effectiveness and link to insti-

tutional learning outcomes.

Under the guidance of the Director of Assessment and Curriculum, the College-

Wide Assessment Council (CWAC) monitors all assessment activities. The assessment

of student learning outcomes occurs in four phases: planning, piloting, full implemen-

tation, and reporting and takes four semesters to complete. Reporting occurs on a five-

year cycle.

8

Part I: Introduction

Academicians are by nature curious and motivated to undertake critical and crea-

tive inquiry. We are innovators constantly engaging in recursive thinking to improve

pedagogical practices and student learning. This characteristic of evolutionary think-

ing and continual innovation permeates higher education. In 2002, MSCHE revised

the Characteristics of Excellence for accreditation to address Assessment of Student

Learning. This change fostered a major transformation within institutions of higher

education. Assessment of student learning is not new, but the manner in which student

learning is documented and the way in which the information is used are central to

this transformation. The changes have been met with a variety of responses in academic

institutions, ranging from resistance, misunderstanding, and resentment to heightened

faculty awareness of their role in documenting student learning for the good of the

institutions and more importantly, for the betterment of students.

BCCC seeks to move all stakeholders into a space of self-actualization; however,

what does it take to get there, stay there, and not fail? The Senate Executive Com-

mittee (SEC) charged the Student Learning Outcomes and Curriculum Assessment

(SLOCA) Task Force with exploring these questions and making recommendations for

a sustained, useful, and meaningful assessment process. Like other institutions, BCCC

has experienced many hurdles and challenges with regard to student learning outcomes

assessment. To sustain a culture of assessment in a changing climate, BCCC must en-

gage in authentic, critical inquiry, and evaluation, making a clear distinction between

the real versus perceived self.

9

Assessment as an innovation and the process of change

Gray (1997) discusses innovation in the context of assessment and the process of get-

ting people on board. Rogers (1995) defines innovation as “an idea, practice or object

perceived as new by an individual.” He further adds that, “it matters little whether or

not an idea, practice or object is objectively new in the sense of the time lapse since its

first use or discovery.” If an individual perceives an idea or practice as new, then it is an

innovation. It is within this context that BCCC is viewing assessment as an innovation.

Assessment of student learning is omnipresent, but the practice (i.e., process) by which

the outcomes are assessed is changing and we are embarking upon a new approach.

What is the relative advantage to this change or the degree to which developing a new

assessment process is perceived as better than the practice it supersedes?

There is a consensus from faculty and administration that innovation is preeminent.

BCCC engages in outcomes assessment but because of our existing processes, we failed

to meet the mark. What caused this failure and how can we move forward with new

processes that have buy-in from all stakeholders? To answer these questions, we first had

to evaluate why existing assessment efforts fell short. Assessment efforts generally fail

following a systemic paralysis. The factors paralyzing our assessment initiatives were: (1)

no sense of urgency; (2) lack of a clear vision and poor communication; (3) allowing

obstacles to the new vision, (4) not systematically planning for and creating short-term

wins; and (5) declaring victory too soon. Kotter (1995) speaks extensively about these

errors in his discussion of why transformation efforts fail in organizations.

Too often, organizations fail to communicate a sense of urgency for change and do

not produce deliverables that are a result of the change process. In this instance, the

10

product is the result of reactive versus proactive responses. Secondly, when there is a

poorly communicated and lackluster vision, the end result is many people doing many

different things. In our case, we had many people doing good things, but there was a

disconnect between our activities and a vision aligned with Middle States expectations.

Not removing obstacles to a vision is equally damaging. Obstacles can be individu-

als or a process. Without a clear leader in assessment, the outcome can prove to be ex-

tremely detrimental. Progress is further impeded when there is no systematic planning

for and creating of short-term wins. Assessment is an ongoing process that must be

celebrated on a regular basis. The lack of a frame of reference to demonstrate how far we

have come and what we have achieved negatively impacts morale and goal attainment.

More importantly, it can lead to declaring victory too soon. For instance, we engaged

in a full-throttle assessment of learning and completed outcomes assessment templates.

There was a declaration of a victory at having finished, but no sense of success in having

a sustained, valued, and meaningful assessment process.

Since the 1980s, there have been discussions and recommendations for transparency

showing “the quality of the product in which tuition dollars were being invested” (Mid-

daugh, 2010). However, in recent years, there has been a push towards more powerful

assessment reform mandated by accrediting bodies, as well as some boards and state

legislatures (Middaugh; Rouseff-Baker and Holm, 2004; Walvoord, 2010). Colleges

and universities must have a full commitment to teaching and learning that includes

assessing and documenting what students are learning, then using the information to

improve learning and the services they provide. While there is a strong external push for

outcomes assessment, BCCC is dedicated to providing positive educational experiences

to our students. At BCCC, the focus for outcomes assessment is primarily to improve

11

student learning. Walvoord defines assessment as “the systematic collection of informa-

tion about student learning, using the time, knowledge, expertise, and resources avail-

able, in order to inform decisions that affect student learning.” Outcomes assessment is

something that already occurs at the course level. When an educator makes changes to

a course based on student performance, this is student learning outcomes assessment.

Our goal at BCCC is to formalize this process.

BCCC’s Guiding Principles for Student Learning Outcomes Assessment (SLOA)

Based on a review of multiple resources, BCCC has adopted the following eight

guiding principles for student learning outcomes assessment:

Principle One: Faculty engagement and active involvement in SLOA development

is essential. Faculty members are best suited to develop the educational outcomes for

their programs and courses and to determine how to assess the outcomes and use the

collected data for development and improvement. Therefore, the primary responsibility

of the faculty is to develop assessment tools and determine the uses of the resulting data.

Principle Two: All appropriate participants at each level of the college, not just se-

lect groups or individuals, should be committed to following the process of assessment.

All academic departments must be involved. This means including adjunct faculty in

discussions and decisions as much as possible.

Principle Three: The results of outcomes assessment should be used to evaluate the

effectiveness of academic programs and activities, as well as student services, but not the

performance of individual faculty or staff. SLOA is a process that must be separate from

faculty evaluation.

12

Principle Four: College administrators must provide leadership and accountability

to the process, as well as enable the assessment process by providing adequate resources,

staff, and other forms of support.

Principle Five: Assessment data do not exist in a vacuum and must be analyzed

alongside all other factors that may affect achievement of outcomes. Other factors, some

that will be out of our control, must be considered during the analysis phase. Factors

such as personal problems, home situations, work hours, study habits and incoming

skill levels, to name a few, can affect the assessment results.

Principle Six: The SLOA process should be as simple and manageable as possible.

The process cannot become so complicated that it interferes with the educational expe-

rience rather than assessing and improving it.

Principle Seven: Faculty and all others involved at the college should engage in

SLOA development and assessment because it is good professional practice that benefits

programs and students, not because it is an accreditation requirement. If outcomes as-

sessment is used primarily as a reporting tool, then the process has failed.

Principle Eight: Outcomes assessment must be ongoing and performed on a regular

basis in each academic area. It must become an academic habit.

Frequently Asked Questions about AssessmentQuestion: What is the difference between a behavioral objective and a student learn-

ing outcome?

Answer: Behavioral objectives for a course are usually more numerous than outcomes

and detail the course content, while outcomes refer to what students should be able to

do after they master the objectives and complete the course or module. Outcomes give

a different perspective than individual objectives used for teaching. Fulks (2004) quotes

a Bakersfield College chemistry professor as saying, “Outcomes demonstrate an under-

13

standing and application of a subject beyond the nuts and bolts which hold it together;

objectives represent the nuts and bolts.” For example, a module may have ten behavioral

objectives used to teach the material that students need to master in order to be able to

perform a task, the outcome.

Question: Will outcomes assessment interfere with my academic freedom?

Answer: Outcomes assessment is only meant to determine if students are learning

what they should be learning in a course and to make improvements. The process of

outcomes assessment at BCCC requires instructors to assess three to five outcomes us-

ing a common instrument (when there is more than one section of the same course), to

report data, and to make changes as needed. This procedure is required every five years.

Outcomes assessment by no means dictates how an instructor chooses to present the

content of a course or how the students receive their grades.

Question: If an outcome is not met and negative data are collected, will this be used

to reprimand or evaluate faculty members?

Answer: No. Data collected for an outcome will only be used to determine if stu-

dents are learning what they should be learning. The process of outcomes assessment is

meant to determine how effective the course is, not to evaluate faculty. Instructor and

student names should not be included when reporting the results of an outcome.

Question: Do I only test some of my students for outcomes assessment?

Answer: The least invasive way to include outcomes assessment is to incorporate it

into your course as a regular assignment for all students to complete. For the purpose of

data analysis, a random sample of the completed assessment tool will be selected. In the

case where there is only one section of the course, all of the completed assessment tool

results may be used for your data analysis.

Question: Will the process of outcomes assessment create more work for faculty

members?

Answer: At first, it may seem to be a little more work. However, as teaching and

learning improves, outcomes assessment may actually free up time spent on concepts

14

that are easier for students to understand, allowing more time for those topics with

which they have trouble. In addition, since effective faculty members are already assess-

ing their students’ learning, it is only a matter of formalizing the process.

Question: Is the only purpose of outcomes assessment to find things that need to be

corrected?

Answer: No. One goal of assessment is to make improvements, but favorable assess-

ment results can be used to promote a program and motivate students and faculty, as

well as to justify the allocation of necessary resources.

Question: Is outcomes assessment just a passing fad that will fade away in a few

years?

Answer: There has been a push for outcomes assessment for more than a decade,

and it is getting stronger. Along with MSCHE, all other accrediting bodies for higher

education across the country have added requirements for outcomes assessment to their

standards.

15

Part II: The Comprehensive Assessment Model and Processes (BCCC)

Definition of Terms

Outcomes Assessment Project (OAP) The overall outcomes assessment activity

(i.e., goal setting, resource allocation,

protocol).

Discipline Liaison (DL)

A comprehensive protocol that identifies

what is being measured, how it will be meas-

ured, and who is responsible for assessing

and reporting results.

A representative trained in outcomes as-

sessment who serves as a peer-mentor for a

respective discipline.

Assessment Plan (AP)

Faculty Assessment Team (FAT) Faculty involved in developing and imple-menting the OAP.

Program Coordinator Responsible for the day-to-day activities of an academic program.

16

BCCC Assessment Model

MSCHE states that an assessment process must be useful, cost-effective, reason-

ably accurate and truthful, carefully planned, organized, systematic, and sustained.

The BCCC outcome assessment processes are useful in that data obtained assists fac-

ulty and staff in making decisions in improving programs, refining course objectives,

and allocating resources that support student learning. Assessment processes are pur-

poseful and prospective in its planning to ensure that: (1) there is a clear alignment

among institutional, program, and course-level outcomes, and supported student

learning activities and (2) there are ongoing, organized, and sustained assessment ac-

tivities. The process to ensure accuracy and truthfulness is achieved by using a myriad

of assessment tools (i.e., direct and indirect assessments) to assess learning outcomes.

The approach to our assessment process is a “design backward and assess forward”

model. More specifically, designing backward, we first examine our institution’s mission

statement and identify the institutional learning goals that address the mission. There-

after, we link institutional student learning outcomes to program outcomes and pro-

gram outcomes to courses. Assessing forward, course learning outcomes assess content

knowledge, skills and competencies to ensure the achievement of program outcomes

that link to institutional learning outcomes. There is also a concerted effort to ensure a

clear alignment between course outcomes and course learning activities. As such, there

is a link between supported student learning activities and student learning outcomes.

17

Program Student Learning Outcomes Assessment

Program-Level Student Learning Assessment

Program-level student learning assessment consists of linking program goals and

learning outcomes with the curriculum. This type of linking is referred to as curriculum

mapping. Curriculum mapping provides a conceptual model of where within the cur-

rent curriculum program learning goals and learning outcomes are addressed. A pro-

gram goal is a broad statement of what students are expected to know at the completion

of the program. A program learning outcome is a specific behavior students are expected

to demonstrate that reflects the broader program goals. Program learning outcomes as-

sessment focuses on determining whether students have acquired the knowledge, skills

and competencies associated with their program of study. This includes general educa-

tion core competencies as well as discipline related competencies (See General Educa-

tion Statement, Appendix C).

Approach to the Assessment Process

Students in academic programs should understand that their curriculum is not just a

litany of courses they must complete to receive a degree, but rather a set of coherent and

transformational experiences carefully created to provide them with the opportunities

to develop the knowledge, skills, and characteristics they will need to become the people

they aspire to be. Program-level student learning assessment is key to assisting faculty in

answering the question, “Upon completion of the program, how do we know that our

students have developed the knowledge and skills to be successful?”

18

Expectations and Requirements for a Program Assessment Project

In an effort to adhere to the principles of usability, accuracy, truthfulness, sustain-

ability, and cost-effectiveness, the minimum expectations and requirements for com-

pleting an outcomes assessment project for programs are as follows:

•Consensus:Facultyandtheprogram’sadvisoryboardshouldreacha

consensus on the program’s mission, goals, and program learning outcomes.

•Selectionofprogramlearningoutcomes:Facultyshouldselectatleast

four learning outcomes.

•Closingtheloop:Inanefforttoplanandimprovethestateofthecurriculum

and student learning, a discussion of results among the users is essential. When

expectations are not met, a plan of action is required to ensure that students are

making expected gains in their program of study.

Program Assessment ProjectsThere are two types of outcome assessment projects a coordinator may select in col-

laboration with faculty in the discipline: the Program Outcomes Assessment Project

(POAP) or the Core Competency Assessment Project (CAP), or a combination of the

two. A POAP assesses program specific content and skills. A CAP focuses on transfer-

able skills such as written and oral communication, problem solving, informational and

computer literacy, numerical reasoning, multicultural diversity, and social responsibility

(see General Education Statement, Appendix C).

19

Outcomes Assessment Methodology

Program learning outcomes assessment occurs over three distinct phases: program

reaffirmation, implementation, and reporting.

Program Assessment Planning

The FAT will convene to review and discuss the program mission, goals and out-

comes, determine what assessment information is currently being collected, and match

these data sources to the learning goals and objectives. Once there is an identification of

assessment processes that are currently in place, faculty will pinpoint central questions

that are not being answered by existing data sources.

The FAT will develop the Program Assessment Plan (PAP, see Appendix D). All

participants will agree upon the assessment method (e.g., project, essay, and exam) and

scoring tool (e.g., rubric, score on multiple-choice test). The program coordinator will

meet with Institutional Research (IR) to discuss and develop the data spreadsheet and

discuss specific analyses. All key constituents (CWAC liaison, Dean, and program co-

ordinator) will endorse the PAP. The following semester, the plan will be implemented.

Implementation

During this phase, the program coordinator begins collecting data. Program learn-

ing outcomes are not collectively assessed at one discrete time; different outcomes are

assessed at different periods across multiple semesters. The program coordinator deter-

mines which outcomes are assessed and when.

20

Reporting and Closing the Loop

The program coordinator will complete the data spreadsheet form and submit it

to IR, who will analyze the data. IR will provide a data analysis report to the program

coordinator. The report will contain a distribution of responses by Program Learning

Outcomes (PLO) and by measure. It is the responsibility of the discipline faculty to

review and discuss the results, and make observations and recommendations about the

learning outcomes, assessment instrument, scoring tool, process, and results. For those

program student learning outcomes that are not achieved, a Learning Improvement Plan

(LIP) must be submitted containing a plan of action to ensure that students achieve the

desired learning outcome for the program. The LIP will be completed the following

semester. Step 2 (Data Collection) and Step 3 (Data Analysis and Reporting) will be

repeated until the learning outcome is achieved.

Minimum Requirement for a Learning Improvement Plan (LIP)

•Re-evaluateorrevisethecurriculum(i.e.,curriculummapping).Theprogram

coordinator needs to identify possible deficiencies that may exist within the

curriculum (e.g., student-supported learning activities in a course, usefulness

of practicum or service learning project in ensuring students achieve specific

competencies).

•Datesofimplementationandreporting.

21

22

Course-Level Student Learning Outcomes

Expectations and RequirementsIn an effort to adhere to the principles of usability, accuracy, truthfulness, sustain-

ability, and cost-effectiveness, below are the minimum expectations and requirements

for completing an Outcomes Assessment Project (OAP).

•CourseCharacteristics

Multiple delivery: For courses that are offered in various modalities, each mo-

dality must be assessed. Distance learning and accelerated courses must be in-

cluded in the OAP.

Multiple sections: (i.e., multiple sections or course offerings over an academic

year) Every instructor teaching a multiple-section course will be involved in the

OAP.

•Selectionofoutcomes:AnOAPshouldcontainatleastthreeoutcomes.Ifthe

course is participating in core competency outcomes assessment, at least two

of the outcomes should be general education competencies.

•Facultyconsensus:Thereshouldbefacultyconsensusabouttheoutcomesand

the plan to assess them. The FAT will be the leaders in helping foster this

consensus. In addition, all faculty members who teach the course must share

the same operational definition for the selected outcomes and must use a

common assessment instrument and score it the same way for the purposes of

the OAP.

•Informedstudents:Theassessmentprocessisvaluablenotonlytothefaculty

and administrators but to students. Students will be made aware of what is

expected of them. This information should be in clear language and made

readily available to students to review at any time. As such, all learning

outcomes should be clearly stated on the course syllabus. Moreover, should the

23

FAT decide to use a rubric to score the assessment, students should receive the

rubric before participating in the assessment. Faculty members participating in

the assessment will record and submit student scores electronically on the data

spreadsheet to the FAT. The FAT will then forward the spreadsheets to IR

(See Appendix A, p. 61 and Appendix B, p. 62).

•Closingtheloop:Participantswilldiscusstheresultsinanefforttoplan

and improve the state of the curriculum and student learning.

Key Terms

Term Definition

What a student is expected to learn through participation

in academic activities or experiences at the College. SLO

reflects competency in knowledge gained, and demonstra-

tion of skills and abilities.

The approach used to measure whether student

learning has occurred (e.g., exam, essay, capstone project,

paper etc.).

A tool disseminated to students to measure student

learning.

A tool used to evaluate the results of the assessment in-

strument, e.g., rubric. Not all instruments may use a scor-

ing tool. For instance, a multiple-choice exam may use the

number of correctly answered items.

This represents a unique student or course trait such as

the number of credit hours a student completed in the

curriculum, or course length (i.e., 7, 12- week course) or

a distance learning course (i.e., online, hybrid course).

Student Learning Outcome

(SLO)

Assessment Method

Assessment Instrument

Scoring Criteria

Academic Trait Variables

24

The Outcomes Assessment Methodology

We reviewed a series of best practices in assessing student learning outcomes and a

myriad of outcomes assessment handbooks from across the country (Cartwright, Wein-

er and Streamer-Veneruso, 2009; University of Massachussetts Amherst, 2001). From

that review, we selected many innovative and practical ideas based on the value they

added to existing structures and processes employed at BCCC.

An outcomes assessment takes two years to complete and runs on a five-year sched-

ule. This approach: (1) allocates time to provide a thoughtful and reflective analysis and

response to the data; (2) does not overwhelm stakeholders; and (3) provides meaningful

data from which to draw inferences.

The first semester comprises planning for the OAP and completion of the outcomes

assessment plan. The following semester, the plan will be piloted in selected sections,

undergoing revisions as needed. In the third semester, the plan is fully implemented.

The fourth semester involves data analysis, reporting, and making observations and

developing recommendations based on the results. A fifth semester may be added, if

improvements are needed in instructional design and delivery.

*This phase is not included for special projects

Table 2: Assessment Process

First Semester Second Semester* Third Semester Fourth Semester

Outcomes Assessment Planning

Pilot of Plan FullImplementation

Analysis andReport

25

Outcomes Assessment TracksCore Competency Assessment Track – Institutional Student Learning Outcomes (ISLO)

The purpose of this assessment track is twofold: (1) to evaluate and improve stu-

dents’ performance in the general education competencies at the course level and (2) to

demonstrate the overall impact of the general educational curriculum at BCCC. BCCC

has adopted the general education competencies within the general education curricu-

lum to represent the college’s Institutional Student Learning Outcomes (Appendix C).

This set of general education competencies aligns with MSCHE and the Maryland

Higher Education Commission (MHEC) requirements. There are eight competencies,

as defined in the General Education Statement. Upon graduation, all BCCC graduates

should be able to:

•CommunicateeffectivelyinoralandwrittenEnglish(OralandWritten

Communication)

•Reasonabstractlyandthinkcritically(CriticalThinking)

•Understandandinterpretnumericaldata(NumericalAnalysis)

•Readwithcomprehensionanddrawconclusionsbasedonevidence

(Deductive and Inferential Thinking)

•Understandthedifferencesaswellascommonalitiesamongpeople

(Multicultural Diversity).

•Understandandutilizeskillsresponsibleforlivingasresponsible,ethical

and contributing citizens (Personal Development and Social Responsibility).

•Reflectupontheartsandtheroleoftheartsinthehumanexperience

(Arts and aesthetic awareness).

•Identify,locate,andeffectivelyuseinformationfromvariousprintand

electronic sources (Informational and Computer Literacy).

26

Discipline Assessment Track

This track comprises all courses outside of the general education curriculum. This

assessment track has two assessment paths: general course assessment path and spe-

cial projects. The general course assessment path involves engaging in student learning

assessment activities of courses that have multiple course sections and dual semester

course offerings. The special projects path involves student learning outcomes assess-

ment of courses offered only once a year or in alternate years.

PlanningCore Competency Assessment Track (ISLO)

The core competencies are embedded in the general education curriculum. As such,

general education courses are selected by the CWAC for outcomes assessment based on

the competency being evaluated for that assessment year. The identified courses must

have at least two learning outcomes that address the competency.

BCCC used the Montgomery College (Cartwright, Weiner & Streamer-Veneruso,

2009) assessment schedule as a model for the assessment of core competencies (i.e., the

ISLO). The assessment of a core competency occurs every five years by competency (see

Table 3). Each assessment year a person from the ISLO committee will be selected to

serve as the liaison for that year’s outcomes assessment activities. This liaison will assem-

ble the FAT and will oversee the OAP.

The FAT, through collaboration between teaching faculty and the discipline liaison,

will develop an assessment plan (see Appendix D). All participants will agree upon an

assessment instrument (e.g., project, essay, and exam) and scoring tool (e.g., rubric,

score on multiple-choice test). The FAT will meet with IR to discuss and develop the

data spreadsheet. All key stakeholders (CWAC liaison, department chair, FAT, and

Dean) will endorse the plan by signing the OAP. The following semester, the plan will

be enacted.

27

Discipline Assessment Track

Based on faculty discussions and consensus, a course will be identified for an OAP.

The Chair will submit the assessment matrix to the CWAC on the Course Outcomes

Assessment Tracker (COAT) (see Appendix D, p. 68). In considering course selection,

the following guidelines should be used:

•Thelasttimethecoursewasassessed.Asaruleofthumb,anOAPforacourse

should take place every five years.

The general assessment track courses will follow the traditional assessment phase

cycle (i.e., planning, piloting, implementing, and reporting). Special project courses are

not offered frequently and cannot follow the traditional assessment planning process in

which there is an opportunity to pilot the outcome assessment plan. As such, special

project course assessment will comprise a three-phase process of planning, implementa-

tion and reporting.

28

Implementation Piloting the assessment

With the exception of special projects, every outcomes assessment activity includes

pilot testing. The purpose of the pilot phase is to assess the efficacy and efficiency of the

outcome assessment plan. More specifically, this is where you determine if the plan, as

designed, is feasible. Are there any challenges inherent in the design or implementation

of the plan that could be a potential problem? The primary focus of this phase of the

outcomes assessment activity is on the assessment tool and obtaining feedback (e.g.,

using a survey) on ways to improve the plan before full implementation. This includes

faculty reactions to and reflections on the assessment and psychometric properties of the

tools (See Appendix D p. 78), such as:

•Clarityofinstructionstostudents.

•Clarityofscoringinstructionstofacultyandthereliabilityofthescoring

tool (i.e., inter-rater reliability).

•Theconsistencyoffacultydataentrywithprescribeddataentryprotocol.

•Timelinefeasibilityandappropriatenessofthetimingoftheassessment.

•Evaluationofthequalityofcommunicationbetweenparticipating

faculty members in the administration and reporting of information from

the outcome assessment.

Not every section of a course is assessed in a pilot. Moreover, selection of sections or

courses should not be based on volunteering to participate. When identifying a section

to assess, we recommend the following guidelines:

•Havebothfull-timeandadjunctfacultyinvolvedinthepilot.

•Useinstructorswhowillalsoteachthecourseduringthefull-scale

implementation.

•IncludefacultywhoarenotintheFATworkgroupinordertogainfeedback

from colleagues who were not as intimately involved in developing the outcomes as-

sessment plan. By doing so, you can determine just how fluidly the process runs when

people outside of the FAT participate..

29

Full Scale Implementation

The full-scale implementation of an OAP involves effective communication with all participating faculty members teaching the course prior to the start of the semester. This includes communicating:

•Thelearningoutcomesidentifiedforassessment. •Theassessmentmethodandcommonassessmentinstrument. •Whentheassessmentinstrumentistobeadministered. •Theinformationtobedisseminatedtostudentsexplainingtheassessment and its purpose. •Therubricoranswerkeytobeusedinscoringtheassessment. •Howthedatawillbecollectedandscored.

When there are more than four sections of a course, a random selection of sections will be used for the outcome assessment activity. The number of random sections to par-ticipate in the outcome assessment activity will be determined by the FAT. Within those sections, a random sample of assessment instruments will be scored. Every student will be administered the assessment tool; however, every student’s score may not necessarily be used for data analysis. For instance, a course that has only one section may want to use all students in the course for data analysis, as opposed to a multiple-section course that may randomly selected 35 percent of the students to score and include in a data analysis.

Submission and Reporting During the semester, following the full-scale assessment Institutional Research (IR)

will analyze the data. IR will provide a data analysis report to the FAT and Dean for the respective discipline. The report will contain a distribution of responses by Student Learning Outcome (SLO) and by measure. It is the responsibility of the discipline to review and discuss the results, and make observations and recommendations about the learning outcomes, assessment instrument, scoring tool, process, and results.

30

Observations and RecommendationsDuring the observation phase, participants identify what worked well and what

didn’t. They then make recommendations to strengthen areas of weakness in future

assessment cycles. This information is captured on the Outcomes Assessment Observa-

tion Form (OAOF, see Appendix D, p. 86). The form allows faculty the opportunity to

review and reflect on the assessment process and activities to see what is going well and

what things need improvement. Areas needing improvement are captured on the Out-

come Assessment Recommendations Form (OARF, see Appendix D, p. 87).

After the completion of the OARF, the FAT completes the Student Learning Out-

comes Assessment Report (SLOAR, see Appendix D, p. 88). The SLOAR is the final re-

port of outcomes assessment; it summarizes results and based on the recommendations

made within the OARF, discusses actions taken to improve student learning.

Closing the LoopFor those outcomes that are not achieved, a Learning Improvement Plan (LIP) must

be submitted containing a plan of action for ensuring that students achieve the desired

learning outcomes. This LIP will be submitted and reviewed by the FAT. The LIP will

need to be implemented the following semester or in the case of single semester course

offerings, the next time the course is offered. Step 3 (Data Collection) and Step 4 (Data

Analysis and Reporting) will be repeated until the learning outcomes are achieved.

Minimum Requirement for a Learning Improvement Plan (LIP)

•Re-evaluateandrevisestudent-supportedlearningactivitiesthatwillbeused

to reach the desired benchmark for achievement of learning outcomes.

•Datesofimplementationandreporting.

31

Each of the forms referenced in the Forms and Submission Table 4 comes with

specific instructions and is located in Appendix D.

32

33

Figure 2. BCCC Assessment Process

34

III. Developing a Plan for Program and Course-level Outcomes Assessment

This section is designed to assist in the planning and assessment of program and

course student learning outcomes and facilitate the process of creating a good and viable

outcomes assessment plan. The instructional information within this section unfolds in

a process-oriented manner and culminates in a completed outcomes assessment plan.

While the suggestions below represent best practices in outcomes assessment, we know

that some disciplines have specific needs, which may conflict with these suggestions. In

those instances, use this guide as a starting point and design the outcomes assessment

plan you deem appropriate.

Steen (1999) identified several principles of assessment. According to Steen, assess-

ment is not a single event, but a continuous cycle; it must be an open process that

promotes valid inferences. It should always employs multiple valid measures of perform-

ance. It should measure what is worth learning, not just what is easy to measure, and

should support every student’s opportunity to learn. Those engaging in assessment

activities should keep these principles in mind when developing an outcome assess-

ment plan.How to write a Program Assessment Plan

The first step in writing a Program Assessment Plan (PAP) is to ensure that there is

an alignment between the institution’s mission and the program’s mission, goals, and

learning outcomes.

Defining the program mission statementThe first step in program assessment is to develop a program mission statement.

Understanding what your program wants to accomplish is important to developing a

successful assessment plan. A mission statement should be in narrative form and broadly

state why the program exists. What is the program’s purpose and whom does it benefit?

35

How will the program contribute to the education and careers of its graduates? If there

are other stakeholders, who are they and how does the program benefit them? What are

the program’s values? A program mission statement should be distinctive while staying

in alignment with the college’s mission.

Characteristics of a well-written mission statementA well-written mission statement supports the mission of the College and is devel-

oped with input from all faculty members in the program. In addition, it should:

•Leadwithabriefstatementoftheeducationalpurposethatisdistinctive

to the degree and field of study.

•Includetheprimaryfunctionsoractivitiesoftheprogram.

•Identifystakeholdersandhowtheybenefitfromtheprogram.Inaddition

to students, this component of the statement can include employers,

community, etc.

General structure of a mission statement“The mission of (name of program or unit) is to (primary purpose) by providing

(primary functions or activities) to (stakeholders).”

Note: The order of each component may vary from the above format, but should be clearly

stated within the mission statement

Example of a mission statementThe Physical Therapist Assistant Program at Baltimore City Community College

provides an accessible and affordable quality education in physical rehabilitation to a

diverse student population within Baltimore City and its surrounding areas. The pro-

gram responds to the changing needs of the health care community by providing the

academic, technical, and life skills training needed to produce graduates who are quali-

fied to sit for the National Physical Therapy Exam for Physical Therapist Assistants and

become successful licensed Physical Therapist Assistants.

36

Steps in creating a mission statement •UsingtheformfoundinAppendixD,p.69-72,programfaculty

members brainstorm possible components for the program mission

statement. Each faculty member can complete the form individually, then

come together to brainstorm if preferred.

•Usinginformationfromthebrainstormingsession,writeamissionstatement

using the general structure in Appendix D, p. 71.

•UsingthechecklistinAppendixD,p.72,reviewthemissionstatement.

Defining Program GoalsGoals are broad statements that describe the long-term program targets or directions

of development. They state in broad terms what the program wants to accomplish (in

terms of student outcomes) or to become over the next several years. Goals focus on

what the program is designed to accomplish and are stated in general terms such as clear

communication, problem-solving skills, critical thinking, etc. Goals will help everyone

involved to have a clear understanding of what the program is designed to accomplish.

They will serve as the foundation for the program student learning outcomes.

How to write a program goalRemember that goals are broad statements of what you want the students to learn

and what you want to accomplish in the future for your program. They can include

improving student learning outcomes and retention, minimizing the time it takes to get

a degree and maximizing the rate of employment for the graduates, improving board

examination results, etc.

37

Examples of Program Goals•Preparedentalhygienestudentstocommunicateinamannerthatpromotes

positive interpersonal relationships with patients, peers, employers and the

community. (Dental Hygiene)

•Preparedentalhygienestudentstoprovidetotalpatientcarebyperforming

dental hygiene services in an effective and efficient manner, which will

exemplify high ethical standards and professional conduct. (Dental Hygiene)

•Maintain100percentjobplacement.(DentalHygiene)

•ImprovethepassrateofNortheastRegionalBoardexaminationby100%.

(Dental Hygiene)

•ImproveretentionrateamongfirstyearNursingstudents.(Nursing)

•Educateatleast80percentofthefacultyinonlineeducationmethods.

(Distance Learning)

•Establishandenforceprogrampoliciesandproceduresthatensurequality

education and preparation for the skills needed to succeed as a licensed

Physical Therapist Assistant. (Physical Therapy)

•Ensurethatprogramstudentsreceiveeducationalinstructionandcontent

that reflects current physical therapy practice and community standards.

(Physical Therapy)

•Prepareadiversestudentbodywithstrongcriticalthinkingskillsand

judgment to practice physical therapy in a safe, ethical, and legal manner.

(Physical Therapy)

38

It is a best practice to have at least three to five goals for your program. However, if

faculty members can only articulate one or two program goals, then focus on those and

brainstorm additional goals at a later date.

Writing Your Program GoalsUsing the directions and questions found on p. 73 in Appendix D, write your pro-

gram goals. This exercise can be completed as a group or it can begin by having the

individual instructors answer the questions and come together later as a group to form

consensus. After completing the program goals, use the chart in Appendix E, p. 74 to

develop your student learning outcomes.

Aligning Program Goals with CoursesIn order to align program goals and courses, program faculty must carefully examine

their core curriculum to make sure students are given the opportunity to develop com-

petence in program-level student learning outcomes regardless of when they take the

course and/or who teaches it. A curriculum map (see Figure 3, Curriculum Mapping, p.

40) will help program faculty examine their curriculum as related to the program-level

learning outcomes. The map lists the outcomes in the first row and the required courses

in first columns. The resulting grid allows faculty to indicate which courses in the cur-

riculum support the achievement of specific program outcomes.

Curriculum mapping is a valuable process. In many cases, it is the first time a cur-

riculum has been systematically examined to see how individual courses function in

the curriculum. Ideally, program curricula would be developed after the identification

of the program-level student learning outcomes. Courses and the curriculum would be

designed to foster the achievement of those outcomes. In reality, program-level learning

outcomes are retrofitted to an existing curriculum rather than driving the curriculum.

39

Sample Program Learning Outcome Statements

The following is a list of generic and specific examples of program learning outcomes. These serve as ideas of what well-stated and measurable program-level student learning outcomes might look like.

Students should be able to:

•Demonstrateknowledgeofthefundamentalconceptsofthediscipline.•Utilizeskillsrelatedtothediscipline.•Communicateeffectivelyinthemethodsrelatedtothediscipline.•Conductsoundresearchusingdiscipline-appropriatemethodologies.

Curriculum mapping is an opportunity to check for alignment in the curriculum. It

can identify, for example, outcomes that may not be supported adequately by the cur-

riculum, areas of overlap, and outcomes that have been overlooked. It provides a con-

ceptual framework for both faculty and students to translate the list of required courses

into a curriculum designed to support achievement of specific outcomes. Creating a

curriculum map may reveal that certain outcomes are not supported by the program’s

core curriculum.

Figure 3 Curriculum Mapping

40

•Generatesolutionstoproblemsthatmayariseinthediscipline.

•Summarizeandcritiqueresearchreportspublishedinpoliticalsciencejournals.

(Political Science)

•Identify,describe,andclassifycommon,andsomeuncommon,earthmaterials

(minerals and rock); make scientific observations of these earth materials in the

field and in the laboratory; and interpret their observations in a scientifically

sound manner.

(Geology)

•Analyzetheimpactofsocialpoliciesonclientsystems,[social]workers,and

agencies. (Social Work)

•Prepareasetoffinancialstatementsinaccordancewithgenerallyacceptedaccounting

principles. (Accounting)

Developing Program Assessment Tools

Choose comprehensive assessment methods that allow you to assess the strengths

and weaknesses of the program. Effective methods of assessment can provide both posi-

tive and negative feedback. A combination of qualitative and quantitative methods can

offer the most effective way to assess goals and outcomes. Both direct and indirect as-

sessment methods can provide useful insight into students’ experiences and learning in

a program. Direct and indirect assessment methods each have unique advantages and

disadvantages in terms of the type of data and information they can provide. As a result,

many program faculty and administrators choose to incorporate both types of assess-

ment into an assessment plan. A brief list of methods is provided below; to review more

comprehensive information, with advantages and disadvantages of various assessment

methods, see Appendix E, p. 116.

Examples of Direct Assessment Methods:

•Comprehensiveexams

•Embeddedassignments(projects,papers,presentations,performances,etc.)

41

•Internal/externaljuriedreviewofperformancesandexhibitions

•Internshipsand/orclinicalevaluations

•Locally-developedexams

•Portfolioevaluation

•Preandposttests

•Regionalornationaltests/exams(certificationexams,licensureexams,etc.)

•Seniorthesisormajorproject

Examples of Indirect Assessment Methods:

•Exitinterviews

•Focusgroups

•Job/graduateschoolplacementstatistics

•Graduationandretentionrates

•Surveyssenttostudents,faculty,alumni,employers,etc.thatassessperceptionsofthe

program.

Determining Success and Meeting the Mark

A lot of time can be spent in developing student learning outcomes and gathering

data, and occasionally people stop there. It is important to “close the loop” and make

sure that assessment data for each program learning outcome is reviewed and used to

make improvements to programs that will enhance the quality of student experiences

and learning. In fact, many experts consider this phase to be the most important part

of assessment.

Walvoord (2004) recommends setting aside at least one faculty meeting a year to dis-

cuss the program’s student learning outcomes and assessment plan as one of the easiest

ways to make the improvements phase a routine departmental function. This meeting

should be at least two hours long and focus on the program’s student learning outcomes,

assessment data, and improvements that can be made. It is not necessary to wait to

schedule this meeting until the assessment plan and data are “perfect.” Assessment is a

work in progress, and any meeting held will be beneficial. Some possible agenda items

42

for this meeting include:

•Shareassessmentdataanalysisresultswithprogramfacultyandstaff

•Discusstheseassessmentresultsastheyrelatetoeachstudentlearningoutcome

•Reviewassessmentresultstodetermineprogrammaticstrengthsandareasfor

improvement

•Decideifdifferentassessmentmethodsareneededinordertoobtainmoretargeted

information

•Determinehowassessmentresultscanbeusedtomakeimprovementstothe

program (e.g., changes to the curriculum, professional development for teaching

personnel, etc.)

•Developanactionplantoimplementtheseimprovements

•Craftingspecificstrategiesregardingtheimplementationoftheactionplan

•Reviewwhatneedstobedoneastheassessmentcycleheadsbacktotheplanning

phase (e.g., Do student learning outcomes need to be revised? Are different assessment

methods necessary? etc.)

How to Write a Course-Level Outcomes Assessment Plan

Questions to Consider

An instructor must have a vision of what it is he or she expects students to know

and have a clear understanding of the purpose of outcomes assessment. From there, the

instructor can define specific goals and objectives. When thinking about developing an

outcomes assessment plan, three questions should be asked: (1) What should students

know upon completing my course? (2) What learning activities will assist students in

attaining this knowledge? and (3) how do I measure students’ success? These three ques-

tions speak to the pedagogical soundness of a course, and course alignment, in which

there is a clear relationship between the course outcomes, teaching and learning activi-

ties, and the assessment.

The key to a meaningful assessment process is the sense of fairness and equitable

opportunities to demonstrate learning. To this end, assessment needs to match what is

43

taught, comprise multiple measures of assessment and provide students multiple experi-

ences to become familiar with the methods of assessment chosen.

Developing and Writing a Student-Learning Outcomes Assessment Plan

The first step in developing an outcomes assessment plan is to develop and write a

student learning outcomes assessment plan. MSCHE views student learning goals or

outcomes as a basis for what qualities or attributes will characterize students after they

have completed a course or program, or after they have graduated from college.

Questions to Consider

There are many resources available for assisting faculty with setting goals and cre-

ating student learning outcomes that revolve around leading questions (Stufflebeam,

2001) and inventories of teaching goals (Angelo & Cross, 1993)). Leading questions

can serve as the catalyst for developing student learning outcomes. Some questions to

start the brainstorming process are:

•Whatarethemostimportantthingsastudentwillgainfromcompletingthe

course?

•Howwillinformationfromstudentlearningoutcomesassessmentbeusedto

improve student learning?

•Howdotheknowledge,skillsandcompetenciesofpeopleinthecoursediffer

from those in other discipline courses?

After considering these questions, you may begin to write your student learning out-

comes (SLO). SLO should be defined in action terms (i.e., analyze, interpret, synthe-

size, etc.). Using Bloom’s Taxonomy as a guide (See Figure 4), think about what you

want your students to be able to do and how they should be able to demonstrate your

stated expectations.

44

Figure 4. Blooms Taxonomy

45

How do you know you that have a measurable Student learning outcome?

TheSLOlitmustest,firstpresentedtoourSLO/CATaskForcebyDr.Jo-EllenAs-

bury, was the “Hey Dad Test.” In this test, you complete the sentence, “Hey, Dad, let

me show you how I can…” with the outcome. If the response makes no sense, then

the outcome is probably not stated in measurable terms.

For Example:

“Hey, Dad! Watch me...appreciate world religion.”

Versus

“Hey, Dad! Watch me…compare and contrast Eastern

versus Western cultural belief systems with 100 percent

accuracy.”

Format and examples:

General format: Students will be able to (verb + __________________________).

Psychology (Introduction to Psychology – Informational and Computer literacy competency)

Less effective: Students will be able to understand how to write a research paper in APA format.

More effective: Students will be able to write a research paper using appropriate pri-mary sources and writing in APA style with 90 percent accuracy.

Philosophy

Less effective: Students will learn to discuss understand philosophical theories.

More effective: Students will be able to understand, interpret, explain, analyze, and assess representative philosophical texts, teachings, and problems.

46

Mathematics (Introduction to Statistics)

Less effective: Students will use parametric and nonparametric tests in a sample re-search problem.

More effective: Students will be able to apply parametric and nonparametric statistical methods appropriately to solve a sample research problem.

English (Introduction to Term Papers)

Less effective: Students will explain how to write a paper

More effective: Student will be able to write multiple-page essays that meet college-level academic standards for content, organization, style, grammar, mechanics, and format.

Accounting

Less effective: Students will understand how to write an internal audit report.More effective: Students will be able to develop and write an internal control training report for a company.

Biology (Anatomy and Physiology)

Less effective: Students will be able to understand the organization of the human body.More effective: Students will be able to recognize, identify, and describe the basic ana-tomical structures associated with each major organ system.

Business Administration (Business Ethics)

Less effective: Students will be able to understand the ethical dilemmas in business.More effective: Students will be able to demonstrate an understanding of how the social, economic, political, technological, and ecological dimensions of internal and external environments create a moral and social context for business decision making.

Art (Introduction to Art)

Less effective: Students will be able to explain various art forms.More effective: Students will be able to differentiate and analyze the differences be-tween functional, decorative, and religious art forms.

47

MusicLess effective: Students will be able to create music.

More effective (1): Students will be able to create musical compositions and improvisa-

tions utilizing technology.

More effective (2) Students will be able to explain the historical developments leading

to the current state of musical technology.

Learning and Teaching ActivitiesThe key to achieving a learning goal lies within teaching and learning activities that

align with the learning outcome. The concept of alignment conveys the idea that critical

course components such as supported student-learning activities should work together

to ensure that students achieve the desired learning outcomes. For instance, when ad-

dressing informational and computer literacy, where the learning outcome relates to

students being able to write a research paper, one must have supported student learning

activities that actually teach how to write a research paper (e.g., how to utilize databases,

identify relevant information for use, writing exercises that focus on prose and organi-

zational development, and opportunity for multiple draft revisions).

ExampleStudent Learning Outcome: Students will be able to write a research paper in APA style

using appropriate primary sources and writing.

Supported Student Learning Activity

Less effective:

(1) Teaching activity: Discuss the history of psychology as a science.

(2) Learning activity: An internet assignment that requires students to locate and sum-

marize an article.

48

More effective:

•Teachingactivity:Discussiononvariousinformationaldatabases,searchstrategies

and Boolean logic, and in-class demonstrations on how to navigate through the

databases and use search engines.

•Learningactivity:Libraryexercisethatprovidesstudentswithpracticeusing

Boolean logic in various research databases and identifying relevant sources

Selecting an Assessment Method

There are a number of assessment strategies to use for assessing student learning

outcomes. These strategies all come with their own unique strengths and weaknesses.

The American Psychological Association (APA) Board of Educational Affairs Task Force

on Psychology Major Competencies (2002) developed a matrix identifying those strate-

gies. It is summarized below (see Table 5). The pros and cons outlined and discussed by

the APA Education Task Force for each assessment method can be found in Appendix

E, p. 116. Often it is difficult to separate the assessment method from the assessment

instrument. Nevertheless, to ensure that an outcomes assessment is meaningful, you

must separate the method from the actual assignment (i.e., assessment instrument) that

is distributed to the student.

Formative Verses Summative

All assessment methods will fall into one of two types: formative and/or summative.

A well-balanced assessment plan will include both types of assessment.

The main purpose of formative assessment is to gather information or feedback that

the instructor can use to make ongoing adjustments to teaching in order to meet student

needs. For example, if the instructor finds that the majority of the students are having

trouble grasping material in a particular area, a review assignment can be arranged. This

type of assessment also gives feedback to the students so they can monitor their progress

and make adjustments as well. Formative assessment can be graded assignments or non-

49

graded assignments. Some examples of formative assessment would be periodic quizzes,

lab assignments, a written summary of the main points from a lecture, an outline for a

paper, or proficiencies that can be repeated after receiving feedback. Formative assess-

ment methods are “low stake” assignments that, if graded, have a lower weight assigned

to them

The main purpose of summative assessment is to provide a cumulative grade. It usu-

ally occurs at the end of a module or unit or at the end of a course or program. More

traditional types of summative assessment would be mid-term and final examinations.

Some other examples of summative assessment would are: a final project, a senior re-

cital, a product produced by a student, a reflective writing assignment, a cap stone as-

signment or test, or portfolios consisting of a variety of assignments. (Carnegie Mellon

University, 2011; University of South Florida, 2011, Fenton & Ward Watkins, 2008).

The differences between formative and summative assessment can sometimes be

confusing. When formative assessment methods are given grades that are incorporated

into the final course grade, they may seem to cross the line. Summative methods can

also be used formatively, when the results are used by students and instructors to help

guide their efforts or activities for another course (Carnegie Mellon University). If one

remembers to keep the main purposes of the two types of assessment in mind and to

use a variety of methods, it is not necessary to be concerned with the subtle differences.

Direct Versus IndirectDirect assessment of student learning includes course assignments that directly assess

student learning and all of the methods previously discussed under “Formative verses

Summative.” Assignments such as quizzes, examinations, lab assignments, portfolios,

capstone assignments or examinations, and written assignments, to name a few, are ex-

amples of direct assessment.(Miami-Dade College, 2011).

50

Indirect assessment methods are administered outside of the classroom or course and

used to complement direct methods of assessment. These types of assessments are used

to identify learning outcomes that students are expected to achieve, to develop a sys-

tematic assessment program to evaluate previous learning and follow student progress

through a sequence of courses, and to create a system of review for each discipline to

include faculty and all appropriate stakeholders. The reviews and results of this type of

assessment can be used to revise curricula. Some examples of indirect assessment are cur-

riculum mapping, syllabus analysis for courses with several sections taught by different

instructors, successful course progression, and transfer success by discipline, job place-

ment rates, student surveys, employer surveys, alumni surveys, student focus groups,

accreditation reviews, and feedback from advisory boards (Miami Dade College).

51

52

Developing an Assessment Instrument

Alignment is paramount to developing the instrument that will be distributed to

students. The instrument provides evidence of student learning and instructional ef-

fectiveness. It should be directly related to the learning outcomes, and supported by the

learning activity. For instance, if the learning outcome addresses a comparative analysis

of romantic languages, then a persuasive essay is not applicable as an assessment instru-

ment. Instrument Development Considerations

Properly-constructed instruments not only provide valuable evidence of student

learning, but assist students in structuring their studying efforts. For instance, when

creating a valid quiz, the instructor reviews the percentage of time spent on a concept

at various domain levels. This process helps define the types of questions that should

be included in the quiz. Many instructors utilize a concept table to assist with test item

development.

53

Effective Question Writing

How a question is worded is just as important as the question type. We have seen

instances in which an ambiguously worded question caused students anxiety, frustra-

tion, and possibly caused them to select an incorrect answer. Ambiguous questions do

not measure a student’s understanding of a concept. Fenton & Ward Watkins (2008)

provide valuable tips for writing effective test questions.

Tips for Writing Effective Test Questions

Multiple-Choice

•Thequestion’sfocusisasingleconceptortopic.

•Thequestiondoesnotincludeirrelevantinformation;distractersmustbe

plausible.

•Thequestioniswritteninsimpleandclearlanguage

•Negativesordoublenegativesareavoided.

True or False

•Thequestionshouldbeabsolutelytrueorabsolutelyfalse.

•Useclearlanguage.

•Avoidnegativesordoublenegatives.

Short Answer

•Directquestionsarepreferredtoincompletealternatives.

•Fordefinitiontypequestions,givethetermratherthanthedefinition.

Essay

•Thequestionshouldincludespecifictasksordescriptionsandusetermssuch

as describe, compare, contrast, and evaluate.

The two things to keep in mind with developing an assessment instrument are valid-

ity and clarity. The assessment instrument you used (e.g., essay, multiple-choice exams)

must truly measure what it purports to measure. In addition, the students must explic-

itly know what is expected of them prior to receiving the instrument.

54

Developing a Scoring Tool

A standard rubric is not required to score students’ responses, but it is extremely

convenient from both a faculty and student perspective. From a student’s perspective,

the rubric is a communicative tool, offering specific guidelines to the expected perform-

ance of a task leading to a desired learning outcome. It has a positive effect on student

achievement by providing the students with the expectations for an assignment and the

criteria by which they will be evaluated. The rubric scoring approach also has benefi-

cial effects for instructors. Rubrics provide an objective method of evaluating student

performance in a way that eliminates ambiguity about how and why a student earned a

grade. Table 7 provides internet resources available for creating and using rubrics.

Table 7 Rubrics in Higher Education

http://rubistar.4teachers.org/

http://www.rubrics4teachers.com/archive.php

http://www.teach-nology.com/web_tools/rubrics/

http://edtech.kennesaw.edu/intech/rubrics.htm

http://www.merlot.org/merlot/viewMaterial.htm;jsessionid=9BD47758EAACF767A0

D4F06732FB8312?id=344540

Determining Success and Meeting the Mark

How do we determine achievement of an outcome on an assessment tool? This is a

question many faculty members ask when constructing a scoring tool and determining

what criteria to use. It is always best not to think about a passing score so much as about

what a student should know. Students’ scores must be compared to something else. The

something else is determined by what the test is designed to reveal.

55

Linda Suskie (2007) identified several possible perspectives at the 7th Annual Texas

A&M Conference. They include local or external standards-based perspectives (com-

petency-based or criterion-referenced), benchmarking (peer-referenced or norm-refer-

enced), best practice (i.e., best in class), value-added (i.e., growth, change, pre-post),

longitudinal (i.e., improvement), strengths and weaknesses, and capability (i.e., poten-

tial). Each has its advantages and disadvantages but by taking a multi-dimensional ap-

proach, a balance can be achieved.

Summarized in Table 8 are these perspectives, the questions they answer, and their

associated challenges.

56

IV. Putting it all togetherWe all have an important role to play in an integral, complex system where the

whole is greater than the sum of its parts. The CWAC provides leadership and man-agement oversight for the BCCC assessment effort. The council’s purpose is to ensure that outcomes assessment processes are sustained, produce valuable results, and provide information that can be used to improve the curriculum (see Figure 3 BCCC Outcome Assessment: Structural Organization and Functional Processes, p. 58).

Figure 3 BCCC Outcomes Assessment: Structural Organization and Functional Processes

57

CarnegieMellonUniversity.26July2011.

Cartwright, Rebecca., Ken Weiner and Samantha Streamer-Veneruso. “Student Learn-ing Outcome Assessment Handbook.” August 2009. Montgomery College Web site. 10 May 2001 http://www.montgomerycollege.edu/Departments/outcomes/docu-ments/sloa_handbook.pdf.

Cross, K.Patricia. and Thomas A. Angelo. “Learning Goals.” Education, Middle States Commission on Higher. Student Learning Assessment: Options and Resources. 2nd Edi-tion. Philadelphia: Middle States Commission on Higher Education, 2007, 21 - 23.

Fenton, Celeste and Brenda Ward Watkins. “Learner-Centered Assessment: Real Strat-egies For Today’s Students.” The Cross papers: Number 11. League for Innovation in the Community College, March 2008.

Fulks,Janet,“AssessingStudentLearninginCommunityColleges.”2004.BakersfieldCollege Web site. http://online.bakersfieldcollege.edu/courseassessment/Default.htm

Gray, Peter. “Viewing Assessment as an Innovation: Leadership and the Change Proc-ess.” New Directions for Higher Education 11 (1997): 5-15.

Kotter,John.“LeadingChange:Whytransformationaleffortsfail.”Harvard Business Review March-April 1995, 1-10.

MiamiDadeCollege.“IndirectMeasuresofStudentLearning.”27July2011.Miami-DadeCollege.“DirectMeasuresofStudentLearning.”27July2011.

Middaugh, Michael F. Planning and Assessment in Higher Education: Demonstratiing Institutional Effectiveness.SanFrancisco:Jossey-Bass,2010.

Middle States Commission on Higher Education. “Team Visits, conducting and host-ing an evaluation visit.” Student Learning Outcome Assessment. 2009.

Rogers, Everett. Diffusion of Innovation. New York: Free Press, 1995.

Rouseff-Baker, Fay and Andrew Holm. “Engaging Faculty and Students in Classroom Assessment of Learning.” New Directions for Colleges 126. Summer 2004: 29-42.

Bibliography

58

Steen, Lynn Arthur. “Assessing Assessment,” in Assessment Practices in Undergraduate Mathematics. Bonnie Gold, et al., editors. Washington, DC: Mathematical Association of America, 1999.

Stufflebeam, Daniel L. Evaluation models: New Directions for Evaluation, No. 89. San Francisco:Jossey-Bass,2001.

Suskie, Linda. “A Framework for Good Assessment Practices.” 7th Annual Texas A&M Conference. Philadelphia: Middle States Commision on Higher Education, 2007.

University of Central Florida. “Program Assessment Handbook: Guidelines for Plan-ning and Implementing Quality Enhancing Efforts of Program and Student Learning Outcomes.” Handbook. 2011.

University of Massachusetts Amherst. “Course Level Outcome Assesment Handbook.” University of Massachusetts Amherst Web site. 15 May 2011 http://www.umass.edu/oapa/oapa/publications/online_handbooks/course_based.pdf.

UniversityofSouthFlorida.“ClassroomAssessment.”26July2011.

Walvoord, Barbara E. Assessment Clear and Simple. 2ndEdition.SanFrancisco:Jossey-Bass, 2010.

59

Appendix AStudent Learning Outcome Data Protocol

Data Recording

•AdataspreadsheetwillbedevelopedbytheFacultyAssessmentTeam(FAT)

in collaboration with Institutional Research (IR)

•Facultyinvolvedintheoutcomeassessmentprojectwillberesponsiblefor

recording data for their course

Data Collection

The Lead FAT member will collect all data spreadsheets from all faculty and submit to IR

Data Analysis and Report

IR will analyze the data and produce a data report. Before submitting the analysis

report to the FAT, all identifying material will be removed (i.e., identifying informa-

tion about instructor or student). The analysis report will contain a distribution of

responses by Student Learning Outcome (SLO) and by measure.

60

Appendix BEthical Use of Student Learning Outcome Data

Without question, student learning outcome assessment should be used to improve

student learning and serve as the nucleus for innovative teaching strategies and meth-

ods. The context of outcome assessment in higher education raises potential ethical

concern when used improperly. More specifically, Student Learning Outcome Data

should never be used to: (1) identify teaching faculty or students; (2) influence faculty

evaluations or (3) play a role in faculty teaching assignments.

61

Appendix CGeneral Education Statement

GE Statement

BCCC defines general education as educational experiences that enable students

to become informed, independent, critical thinkers. Through the diverse curriculum,

students acquire knowledge and skills to communicate effectively, reason abstractly,

gather/evaluate/interpret numerical data as well as written information, draw conclu-

sions based on evidence, apply knowledge to real world situations, develop an appreci-

ation for social and cultural diversity, value the arts, and become individuals prepared

for the lifelong journey of learning and responsible citizenship in their communities,

the nation and the world.

•CommunicateeffectivelyinoralandwrittenEnglish(OralandWritten Communication). •Reasonabstractlyandthinkcritically(CriticalThinking) •Understandandinterpretnumericaldata(NumericalAnalysis) •Readwithcomprehensionanddrawconclusionsbasedonevidence (Deductive and Inferential Thinking) •Understandthedifferencesaswellascommonalitiesamongpeople (Multicultural Diversity). •Understandandutilizeskillsresponsibleforlivingasresponsible,ethical and contributing citizens (Personal Development and Social Responsibility). •Reflectupontheartsandtheroleoftheartsinthehumanexperience (Arts and aesthetic awareness). •Identify,locate,andeffectivelyuseinformationfromvariousprintand electronic sources (Informational and Computer Literacy).

Core Competencies

In parenthesis at the end of each competency statement is (are) letter(s) that match

the newly developed competencies to the GE requirements as found in COMAR 13B-

02-02.16.

62

I. COMMUNICATION

Student will:

Speak, read, and write effectively and access, evaluate, utilize and organize information from a variety of sources; (a,b)

Analyze and reflect on complex issues, and synthesize ideas in clearly written and well-organized standard English; (c,a)

Demonstrate the basic concepts and practices associated with oral presentations. (a,b)

II. ARTS and HUMANITIES

Students will:

Develop an aesthetic sensibility and the intellectual skills of critical analysis; c, f, g)

Form artistic judgments by exposure to the rich history and diversity of human knowledge and thought; and (f, g)

Understand the achievements of diverse cultures, as they are expressed in the arts, literature, religions, philosophy, foreign languages, and linguistic. (f, g)

III. SOCIAL SCIENCES and CULTURAL AWARENESS

Students will:

Develop an understanding of the commonalities among world cultures and the influence of culture and the environment on individuals and groups; (f )

Understand the complexities of social and psychological relations and human experiences and the ways in which they have changed over time, through the systematic study of societies and human behaviors; and (c, f )

Utilize critical and ethical reasoning to respond to current events and issues. (c)

IV. MATHEMATICAL and SCIENTIFIC REASONING

Students will:

Analyze and evaluate information from mathematical and scientific perspec tives, and develop reasoned solutions to real world problems; (c, d, e)

Demonstrate knowledge of the scientific method; and (e)

Gain competence in mathematical literacy and reasoning along with critical thinking skills necessary for making informed judgments. (c)

63

V. PERSONAL DEVELOPMENT and SOCIAL RESPONSIBILITY

Students will: Develop skills that promote personal wellness, teamwork, lifelong learning and personal life-management skills; (c)

Formulate a framework for ethical decision-making and personal responsibility through critical reflection on their own values, the values of others and the values shaping society; and (c)

Demonstrate traits of a well-educated individual by cultivating a sense of personal responsibility and social accountability. (f )

VI. Informational and Technological Literacy

Students will:

Demonstrate these computer skills: use basic software programs, search the Web, use proper etiquette and security safeguards when communicating through e-mail. (h)

Identify, locate, and effectively use information from various print and electronic resources. (h)

Combine aspects of information literacy and consider the ethical, legal and economic implications of information use. (h)

COMAR Regulations (13B-02-02.16) An in-State institution shall provide to its

students, within the required curriculum for graduation, a general education that is

designed to provide the student with the skills and knowledge necessary to:

(a) Communicate effectively in oral and written English;

(b) Read with comprehension;

(c) Reason abstractly and think critically;

(d) Understand and interpret numerical data;

(e) Understand the scientific method;

(f ) Recognize and appreciate cultural diversity;

(g) Understand the nature and value of the fine and performing arts; and

(h) Demonstrate information literacy.*

64

65

Appendix D

Forms, Templates, and Worksheets

66

Figure 4: Course Outcome Assessment Tracker

67

68

69

General Guide for Writing a Mission Statement

Write your Mission Statement using the following as a guide: “The mission of (your

office name) is to (your primary purpose) by providing (your primary functions or ac-

tivities) to (your stakeholders).” (Additional clarifying statements)

70

If you have any “No” answers above, go back and review and revise the mission statement.

Once you have answered “Yes” to all of the questions, the mission statement is complete and you are ready for the next step, writing goals and developing the student learning outcomes for the program.

Adopted and modified from “Creating a Mission Statement” from Stanford University and “The Program Assessment Handbook” from the University of Central Florida

71

Goal Definition Worksheet

Arrange a program meeting to brainstorm and answer the questions below. As a group, from those answers, start to form goals for the department. The final product of this exercise should be a list of at least three to five broad goals that describe what depart-ment faculty believe should be characteristic of graduates in the major. (This exercise could start by having each faculty member in the department complete a copy of this worksheet individually. Then meet to compare notes and discuss results.)

1. Review any current program goals and see if they may need revised. This informa-tion can most likely be found in the course catalog, online, program brochure, or department mission statement. Write your thoughts and concerns about the current goals.

2. Describe your ideal student in terms of strengths, skills, knowledge and values, and identify which of these characteristics are the result of the program experience.

3. Keeping this ideal student in mind, ask what the student

a. knows b. can do c. cares about

4. What program experiences can you identify as making the most greatest contribu-tions to producing and supporting the ideal student?

5. What should every graduate of your program know?

6. What career achievements of your alumni are you most proud of?

Adapted and revised from “Program-Based Review and Assessment” from the University of Massachusetts

72

Program Learning Outcomes Worksheet

This worksheet may help you and others in your department develop Program Learn-ing Outcomes (PLO) from the goals you have identified. Have all faculty members complete the table below. Then meet as a group to discuss your response and brain-storm to write learning outcomes for the program. You may end up with more than one SLO for each goal.

Adapted and revised from “Program-Based Review and Assessment” from the University of Massa-chusetts

73

Program Learning Outcome Assessment Report

74

Summ

ary and Analysis of D

ata

Use of R

esults and Modifi

cations

75

Student Learning Outcome Pilot SurveyFaculty Survey

76

Student Learning Outcomes Assessment Plan (SLOAP)Instructions: Complete the following form with a summary and overview of your outcome

assessment plan by summarizing Data from Parts 1 to 3 in this form.

77

78

Part I: Selecting a Student Learning Outcome

Instructions: Develop at least three (3) Student Learning Outcomes (SLO) and provide a selection rationale for choosing each SLO. Selected SLO can be from the Master Syllabus for the course or be ancillary outcomes if there is specific reason for assessment and the discipline agrees they are important to assess.

Note: General Education focus courses for ISLOs must select at least one SLO that aligns with one of the course specific outcomes.

79

Part II: Planning your Outcome Assessment Activity

Student Supported Learning Activities

Instructions: List each SLO selected for the assessment. Thereafter, make a list of instructional activities that may be used to facilitate student achievement of the out-come. Activities can support multiple outcomes and can be listed for each appropri-ate outcome. Make sure you to survey faculty involved in the outcomes assessment activity (i.e., teaching the course) to get a comprehensive list of potential instructional activities.

*Note: The writers of the plan should include a myriad of activities. Course instructors are not limited in their approach to teaching activity. The planning of this activity in no way is designed to limited academic freedom, but serves as a jump start to ensuring that learning activities align with outcomes. Whatever learning activity is chosen, it should be document.

80

Part III: The Assessment Tool

Instructions: In this section, you will provide a copy of your assessment tool (i.e., essay or writing prompt, exam, etc).

81

Part IV: The Scoring Tool

Instructions: In this section you will provide your scoring tools (i.e., rubric) and/or bench-mark criteria.

82

Recommendations and Observations

83

Outcomes Assessment Observations Form

Assessment Results Observations:

Assessment Process(Including SLO’s, Assessment Instrument,

Overall Summary Observations

Preliminary Recommendations(make sure to include recommendations for each area of concern & strength, also focus on activities to be replaced or replicated).

Observations:

a. Strengths:

b. Concerns:

1.

2.

84

Outcome Assessment Recommendations Form

Division___________________Department_______________________________

Type of Assessment Activity

Course: __________________

1. Summary Outcomes Assessment Report: a. Strengths identified in the Outcomes Assessment Project: (Identification of specific activities or best practices that are working and can be replicated)

b. Weaknesses identified in the Outcomes Assessment Project: (Identification of areas of weakness and specific activities or practices that can be used to improve student learning experiences)

2. Recommendations: (must reflect meaningful, action oriented activities that can be incorporated into practice, should relate to student performance & learning and include a proposed semester and year of implementation)

General Course Outcome Assessment

Core Competency AssessmentCompetency:_____________

Special Project***

85

86

87

Samples of Outcome Assessment Plans

88

89

90

Part I: Selecting a Student Learning Outcome

Student Learning Outcome #1:

Students will be able to identify and use appropriate, relevant resources from textbook and electronic sources from a research database.

Student Learning Outcome Selection Rationale:

This is an essential outcome of this course. Effective integration of relevant, appropriate sources, material and appropriate documentation is a key area in which many students seem to struggle. Having specific data that illustrates whether or not students are performing sat-isfactorily is helpful to faculty in preparing them to excel in upper level courses. We can use this data to determine the need to add additional support for this key skill.

Student Learning Outcome #2:

Students will be able to write a multiple-page document that meets college-level academic standards for style, grammar, mechanics, and format.

Student Learning Outcome Selection Rationale:

This skill is an academic component to SLO #1. Students need not only be able to effectively integrate relevant and appropriate resources and ethically document those sources, but also need to demonstrate how they can logically organize information into a cogent and cohesive paper. Moreover, this outcome will add insight help assess on the impact of not requiring having a college level Basic English composition course as a pre-requisite.

Student Learning Outcome #3:

Students will be able to use the concepts, language and major theories to account for psycho-logical developmental phenomena.

Student Learning Outcome Selection Rationale:

Understanding the concepts and principles of Developmental Psychology and how to apply these concepts to explain behavior is a key element of this course. This is a subject intro-duced in a previous course (i.e., PSY 101 – Introduction to Psychology). Using this authen-tic assessment, in which students apply real life experiences in the application of Develop-mental Psychology will allow Psychology Faculty to examine how well udents understand and apply theory to explain observations

91

Part II: Planning your Outcome Assessment ActivityStudent Supported Learning Activities

Student Learning Outcome #1:

Students will be able to identify and use appropriate, relevant resources from textbook and electronic sources from a research database.

Supporting Learning Activities:

•(inclass)practicePracticedistinguishingbetweensourcematerialandstudents’ commentary on the reference material(in class)

•(inlibrary)PracticeusingappropriateBooleanlogicindatabasestosearchfor relevant sources (in library)

•Studentswillreadarticlesandincorporatequotes,paraphrases,andsummaries from these articles into an original paragraph in assignments

•Studentswillproduceworkscitedorbibliographicinformationinappropriate formal citation format

Student Learning Outcome #2:

Students will be able to write a multiple-page document that meets college-level academic standards for style, grammar, mechanics, and format.Supporting Learning Activities:

•StudentswillpracticewritingessaysinAPAformat

•StudentswillparticipateinanAPAwritingmodule

•Studentswillusethewritinglabtoassistthemwithdocumentrevision

Student Learning Outcome #3:

Students will be able to use the concepts, language and major theories to account for psy-chological developmental phenomena.Supporting Learning Activities:

•Reviewingvideosandfilmsthatdepictillustratedevelopmentaltheories, principles, and concepts of Developmental Psychology (in class)

•Practiceraisingavirtualchildfromnewborntoeighteen(18)yearsofage. Students are required to use a software program that requires them make child rearing decisions based on principles, concepts and theories of Developmental Psychology

•Practiceapplyingprinciples,conceptsandtheoriesofDevelopmentalPsychology to their own development through journaling

92

Part III: The Assessment Tool

Student Learning Outcome #1:

Students will be able to identify and use appropriate, relevant resources from textbook and electronic sources from a research database.

Student Learning Outcome #2:

Students will be able to write a multiple-page document that meets college-level academic standards for style, grammar, mechanics, and format.

Student Learning Outcome #3:

Students will be able to use the concepts, language and major theories to account for psychological developmental phenomena.

All three learning outcomes will be assessed in an assigned seven (7) page minimum paper assigned in each course. Instructors have the freedom to assign different ages (Childhood, Adolescence, Young, Middle or Late Adulthood, but the end result will be a 7-page minimum paper requiring research, use of sources and appropriate documentation.

93

Part IV: The Scoring Tool

4 - Clearly a knowledgeable, practiced, skilled pattern

3 - Evidence of a developing pattern

2 - Superficial, random, limited consistencies

1 - Unacceptable skill application

94

95

96

97

Part I: Selecting a Student Learning Outcome

Student Learning Outcome #1:

Students will display knowledge in identifying and applying key biological concepts and principles related to scientific method, chemistry of life, cell structure, cell dynamics and genetics.

Student Learning Outcome Selection Rationale:

Important for the students of biology to gain knowledge about the concepts and theories that are presently accepted for basic understanding of life. Since cell is the basic unit of living things, its structure and functions are the essential.

Student Learning Outcome #2:

Students demonstrate knowledge in scientific reasoning by identifying components of sci-entific method, carrying out scientific investigation, analyzing data, drawing conclusion and presenting their investigation in the lab report.

Student Learning Outcome Selection Rationale:

Students need to develop scientific reasoning to understand previous experiments, to think critically and analyze problems, to develop a logistic method of answering the question.

Student Learning Outcome #3:

Students will communicate clearly and effectively by using appropriate electronic and written techniques to present a research paper pertaining to cell and molecular biology.

Student Learning Outcome Selection Rationale:

Students will effectively communicate Students will critically analyze, think comprehend and review primary cell -molecular biology literatures, and communicate effectively by writing the research paper related to the principles of biology.

98

Part II: Planning your Outcome Assessment Activity

Student Supported Learning Activities

Student Learning Outcome #1:

Students will display knowledge in identifying and applying key biological concepts and principles related to scientific method, chemistry of life, cell structure, cell dynamics and genetics.

Supporting Learning Activities:

•Studentswillreadtheassignedchapterfromthetextbook,powerpointslides, and outline of the chapter. •Answerthestudyguidequestions. •Reviewanimationsandcomputerizesflashcardterminologies. •Reviewingthetutorialspostedbythepublisher.

Student Learning Outcome #2:

Students demonstrate knowledge in scientific reasoning by identifying components of sci-entific method, carrying out scientific investigation, analyzing data, drawing conclusion and presenting their investigation in the lab report

Supporting Learning Activities:

•Studentswillreadtheassignedlabexercise •Identifyandmentionthestepsinthescientificmethod •Performexperiments •Writelabreports

Student Learning Outcome #3:

Students will communicate clearly and effectively by using appropriate electronic and writ-ten techniques to present a research paper pertaining to cell and molecular biology.

Supporting Learning Activities:

•Studentwillselectatopic(tobeagreeduponbythestudentandtheinstructorto avoid repetition) in “Cell or Molecular Biology”.

•ThepaperwillbewritteninAPAformatfromreviewoftheliteraturefromrecent 5 years and three different sources like scientific journal, newspaper, magazine, text book etc.). Format will include an introduction, subject and conclusion. Requires minimum of 3 pages (excluding graphs & pictures)

99

Part III: The Assessment Tool

Student Learning Outcome #1:

Students will be able to answer the biological concepts and principles.

This learning outcome will be assessed using embedded questions in the exams

Student Learning Outcome #2:

Students will be able to perform lab experiments and write a lab report.

This learning outcome will be evaluated through a lab report.

Student Learning Outcome #3:

Students will communicate clearly and effectively by using appropriate electronic

and written techniques to present a research paper pertaining to cell and molecular

biology.This learning outcome will be evaluated using a rubric on a 5 point scale for

each performance area.

100

Part IV: The Scoring Tool

Student Learning Outcome #1:

Analysis of Data: Embedded Questions related to Key Biological Concepts & Principles

Embedded Questions:

#3. Which of the following sequences represents the hierarchy of biological organization from the least to the most complex level?

#10. In animal cells, hydrolytic enzymes are packaged to prevent general destruction of cellular components. Which of the following organelles functions in this compartmentaliza-tion?

#12. The mass number of an element can be easily approximated by adding together the number of __________ in an atom of that element.

#13. When a plant cell, such as one from a peony stem, is submerged in a very hypotonic solution, what is likely to occur? The cells will become__________________.

#16. Each element is unique and different from other elements because of the number of protons in the nuclei of its atoms. Which of the following indicates the number of protons in an atom’s nucleus? #19. Lactose, a sugar in milk, is composed of one glucose molecule joined by a glycosidic linkage to one galactose molecule. How is lactose classified? #31. White blood cells engulf bacteria through what process?

#35. When two atoms equally share electrons, they will interact to form ___________.

#37.What is the chemical mechanism by which cells make polymers from monomers?

#49. Different atomic forms of an element contain the same number of protons but a dif-ferent number of neutrons. What are these different atomic forms called? ________________________________________________________________________

Total Number of Answers = Number of Embedded Questions Multiplied by Number of Students

Total number of Wrong Answers =

Total number of Correct answers =

% of students answered embedded questions correctly =

Criterion Score ≥7 ___________________________________________________________________

101

Student Learning Outcome #3:

102

103

104

Part I: Selecting a Student Learning Outcome

Student Learning Outcome #1: Students will perform an intraoral and extraoral inspection on a classmate.

Student Learning Outcome Selection Rationale:

This is an essential outcome of this course and the dental hygiene program. Starting the second semester of the program, the students will be required to perform an oral inspection on each patient they treat in the Dental Hygiene clinic. At this level, students perform the inspection and record a description of what they see. Later in the dental hygiene program, they will learn how to determine diseased states.

Student Learning Outcome #2:

Students will identify all of the structures within the oral cavity and surrounding areas of the head and neck on a classmate.

Student Learning Outcome Selection Rationale:

Being able to identify normal structures of the oral cavity and surrounding areas of the head and neck is essential in learning about abnormal structures later in the dental hygiene pro-gram.

Student Learning Outcome #3:

Students will distinguish between the different types of teeth by identifying characteristics of morphology.

Student Learning Outcome Selection Rationale:

As a dental hygienist it is important to be able to differentiate between the different types of teeth.

105

Part II: Planning your Outcome Assessment Activity

Student Supported Learning Activities

Student Learning Outcome #1:

Students will perform an intraoral and extraoral inspection on a classmate.Supporting Learning Activities:

•Studentswillhavereadingassignmentsfortheoralinspectionlabmodule •Studentswillobserveademonstrationbytheinstructor •Studentswillpracticetheoralinspectiononalabpartner •Studentswillanswerquestionsonbothlabquizzesandlecturequizzesandexams

Student Learning Outcome #2:

Students will identify all of the structures within the oral cavity and surrounding areas of the head and neck on a classmate.

Supporting Learning Activities:

•Studentswilllabel(andcolorifdesired)thestructureswithintheoralcavityand surrounding areas on a picture •Studentswilllocatethestructureswithintheoralcavityandsurroundingareasona lab partner as the instructor covers the material •Studentswillhavereadingassignmentsforthestructureswithintheoralcavityand surrounding areas •Studentswillanswerquestionsonbothlabquizzesandlecturequizzesandexams

Student Learning Outcome #3:

Students will distinguish between the different types of teeth by identifying characteristics of morphology.

Supporting Learning Activities:

•Studentswilldraw(orcolor)andlabellandmarksonpicturesofteeth. •Studentswillhavelabandselfstudyassignmentswithextractedteeth. •Studentswillbegivenaminipracticalquizwith3teethperstudent. •Studentswillanswerquestionsonbothlabquizzesandlecturequizzesandexams

106

Part III: The Assessment Tool

Student Learning Outcome #1: Students will perform an intraoral and extraoral inspection on a classmate. This inspection will be assessed by an instructor in chair side observation.

Student Learning Outcome #2:Students will identify all of the structures within the oral cavity and surrounding areas of the head and neck on a classmate. This inspection will be assessed by an instructor in chair side observation.

Student Learning Outcome #3:Students will distinguish between the different types of teeth by identifying characteristics of morphology using extracted teeth on a practical style examination.

107

Part IV: The Scoring Tool

SLO #1: Oral Inspection process evaluation form for DS110

Criteria: Twenty minute maximum. Conducted chair side.

Minimum Competency level:80%accuracy

Instructions: An instructor (RDH) will observe a chair side oral inspection demonstration and enter a number for each item based on the attached rubric.

108

109

110

111

112

113

114

115

116

117

118

119

SELF-ASSESSMENT/REFLECTIONSTUDENT JOURNALS OR SELF-CRITIQUES

Example 1: Roundtable Interactive Peer Feedback Presentation Assessment:

Assessment Procedure: This presentation format varies from the traditional presentation in that students present their ideas to a series of smaller groups of peers in an interactive roundtable format as opposed to standing in front of the entire class and presenting with little or no interaction. Each roundtable session lasts about 15 minutes. Students are asked to provide a brief introduction and then peer groups are permitted to ask questions of the presenter. A rubric outlining what constitutes a quality presentation is included in the course syllabus. Teacher assessment is obtained within one of the peer group sessions. In this ses-sion, the teacher is often required to ask questions that elicit evidence of both the content of the presentation as well as the students digestion of the critical issues related to their topic. Given that the presenters move from group to group, roughly the same amount of time is required as that for traditional presentations.

120

Example 2: Response GroupsResponse groups are opportunities for small numbers of students to discuss books or events in depth with one another. Often these groups are organized and run by stu-dents themselves because they all have read the same book or experienced the same event and want to discuss it. Teachers participating in a response group will gain in-sight into their students’ thinking skills, group behaviors, and affective characteristics.

Example 3: Peer ReviewPeer review is having students review each other’s written work and oral presentations. Introductory-level courses can integrate the process of having students review and comment on materials written by their peers. Traditional peer-review exercises can also be modified with the addition of technological tools to facilitate the process. Students can write and review the writings of their peers online. Strengths and weaknesses may identify earlier than with other methods. Allows learners to develop as teachers and practice constructive feedback skills.

Disadvantages

•studentshavelimitedtrainingingroup dynamics

•socialloaferscantaxequitable judgments about grading

•erroneousideasthatarenotcaughtand corrected spread across group members

•challengingtofacultytojudgewhento redirect or rescue student groups in trouble

•time-consuming

Advantages

•student-centereddesignspromote engagement

•providesopportunitytopractice group skills,

•Timemanagement

•promotesdependentworkatdeeperlevel

•breadthofassignmentscanaddress content coverage issue

•stimulateshowprofessionalactivities/ achievement transpires

•producessynergyandexcitementaround project completion

•createsavenuetosynthesizecontentbases from multiple courses

Research Teams and Group Projects: (i.e., written and oral research projects)

121

VI. Archival MeasuresLearning Logs: A learning log is a kind of journal that enables students to write across the curriculum. The major reason for using them is to encourage students to be in control of their own learning and to promote thinking through writing.

Student record keeping helps students to better understand their own learning as evidenced by their classroom work. This process of students keeping ongoing records of their work not only engages students, it also helps them, beyond a “grade,” to see where they started and the progress they are making toward the learning goal.

Portfolios: Portfolios are personalized long-term documentation of student mastery of course material. An essential element of portfolios is student reflections on their own learning and progression towards the mastery of the material documented in the port-folio. As such, portfolios are windows on the metacognitive process of students.

•thedataalreadyexistaspartofregularcourse

•recordstrendsincollaborativeskill

•tracksprocess

•cheapandconvenient

•demandcharacteristicsmaybereduced

•studentshaveequalopportunitytoparticipate

•facultymonitoringcanbeunobtrusive

•appealstosomestudentswhomayhavegreater difficulty in oral expression

•providesarchivethroughautomaticrecording

•documentsfeedbackforinstructoronwhathas been covered or what is still unclear

Online Group Activities: Print record of interactions in chat room or other internet-based contact.

DisadvantagesAdvantages

•contentanalysisistime-consuming

•privacyissuescanbecompromised

•studentsmaybehandicappedby computer savvy and tech patterns

•facultyneedtobecomputersavvy

122

•Showssophisticationinstudentperformance and highlight student strengths and identifies student weaknesses for remediation, if timed properly

•Canbeusedtoviewlearninganddevelopment longitudinally

•Multiplecomponentsofthecurriculumcanbe assessed (e.g. writing, critical thinking, technology skills)

•Samplesaremorelikelythantestresultsto reflect student ability when planning, input from others, and similar opportunities common to more work settings are available

•Processofreviewingandevaluatingandexcellent opportunity for faculty exchange and develop ment, discussion of curriculum goals and objec tives, review of criteria, and program feedback

•Greaterfacultycontroloverinterpretationand use of results

•Resultsaremorelikelytobemeaningfulatall levels (student, class, program, institution) and can be used for diagnostic and prescriptive purposes as well

•Avoidsorminimizestestanxietyandother one-shot measurement problems

VII. Surveys & Interview (Student surveys and exit interviews, alumni surveys)Surveys and interviews are used to gather students’ opinions about their educational experiences and experts’ opinions about the students’ competence.

Interview: An interview is structured or unstructured dialogue with students in which the student reports his/her reaction or response to a single question or a series of questions. This typically provides an op-portunity for the teacher to determine the student’s depth of understanding rather than whether the student can provide the “correct” answer. Questioning may follow a period of observation to discover if the student’s perception of a situation is the same as the observer’s.

Survey: Although this is not formative assessment of their learning, you can learn a great deal by surveying students.

•Timeconsumingandchallengingtoevaluate

•Spaceandownershipchallengesmake evaluation difficult

•Contentmayvarywidelyamongstudents

•Studentsmayfailtorememberto collect items

•Timeintensivetoconverttomeaningfuldata

•Managementofthecollectionandevaluation process, including the establishment of reliable and valid grading criteria, is likely to be challenging

•Ifsamplestobeincludedhavebeen previously submitted for course grades, faculty may be concerned that a hidden agenda of the process is to validate their grading

•Securityconcernsmayariseastowhether submitted samples are the students’ own work or adhere to other measurement criteria.

DisadvantagesAdvantages

123

•easytoadminister

•inexpensive

•easytoscore

•quickfeedback

•canbereliablebutnotvalid

DisadvantagesAdvantages

Self-Assessment Survey: A process in which students collect information about their own learning, analyze what it reveals about their progress toward the intended learning goals and plan the next steps in their learning.

•validityhingesongooddesign

•maynotbevalid

•demandcharacteristicsmaydistortresults

•participantsmaynothavegoodknowledge about their own attitudes

•participantsmaydemonstrateresponsebias or dishonesty

•laborintensivetointerpret

Baltimore City Community College2901 Liberty Heights Avenue

Baltimore, Md. 21215