36
General Education Assessment Plan Fall 2014

General Assessment Plan Fall 2014 - JCCC Home · General Education Assessment Plan Fall 2014 Jonson ount ... measure student success using tools the faculty in a ... CCSSE is administered

Embed Size (px)

Citation preview

General Education Assessment Plan Fall 2014Johnson County Community College

Office of Outcomes Assessment2013-2014 Annual Report

Table of Contents

What is Assessment? .................................................................................................................................... 2

Framework of Assessment at JCCC ............................................................................................................... 3

Guiding Principles of the Academic Assessment Process at JCCC ................................................................ 3

JCCC Statement of General Education .......................................................................................................... 5

General Education Designation .................................................................................................................... 6

General Education Assessment ..................................................................................................................... 7

Distribution of General Education Curriculum ............................................................................................. 9

Mapping the General Education Curriculum .............................................................................................. 10

Timelines for Assessing the Curriculum ...................................................................................................... 10

Assessment Benchmarks............................................................................................................................. 13

Glossary of Terms.................................................................................................................................... 14

Appendices/Forms

General Education Assessment Plan

2 | P a g e

What is Assessment? “Assessment is the ongoing process of:

Establishing clear, measurable expected outcomes of student learning

Ensuring that students have sufficient opportunities to achieve those outcomes

Systematically gathering, analyzing, and interpreting evidence to determine how well student

learning matches our expectations

Using the resulting information to understand and improve student learning.”

Linda Suskie (2005), Assessing Student Learning: A Common Sense Guide

A process of assessment is essential to continuous improvement and therefore a commitment

to assessment should be deeply embedded in an institution’s activities. For student learning, a

commitment to assessment would mean assessment at the institutional and program level that

proceeds from clear goals, involves faculty at all points in the process, and analyzes the

assessment results.1

1 Higher Learning Commission “Criteria for Accreditation - Guiding Values” document

General Education Assessment Plan

3 | P a g e

Framework of Assessment at JCCC

The faculty of JCCC has established curriculum-wide student learning outcomes for all students.

These outcomes are the heart of a continuous cycle of inquiry, assessment, and improvement.

Regular assessment of student achievement of these outcomes is used to develop

improvement strategies and demonstrate our accountability for our students’ learning.

Assessment produces data we can use to make evidence-based decisions related to curriculum,

instruction, and resources.

Guiding Principles of the Academic Assessment Process at JCCC

1. Assessment is a vehicle for improvement of student learning, not an end in itself. As such,

assessment is fueled by thoughtful questions posed by faculty that identify data to be

collected and analyzed in order to illuminate opportunities to develop and implement

initiatives in curriculum, instruction, and support services. For assessment to function

formatively, assessment results must be used appropriately to provide direction and

guidance for improving curricula and related student experiences.

2. Assessment works best when it is ongoing, not episodic, and when it is multi-dimensional,

employing multiple methods. Assessment is a process whose power is cumulative.

Improvement in student learning is a long-range process.

3. Assessment works best when it has clear, shared, implementable goals. Assessment

activities are goal-oriented and involve comparing student performance with the purposes

and expectations of the faculty as expressed in program and course design.

4. Academic assessment is a curricular issue, and therefore is the responsibility of the

faculty. Faculty-driven assessment is instigated, designed, conducted, analyzed, interpreted

and acted upon by the faculty. Regular assessment of student achievement of student

learning outcomes is used to develop improvement strategies and demonstrate our

accountability for our students’ learning.

5. The independence of instructors, departments, and divisions in approaches to assessment

is crucial to the assessment process. Just as individual faculty value autonomy in assessing

course-level objectives, the faculty value independence in assessing curriculum-wide

student learning outcomes at JCCC. For assessment to be meaningful, it should be localized

in the departments, measure student success using tools the faculty in a discipline respect

General Education Assessment Plan

4 | P a g e

and generate information a department can use to improve student learning or update the

curriculum.

6. Assessment results will not be used for evaluation of individual faculty.

7. Assessment data will not be used to make comparative judgments across departments or

divisions. Assessment data is intended to be used to for the facilitation of student,

curricular, and college development and is not intended for comparative judgments. For

assessing student learning outcomes across the college, results are looked at in the

aggregate.

8. Successful assessment requires institutional support and resources. Planning and

implementing assessment activities depend on the availability of faculty time, resources and

support to produce meaningful results over a sustained period of time. Ongoing assessment

efforts within departments and divisions require institutional funding for appropriate

staffing, faculty time, and materials.

General Education Assessment Plan

5 | P a g e

JCCC Statement of General Education

General education at JCCC combines essential thinking skills with knowledge from areas such as

the arts, communication, humanities, language, mathematics, natural sciences, and social

sciences. It prepares students to become lifelong learners capable of making informed, ethical

decisions in an increasingly complex and diverse global community.

The eight student learning outcomes (SLOs) adopted by the college expect students who pursue

a course of study at JCCC will be expected to:

1) Access and evaluate information from credible sources.

2) Collaborate respectfully with others.

3) Communicate effectively through the clear and accurate use of language.

4) Demonstrate an understanding of the broad diversity of the human

experience.

5) Process numeric, symbolic, and graphic information.

6) Comprehend, analyze, and synthesize written, visual and aural material.

7) Select and apply appropriate problem-solving techniques.

8) Use current technology efficiently and responsibly.

General Education Assessment Plan

6 | P a g e

General Education Designation

The approval process for courses requesting general education designation requires the

offering to address the following criteria:

1. Is the course a college-level survey course?

The course is not remedial/developmental; focuses on foundational concepts, methods

and theories of use to students beyond specific careers; does not require discipline-

specific knowledge; reflects the Johnson County Community College Statement of

General Education.

2. How does the course build on college-level reading, writing, speaking and/or

mathematical skills, including symbolic, information and visual literacies?

3. How does the course present students with concepts, theories or knowledge that can

be applied across disciplines?

The course presents students with opportunities for analysis and problem solving;

enhances students’ ability to conduct critical inquiry, including skills in researching,

evaluating, applying and synthesizing information; examines various methods of inquiry;

and encourages original thinking.

4. How does the course promote an understanding of knowledge and learning as

collaborative and shaped by social interaction and influences?

The course encourages students to consider a variety of opinions and methods of

research when developing an informed personal viewpoint; enhances students’ ability

to participate in their communities as ethical, independent, reflective and responsible

citizens; supports students to recognize personal, cultural and institutional values; and

encourages students’ responsibility to self and others.

5. How does the course enhance students’ understanding of the human experience? The

course expands students’ understanding of those concepts and theories that shape

society, history, and culture, both nationally and globally; and introduces a variety of

historical, cultural, political, aesthetic, scientific, ethical, or analytical perspectives.

6. How does the course encourage students to consider and construct multiple

perspectives?

The course provides students multiple critical frameworks for considering contemporary

social, ethical, technological, and scientific topics; teaches methods of applying

General Education Assessment Plan

7 | P a g e

information across disciplines; encourages students to develop an informed personal

viewpoint; provides students with the opportunity to analyze and interpret information

from a variety of perspectives.

General Education Assessment

The General Education Assessment Plan of the college was built upon the overall assessment

culture of the institution, the work of the General Education Task Force and the Assessment

Council. The assessment of general education courses at the college includes both direct and

indirect forms of assessment taken over multiple classes and over multiple points in time.

Direct Assessment

The direct form of assessment takes place in the classroom in the form of embedded

assessments. The tools of embedded assessment fall in to three broad categories and are tied

to specific general education student learning outcomes: 1) Pre/Posttest of content

knowledge; 2) Rubrics designed to measure student artifacts; 3) Questions or Assignments

embedded within coursework.

Embedded assessment is a particularly efficient approach to measure student learning because

it makes use of tasks instructors already assign in their courses, thereby being reflective of what

takes place in the course; allowing the results to be used with confidence to drive improvement

in the curriculum. A variety of assessment methods are used by faculty for embedded

assessments and vary across disciplines. These can include student artifacts such as writing

assignments, exam questions, original artwork, and performances.

Indirect Assessment

The indirect form of assessment used for the purposes of general education student learning

outcomes is a series of questions within the Community College Survey of Student Engagement

(CCSSE). Survey items from CCSSE represent empirically confirmed "good practices" in

undergraduate education. That is, they reflect behaviors by students and institutions that are

associated with desired outcomes of college. CCSSE doesn’t assess student learning directly, but

survey results point to areas where the college is performing well and aspects of the

undergraduate experience that could be improved.

CCSSE is administered in the spring semester to mostly returning students and asks them to

reflect on institutional practices and student behaviors. Specifically the college will examine

General Education Assessment Plan

8 | P a g e

student perceptions of their experience related to over-arching general education skills and

compare these results to national norms for community colleges. The survey questions to be

used include the following specific responses from CCSSE:

“How much has your experience at this college contributed to your knowledge, skills, and

personal development in the following areas?

Acquiring a broad general education

Writing clearly and effectively

Speaking clearly and effectively

Thinking critically and analytically

Solving numerical problems

Using computing and information technology

Working effectively with others

Understanding people of other racial and ethnic backgrounds”

General Education Assessment Plan

9 | P a g e

Distribution of General Education Curriculum The general education curriculum is distributed through 33 departments of the college. By

meeting the college’s general education requirements, a student will be exposed to key

concepts in a range of areas, including the sciences, mathematics, social sciences, humanities,

communication and fine arts, physical education and computer skills. A general education

course offers students concepts, frameworks or patterns of thinking that transcend disciplines

or career fields and allow students to think more deeply about issues. The distribution of

courses across these broad categories is reflected in the following chart.

0

10

20

30

40

50

60

Comm. Humanities SS/Econ Sci/Math Physical Ed ComputerSkills

General Education Categories

Fall 2013 Data

# of

Courses

General Education Assessment Plan

10 | P a g e

Mapping the General Education Curriculum In the Fall 2013 semester, faculty teaching courses designated as satisfying general education

requirements were asked to identify the primary student learning outcome addressed in the

course. Provided below is a mapping of the general education curriculum to the 8 student

learning outcomes. The nearly 200 courses were distributed across 33 departments. Student

learning outcomes 4, 5 and 6 make up the majority of the primary learning outcome chosen by

the departments; however, all student learning outcomes received coverage in the mapping.

Timelines for Assessing the Curriculum In addition to mapping the general education curriculum to the student learning outcomes,

faculty were tasked with identifying and implementing common assessment instruments in

their disciplines to be used across sections for the purpose of measuring the primary student

learning outcome for the general education courses. This method provides the best

mechanism to get meaningful and consistent results for assessment general education courses.

All faculty members teaching a general education course are required to participate in

assessment activities. For those departments offering a large number of sections, schedules

were established at the department level to rotate participation over a three year period.

0

10

20

30

40

50

60

70

80

90

SLO1 SLO2 SLO3 SLO4 SLO5 SLO6 SLO7 SLO8

General Education CoursesMapped to Student Learning Outcomes

Fall 2013 Data

# of

Courses

General Education Assessment Plan

11 | P a g e

Departments and Divisions will collect data each academic year and provide feedback on the

implications of the data on the curriculum. Curriculum impact might include changes to

courses, new textbooks, or overall curriculum changes planned for the coming academic year.

Aggregated data reporting will be completed by the Office of Outcomes Assessment. Reports

will be generated at the institutional level by student learning outcome. Aggregated data

reports will be shared with the divisions/departments, Vice President for Academic Affairs,

Assessment Council and external agencies as warranted.

Timeline for General Education Assessment Submission

Date Action Report Recipient

August (or before) Designate courses to gather general

education assessment data for

coming Academic Year.

Form A (pg. 13) Division/Department

Liaison

Office of Outcomes

Assessment

December/January Fall data submitted by course to

department/division designee

Form B (pg. 14) Division/Department

designee

January Faculty members discuss

implications of the Fall data.

Discussion captured

and submitted with

final report

Department designee

May/June Spring data submitted by course to

department/division designee.

Form B (pg. 14) Division/Department

designee

June/July Department/Course data aggregated

from the Fall and Spring terms.

Implications of data reported and

action plans developed based on

data.

General Education

Mastery Matrix

(Form C, pgs. 17-24))

Division/Department

Liaison

Office of Outcomes

Assessment

July Institution-wide data aggregated by Student Learning Outcome.

Reports generated and disseminated.

Divisions, Departments, Vice President for Academic Affairs, Assessment Council

August Faculty members discuss

implications of the Academic Year

data.

Report findings,

discussions,

curriculum decisions

Division/Department

Liaison

Office of Outcomes

Assessment

Cycle begins again

General Education Assessment Plan

12 | P a g e

Fall/Spring - GE Courses designated

collect data

Data collected is forwarded to

department/division designee

Results are aggregated and dissiminated to

the faculty in the department/division

Faculty engage in discussions on the

implication of the data

Results of discussion and aggregated data sent to the Office of

Outcomes Assessment

Faculty/Departments plan and act on results

of data

Aggregated Assessment Results disseminated to key

stakeholders

General Education

Assessment Cycle

General Education Assessment Plan

13 | P a g e

Assessment Benchmarks The General Education student learning goals provide a framework of assessment to inform curricular

change and to inform the institution on the state of student learning in the general education program.

The benchmarks noted below are at the institutional level and may vary from the benchmarks designated

by the department/division teaching courses in the General Education curriculum. The targets established

below are initial targets for the institution based on performance data from like institutions and may be

adjusted after the initial year of data collection.

Assessment Methodologies

Direct Assessment of Student Learning Outcomes

Institutional Benchmark

1) Access and evaluate information from credible sources.

2) Collaborate respectfully with others. 3) Communicate effectively through the clear and

accurate use of language. 4) Demonstrate an understanding of the broad

diversity of the human experience. 5) Process numeric, symbolic, and graphic

information. 6) Comprehend, analyze, and synthesize written,

visual and aural material. 7) Select and apply appropriate problem-solving

techniques. 8) Use current technology efficiently and responsibly.

Student Performance

Mastery - At least 10-15% of students should gain mastery of SLOs. Progressing – At least 65-70% of students should be progressing on SLOs. Low/No Mastery – Less than 20% of students should exhibit low or no mastery of SLOs. NOTE: Departments should discuss and

decide on criteria for Mastery,

Progressing, and Low/No Skills in general

education courses in your department.

Indirect Assessment of Student Learning Outcomes

Institutional Benchmarks

Survey Results from CCSSE question(s): “How much has your experience at this college contributed to your knowledge, skills, and personal development in the following areas?

Acquiring a broad general education

Writing clearly and effectively

Speaking clearly and effectively

Thinking critically and analytically

Solving numerical problems

Using computing and information technology

Working effectively with others

Understanding people of other racial and ethnic

backgrounds”

Student Perception

National 50th Percentile of Community College Students

responding to CCSSE questions

General Education Assessment Plan

14 | P a g e

Glossary of Terms

Assessment A continuous cycle of deliberative activities aimed at

monitoring and improving student learning; the

intentional collection of data that documents the level

of student learning of targeted core learning outcomes.

Direct Assessment Evidence that is produced directly from students’ work

that serves to measure a student learning goal. For

example: Portfolios, performances, papers, lab reports,

etc.

General Education Assessment Plan An assessment plan that uses direct and indirect

assessments to measure specific learning goals from

courses that meet the general education requirements.

General Education Curriculum A set of courses that all students must successfully

complete to graduate. These courses are designed to

guide students to essential thinking skills with

knowledge from areas such as the arts,

communication, humanities, language,

mathematics, natural sciences, and social sciences.

Indirect Assessment Techniques used that indirectly measure student

learning goals, such as surveys that measure student

perception of learning.

Rubric Expectations for student work; set of criteria used to

assess a student artifact such as a paper, presentation,

or assignment; scoring guidelines, including levels of

performance.

Student Artifacts Examples of academic work, produced by students, that

demonstrates their ability to fulfill a student learning

goal.

Student Learning Goal Student Learning Goals are statements that specify

what students will know, be able to do, or be able to

demonstrate when they have completed or participate

in a program/activity/course/project.

General Education Assessment Plan

15 | P a g e

Form A

Yearly Course Designation Form for Assessment of

General Education Student Learning Outcomes

Form B

Sample Reporting Template for Divisions/Departments

(For internal reporting of data)

General Education Assessment Plan

16 | P a g e

Form A

AY ____ General Education Assessment Plan Courses designated for Data Collection

Program/Department: _______________________________ Course # Sections

(Provide CRN) Assessed

Semester Data Collected

SLO being Assessed

Common Assessment Activity Scoring Mechanism Data Gathering Process

Legend: SLO = Student Learning Outcomes; Common Assessment Activity = reference page 27 in Appendix for examples; Scoring Mechanism = Rubric, dichotomous answers,

scaled response; Data Gathering Process = For example all students in x sections, or sampling of X students in X sections.

General Education Assessment Plan

17 | P a g e

Form B Sample Report for Internal Data submitted to Departments/Divisions

Course(s)

Assessment Method

Pre/Post – Rubric – Embedded (Other)

Mastery

#Mastery/# in Courses

Progressing

#Progressing/# in Courses

Low/No Mastery

#Low-No/# in Courses Provide Scoring Parameters for Mastery:

Provide Scoring Parameters for Progressing:

Provide Scoring Parameters Low/No Mastery:

Most significant assessment findings? (Pedagogical, curricular changes). Please report on actions taken and on ongoing assessment plans. (Assessment reporting templates can be found in the appendix.)

General Education Assessment Plan

18 | P a g e

Form C

8 General Education Mastery Matrix

(1 per Student Learning Outcome)

NOTE: Departments should discuss and decide on

criteria for Mastery, Progressing, and Low/No Skills in

general education courses.

General Education Assessment Plan

19 | P a g e

Form C General Education Mastery Matrix

SLO # 1 – Access and evaluate information from credible sources

Course(s)

(Aggregate

reporting of all sections of a

course)

Assessment Method

Pre/Post – Rubric - Embedded

Total Number of Students Assessed in

Courses being Reported

Mastery Definition: Student uses appropriate methods to find and evaluate information in order to distinguish between credible and non-credible sources and select the most appropriate source(s).

Progressing Definition: Student is able to differentiate between credible and non-credible sources, but not understanding all aspects of relevant tools or evaluating the source fully.

Low/No Mastery Definition: Student cannot consistently differentiate between credible and non-credible sources.

# in Class Attaining Mastery # in Class Progressing # in Class with Low/No Mastery

Most significant assessment findings? (Pedagogical, instructional or curricular changes). Please report on actions taken and ongoing assessment plans. (Assessment reporting templates can be found in the appendix.)

General Education Assessment Plan

20 | P a g e

Form C General Education Mastery Matrix

SLO # 2 – Collaborate respectfully with others

Course(s)

(Aggregate

reporting of all sections of a

course)

Assessment Method

Pre/Post – Rubric - Embedded

Total Number of Students Assessed in

Courses being Reported

Mastery Definition: Student engages team members in ways that facilitate their contributions to meetings by both constructively building upon and synthesizing the contributions of others.

Progressing Definition: Student engages team members in ways that facilitate their contributions to meetings by restating the views of other team members and/or asking questions for clarification

Low/No Mastery Definition: Student exhibits low commitment to collaboratively work with team members.

# in Class Attaining Mastery # in Class Progressing # in Class with Low/No Mastery

Most significant assessment findings? (Pedagogical, instructional or curricular changes). Please report on actions taken and ongoing assessment plans. (Assessment reporting templates can be found in the appendix.)

General Education Assessment Plan

21 | P a g e

Form C General Education Mastery Matrix

SLO # 3 – Communicate effectively through the clear and accurate use of language

Course(s)

(Aggregate

reporting of all sections of a

course)

Assessment Method

Pre/Post – Rubric - Embedded

Total Number of Students Assessed in

Courses being Reported

Mastery Definition: Student consistently presents ideas, information and concepts clearly without error that detracts from communicating accurately with the target audience.

Progressing Definition: Student generally presents ideas, information and concepts clearly, but with error that decreases accuracy of the message communicated with the target audience.

Low/No Mastery Definition: Student presents ideas, information and concepts unclearly, with errors that obscure the message communicated with the target audience.

# in Class Attaining Mastery # in Class Progressing # in Class with Low/No Mastery

Most significant assessment findings? (Pedagogical, instructional or curricular changes). Please report on actions taken and ongoing assessment plans. (Assessment reporting templates can be found in the appendix.)

General Education Assessment Plan

22 | P a g e

Form C General Education Mastery Matrix

SLO # 4 – Demonstrate and understanding of the broad diversity of the human experience and the individual’s connection to society.

Course(s)

(Aggregate

reporting of all sections of a

course)

Assessment Method

Pre/Post – Rubric - Embedded

Total Number of Students Assessed in

Courses being Reported

Mastery Definition: Student demonstrates evidence of adjustment in attitudes and beliefs. Learning from diversity of communities and cultures.

Progressing Definition: Student has awareness that attitudes and beliefs are different from those of other cultures and communities.

Low/No Mastery Definition: Student expresses attitudes and beliefs as an individual. Is indifferent or resistant to what can be learned from diversity of communities and cultures

# in Class Attaining Mastery # in Class Progressing # in Class with Low/No Mastery

Most significant assessment findings? (Pedagogical, instructional or curricular changes). Please report on actions taken and ongoing assessment plans. (Assessment reporting templates can be found in the appendix.)

General Education Assessment Plan

23 | P a g e

Form C General Education Mastery Matrix

SLO # 5 – Process numeric, symbolic and graphic information

Course(s)

(Aggregate

reporting of all sections of a

course)

Assessment Method

Pre/Post – Rubric - Embedded

Total Number of Students Assessed in

Courses being Reported

Mastery Definition: Student provides accurate explanations/ processing of mathematical forms. Makes appropriate inferences based on that information.

Progressing Definition: Student provides somewhat accurate explanations/ processing of mathematical forms. but occasionally makes minor errors related to computations or units.

Low/No Mastery Definition: Student attempts to explanations/processing of mathematical forms, but draws incorrect conclusions about what the information means, or commits major computational errors.

# in Class Attaining Mastery # in Class Progressing # in Class with Low/No Mastery

Most significant assessment findings? (Pedagogical, instructional or curricular changes). Please report on actions taken and ongoing assessment plans. (Assessment reporting templates can be found in the appendix.)

General Education Assessment Plan

24 | P a g e

Form C General Education Mastery Matrix

SLO # 6 – Read, analyze and synthesize written, visual and aural material

Course(s)

(Aggregate

reporting of all sections of a

course)

Assessment Method

Pre/Post – Rubric - Embedded

Total Number of Students Assessed in

Courses being Reported

Mastery Definition: Student consistently recognizes and can communicate implications of the material for contexts, perspectives, or issues.

Progressing Definition: Student draws basic inferences about context and purpose of the material.

Low/No Mastery Definition: Student exhibits little to no understanding of material.

# in Class Attaining Mastery # in Class Progressing # in Class with Low/No Mastery

Most significant assessment findings? (Pedagogical, instructional or curricular changes). Please report on actions taken and ongoing assessment plans. (Assessment reporting templates can be found in the appendix.)

General Education Assessment Plan

25 | P a g e

Form C General Education Mastery Matrix

SLO # 7 – Select and apply appropriate problem-solving techniques

Course(s)

(Aggregate

reporting of all sections of a

course)

Assessment Method

Pre/Post – Rubric - Embedded

Total Number of Students Assessed in

Courses being Reported

Mastery Definition: Most of the time, a student is able to identify specific problems to be solved and apply an appropriate process and sequence to achieve the solution.

Progressing Definition: More than half the time, a student is able to identify specific problems to be solved and apply an appropriate process and sequence to achieve the solution.

Low/No Mastery Definition: Student is rarely or never able to identify specific problems to be solved and apply and appropriate process and sequence to achieve the solution.

# in Class Attaining Mastery # in Class Progressing # in Class with Low/No Mastery

Most significant assessment findings? (Pedagogical, instructional or curricular changes). Please report on actions taken and ongoing assessment plans. (Assessment reporting templates can be found in the appendix.)

General Education Assessment Plan

26 | P a g e

Form C General Education Mastery Matrix

SLO # 8 – Use Technology efficiently and responsibly

Course(s)

(Aggregate

reporting of all sections of a

course)

Assessment Method

Pre/Post – Rubric - Embedded

Total Number of Students Assessed in

Courses being Reported

Mastery Definition: Most of the time the student demonstrates the ability to use technology efficiently and responsibly.

Progressing Definition: Some of the time the student demonstrates the ability to use technology efficiently and responsibly.

Low/No Mastery Definition: Student rarely or never demonstrates the ability to use technology efficiently and responsibly.

# in Class Attaining Mastery # in Class Progressing # in Class with Low/No Mastery

Most significant assessment findings? (Pedagogical, curricular changes). Please report on actions taken and ongoing assessment plans. (Assessment reporting templates can be found in the appendix.)

General Education Assessment Plan

27 | P a g e

Resources for Assessment

General Education Assessment Plan

28 | P a g e

Question

Plan

Collect & Score

Analyze & Discuss

Report & Act

Step 1. Meaningful assessment starts with a question! What do I want to know about my students

and why is it important? This is an important step whether the assessment will take place

at the course, program, or general education level.

Step. 2. Plan. How will you answer your question?

What type of data will help you answer your question?

What is a realistic way to collect the data?

How will student work be evaluated?

Step. 3. Collect and Score. How will you collect and score the student work?

How will you manage the data?

Overall how many assignments/sections will the data be collected?

Step 4. Analyze and Discuss. What do you observe about the work you and your colleagues have

scored?

What is the best way to organize the data you have collected?

How will you analyze the data?

What is the data telling you?

o Discuss the data with colleagues to glean meaning from the data and

discuss next steps.

Step 5. Report and Act. Determine how you will use the data to improve student learning and

report your findings.

Act on your plan.

Cycle of

Assessment

General Education Assessment Plan

29 | P a g e

Timelines for Executing Assessment Plans

1-3 months prior to the semester in which you intend to collect your data:

Decide who will lead the assessment efforts in your area.

Determine which learning outcome(s) to assess.

Survey the potential tools you might use to conduct the assessment for the learning outcome(s),

such as rubrics (for scoring projects, presentations, papers), test(s), etc.

Solicit participation from instructors in appropriate courses.

Collaborate in designing your scoring tool and planning, conferring with participants.

Discuss criteria for Mastery, Progressing, and Low/No Skills in general education courses in your

department.

Set your expectations of student performance (before you gather data)

During the period designated for data collections:

Distribute required materials and reminders to those instructors involved in collecting the data.

Collect results / raw data from rubrics and / or test(s).

Create a method to organize and store data, e.g. tables, spreadsheet, database. This will

provide a consistent method of gathering and reviewing data and results over time.

Upon collecting the data:

Sort and tabulate data.

Discuss Data with your colleagues:

o Review your objective and identify relevant history regarding your objective.

o Summarize data.

o Record observations about data, e.g., gaps, successes, relationships, anomalies or

provocative data, and/or possible future questions suggested by this data.

Consult with the appropriate institutional departments (i.e. Assessment or Institutional

Research) for additional insights on your data.

After Collecting the Data:

Analyze and Act on the findings

o Determine implications for instruction

o Identify areas for improvement (curriculum, instruction, materials, placement, scheduling,

etc.)

o Plan for implementation (prioritize, specify actions, identify immediate as well as long-range

plans, etc.)

Evaluate the assessment and assessment tool

o Identify benefits of the assessment

o Recognize challenges with the assessment

o Determine changes to be made with the assessment tool and process

General Education Assessment Plan

30 | P a g e

Assessment Methods

Selecting appropriate means for assessing student learning is a crucial step in the cycle of assessment.

There are many different ways to assess student learning, below are some different types of assessment

approaches.

Direct versus Indirect Measures of Assessment

Direct measures of assessments require students to represent, produce or demonstrate their learning.

Student portfolios, capstone projects, student performances, case studies, embedded assessments,

standardized instrument, or oral exams all provide direct evidence of student learning. Indirect

measures share information about students’ perceptions about their learning experience and attitudes

towards the learning process. Informal observations of student behaviors focus groups, alumni surveys,

curriculum and syllabi analysis, exit interviews, and evaluation of retention rates are some examples.

Objective versus Performance Assessment

Objective assessments such as short answer, completion, multiple-choice, true-false, and matching tests

are structured tasks that limit responses to brief words or phrases, numbers or symbols, or selection of a

single answer choice among a given number of alternatives (Miller & Linn, 2005). Objective assessments

capture information about recall of factual knowledge and are less useful for assessing higher-order

thinking due to their structured response format that allows for only one best answer. Performance

assessments allow for more than one correct answer. They require students to respond to questions by

selecting, organizing, creating, performing and/or presenting ideas. For this reason, performance

assessments are better at measuring higher-order thinking. However, these assessments are often less

reliable than objective assessments since they require expert judgment to score responses.

Embedded and Add-On Assessment

Embedded assessments are tasks that are integrated into specific courses. They usually involve

classroom assessment techniques but are designed to collect specific information on program learning

outcomes. These assessments are typically graded by course instructors and then pooled across sections

to evaluate student learning at the program level. Embedded assessments are highly recommended.

They are easy to develop and to administer and can be directly linked to the program’s curriculum and

learning outcomes. Additionally, students are usually more motivated to show what they are learning

since embedded assessments are tied to the grading structure in the course.

Local versus Standardized Assessment

Local assessments are instruments developed by faculty members within a program for internal use

only. They are helpful in assessing standard-based questions (i.e., whether or not students are meeting

objectives within the program), because they can be directly linked to program learning outcomes.

Standardized assessments are published instruments developed outside of the institution. They rely on a

standard set of administration and scoring procedures and because of this are often times more reliable.

These assessments provide information about how students in a program compare to students at other

General Education Assessment Plan

31 | P a g e

peer institutions or to national/regional norms and standards. Knowing what you want to assess is key in

the selection of standardized instruments. This includes making sure that these assessments contain

enough locally relevant information to be useful. It is also means that norms should be comparable in

terms of the institution’s size, mission and student population in order to draw valid conclusions.

Although standardized assessments are primarily used to generate benchmarking information, they are

sometimes used to answer standards-based questions. If you decide to use a standardized assessment

for this purpose, make sure that the test content aligns with your learning outcomes, otherwise

interpretations will be invalid. Secondly make sure results are reported in the form of subscales so that

you can identify where improvements need to be made. Testing companies should be able to provide

you with this information.

General Education Assessment Plan

32 | P a g e

Examples of Direct Assessment Methods

Assessment Method Description

Capstone Projects Culminating projects that provide information about how students integrate, synthesize and transfer learning

Assesses competence in several areas May be independent or collaborative Focus on higher order thinking Are useful for program-level assessment Examples: exams, integrative papers, projects, oral reports, performances Typically disciplined based Scoring Method: Pre-Specified rubrics

Course-Embedded Assessment

Assessment procedures that are embedded into a course’s curriculum May include test items or projects May be take-home or in-class Usually developed by the faculty member or department Can be used assesses discipline-specific knowledge Scoring methods: Raw scores or pre-specified rubrics

Performance Assessment Use student activities to assess skills and knowledge Assess what students can demonstrate or produce Allow for the evaluation of both process and product Focus on higher order thinking Examples: Essay tests, artistic productions, experiments, projects, oral

presentations Scoring Methods: Pre-Specified rubrics

Portfolio Assessment Collection of student work over time that is used to demonstrate growth and achievement

Usually allows student to self-reflect on incorporated work May include written assignments, works of art, collection of projects, programs,

exams, computational exercises, video or other electron media, etc. Focus on higher-order thinking Scoring Methods: Pre-Specified rubrics

Standardized Instruments (This includes tests for national tests for certificate programs, or licensure)

Instruments developed outside the institution with standardized administration and scoring procedures and frequently with time restrictions

Psychometrically tested based on norming group Sometimes allows for national comparisons Scoring Methods: Answer key, scored by testing company

Localized Instruments Instruments within the university usually developed within the department for internal use only potentially to prepare students for national test

Content may be tailored to match outcomes exactly Scoring Methods: Answer key, scored internally

General Education Assessment Plan

33 | P a g e

Rubrics

Rubrics can be very helpful in providing a tool to capture data for assessment purposes, as well as

communicate with students parameters of an assignment. The assessment data that is gathered may be

only a portion of what is reflected on a rubric, or the entire rubric. In the example rubric below, the only

portion of the rubric being collected for assessment purposes involved “Solving Problems”.

From “Introduction to Rubrics” by Stevens and Levi – Development of a Rubric

Part 1 – Task Description – usually based on a description from the course syllabus. Is reflective

of a task that we are asking the student to perform (paper, lab, poster, performance, etc.)

Part 2 – Scale – describes how well or poorly any given task has been performed by the student

Part 3 – Dimensions – provides the parts of the task simply and completely, clarifying for the

student how their task can be broken down in to components (this may also be weighted

according to importance by the faculty member)

Part 4 – Description of the Dimensions – should, at a minimum, contain a description of the

highest level of performance, but is best used with a range of performance descriptions.

Copies of the AAC&U Value Rubrics and other rubric examples are available from the Office of Outcomes

Assessment (GEB 262).

General Education Assessment Plan

34 | P a g e

Additional assessment resources are available from the following sources:

http://blogs.jccc.edu/outcomesassessment

https://infoshare.jccc.edu/communities/slo/default.aspx

Professional Development Days – Assessment workshops are offered during fall and spring

professional development days at the college.

Brown Bag Brownie Breaks (BBBB) – Faculty driven seminars explore topics on assessment

during the semester over brownies and a soda.

World Café – Offered as part of the professional development days in the fall and spring.

World Café offers faculty and departments the opportunity to participate in round-table

discussions or serve a table for departments or assessment teams to work on assessment

activities such as brainstorming, writing an assessment plan or analyzing data.

Coffee Breaks – Monthly informal discussions about assessment which are open to all faculty

members (complete with a free cup of coffee).

Outcomes Assessment Mini-Grants – Grants available from the Office of Outcomes Assessment

for faculty in the instructional branch. Funding is available in support of evidence-based

initiatives to assess student learning outcomes in credit courses. Awards are up to $750.

The Office of Outcomes Assessment Resource Library

GEB 262, x7607

Office of Outcomes Assessment204A OCB | 913-469-7607

blogs.jccc.edu/outcomesassessment

jccc.edu/faculty-development/outcomes-assessment