55
Assessment 101 Elizabeth Bledsoe

Elizabeth Bledsoe. Introduction to Assessment Components of an Assessment Plan Mission Outcomes Measures Achievement Targets Question and

Embed Size (px)

Citation preview

Assessment 101

Elizabeth Bledsoe

Introduction to Assessment Components of an Assessment Plan

Mission Outcomes Measures Achievement Targets

Question and Answer Session

Agenda

SACS Comprehensive Standard 3.3.1

3.3 Institutional Effectiveness

3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness)

3.3.1.1 educational programs, to include student learning outcomes

3.3.1.2 administrative support services

3.3.1.3 educational support services

3.3.1.4 research within its educational mission, if appropriate

3.3.1.5 community/public service within its educational mission, if appropriate

SACS Expectations

SACS Comprehensive Standard 3.3.1

3.3 Institutional Effectiveness

3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness)

3.3.1.1 educational programs, to include student learning outcomes

3.3.1.2 administrative support services

3.3.1.3 educational support services

3.3.1.4 research within its educational mission, if appropriate

3.3.1.5 community/public service within its educational mission, if appropriate

SACS Expectations

and provides evidence of improvement based on analysis of the results…

The Assessment Circle

Develop Program Mission & Outcomes

Design an Assessme

nt Plan

Implement the Plan & GatherInformati

on

Interpret/Evaluate

Information

Modify & Improve

Adapted from: Trudy Banta, IUPUI

Develop Mission and Outcomes

Develop Program Mission & Outcomes

The mission statement links the functions of your unit to the overall mission of the institution.

A few questions to consider in formulating the mission of your unit:

What is the primary function of your unit? What core activities are involved? What should those you serve experience after

interacting with your unit?

Mission Statement

Brief, concise, distinctive Clearly identifies the program’s purpose Clearly aligns with the mission of the division and the

University Explicitly articulates the essential functions/activities of

the program Clearly identifies the primary stakeholders of the

program: i.e., students, faculty, parents, etc.

Characteristics of a Well-Defined Mission Statement

“The primary purpose of the Office of Academic Advising is to

assist students in the development and implementation of their

educational plans. To this end the Office of Academic Advising

subscribes to the philosophy of developmental advising; advising

is a cooperative effort between advisor and student that consists

not only of course planning and selection, but the development of

the person as a whole. This includes the selection of career and

life-long goals.” (University of La Verne)

Example of a Mission Statement

There are two categories of outcomes:

Learning Outcomes

Program Outcomes

Outcomes

When writing Learning Outcomes, the focus must be on the students and what they will think, know, do, or value as a result of participation in the educational environment.

Learning Outcomes

Cognitive Learning

Knowledge - to recall or remember facts without necessarily understanding them

articulate, define, indicate, name, order, recognize, relate, recall, reproduce, list, tell, describe, identify, show, label, tabulate, quote

Comprehensive - to understand and interpret learned information

classify, describe, discuss, explain, express, interpret, contrast, associate, differentiate, extend, translate, review, suggest, restate

Application - to put ideas and concepts to work in solving problems

apply, compute, give examples, investigate, experiment, solve, choose, predict, translate, employ, operate, practice, schedule

Analysis - to break information into its components to see interrelationships

analyze, appraise, calculate, categorize, compare, contrast, criticize, differentiate, distinguish, examine, investigate, interpret

Synthesis - to use creativity to compose and design something original

arrange, assemble, collect, compose, construct, create, design, formulate, manage, organize, plan, prepare, propose, set up

Evaluation - to judge the value of information based on established criteria

appraise, assess, defend, judge, predict, rate, support, evaluate, recommend, convince, conclude, compare, summarize

Affective Learningappreciate, accept, attempt, challenge, defend, dispute, join, judge, praise, question, share, support

Students who participate in career counseling will be able to define the next step(s) in their career development process.

Students will identify various aspects of architectural diversity in their design projects.

Examples of Learning Outcomes

Process statements Relate to what the unit intends to accomplish

• Level or volume of activity• Efficiency with which you conduct the processes• Compliance with external standards of “good practice in the field” or

regulations

Satisfaction statements Describe how those you serve rate their satisfaction with your unit’s

processes or services

Program Outcomes

Process statements The Registrar’s office will promptly process transcript

requests. Students will utilize the University Writing Center.

Satisfaction statements Students will report satisfaction in usefulness of the

registration system. Transfer students will report satisfaction with

admissions application processing.

Examples of Program Outcomes

Consider questions such as: What are the most important results or impacts that

should occur as a result of your unit’s activities? What are your critical work processes and how should

they function? What does the end user experience through interaction

with your unit? What should students in your degree program know, do,

and/or value?

When Writing Outcomes…

Outcomes should be: linked to the unit’s mission realistic and attainable limited in number (manageable) something that is under the control of the unit measurable and/or observable meaningful

When Writing Outcomes…

Outcomes should also: target key services or change points use action verbs

When Writing Outcomes…

19

Student Learning Outcome Example:

Student Learning Outcome A

Student Learning Outcome B

Students receiving a degree from this program will be effective communicators.

Students receiving a degree from this program will be able to effectively communicate their research findings both verbally and in writing.

20

Program Outcome Example:

Program Outcome A Program Outcome B

The Office of Student Financial Aid will respond to all meeting requests within two business days.

The Admissions Office will process applications within a timely manner.

Design an Assessment Plan

Design an Assessme

nt Plan

After establishing your outcomes…

Define and identify the sources of evidence you will use to determine whether you are achieving your outcomes.

Detail what will be measured and how Identify or create measures which can inform decisions

about your program’s processes and services.

Assessment Measures

Measurable and/or observable

You can either observe it, count it, quantify, etc.

Meaningful

It captures enough of the essential components of the objective to represent it adequately

It will yield vital information about your program

Manageable

It can be measured without excessive cost or effort

Characteristics of an Effective Assessment Measure

There are two basic types of assessment measures:

Direct Measures

Indirect Measures

Types of Assessment Measures(Palomba and Banta, 1999)

Direct measures are those designed to directly measure:

what a stakeholder knows or is able to do (i.e., requires a stakeholder to actually demonstrate the skill or knowledge)

The effectiveness and/or value of the program or process

Direct Measures

Participation data

Observation of behavior

Collection of work samples (student work)

Volume of activity

Level of efficiency (average response time)

Measure of quality (average errors)

Common Direct Measures

Indirect measures focus on:

stakeholders’ perception of their level of learning stakeholders’ perception of the benefit of programming

or intervention stakeholders’ satisfaction with some aspect of the

program or service

Indirect Measures

Surveys

Exit interviews

Retention/graduation data

Demographics

Focus groups

Common Indirect Measures

Some things to think about:

How would you describe the end result of the outcome?

How will you know if this outcome is being accomplished?

What will provide you with this information?

Where are you currently delivering the outcome?

Are there any naturally occurring assessment opportunities?

What measures are currently available?

Choosing Assessment Measures

Some more things to think about:

Will the resulting data provide information that could lead to improvement of your services?

Who will analyze the information and how easily will it fit into their regular responsibilities?

How will it fit into your budget and timeline?

Choosing Assessment Measures

An achievement target is the result, target, benchmark, or value that will represent success at achieving a given outcome.

Achievement targets should be specific numbers or trends.

Achievement Targets

95% of our users will be “very satisfied or satisfied” with our services.

90% of the transcripts will be sent within three days.

Each employee will participate in a minimum of two training/development programs per year.

Students will score a 2.5 out of 4 on the writing rubric.

Examples of Achievement Targets

Implement & Gather Information

Implement the Plan & GatherInformati

on

In a WEAVEonline Assessment Report, “Findings” refer to a concise summary of the results you gathered from a given assessment measure.

The language of this statement should parallel the corresponding achievement target.

Results should be described in enough detail to prove you have met, partially met, or not met the achievement target.

Findings

Example 1:

Achievement Target

• Overall mean score of students from program will meet or exceed the state average score of 79.

Findings

• The achievement target was met. The overall mean score of students from the Teaching, Learning, and Culture program exceeded that of the state average score of the state certification exam. Results: Program overall mean scaled score—91.50, State overall mean scaled score—79.13.

Examples of Findings Statements

Example 2: Achievement Target

• 90% of the survey results will indicate the highest level of satisfaction with each of the five services provided by the Office of Student Success.

Findings• PM 3/5—edit--- 94% of the survey respondents indicated the

highest level of satisfaction with services provided by the Office of Student Success.

Examples of Findings Statements

Interpret/Evaluate Information

Interpret/Evaluate

Information

Reflect on what has been learned during an assessment cycle:

Based on the analysis of the findings, what changes could or should be made to improve the program?

What specific findings led to this decision?

Analyzing Findings

Analyzing Findings Three key questions at the heart of the

analysis:

What did you find and learn?

So What does that mean for your academic program or support unit?

Now What will you do as a result of the first two answers?

In WEAVEonline, the Analysis Question responses provide an opportunity to explain the Findings analysis process.

Analysis Questions

Modify/Improve

Modify & Improve

After reflecting on the findings, you and your colleagues should determine appropriate action to improve the program.

Actions outlined in the Action Plan should be specific and relate to the outcome and the results of assessment.

Action Plans should not be related to the assessment process itself

Action Plans

Action Plans Establish a plan for using evidence of student learning

achievement or service quality at the program level

Establish a decision-making process for approving/implementing recommendations

Clearly identify the parties responsible for implementing the approved recommendations

Examples of Assessment Plans

Outcome: Demonstrate timeliness in processing admission applications.

Measure 1 (Direct): Random sample audit of 100 Applications for Admission received by both the Reception Desk and electronically.

Achievement Target: 90% of all Applications for Admission will be processed within two weeks of receipt of application.

Putting It All Together

Outcome: Undergraduate students who attend the "Registration Overview" during Orientation will be able to successfully utilize the University Registration system.

Measure 1 (Indirect): Online post-registration survey gathered, compiled and released to the Registrar's Office in late September.

Achievement Target: 80% of the undergraduate students who participate in the "Registration Overview" presentation will answer “Agree” and “Strongly Agree” to the statement: “The "Registration Overview" helped me understand how to register for classes” on an online post-registration survey.

Putting It All Together

Outcome: Faculty and staff members who participate in FERPA Rules and Regulations training will be able to demonstrate fundamental knowledge of FERPA rules and regulations that pertain to their roles.

Measure 1 (Direct): FERPA training posttest given to faculty and staff members who participate in the online FERPA training webcourse.

Achievement Target: 80% of the faculty and staff members will achieve a score of 90% or better on their first attempt at the FERPA training posttest.

Putting It All Together

Outcome: The Office of Admissions provides useful university information to high school counselors.

Measure 1 (Indirect): An Evaluation Survey is distributed to all participants who attend the Counselor Update seminar.

Achievement Target: 90% of high school counselors attending Counselor Update seminars will indicate that

the information provided is a “5” (excellent) on a 5 level scale.

Putting It All Together

Outcome: University has the optimal number of newly enrolled students to achieve our enrollment goals for the total number of students and credit hour production.

Measure 1 (Direct): The number of new freshmen and transfers enrolled at the conclusion of Late Registration will be used to calculate our success in achieving this outcome for the fall of 2013.

Achievement Target: A total of 2200 incoming freshmen and 1600 incoming transfer students are the enrollment targets set by the Committee for fall 2013 with a total enrollment goal of 28,000.

The Enrollment Management Committee sets targets for the optimal number of incoming students. The optimal number of newly enrolled students for each term is determined by projecting the number of continuing students who will enroll, the capacity of the University to provide instructional resources and the needs of the University for tuition revenue.

Putting It All Together

Outcome: Improve the quality of commencement ceremony for future participants.

Measure 1 (Indirect): An online post-Commencement survey administered in September 2013 (Summer 2013 graduates), January 2014 (Fall 2013 graduates), and June 2014 (Spring 2014 graduates).

Achievement Target: 95% of the recent graduates for Summer 2013, Fall 2013, and Spring 2014 will answer “Yes” to the

questions: “Was the information accurate on these aspects of your diploma: (1) your name, (2) your degree type, (3) your major?” on an online post- Commencement survey

Putting It All Together

Office of Institutional Assessment2012-2013 Assessment Report

 

Continuous Improvement

To fulfill the 2011-12 action plan to address the unmet target of 80% of conference respondents indicating satisfaction with the variety of poster sessions offered, the Office of Institutional Assessment (OIA) along with the Assessment Conference Committee (ACC) sought more variety in the posters for the 2012 Assessment Conference. As a result, the percentage of respondents satisfied with the variety of posters increased from 74% to 78%. Although the 85% target was still not met during the 2012-13 cycle, this result shows improvement towards the target.To complete the other 2011-12 action plan, OIA enhanced the Assessment Review Guidelines to include more practical and applicable “good practices” for assessment liaisons to pass along to their programs as formative assessment. Additionally, the Assessment Review Rubric was modified to be more exhaustive in its evaluation of assessment reports. As a result, less variance was observed in the quality of assessment reports. Lastly, the Vice Provost of Academic Affairs supplied each dean with a college-specific, personalized memo addressing the strengths and weaknesses of assessment reports in each college. This process was well received and will continue as a service to colleges from the Office of the Vice Provost.

 Outcome/Objective Measure Target Finding Action Plan

O 5: Provide Excellent Concurrent and Poster Sessions Provide excellent concurrent and poster sessions for participants at the Annual Assessment Conference.

M 8: Overall Assessment Conference Survey

85%, or more, of the Annual Assessment Conference attendees will report satisfaction with the Concurrent and Poster Sessions.

Status: Partially MetFollowing the end of the 13th Annual Texas A&M Assessment Conference, an on-line conference evaluation survey was sent out to all attendees. Information gained from this survey was organized into the 13th Annual Conference Survey Report, and was distributed to the Assessment Conference Committee for review. Results from the survey questions relating to Concurrent and Plenary Sessions are below: Concurrent Sessions - Question 16: "How satisfied were you with the quantity of Concurrent Sessions?" - 90.58% were "Very Satisfied" or "Satisfied" Question 17: "How satisfied were you with the variety of Concurrent Sessions?" - 83.71% were "Very Satisfied" or "Satisfied" Poster Sessions - Question 19: "How satisfied were you with the quantity of Poster Sessions?" - 77.78% were "Very Satisfied" or "Satisfied" Question 20: "How satisfied were you with the variety of Poster Sessions?" - 77.06% were "Very Satisfied" or "Satisfied"

Although we improved from the 2011-2012 findings of 73%, based on our findings from the 2012-2013 Assessment Cycle, 77% of respondents indicated that they were satisfied with the variety of poster sessions offered. In response, the Office of Institutional Assessment will seek posters from each track to provide a greater variety of posters during the 14th Annual Texas A&M Assessment Conference.

 

Use of Results

Although the satisfaction results from the conference survey related to the variety of poster sessions increased from 74% to 78%, the 85% target was still not met. In response, the Office of Institutional Assessment (OIA) and the Assessment Conference Committee (ACC) will ensure that each of the conference “tracks” has coverage in the poster session. OIA and the ACC have traditionally ensured track coverage in concurrent session offerings but has never paid close attention to track coverage in the poster session offerings. This strategy includes contacting specific authors of concurrent session proposals in underrepresented tracks and inviting them to consider a poster presentation, perhaps in addition to the concurrent session. Next, as referenced in the “Enhance Workshop Presentations” action plan for this cycle, according to our “One Minute Evaluations” results, some workshop attendees have requested to see more examples of quality assessment during the workshop. In response, OIA is enhancing our workshop presentations to include screenshots of actual assessment plans and reports from WEAVEonline to allow attendees to work through and critique assessment reports with our staff to gain a better understanding of quality assessment and reporting. One Minute Evaluations will be analyzed again next year to ensure the examples added to the workshops improve our attendees’ reported satisfaction.

You do not have to assess everything every year Modify something already being done that is

meaningful to the program Be flexible—this is an iterative process

Take-Home Messages

What was the most valuable thing you learned? What is one question that you still have? What do you think is the next step that your program

needs to take in order to implement course embedded assessment?

One-Minute Evaluation

For more information on the conference and registration, visit http://assessment.tamu.edu/conference

February 16-18, 2014

College Station, TX

The Principles of Accreditation: Foundations for Quality Enhancement. SACS COC. 2008 Edition.

Banta, Trudy W., & Palomba, C. (1999). Assessment Essentials. San Francisco: Jossey-Bass.

Banta, Trudy W. (2004). Hallmarks of Effective Outcomes Assessment. San Francisco: John Wiley and Sons.

Walvoord, Barbara E. (2004). Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education. San Francisco: Jossey-Bass.

Assessment manuals from Western Carolina University, Texas Christian University, the University of Central Florida were very helpful in developing this presentation.

Putting It All Together examples adapted from Georgia State University, the University of North Texas, and the University of Central Florida’s Assessment Plans

References