12
Office of University Assessment Page 1 of 12 Student Affairs Student Learning Outcomes (SLO) Assessment Review Brief January 2015 Context: The University Assessment Council (UAC) is charged by the Provost to review student learning outcomes assessment reports and provide feedback to the colleges and other units on these reports. Each college and select units on at the institution has a representative on this council. These efforts are essential to the University’s continuous improvement of student learning. Based on SACSCOC Standard 3.3.1.1 1 all degree/certificate programs must have student learning outcomes, hence a student learning assessment report. To fulfill the charge stated above, at least once annually the UAC members undertake an evaluative review all of these SLO reports. Purpose of this Brief: Two briefs are prepared to provide input to each college and unit about: (1) institution-wide reporting and (2) their specific unit’s report. Review Process: On December 3 rd and 4 th , 2014, 23 UAC members evaluated 270 total SLO reports from academic year 2013-14, eleven of which were from Student Affairs. The members used a previously validated rubric, SLO Assessment Report Rubric, to evaluate the 2013-14 reports (Appendix A). The SLO Assessment Report Rubric contains five separate criteria: (1) relationship between the outcome tools, (2) data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement action plan. During this review period, the members did not evaluate the ‘benchmark/target’ criterion because of a change in the reporting format. Each criterion was evaluated on a three-point scale of Meets Expectations, Emerging, or Does Not Meet. The equivalency comparison for 2012 and prior evaluations are Comprehensive, Needs Improvement, and Absent/Does Not Meet, respectively. Both evaluations, UAC members were asked to provide comments to assist the units in making improvements to their student learning process/environment. Due to length of the comments, they are not included in this brief; however, are provided within the evaluated rubric for each degree program. Description of this Brief: This brief contains: Degree Programs Evaluation Status Number of Reports Evaluated by Level Degree Program Report Status and Compliance Rating 2013-2014 Evaluation Overall Scores 2013-2014 Evaluation by Rubric Criteria Five Year Overall Score Trends Five Year Overall Score Trends by Rubric Criteria Next Steps Appendix 1 SACSCOC Standard 3.3.3.1 The institution identifies expected outcomes, assess the extent to which it achieves these outcomes, and provides evidence of the improvement based on analysis of the results in: educational programs, to include student learning outcomes (page 27, The Principles of Accreditation: Foundations for Quality Enhancement, 2012).

Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 1 of 12

Student Affairs Student Learning Outcomes (SLO) Assessment

Review Brief

January 2015

Context: The University Assessment Council (UAC) is charged by the Provost to review student learning outcomes assessment reports and provide feedback to the colleges and other units on these reports. Each college and select units on at the institution has a representative on this council. These efforts are essential to the University’s continuous improvement of student learning. Based on SACSCOC Standard 3.3.1.11 all degree/certificate programs must have student learning outcomes, hence a student learning assessment report. To fulfill the charge stated above, at least once annually the UAC members undertake an evaluative review all of these SLO reports. Purpose of this Brief: Two briefs are prepared to provide input to each college and unit about: (1) institution-wide reporting and (2) their specific unit’s report. Review Process: On December 3rd and 4th, 2014, 23 UAC members evaluated 270 total SLO reports from academic year 2013-14, eleven of which were from Student Affairs. The members used a previously validated rubric, SLO Assessment Report Rubric, to evaluate the 2013-14 reports (Appendix A). The SLO Assessment Report Rubric contains five separate criteria: (1) relationship between the outcome tools, (2) data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement action plan. During this review period, the members did not evaluate the ‘benchmark/target’ criterion because of a change in the reporting format. Each criterion was evaluated on a three-point scale of Meets Expectations, Emerging, or Does Not Meet. The equivalency comparison for 2012 and prior evaluations are Comprehensive, Needs Improvement, and Absent/Does Not Meet, respectively. Both evaluations, UAC members were asked to provide comments to assist the units in making improvements to their student learning process/environment. Due to length of the comments, they are not included in this brief; however, are provided within the evaluated rubric for each degree program. Description of this Brief: This brief contains:

Degree Programs Evaluation Status Number of Reports Evaluated by Level Degree Program Report Status and Compliance Rating 2013-2014 Evaluation Overall Scores 2013-2014 Evaluation by Rubric Criteria Five Year Overall Score Trends Five Year Overall Score Trends by Rubric Criteria Next Steps

Appendix

1 SACSCOC Standard 3.3.3.1 The institution identifies expected outcomes, assess the extent to which it achieves these outcomes, and provides evidence of the improvement based on analysis of the results in: educational programs, to include student learning outcomes (page 27, The Principles of Accreditation: Foundations for Quality Enhancement, 2012).

Page 2: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 2 of 12

DEGREE PROGRAMS EVALUATION STATUS

Program Degree Level Status

Campus Recreation Student Affairs Submitted

Counseling & Testing Center Student Affairs Submitted

Dining Services Student Affairs Exempt

Disability Resource Center Student Affairs Exempt

Fraternities & Sororities Student Affairs Submitted

New Student and Parent Programs Student Affairs Submitted

Off Campus Student Services Student Affairs Submitted

Residence Life Student Affairs Submitted

Student Center Student Affairs Submitted

Student Conduct Student Affairs Submitted

Student Involvement Student Affairs Submitted

Substance Education & Responsibility (SEAR) Student Affairs Submitted

Violence Intervention & Prevention Center Student Affairs Submitted

Page 3: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 3 of 12

NUMBER OF REPORTS EVALUATED BY LEVEL

DEGREE PROGRAM REPORT STATUS

COLLEGE COMPLIANCE RATING

100%

0 0 0 0 0

11

0%0%0%0%0%

100%

Bachelor Master Doctor

Specialist Certificate Other

11

0

2

Submitted Not Submitted Exempt

85%

0%15%

Submitted Not Submitted Exempt

Page 4: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 4 of 12

2013-2014 EVALUATION OVERALL SCORES

College/Unit Degree Level Program

Student Affairs Campus Recreation

Student Affairs Counseling Center

Student Affairs Student Center

Student Affairs Student Involvement

College/Unit Degree Level Program

Student Affairs Fraternity & Sorority Affairs

Student Affairs New Student and Parent Programs

Student Affairs Off Campus Student Services

Student Affairs Residence Life

Student Affairs Student Conduct

Student Affairs Substance Education & Responsibility (SEAR)

College/Unit Degree Level Program

Student Affairs Student Affairs Violence, Intervention and Prevention Center

College/Unit Degree Level Program

Student Affairs N/A N/A

Not Submitted

Student Affairs

Student Affairs

Meets Expectations

Emerging

Does Not Meet Expectations

Page 5: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 5 of 12

2013-2014 EVALUATION BY RUBRIC CRITERIA

RELATIONSHIP BETWEEN ASSESSMENT TOOL AND OUTCOMES

DATA COLLECTION AND RESEARCH DESIGN INTEGRITY

RESULTS

4

7

0

MeetsExpectations

Emerging Does Not Meet

36%

64%

0%

Meets Expectations Emerging

Does Not Meet

6

5

0

MeetsExpectations

Emerging Does Not Meet

55%

45%

0%

Meets Expectations Emerging

Does Not Meet

3

8

0

MeetsExpectations

Emerging Does Not Meet

27%

73%

0%

Meets Expectations Emerging

Does Not Meet

Page 6: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 6 of 12

INTERPRETATION OF RESULTS

IMPROVEMENT ACTION

6

4

1

MeetsExpectations

Emerging Does Not Meet

55%36%

9%

Meets Expectations Emerging

Does Not Meet

4

7

0

MeetsExpectations

Emerging Does Not Meet

36%

64%

0%

Meets Expectations Emerging

Does Not Meet

Page 7: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 7 of 12

FIVE YEAR OVERALL SCORE TRENDS:

Degree Level Degree Name 2010 2011 2012 2013 2014

Student Affairs Campus Recreation Emerging Comprehensive Comprehensive Meets Expectations Meets Expectations

Student Affairs Counseling & Testing Center Emerging Comprehensive Comprehensive Meets Expectations Meets Expectations

Student Affairs Dining Services Emerging Emerging Needs Improvement N/A N/A

Student Affairs Disability Resource Center Comprehensive Emerging Emerging N/A N/A

Student Affairs Fraternities & Sororities Emerging Comprehensive Emerging Meets Expectations Emerging

Student Affairs New Student and Parent Programs Comprehensive Comprehensive Comprehensive Meets Expectations Emerging

Student Affairs Off Campus Student Services N/A Comprehensive Comprehensive Meets Expectations Emerging

Student Affairs Residence Life Comprehensive Comprehensive Emerging Meets Expectations Emerging

Student Affairs Student Center Comprehensive Emerging Comprehensive Meets Expectations Meets Expectations

Student Affairs Student Conduct Emerging Emerging Needs Improvement Meets Expectations Emerging

Student Affairs Student Involvement Comprehensive Comprehensive Comprehensive Meets Expectations Meets Expectations

Student Affairs Substance Education & Responsibility (SEAR) Emerging Emerging Emerging Meets Expectations Emerging

Student Affairs Violence Intervention & Prevention Center Emerging Comprehensive Comprehensive Emerging Does Not Meet

Page 8: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 8 of 12

FIVE YEAR OVERALL SCORE TRENDS BY RUBRIC CRITERIA:

N % N % N % N % N %

Meets

Expectations6 50.00% 7 58.33% 6 46.15% 7 63.64% 4 36.36%

Emerging 6 50.00% 5 41.67% 7 53.85% 4 36.36% 7 63.64%

Does Not

Meet0 0.00% 0 0.00% 0 0.00% 0 0.00% 0 0.00%

Total 12 100% 12 100% 13 100% 11 100% 11 100%

N % N % N % N % N %

Meets

Expectations3 25.00% 8 66.67% 12 92.31% 11 100.00% 6 54.55%

Emerging 9 75.00% 4 33.33% 1 7.69% 0 0.00% 5 45.45%

Does Not

Meet0 0.00% 0 0.00% 0 0.00% 0 0.00% 0 0.00%

Total 12 100% 12 100% 13 100% 11 100% 11 100%

N % N % N % N % N %

Meets

Expectations7 58.33% 11 91.67% 7 53.85% 9 81.82% 3 27.27%

Emerging 4 33.33% 1 8.33% 6 46.15% 2 18.18% 8 72.73%

Does Not

Meet0 0.00% 0 0.00% 0 0.00% 0 0.00% 0 0.00%

Total 11 92% 12 100% 13 100% 11 100% 11 100%

Relationship between assessment tools and outcomes

10-Nov 11-Nov 12-Nov 13-Nov 14-Nov

Data Collection and Research Design Integrity

10-Nov 11-Nov 12-Nov 13-Nov 14-Nov

Results

10-Nov 11-Nov 12-Nov 13-Nov 14-Nov

Page 9: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 9 of 12

NEXT STEPS

Review the data presented in both review briefs Review the individual unit results and comments provided from the UAC evaluative review Share the results with department chairs and/or faculty Encourage departments/units to review the comments provided by the UAC member in order to improve

their 2014-2015 SLO report Remind departments/units to contact the Office of University Assessment for any guidance or assistance in

revising their assessment processes and/or reporting efforts. Contact Tara Rose, Director of University Assessment, at: [email protected] or Brandon Combs, Assistant Director of University Assessment, at: [email protected].

N % N % N % N % N %

Meets

Expectations4 33.33% 9 75.00% 8 61.54% 8 72.73% 6 54.55%

Emerging 8 66.67% 3 25.00% 4 30.77% 3 27.27% 4 36.36%

Does Not

Meet0 0.00% 0 0.00% 1 7.69% 0 0.00% 1 9.09%

Total 12 100% 12 100% 13 100% 11 100% 11 100%

N % N % N % N % N %

Meets

Expectations6 50.00% 10 83.33% 10 76.92% 8 72.73% 4 36.36%

Emerging 6 50.00% 2 16.67% 3 23.08% 3 27.27% 7 63.64%

Does Not

Meet0 0.00% 0 0.00% 0 0.00% 0 0.00% 0 0.00%

Total 12 100% 12 100% 13 100% 11 100% 11 100%

Interpretation of Results

10-Nov 11-Nov 12-Nov 13-Nov 14-Nov

Improvement Action

10-Nov 11-Nov 12-Nov 13-Nov 14-Nov

Page 10: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 10 of 12

APPENDIX A

Student Learning Outcome Assessment Report Rubric

Approved November 2013

Each category will be scored as Meets expectations (2 points), Emerging (1 point), or Does not meet

expectations (0 points). The total points for each report will be added together to provide an overall score for

the Student Learning Outcome Report. Final scoring categories are as follows:

Meets expectations: 10-12 points

Emerging: 6-9 points

Does not meet expectations: 0-5 points

Scores will be reported to the Dean and the UAC liaisons for each college or division.

I. Method(s)

Meets Expectations (2 points)

Emerging (1 point)

Does not meet expectations (0 points)

A. Relationship between assessment tools and outcomes A general explanation is provided about how the assessment tools relates to the outcome measured (e.g., the faculty wrote test items, essay questions, etc to match the outcome, or the instrument was selected “because its general description appeared to match our outcome”). May include pass rates for license or certification exams. Assessment tools specified by a program’s accrediting body are considered to meet expectations, but it must be made clear to the reader that the tool is chosen by the accrediting body. If more than one outcome is linked to any one assessment tool, an explanation is provided for how each outcome can be measured using only one tool.

At a superficial level, it appears the content assessed by the assessment tools matches the outcome, but no explanation is provided. Assessment tools are primarily indirect, and include things like head counts and course pass rates.

Seemingly no relationship between outcome and assessment tools.

Page 11: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 11 of 12

B. Data collection and Research design integrity Enough information is provided to understand the data collection process, such as a description of the sample, evaluation protocol, evaluation conditions, and student motivation, when and where the data was collected (e.g., were students sampled, or was the population evaluated, adequate motivation, two or more trained raters for performance assessment, pre-post design to measure gain, cutoff defended for performance vs. a criterion).

Limited information is provided about data collection such as who and how many took the assessment, but not enough to judge the veracity of the process (e.g., thirty-five seniors took the test). There appears to be a mismatch with specifications of desired results.

No information is provided about data collection process or data not collected.

C. Specification of desired benchmark/target (Not Scored During This Evaluation) Desired benchmark/target is specified (e.g., our students will gain ½ standard deviation from junior to senior year; our students will score above a faculty-determined standard). “Gathering baseline data” is acceptable for this rating. Enough information was provided to understand how the benchmark was determined.

Desired result (e.g., student growth, comparison to previous year’s data, comparison to faculty standards, performance vs. a criterion), but lack specificity (e.g., students will grow; students will perform better than last year).

No a priori benchmarks/targets for outcomes.

II. Results

Results are present, and directly relate to outcomes. The desired benchmarks for the outcomes are clearly presented, and were derived by appropriate analysis. If a rubric or grading scale was used, it is clear how many in the sample scored in each category.

Results are present, but it is unclear how they relate to the outcomes or the benchmark/target for the outcomes, but presentation lacks clarity or difficult to follow. Only the aggregate totals are given (e.g 80% of the students met the target.)

No results presented.

Page 12: Student Affairs Student Learning Outcomes (SLO) Assessment ... · data collection and research design and integrity, (3) results, (4) interpretation of results, and (5) improvement

Office of University Assessment Page 12 of 12

III. Interpretation of Results

Interpretations of results seem to be reasonable inferences given the outcomes, benchmarks/targets, and methodology. It reflects a discussion of the results by pertinent parties. The position of the person or persons involved in the analysis is listed.

Interpretation attempted, but the interpretation does not refer back to the outcomes or benchmarks/targets for the outcomes. Or, the interpretations are clearly not supported by the methodology and/or results. There is no mention of the person or persons that completed the analysis.

No interpretation attempted. The analysis simply repeats what was stated in the Results category.

IV. Improvement Action

Examples of improvements (or plans to improve) are documented and directly related to findings of assessment. These improvements are very specific (e.g., approximate dates of and person(s) responsible for implementation, and where in curriculum/activities and department/program they will occur.) If no improvements are found to be necessary then: the program must either increase the benchmark, or explain why the benchmark does not need to be increased; state plans to focus on another area of concern for future assessments and work to monitor and maintain the current level of success for this outcome.

Examples of improvements are documented but the link between them and the assessment findings is not clear. The improvements lack specificity.

No mention of any improvements.