49
University of Rochester Report of Student Learning Assessment December 2016

Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

  • Upload
    voanh

  • View
    226

  • Download
    6

Embed Size (px)

Citation preview

Page 1: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

University of Rochester

Report of

Student Learning Assessment

December 2016

Page 2: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 1

University Committee

On Educational Effectiveness Assessment

Student Learning Assessment across the University

Report Date: December 2016

Dr. Jane Marie Souza Reporting

Assessment of Student Learning across the University continues to be

driven by faculty and staff within the institution’s Schools. The individual

Schools are best positioned to develop and support useful assessment

processes that will result in continuous improvement of their educational

effectiveness. However, if communication about assessment practices

across the Schools is not planned and supported, there is the risk of

missing opportunities for rich conversations about best practices,

challenges, and lessons learned. Moreover, without intentional sharing of

assessment plans and data, it would be difficult to report on learning

outcomes assessment from the University perspective.

Established and charged by the Provost, a university-wide committee

brings together personnel from all Schools and the University Dean of

Graduate Studies to support sustainable assessment processes, share

assessment strategies, report on school-level activity, and participate in

professional development activities related to learning outcomes

assessment. This committee is named the University Committee on

Educational Effectiveness Assessment (UCEEA) to reflect the group’s

focus on Middle States Commission on Higher Education (MSCHE)

Standard V: Educational Effectiveness Assessment.

“Useful assessment processes help

faculty and staff make

appropriate decisions about

improving programs and services,

developing goals and plans, and

making resource allocations.

Because institutions, their

students, and their environments

are continually evolving, effective

assessments cannot be static; they

must be reviewed periodically and

adapted in order to remain

useful.”

Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations http://msche.org/publications.asp

Page 3: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 2

University Committee on

Educational Effectiveness Assessment

Jane Marie Souza, Assistant Provost for Academic

Administration, Chair UCEEA

Greg Dobson, Simon Business School, Associate

Professor

Linda Lipani, Eastman Institute for Oral Health,

Registrar and Director of Student Affairs

John Hain, Eastman School of Music, Associate Dean

of Academic Affairs / Director of Assessment

Karen DeAngelis, Warner School of Education,

Associate Professor and Associate Dean for

Academic Programs

Margaret Kearney, Office of Graduate Studies, Vice

Provost and University Dean of Graduate Studies

Edith Lord, School of Medicine and Dentistry-

Professor and Senior Associate Dean for Graduate

Education and Postdoctoral Affairs

Josephine Seddon, Arts, Science and Engineering,

Director of Educational Effectiveness

Christopher Mooney, School of Medicine and

Dentistry- Medical Education, Director of

Assessment

Bethel Powers, School of Nursing, Professor and

Director of Evaluation Office and the PhD Program

James Zavislan, Hajim School of Engineering and

Applied Sciences, Associate Dean, Education and

New Initiatives

ACADEMIC YEAR 2015-2016

UCEEA ACTIVITIES

During the past academic year there were multiple

changes to the UCEEA membership as follows:

Greg Dobson- Simon Business School

Linda Lipani- Eastman Institute for Oral Health

Karen DeAngelis- Warner School of Education

Edith Lord- School of Medicine and Dentistry,

Graduate Education

Josephine Seddon- Arts, Sciences and Engineering

These personnel changes along with significant regional

accreditation process revisions informed many of the

agenda topics for the year’s UCEEA meetings.

Presentations by the veteran members of the

committee served to inform new members as well as

provide valuable updates to all.

Dr. Margaret Kearney, Vice Provost and University Dean

of Graduate Studies shared her experience at the

Association of American Universities Data Exchange

meeting. She emphasized the call to post more data

publicly with respect to PhD outcomes including time to

degree, completion rates, and career outcomes.

Dr. Christopher Mooney, Director of Assessment for the

School of Medicine and Dentistry – Medical Education

shared two presentations he delivered at an Association

of American Medical Colleges conference. His

presentations offered valuable strategies for formative

assessment and peer evaluation. He noted that the

UCEEA can serve as a “sounding board” to vet new

assessment initiatives.

Page 4: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 3

Dr. John Hain, Associate Dean at the Eastman School of Music, shared the executive summary assessment

plan for the School, providing methods for gathering both direct and indirect assessment data. He

demonstrated a process for use of assessment data to inform decision making. The committee noted that

while assessment methods may vary greatly between Schools (e.g. the School of Music approach is very

different from the School of Medicine approach) the executive assessment plans for all schools demonstrate

planned, rather than ad hoc, approaches to outcomes assessment.

Dr. Jane Marie Souza, Assistant Provost for Academic Administration, was invited to serve on the Middle

States Commission on Higher Education working group tasked with articulating the implementation of the

new accreditation processes. Before traveling to Philadelphia to begin that work, she sought the

recommendations of the UCEEA regarding what members would consider to be the important outcomes to

report to the commission. The committee provided valuable input to share with members for the MSCHE

working group.

Dr. Souza later updated the committee on the information provided at MSCHE Town Hall meetings which

addressed the outcomes of the working groups. She outlined the new Annual Institutional Update

requirements and the new eight-year accreditation cycle.

The year’s UCEEA meetings concluded with robust discussions regarding the possible uses of an ePortfolio

system. This group will take the lead on vetting ePortfolio vendors for an academic year 2017-2018 pilot. The

discussions resulted in highlighting how the use of such a tool would vary greatly across the schools, and

confirming that any system adopted would need to allow for autonomous use based on each school’s

priorities.

Additional actions resulting from the work of the UCEEA include updating the Office of the Provost web page

with information to support verification of compliance with accreditation-relevant federal regulations. (See

http://rochester.edu/provost/compliance/index.html .)

The Office of the Provost web page was also updated to include three frames of the National Institute for

Learning Outcomes Assessment Transparency Framework: Student Learning Outcomes Statements,

Assessment Plans, and Assessment Resources. (See http://rochester.edu/provost/assessment/index.html )

SCHOOL ASSESSMENT REPORTS 2015-2016

The schools were asked to provide summaries of changes implemented or in the planning stages to improve

teaching and learning during the past academic year. As noted in the previous reporting cycle, the format for

the reports should include the articulated goals, the activities supporting them, assessment strategies to

determine progress, and how data will be used. These annual reports are not meant to replace more

comprehensive reports that may be required by the school deans. The reports provided below reflect each

school’s unique style and teaching environment. Collectively, they demonstrate the range of strategies

employed to continuously improve teaching and learning across the university.

Page 5: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 4

Assessment Reports Table of Contents

Eastman Institute for Oral Health Linda Lipani Pages 5 - 6 Eastman School of Music Dr. John Hain Pages 7 - 9 Hajim School of Engineering and Applied Sciences Dr. James Zavislan Pages 10 - 15 School of Medicine and Dentistry – Graduate Programs Dr. Edith Lord Pages 16 – 18 School of Medicine and Dentistry – Medical Education Dr. Christopher Mooney Pages 19 – 23 School of Nursing Dr. Bethel Powers Pages 24 – 31 University Graduate Studies Dr. Margaret Kearney Pages 32 – 35 Warner School of Education Dr. Karen DeAngelis Pages 36 – 39 The College Josephine Seddon Pages 40 – 43 Simon Business School Dr. Greg Dobson Pages 44 - 46

Professional Development Activity Report 2016 Page 47 2015 Page 48

Page 6: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 5

EASTMAN INSTITUTE FOR ORAL HEALTH

Assessment Report for Academic Year 2015-2016

Linda Lipani Reporting

GOAL 1: To develop a syllabus template to be used for all core courses offered within the School: Request SMD review of EDD courses to determine if they could be applied to SMD MS degree programs. ACTIVITIES SUPPORTING ACHIEVEMENT OF GOAL:

An official syllabus was developed to include major components of a standard syllabus (relevant information about the course, statements of course objectives, activities by which the objectives will be met, list of text and required readings, assignments, grading criteria, course policies, schedule, etc.). Course directors were asked to complete each section of the syllabus.

The EIOH Curriculum Committee reviewed each syllabus carefully to ensure compliance with the standard syllabus. Deficiencies were noted and course directors were asked to revised syllabi as needed. Contact hours and out of classroom work was also reviewed to ensure credit hours were appropriately assigned to each course.

SMD review of EDD courses did not occur. EIOH developed two new master’s degrees and submitted proposals to NYSED. Courses included in the proposal were reviewed for graduate-level credit worthiness in consultation with the SMD. Two new Master of Science degrees were approved and will be implemented for fall 2017 admission: Master of Science, Dental Science, Clinical and Translational Science and Master of Science, Dental Science, Infectious Diseases.

ASSESSMENT STRATEGIES AND TARGETS:

Incorporated questions in end of semester course evaluations to determine if the syllabus helped students manage their learning in the course. Questions regarding its usefulness and clarity, especially in the areas of course goals and learning outcomes were included.

Target: Results of course evaluation responses will be used to further develop the standard course syllabus and to improve syllabi for individual courses.

Page 7: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 6

EASTMAN INSTITUTE FOR ORAL HEALTH (continued)

GOAL 2: Ensure continuous integrity of Program of Study manuals and transcripts. This goal was not achieved in the past academic year. Work to convert the EIOH from “home grown” transcripts to the University official transcript began November 2016 and will continue throughout 2017. Review of program manuals is planned for the future. In the meantime, a school-wide policy and procedure handbook is under development. ACTIVITIES SUPPORTING ACHIEVEMENT OF GOAL:

Ongoing discussions with program and School leadership to determine content of the policy and procedure handbook.

Review of University policies and procedures to ensure inclusion in the handbook and to establish compliance requirements within the School.

ASSESSMENT STRATEGIES AND TARGETS:

A focus group will be assembled with current residents/trainees to fine-tune the format and content of the handbook.

The handbook will be presented to program and School leadership for review and approval.

Target for completion is June 2017. GOAL 3: Design an assessment plan for two Master of Science in Dental Science programs approved by NYSED in fall 2016, plan for implementation of the assessment process and for the communication of results. ACTIVITIES SUPPORTING ACHIEVEMENT OF GOAL:

Work with program leadership to develop an assessment plan that will inform decision making and demonstrate the effectiveness of the program.

Determine what types of data will best serve to inform and support change to the structure and content of the program.

ASSESSMENT STRATEGIES AND TARGETS:

Have the assessment plan reviewed by colleagues and assessment professionals at other universities. Implement changes as suggested.

Target for completion of work and implementation of a plan is fall 2017.

Page 8: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 7

EASTMAN SCHOOL OF MUSIC

Assessment Report for Academic Year 2015-2016

Dr. John Hain Reporting

GOAL 1: Examine the music history core sequence and determine if changes are required

The Musicology department currently oversees the three-semester music history course sequence required

of all undergraduate majors at Eastman. These courses, MHS 121, MHS 122, and MHS 123, are all taught in a

large lecture format (~70 students per class) and are chronological surveys across the history of music from

the year 800 through the present day. Over the past few years, faculty have seen a rise in the number of

unexcused absences for their weekly lectures, a substantial decrease in the number of students attending the

optional review sessions with TAs, a decrease in the overall quality of student work in the class, and mixed

course evaluation feedback from the students.

ACTIVITIES SUPPORTING ACHIEVEMENT OF GOAL

Throughout the 2015-16 academic year, the Musicology faculty will be engaged in the process of

reviewing their curriculum for the music history sequence. The will examine the course content as

well as the delivery of the instruction to determine if any changes are required.

Members of the Musicology department will work closely with the Undergraduate Curriculum

Committee and Office of Academic Affairs to make sure that any proposed revisions to the

curriculum adhere to our learning outcomes for the Bachelor of Music program.

ASSESSMENT STRATEGIES AND TARGETS

The department should have their review completed by the end of the spring 2016 semester.

RESULTS, USE OF ASSESSMENT DATA & FOLLOW UP

Proposed revisions to the curriculum will be brought to the Undergraduate Curriculum Committee in

fall 2016.

Page 9: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 8

EASTMAN SCHOOL OF MUSIC (continued)

GOAL 2: Implement a systematic method for collecting jury evaluations and comments across the school.

Eastman performance faculty already provide incredibly rich, authentic feedback to students following their

juries, and this goal is merely to put in place a system to collect this information for every student in every

department. We began work on this goal in the 2014-15 academic year, and the change in culture is slowly

beginning to work its way through the performance departments.

ACTIVITIES SUPPORTING ACHIEVEMENT OF GOAL

The Office of Academic Affairs will continue to work closely with department chairs and department

secretaries to ensure that jury forms are being used correctly, and that scores and comments are

being properly collected and archived for future analysis. For departments and instruments not

using the new scoring rubrics, they will continue to receive feedback from the Office of Academic

Affairs on why we continue to push for their adoption of the new system.

ASSESSMENT STRATEGIES AND TARGETS

While most departments are using the new forms, there are some notable holdouts. At this point, it

is likely that we will wait for some key retirements in these areas to change the culture of the

department.

Good examples of departmental rubrics will be highlighted at curriculum committee meetings.

Any changes made based on jury data will be shared across the school, but particularly with the

reluctant departments.

RESULTS, USE OF ASSESSMENT DATA & FOLLOW UP

Data collected from the 2015-16 juries will continue to play a key role in the departmental

assessment reports.

Page 10: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 9

EASTMAN SCHOOL OF MUSIC (continued)

GOAL 3: A database will be created to replace the “orange card” data currently collected by the theory

department.

The Theory department currently tracks undergraduate students in the five-semester core theory sequence

using paper files that are maintained by the theory department administrative assistant. This system will be

replaced by a Filemaker database that will simplify data entry for the teaching assistants and faculty, and will

also allow for a more robust analysis of the data that is collected.

ACTIVITIES SUPPORTING ACHIEVEMENT OF GOAL

The new Filemaker database was rolled out in the fall 2015 semester, and a select group of students

from a couple of TH 101 and TH 161 sections were inputted as a test. After gathering feedback from

the professors and TAs, changes were made to the database and another set of student information

was uploaded in the spring 2016 semester.

As that went well, the plan is to collect information on all freshmen and sophomore classes in the

2016-17 academic year, and then have all undergraduate theory courses in the database for the

2017-18 academic year and beyond.

ASSESSMENT STRATEGIES AND TARGETS

As the testing went well, the current plan is to collect information on all freshmen and sophomore

classes in the 2016-17 academic year and then have all undergraduate theory courses in the

database for the 2017-18 academic year and beyond.

RESULTS, USE OF ASSESSMENT DATA & FOLLOW UP

The data collected by this system will be analyzed by the theory department faculty to help measure

the effectiveness of the core music theory curriculum.

Page 11: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 10

HAJIM SCHOOL OF ENGINEERING AND APPLIED SCIENCES

Assessment Report for Academic Year 2015-2016

Dr. James M. Zavislan Reporting

I. Overview The degree programs in the Hajim School of Engineering and Applied Sciences have well-established procedures and assessment plans for monitoring and improving their educational objectives and outcomes. The Hajim School of Engineering and Applied Sciences (Hajim School) currently consists of ten degree programs: Audio and Music Engineering, Biomedical Engineering, Chemical Engineering, Computer Science, Electrical and Computer Engineering, Mechanical Engineering, Optics and Optical Engineering, Interdepartmental Engineering and Bachelor of Arts in Engineering Science. Five of the BS degree programs: Biomedical Engineering, Chemical Engineering, Electrical and Computer Engineering, Mechanical Engineering and Optical Engineering currently are accredited by the Engineering Accreditation Commission of ABET, http://abet.org. These BS programs were reviewed by the Engineering Accreditation Commission in the fall of 2015. In August 2016 we were notified that these five degree programs are accredited to 30 September 2022. The Audio and Music Engineering BS degree program was not reviewed, but intends to undergo review by the Engineering Accreditation Commission of ABET, http://abet.org at the next scheduled review. All of the Hajim programs have active assessment plans that are published on the University of Rochester Arts Science and Engineering Assessment web page: https://www.rochester.edu/college/assessment/plans/index.html. The department assessment plans identify the program objectives and also identify methods for evaluation and areas for improvement. The outcomes of the program’s assessments are reviewed by the Hajim office annually by the Associate Dean for Education. II. Reported Assessment Activities for the Academic Year 2015-2016 ABET Accreditation The BS degree programs in Biomedical Engineering, Chemical Engineering, Electrical and Computer Engineering, Mechanical Engineering and Optical Engineering received Draft Statements from Engineering Accreditation Commission of ABET, http://abet.org, based on the fall 2015 site visit and program Self Study review. The Draft Statement highlighted institutional and department strengths as well as program Concerns1

and Weaknesses2. Below is a summary of the Draft Statement along with the corrective actions presented in our Response to the Draft Statement as submitted 14 April 2016. _______________________________________ 1 From the PAF: “A concern indicates that a program currently satisfies a criterion, policy, or procedure; however, the potential exists for the situation to change such that the criterion, policy, or procedure may not be satisfied.” 2 From the PAF: “A weakness indicates that a program lacks strength of compliance with a criterion, policy, or procedure to ensure that the quality of the program will not be compromised Therefore, remedial action is required to strengthen compliance with the criterion, policy, or procedure prior to the next review.”

Page 12: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 11

HAJIM SCHOOL OF ENGINEERING AND APPLIED SCIENCES (continued)

Summary of ABET Draft Statement and Corrective Action

1. Concerns were identified for the BS degree programs in CHE regarding automatic enforcement of prerequisite requirements. This was identified as a Weakness for the BS degree program in ME. Corrective Action: During the fall of 2016 the Hajim School departments reviewed prerequisite requirements for all required classes in each major. These were re-written so that pre-requisite requirements for each required class could be tested against registration information from the student information system (ISIS). After the registration period for the 2016 Spring semester, each department received pre-requisite compliance reports that alerted undergraduate coordinators and individual faculty of students lacking pre-requisites. We will continue this automated reporting until the new student information system is operational. The new system is designed to automatically enforce pre-requisites during registration and is anticipated to be on-line in 2019.

2. A Concern was identified for the BS degree program in ECE regarding amount of discrete mathematics in its curriculum. Corrective Action: The ECE department added a required course in discrete mathematics: ECE 270: “Discrete Mathematics and Probability”.

3. Weaknesses were identified for BS degree programs in BME, CHE, ECE and ME regarding accreditation attribution and reporting of program enrollment and graduation data on the department websites. Corrective Action: While enrollment and graduation data was provided on the Hajim website, it was not referenced on the department websites. Links to the Hajim information were added to the department websites and all references to our ABET accredited programs now include: “BS Degree program is accredited by the Engineering Accreditation Commission of ABET, http://abet.org.”

4. A Weakness was identified for the BS degree program in CHE regarding ABET Criteria 4. Continuous Improvement, the assessment and evaluation processes for student outcomes obtainment. Corrective Action: The chemical engineering department recognized this concern prior to the site visit and augmented its processes to address this weakness. The revised processes include integrating data from a) surveys of CHE sophomores and seniors, b) feedback from local industry leaders serving on the CHE visiting committee, c) alumni surveys and d) course reviews. This integrated assessment information is evaluated by the CHE faculty during an annual retreat. From this evaluation, department faculty identify areas of improvement and strengths. Department faculty then decide on changes required for continuous improvement. These plans are then reviewed as needed throughout the year and then again at the next annual retreat. 5. A Weakness was identified for the BS degree program in Optical Engineering regarding Continuous Improvement, the assessment and evaluation processes for student outcomes obtainment. Corrective Action: Being a new degree program, the OPE degree had only assessed its outcomes once in the 2014-2015 academic year. Furthermore, this assessment had been based on a set of unique outcomes developed by The Institute of Optics and mapped onto the ABET a-k outcomes. Based on feedback from the ABET Program Evaluator and the ABET Draft Statement the faculty of The Institute of Optics adopted the ABET a-k outcomes for the BS OPE degree and instituted procedures to systematically assess each of these twice in a six-year review cycle.

Page 13: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 12

HAJIM SCHOOL OF ENGINEERING AND APPLIED SCIENCES (continued)

6. Weaknesses were identified for the BS degree programs in ME and OPE regarding Program Educational Objectives, which were found to be written as student outcomes. Corrective Action: The ME degree program reduced the number of Program Educational Objectives to three and related them to what the graduates are expected to attain within a few years of graduation. These objectives were approved by the faculty as well as the Industrial Review Board and are posted on the program’s web page. The OPE degree program developed five revised Program Educational Objectives that were reviewed and approved by both the faculty and the External Review Board. These are reviewed annually by the faculty and the External Review Board and updated if necessary. The Engineering Accreditation Commission reviewed our Responses to the Draft Statement and concluded the corrective actions resolved all Concerns and Weaknesses. The degree programs in BME, CHE, ECE, ME and OPE are accredited to 30 September 2022. The OPE degree program was awarded retroactive accreditation from 1 October 2014. Highlights of Areas of Special Emphasis As reported previously, we have focused on introductory physics and the math preparation of our students. This activity continued during the 2015-2016 academic year. We continue to work closely with the College of Arts, Sciences and Engineering, the Math department and the Physics department to 1) Ensure our incoming students are placed in the appropriate math sequences,

2) Ensure students placed in the MTH 141,142, 143 sequence have the ability to complete this track prior to the fall semester of their sophomore year,

3) Deploy introductory physics classes that are appropriate for students in each of our two math sequences. Math Placement: For the second year revised math placement recommendations were used for all incoming students prior to orientation. Students and their advisors were instructed that these placements would be strictly enforced, but students placed into either MTH 140 or MTH 141 would have the option of taking a placement test at the end of orientation, which if passed would enable them to enroll in MTH 161. Below we compare progress for freshmen students enrolled in MTH 141 and MTH 161 into their next math class prior to and after we revised the math placement.

0.0%

5.0%

10.0%

15.0%

20.0%

25.0%

Fall 2014/MTH 141w/o placement

Fall 2015/MTH 141w/placement

Fall 2016/MTH 141w/placement

Students Unable to Progress to MTH 142

Page 14: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 13

HAJIM SCHOOL OF ENGINEERING AND APPLIED SCIENCES (continued)

The two graphs show the percentage of students that either withdrew or earned a grade below C- in either

MTH 141 or MTH 161. A higher percentage of students were able to progress to both MTH 142 and MTH 162

after the placement criteria was modified. The decrease in the percentage of students unable to progress

from MTH 161 to MTH 162 (2014: 18%, 2015: 10%, 2016: 10%) was expected since students with limited

math preparation were shifted away from MTH 161 to MTH 141. However, we also noted that a decrease in

the percentage of students unable to progress to MTH 142 (2014: 20%, 2015: 12%, 2016: 16%).

The next two graphs track the percentage of students that earned a grade of “B” or above in MTH 141 and MTH 161. As mentioned above, it was expected the new math placement would boost the percentage of students able to progress from MTH 161 to MTH 162 with a grade of C- or above. We noted the percentage of students earning a grade of B or higher in MTH 161 also increased under the revised placement rules (2014: 43%, 2015: 62%, 2016: 73%). However, the percentage of students earning a grade of B or higher in MTH 141 remained essentially constant (2014: 50%, 2015: 51%, 2016: 55%).

0.0%

5.0%

10.0%

15.0%

20.0%

25.0%

Fall 2014/MTH 161w/o placement

Fall 2015/MTH 161w/placement

Fall 2016/MTH 161w/placement

Students Unable to Progress to MTH 162

0.0%

20.0%

40.0%

60.0%

80.0%

Fall 2014/MTH 141w/o placement

Fall 2015/MTH 141w/placement

Fall 2016/MTH 141w/placement

MTH 141 Grades B or Higher

Page 15: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 14

HAJIM SCHOOL OF ENGINEERING AND APPLIED SCIENCES (continued)

Support for Students Placed in the MTH 141-143 sequence:

During the 2015-2016 academic year, all students placed into MTH 141 fall of their freshmen year who

expressed interest in majors that require PHY 122 were provided the option to complete MTH 143 tuition

free during the first summer session. Students that complete MTH 143 or the equivalent prior to the fall of

their sophomore year are able to follow standard curriculum sequences in the remainder of their math and

introductory physics classes. Summer 2016 enrollment in tuition-free MTH 143 was 33 out of 42 eligible

students, and 88% of the students earned a “C” or higher. Tuition-free MTH 143 will be offered again in the

summer of 2017.

Introductory Physics Classes

As reported last year, we worked with the Physics department to identify an alternative introductory mechanics class that is consistent with the syllabus and course progress in MTH 142. PHY 113 was found to be appropriate for the MTH 142 students, enabling students to progress to PHY 122/122P as well meeting the BME department’s requirement that the introductory physics sequence be calculus based. Spring 2016 was the first semester that students going into MTH 142 were required to take PHY113 instead of PHY 121/121P. Below is the comparison of Hajim student progress to PHY 122 for PHY 121/121P in spring of 2015 and 2016 as well as for PHY 113 in the spring of 2016 (the first year that PHY 113 could be used as a pre-requisite for PHY 122). On average 86% of PHY 121 and 85% of 121P students were able to progress to PHY 122. This is significantly higher than the 58% of PHY 113 students able to progress to PHY 122. This difference will be investigated in 2017.

0.0%

20.0%

40.0%

60.0%

80.0%

Fall 2014/MTH 161w/o placement

Fall 2015/MTH 161w/placement

Fall 2016/MTH 161w/placement

MTH 161 Grades B or Higher

Page 16: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 15

HAJIM SCHOOL OF ENGINEERING AND APPLIED SCIENCES (continued)

0.0%

10.0%

20.0%

30.0%

40.0%

50.0%

60.0%

70.0%

80.0%

90.0%

100.0%

PHY 113 20162 PHY 121 20152 PHY 121 20162 PHY 121P20152

PHY 121P20162

Cohort Eligible to Progress to PHY 122

Page 17: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 16

SCHOOL OF MEDICINE AND DENTISTRY GRADUATE EDUCATION (MA, MPH, MS, PHD PROGRAMS) Assessment Report for Academic Year 2015-2016 Dr. Edith Lord Reporting The graduate programs in the School of Medicine and Dentistry are committed to the continual improvement of assessment practices. This report highlights major assessment activities for the academic year 2015-16. Goal 1:

To address deficiencies in graduate student writing skills identified through qualifying exam outcomes.

To assist graduate students in learning how to improve fundamental writing skills to support scientific writing and publication.

Activities Supporting Achievement of Goal:

A Life Sciences Writing Specialist was hired in 2015 to offer writing seminars, co-teach a scientific communications course, and provide individual support to graduate students.

A 4- hour “Preparing for the Qualifying Exam Writing Workshop” was offered twice in summer 2015 and again in the summer of 2016. The majority of students required to qualify for candidacy have attended these workshops.

A 2-hour “From Successful Qualifying Exams to NIH F31 Grant Submissions” writing workshop was offered in fall 2015 and fall 2016. Fifteen students attended in 2015 and 13 in 2016.

A communications course originating in the Department of Microbiology & Immunology was piloted in the fall of 2015, approved by the Committee on Graduate Studies in May 2016 and the course was presented again in the fall of 2016.

Assessment Strategies and Targets:

Review student evaluations for each of the workshops.

Review of qualifying exams outcomes.

Target: Results of qualifying exams held between June and December 2015 will be used as a baseline for comparison to subsequent years.

Qualifying exam outcomes will be examined to determine if change occurs in the number of students who passed pending modifications to the thesis proposal. We will attempt to draw a correlation between workshop participation and qualifying examination outcomes. Data will be shared with the Committee on Graduate Studies and the Life Sciences Learning Specialist.

Page 18: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 17

SCHOOL OF MEDICINE AND DENTISTRY GRADUATE EDUCATION (continued) Goal 2:

To provide graduate students with professional support for career and internship planning, programming to introduce non-academic career opportunities, resume/CV writing services and coaching for interviewing and salary negotiations.

To increase graduate student satisfaction with career guidance received and career outcomes.

This is an ongoing initiative, to which the School has committed considerable resources.

Activities Supporting Achievement of Goal:

UR Best, a NIH-funded training program that prepares graduate students for careers outside of academia. The program fosters new opportunities for experiential learning through internship and shadowing.

Workshops in leadership, professionalism, transferable and supplemental skills development, and career exploration and planning

Workshops featuring speakers in specific non-academic disciplines (e.g. industry, management, entrepreneurship, regulatory affairs, and science technology policy).

New course developed “Leadership and Management for Scientists”.

New website, web resources and a book lending library for career exploration and personal growth.

New weekly Opportunities to Explore newsletter for efficient communication of professional development opportunities at SMD, UR, locally, nationally, and internationally.

A Life Sciences Career Coach was hired October 2015. He accepted a position in Washington, DC in November of 2016. A new Career Coach has been recruited who has extensive experience. She will be starting work in January 2017.

Membership in the CIRTL (Center for the Integration of Research, Teaching and Learning) network, a learning community of 43 universities across North America.

Subscriptions with career planning websites, Bio Careers and the Versatile PhD. Assessment Strategies and Targets:

Annual surveys required by the NIH for the UR Best program.

Student evaluations for each workshop and new course.

Satisfaction survey feedback and testimonials from students who have worked one-on-one with the Career Coach.

Collect first-job out from exit surveys.

Target: to offer a wide range of career services and opportunities to meet the needs of a diverse student body.

The development of career services for graduate students is ongoing. Data will be collected to determine impact over time.

Page 19: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 18

SCHOOL OF MEDICINE AND DENTISTRY GRADUATE EDUCATION (continued) Goal 3:

To reassess our core curriculum (basic interdepartmental courses) for the basic science students.

To devise new courses with a new format, additional content particularly in regards to bioinformatics and genomics and a component of active peer or near peer learning.

To design a new hands-on way to teach biostatistics to all of the basic science students. Activities Supporting Achievement of Goal:

Evaluations of the current courses were carefully examined and deficiencies noted.

A curriculum committee was formed with extensive discussion regarding the format of the course. It was determined that a two semester, modular (5 modules/semester) would be preferred.

A second curriculum committee was formed in fall 2016 and tasked with fleshing out the course content and format.

Discussions are occurring currently to determine how these substantial changes will best be accomplished. Assessment Strategies and Targets:

The proposed courses will be presented to the leaders of the School and to faculty in each of the involved departments.

Focus groups with current students will be formed to fine-tune the format and content.

Our target date is fall 2017 to initiate these new courses.

Page 20: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 19

SCHOOL OF MEDICINE AND DENTISTRY – MEDICAL EDUCATION Assessment Report for Academic Year 2015-2016 Dr. Christopher Mooney Reporting Goal 1: To improve the learning environment within the Obstetrics and Gynecology (OB/Gyn) Clerkship in the third year Activity: In 2015-2016, implemented OB/Gyn chair rounds; held focus groups with Ob/Gyn residents;

scheduled regular discussions with faculty and residents at affiliate hospitals regarding clerkship learning

objectives, student expectations, and policies; continued residents as teachers educational sessions. In 2016-

2017, meetings held between OB/Gyn administration and medical school leadership.

Assessment and Target: Revised clerkship evaluation forms to include questions that assess clerkship

learning environment. The Office of Curriculum and Assessment (OCA) is monitoring relevant data in

clerkship evaluation forms; institutional exit surveys; AAMC surveys Graduation Questionnaire (GQ) to

indirectly measure learning environment

Relevant data in all 2016-2017 clerkship evaluations, third year exit survey; 2018 AAMC GQ

Results/Evidence of Effectiveness: In process. Data from 2015-2016 confirmed improvements reported by

students. The school’s efforts as outlined appear to have been successful in not only sustaining but also in

advancing improvement as noted in 2016-2017 data to date. Systems remain in place for monitoring the

learning environment by the school in all clerkships as part of our continuous quality improvement

processes.

Planned Use of the Evidence and Next Steps: Information collected was shared with: the LCME for

monitoring purposes (Nov. 2016); OB/Gyn Department Chair and Clerkship Director; Year 3 and 4 Instruction

Committee; Curriculum Steering Committee.

Page 21: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 20

SCHOOL OF MEDICINE AND DENTISTRY – MEDICAL EDUCATION (continued)

Goal 2: To increase direct observation of students’ history taking and physical/mental status examination in the core clerkships

Activity: Monitor students’ self-reporting of direct observation on and provide feedback to Clerkship

Directors and Senior Associate Dean for Medical Education

Assessment and Target: OCA will review relevant questions in end-of-clerkship evaluations on a bi-annual

basis and relevant AAMC surveys

Relevant data in all 2016-2017 clerkship evaluations, third year exit survey; 2018 AAMC GQ

Results/Evidence of Effectiveness: In process. In response to the LCME survey visit, the departments of

Surgery, Emergency Medicine, and Ob/Gyn have implemented efforts to enhance direct observation of

students’ physical/mental status examination.

Although the medical school expected it would take two years to see evidence of the school’s efforts to

observe students taking a history and doing a physical examination on the AAMC-GQ, we were surprised to

see low rates of student reported observation in some clerkships on the recent survey conducted at the end

of the third year. The school will continue to monitor this data centrally and at the clerkship level. Bolstered

efforts to educate students and faculty are ongoing so there is consistency in results between surveys and

clerkships.

Planned Use of the Evidence and Next Steps: Information collected was shared with: LCME for monitoring

purposes (Nov. 2016); Clerkship Directors; Senior Associate Dean for Medical Education; Year 3 and 4

Instruction Committee; Curriculum Steering Committee. Additional efforts by the school have included

altering end-of clerkship evaluation question to clarify student expectations and requiring students to log a

“direct observation” in our online system (medsis) that records required clerkship experiences.

Page 22: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 21

SCHOOL OF MEDICINE AND DENTISTRY – MEDICAL EDUCATION (continued)

Goal 3: Monitor URSMD’s transition to MedHub as a means to document students’ required clinical experiences

Activity: Systematically monitored and audited students’ completion of required clinical experiences

Assessment and Target: OCA audited students’ documentation of required clinical experiences in MedHub

on a biannual basis, expecting that all core clerkships will be compliant - 100% of students document

completion of required clinical experiences

Results/Evidence of Effectiveness: The implementation of MedHub was well planned and smooth, timed to

begin with a new academic year. As our medical center is owned by the University of Rochester, MedHub

implementation was planned with the Graduate Medical Education Office to implement a single

evaluative/monitoring system for all students, residents, fellows, and evaluators in the medical center. We

are pleased with the ease of use and the reporting that MedHub offers. Student encounters continue to be

monitored both at the clerkship level and centrally by the school with reports reviewed by the appropriate

committees as done with the previous system. This process is functioning well and is very effective.

Planned Use of the Evidence and Next Steps: Information collected was shared with: LCME for monitoring

purposes (Nov. 2016); Clerkship Directors; Senior Associate Dean for Medical Education; Year 3 and 4

Instruction Committee; Curriculum Steering Committee. There are no further steps.

Goal 4: Improve overall quality and perception of students’ knowledge of health systems content

Activity: Transition to new theme director leadership in the 2015-2016 academic year; review of content in MMI and CHIC courses

Assessment and Target: OCA will review relevant questions in course evaluations; institutional exit surveys;

AAMC GQ to measure quality and student perception of knowledge

Relevant data in all 2015-2016 course evaluations, years 2, 3, and 4 exit survey; 2016 AAMC GQ

Results/Evidence of Effectiveness: In process. The school underwent a successful transition of theme

leadership with a focus on learning where relevant theme content is taught, and identifying potential gaps.

Student feedback from MMI course evaluation reveals that delivery of content was well received. Data from

exit surveys and AAMC GQ reveal no substantial shifts in overall ratings.

Planned Use of the Evidence and Next Steps: Information collected will be shared with: Health Systems

Theme Co-Directors; Senior Associate Dean for Medical Education; Instruction Committees; Curriculum

Steering Committee

Page 23: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 22

SCHOOL OF MEDICINE AND DENTISTRY – MEDICAL EDUCATION (continued)

Goal 5: Improve overall quality of pharmacology course and perception of students’ knowledge of pharmacology content

Activity: Move pharmacology course to end of first year; remove PBL; incorporate simulator experience

Assessment and Target: OCA will review relevant questions in Pharmacology course evaluation; institutional

exit surveys; AAMC GQ; student step 1 scores

Relevant data in 2015-2016 course evaluation, years 2 exit survey; 2017 AAMC

Results/Evidence of Effectiveness: In process. Review of course evaluation and annual exit surveys have

shown that student perception of the overall course quality has improved and is trending upwards. Students

still express desire for greater integration of content in other courses to supplement the relatively brief

course.

NBME board data (step 1 scores) indicates that relative to national norms, URSMD students score above the

mean on pharmacology related content.

Planned Use of the Evidence and Next Steps: Information collected will be shared with: Pharmacology

Course Director; Pharmacology Department Chair; Senior Associate Dean for Medical Education; First and

Second Year Instruction Committee; Curriculum Steering Committee.

The pharmacology course has recently transitioned in leadership. The new course director is eager to

improve the course and see a continual upward trend in its rating and student perception of pharmacologic

knowledge.

Goal 6: Explore the possibility of developing a discipline-specific quality and safety activity in the fourth year curriculum

Activity: Convene a task force to discuss opportunities, structure, timing, etc.

Assessment and Target: Task force recommendations. End of 2016-2017 academic year

Results/Evidence of Effectiveness: The task force has yet to be convened and will likely occur via formal

review of the curriculum which is scheduled to occur in 2017.

Planned Use of the Evidence and Next Steps: Information collected will be shared with: Year 3 and 4

Instruction Committee; Curriculum Steering Committee

Page 24: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 23

SCHOOL OF MEDICINE AND DENTISTRY – MEDICAL EDUCATION (continued)

Goal 7: Monitor the overall quality and student perception of the revised Stress, Adaptations, and Transitions (SAT) Course in the second year

Activity: The SAT course has been revised in AY 2016-2017, with changes in content and incorporation of

activities that aim to bridge students’ transition to the third year. The course has been renamed,

“Adaptations and Clinical Transitions”

Assessment and Target: OCA will review 2016-2017 end of course evaluations. Potential focus groups

Results/Evidence of Effectiveness: In process.

Planned Use of the Evidence and Next Steps: Information collected will be shared with Course leadership

and design team; Year 1 and 2 Instruction Committee; and the Curriculum Steering Committee

Goal 8: Monitor the overall quality and student perception of the revised Successful Interning (SI) course at the end of year 4

Activity: The SI course has added an additional week in AY 2016-2017 to help prepare students transition to

internship and help students meet specialty specific competencies.

Assessment and Target: OCA will review 2016-2017 end of course evaluations; 2017 AAMC GQ

Results/Evidence of Effectiveness: In process.

Planned Use of the Evidence and Next Steps: Information collected will be shared with Course leadership

and design team; Year 3 and 4 Instruction Committee; and the Curriculum Steering Committee

Page 25: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 24

SCHOOL OF NURSING

Assessment Report Academic Year 2015-2016

Dr. Bethel Powers Reporting

In the School of Nursing (SON), program and course goals are aligned with the accreditation standards

related to each program area, operating within the context of the University’s mission, goals, and vision.

Assessment activities occur within the following organizational context. Program assessment is conducted

collaboratively by faculty (The Faculty Organization), the Curriculum Committee, the appropriate program

subcommittee (baccalaureate, master’s, DNP, PhD) and the Senior Academic Leadership Team (SALT).

School-Level Assessment: Curriculum Committee Initiative

Goal 1: Improve Faculty Course Assessment Process

Objectives in Support of Goal 1:

1. Assure individual course alignment with program goals and national discipline-specific professional

standards

2. Assure that course content and course assignments support student learning outcomes

3. Implement a user-friendly process to systematically assess student learning at the course level as well as to

identify changes to infrastructure, course content and teaching practices

4. Trend outcomes which result in improved student learning

Activities in Support of Goal 1:

Development of a course-based assessment tool to replace open-ended faculty end-of-course narrative which (a) aligns Program Objectives…Student Learning Outcomes…Embedded Course Assessments (teaching and assessment methods) and (b) enables tracking of strategies and changes to improve student learning. [The previous open-ended reporting involved a blank form accompanied by the following

instructions:

"The faculty course summary includes a personal evaluation of course operations, including

context relevant to course evaluations by students and recommendations for operational changes in

future semesters." The new course assessment tool includes preset fields and open fields for ongoing

and systematic faculty reports.]

Preset fields identify student learning outcomes and the program objectives to which they are mapped

Faculty are directed to fields for reporting specific information related to: teaching methods and activities used to meet the desired student learning outcomes methods used to assess student learning results, recommended changes, and follow-up plan

Page 26: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 25

SCHOOL OF NURSING (continued)

Assessment Strategies and Targets for Goal 1:

The new course assessment approach was designed during the spring 2016 semester by the SON Curriculum Committee and approved by program subcommittees and SON Faculty in October 2016.

Implementation of the new tool is planned for the end of fall semester 2016.

Per usual, faculty course assessments will be collected by and stored electronically in the Office of Evaluation and used for ongoing improvement at both the course and program level.

Results, Use of Assessment Data & Next Steps for Goal 1:

Faculty course assessments will be used by program leadership and program faculty (a) to evaluate achievement of key program goals and (b) to evaluate student learning outcomes at the course level in order to ensure alignment across program curricula.

Faculty course assessments are also used by course faculty, at the individual and program level, to identify best teaching practices and guide needed changes in future iterations of the course

The following report represents assessment plans and activities of SON Program Subcommittees, the

functions of which include ongoing review and evaluation of the overall management and curriculum for

the program.

Subcommittee for Undergraduate Programs

Goal 2: Follow-up [highlighted] on goal to improve NCLEX (RN licensure exam) outcomes

Activities in Support of Goal 2:

Activity 1) A comprehensive and complete assessment of the current curriculum including content mapping to the detailed NCLEX test plan, leveling of content and adding content based on the content mapping which is found to be lacking

Activity 2) Partnering with ATI nursing education (a private company which supports pre-licensure programs) in providing student education support and assessments which directly align with the NCLEX test plan. This will allow our students to be exposed to national normed, computer based testing in preparation for the NCLEX exam. APNN courses have now integrated practice ATI tests and summary end-of-course ATI examinations. The anatomy and physiology ATI test will be administered in the first week of the program beginning January 2017 to better assess student knowledge in this critically important foundational material.

Assessment Strategies and Targets for Goal 2:

1) Careful monitoring and oversight of the APNN curriculum, including review of all faculty-developed exams in order to assess alignment with content mapping and the NCLEX test plan

Page 27: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 26

SCHOOL OF NURSING (continued)

2) Review of student progress and areas of weakness on the nationally normed ATI exams, with strong academic support for those students in identified areas of weakness. Our goal would be to have all students achieve at least 70% on the nationally normed exams (which is correlated with a > 95% pass rate on the NCLEX exam). The latest NCLEX overall rate was >90% for the UR SON.

Results, Use of Assessment Data & Next Steps for Goal 2:

Information from these assessment activities will be shared with the APNN faculty and will shape future changes within the curriculum in relation to the APNN NCLEX pass rate. This is the first step in an overall examination of the APNN program.

Subcommittee for Master of Science (MS) Programs

Goal 3: Follow-up on goal to perform a formative assessment of a new Master of Science in Nursing

Education program

Activities in Support of Goal 3:

A Master of Science in Nursing Education (MNE) Program was developed in 2014 and 2015 and, following NYSED approval, admitted the first class of 22 students in September 2015. The MNE specialty director, graduate program director, and course faculty initiated a new program assessment plan, beginning at the course level. Following conclusion of the initial fall 2015 semester, based on student feedback, a change in methodology was implemented, in that student assessment course evaluation data were collected not in survey, as originally intended, but in face-to-face sessions or focus groups, conducted by course faculty. This assessment meets CCNE Standard III (Program Quality; Teaching and Learning Centered Practices), Element III G (Curriculum and teaching-learning practices are evaluated regularly at scheduled intervals to foster ongoing improvement).

The new MNE Program Assessment Plan data are being collected and analyzed at the conclusion of every semester and will support the Annual Program Assessment plan to be implemented following the graduation of the first MNE students. New program assessment goals will be to systematically assess student and faculty input on four key areas: (1) individual course design, (2) overall program design, (3) the alignment of course curricular goals, as well as (4) achievement of individual student learning outcomes per course.

Assessment Strategies and Targets for Goal 3:

As a formative assessment for the new MNE program, focus groups of students taking all MNE courses were conducted at the end of fall 2015 and spring 2016 semesters. Students were asked for structured feedback on individual course student learning outcomes (SLO’s), teaching-learning practices, and program and course infrastructure. Course faculty, along with the MNE Specialty Director, served as internal focus group facilitators. While no formal template was used for focus groups, questions were identified to evaluate (1) course infrastructure, content, and delivery; (2) teaching-learning practices; and (3) all curricular and co-curricular student feedback to inform decisions on any course revisions needed and the achievement of individual student learning outcomes.

Page 28: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 27

SCHOOL OF NURSING (continued)

Student feedback via the SON Course Evaluation (CE) process will be collected, trended, and reviewed to triangulate assessment data to inform changes to support student learning at the course level.

Results, Use of Assessment Data & Next Steps for Goal 3:

Results of the new program course assessment process are on file in the MS program office and are available for accreditation review.

Information from this assessment was analyzed and discussed with MNE faculty at monthly MNE faculty meetings as well as annual spring faculty retreat. Changes made, based on this formative assessment, were shared with each course faculty and with MNE students at “town hall” style meetings that occurred at the conclusion of every semester.

“Assessment of student learning outcomes” is a standing agenda item for monthly MNE Faculty Meetings;

MNE faculty were heavily involved in the development of the new course-level assessment tool and process development.

Student and faculty feedback as well as quantitative data from course grades, course evaluation, and individual student assignment grades are trended by semester and reviewed at the completion of each semester.

Any changes in program or course infrastructure will be informed by qualitative data obtained from students and course faculty as well as from the quantitative CE data and new course-level assessment tool.

Subcommittee for the DNP Program

Goal 4: Follow-up on goal to evaluate outcomes of 2014-2015 DNP curriculum revisions

Activities in Support of Goal 4:

During the 2015-16 academic year, approved curriculum revisions were implemented. Updated progress on related activities include:

o All new degree plans for incoming students aligns with desired course sequencing based on logical progression of content and placement of foundational coursework needed to design and develop the DNP scholarly project – the summative experience for the practice doctorate.

o The revised timeline for completion of program milestones included: selection of each student’s DNP committee chair in DNP Practicum I (vs DNP Practicum II), and completion of the DNP Project Proposal Defense one month earlier (September 15 vs October 15th)

o Identifying areas of curriculum enhancement: Leadership development – The addition of the leadership course beginning in

summer, 2015 has provided students with an opportunity to enhance their own leadership development.

Page 29: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 28

SCHOOL OF NURSING (continued)

Population health/data analytics – an additional credit was added to an existing informatics course in the DNP curriculum in response to the intensifying need for data-driven clinical decision making and was implemented in summer, 2016. A DNP-prepared informaticist was added to the course faculty who oversaw content development.

A graduate level elective course in the student’s identified area of interest. There was limited opportunity to take an additional non-required course due to course availability and student scheduling.

o Creation of DNP Project Proposal Defense and Final Defense Evaluation Rubrics- A final defense evaluation rubric was developed in accordance with written DNP Scholarly project paper/presentation guidelines. The existing proposal defense rubric was revised to align with the final defense rubric as well as the scholarly paper guidelines.

o Development of facilitated IRB review process for DNP projects – Working with IRB representatives and the SON, a standardized process and DNP project IRB project were developed and implemented in summer, 2016. This process was developed to reduce the time for review for DNP projects in order to better align with a semester-driven program.

Assessment Strategies and Targets for Goal 4:

Amended program milestone timelines – student and faculty feedback from two cohorts comparing existing and revised timelines for selection of the DNP committee chair and project proposal defense dates were reviewed.

Leadership development – course feedback from students and faculty has been positive and ongoing leadership development has been added to each DNP practicum course as students identify a leadership development objective to their individualized semester learning objectives in consultation with course faculty and their DNP committee chairs. Associated leadership development activities are reviewed at the end of each semester by the DNP committee chair.

Population health/data analytics – Faculty/student feedback will be used to evaluate early experiences.

Elective –. Student/faculty feedback will be used to evaluate current course offerings.

Project rubrics –participant feedback will be collected during initial implementation beginning in spring 2016.

Facilitated IRB review process – IRB application processing time (from time of submission to approval) will be tracked beginning in summer, 2015.

Results, Use of Assessment Data & Next Steps for Goal 4:

Project milestones -These changes resulted in students having additional support with project development as well as additional time to complete project implementation/evaluation, cumulative scholarly paper, and manuscript preparation during their final year of study. No further changes will be made to this timeline at this time.

Page 30: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 29

SCHOOL OF NURSING (continued)

Leadership development –To date, nine DNP students have taken the leadership course and have provided positive feedback. Working with leadership course faculty, the DNP PD has been actively involved in the identification of potential leadership mentors for the field experience component of the leadership course. This has strengthened communication between leadership and DNP programs and will be useful – along with ongoing faculty/student feedback in identifying additional leadership development opportunities in DNP practicum courses.

Population health- Only one student completed the course, so feedback was limited but positive. This will remain an area of ongoing curricular development in response to rapidly evolving healthcare environment. Faculty/student feedback from the population health/data analytics immersion experience will be shared with DNP program faculty/students, colleagues, and practice partners to inform students’ current and future scholarly work and opportunities for expanded learning experiences.

Electives- this curricular change is in early stages of implementation. With limited experience, additional time will be needed to see if this curricular offering provides additional meaningful educational experiences for DNP students. Ongoing efforts will be undertaken to identify potential elective courses in the student’s area of interest.

Rubrics- DNP committee feedback has been positive and additional modifications have been made pending trial experience in spring 2016.

Facilitated IRB review process- The initial experience in summer 2015 was positive with project approval occurring within one-two weeks of submission. The internal facilitation process with the SON with an identified IRB faculty liaison who reviews all DNP projects before IRB submission has been very helpful. Some delays with student project review in the summer of 2016 provided the opportunity for additional refinement of the process and additional communication with the IRB immediately prior to DNP student submission. In addition, additional reminders to DNP committee chairs about inclusion of the DNP checklist as well as inclusion of a detailed review process in the DNP handbook should provide reinforcement of the new process.

Subcommittee for the PhD Program

Goal 5: Follow-up on goal to evaluate effectiveness of the PhD Qualifying Examination

Results of Activities Completed Student resources were reviewed: Student response to the survey on the topic was largely

focused on best ways to prepare for the exam. Resources in place were reviewed (e.g. documented study tips, collection of study materials passed on by prior student test-takers, a collection of previous exam questions) and augmented (e.g. suggested use of flash cards to review key concepts derived from themes used in past exams). A faculty-generated checklist of course content to review was added as an additional study aid. Students confirmed that provision of the grading rubric in advance of the exam is helpful to their understanding of expectations and supportive of ways in which to focus their energies.

Page 31: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 30

SCHOOL OF NURSING (continued)

Written directions to students and faculty were reviewed for parallelism of structure and language. Faculty find that use of the grading rubric is (a) helpful, upon administration, in arriving at consensus about student performance on the exam and (b) useful, following the exam, in guiding improvement based on the resultant picture of individual student strengths and weaknesses.

The rubric was reviewed for its role in evaluating student learning outcomes. The built-in scoring mechanism and wording of rubric dimensions were reviewed for accuracy, clarity, and consistency. The exam has 3 distinct goals: (1) grasp of basic principles and evidence of synthesis, (2) adequate written expression, and (3) adequate oral expression. It is a two-part process (timed written portion followed by an oral portion) that is viewed as a whole in determining whether a student has passed or failed. Scoring directions are provided on the rubric. If a student's written exam is scored such that it is impossible for the student to pass the whole exam, the student will be given the option to partake or forego the oral examination. Students may have one more opportunity to retake the exam. At the time of this evaluation, review of student scores on the exam (2012-2015) revealed that a majority of them received scores ranging from 'outstanding' to 'good' on oral performance and understanding of basic principles and evidence of synthesis with none failing to progress to the oral portion of the exam. However, scores on written performance were noticeably lower, overall. This finding suggests that support for helping students develop their scholarly writing skills should be sought and intensified.

Use of Assessment Data & Next Steps for Goal 5

Helping students prepare to be successful in their encounters with this first program milestone and tracking of exam outcomes will be ongoing activities.

Results of this assessment direct our next steps toward helping students to improve their writing skills.

Goal 6: Trial of a 3-Paper (Manuscript) PhD Dissertation Option

Activities in Support of Goal 6:

Benchmarking of current practices included (a) a survey of dissertation formats at UR and (b) a survey of policies in other schools of nursing regarding the use of a 3-paper (manuscript) PhD dissertation option.

Guidelines were developed to support a limited trial of the 3-paper dissertation format. Two students and their dissertation committee members participated in the trial and provided regular progress reports.

Both students completed the trial and successfully defended their dissertations.

Page 32: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 31

SCHOOL OF NURSING (continued)

Assessment Strategies and Targets for Goal 6:

Student and faculty trial participants collaborated with PhD Subcommittee members on the identification of factors affecting student choice to pursue the 3-paper versus the traditional 5-chapter dissertation format.

On the basis of trial outcomes, the PhD Subcommittee drafted a policy and developed written guidelines for the 3-paper dissertation option as well as procedures governing choice between the two options (3-paper or traditional chapter format). Student and faculty trial participants provided feedback in the form of additions corrections, and approval of the draft. The policy and procedures governing choice between the two options were incorporated into the 2016-2017 PhD Student Handbook.

Results, Use of Assessment Data & Next Steps for Goal 6:

Follow-up of dissertation option choices made by students and the outcomes of those choices will be ongoing.

Page 33: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 32

UNIVERSITY GRADUATE STUDIES

Assessment Report for Academic Year 2015-2016

Dr. Margaret H. Kearney Reporting

1. Significant changes in formal or informal assessment of student learning activities since the last

annual report: No changes were made. I continue to rely on two surveys (one of exiting PhD students, which is

administered by many AAU universities, and one of faculty evaluating final PhD defenses, which was developed here). These data are examined year by year and school by school, with results provided to deans of graduate studies in each school. I also rely on verbal input from deans and directors of PhD programs across the University, who constitute the Council on Graduate Studies.

2. Monitoring of key indicators of concern to the quality of student learning: In reviewing data from the past year, the ratings on almost all aspects of our PhD programs remain

high, from both faculty and student perspectives. However, two areas- one on PhD students’ exit surveys and one on faculty post-defense surveys- are being tracked.

a. Faculty PhD committee members’ ratings of soon-to-be-PhD-graduates’ methodological rigor

and written and oral skills as demonstrated in the dissertation and defense (items 5 and 6 below) were lower than their ratings of the other 4 criteria in 2014-15 and are being monitored. Data below (possible range 1-6, with lower being better) show an improvement on all areas and particularly in these 2 areas from 2014-15 (758 raters) to 2015-16 (936 raters):

1.82

2.28 2.39

2.27

2.49 2.59

1.81

2.23 2.33

2.24 2.34

2.43

1

1.2

1.4

1.6

1.8

2

2.2

2.4

2.6

2.8

1. Select anddefend animportantproblem or

topic forstudy.

2.Demonstrate

mastery ofrelevant

knowledge inthe field.

3. Applyrigorous

methods ofthe discipline.

4. Produce avaluable

product andaccurately

appraise itsimportance.

5.Communicateeffectively in

academicwriting.

6. Effectivelydefend thework whenquestioned.

Mean 2015Mean 2016

Page 34: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 33

UNIVERSITY GRADUATE STUDIES (continued)

It is important to note that scores between 2 and 3 are not poor results. For writing, 2= “writing in

dissertation was fluent, clear, and effective” and 3= “writing was correct and easily understood.” For oral

defense, 2= “responses to all questions were well-argued, convincing, and engaging” and 3= “effectively

explained reasoning and decisions when questioned.” These response options were intentionally written to

detect subtle differences at a very high level of achievement.

Seen another way, the percentage of worrisome ratings of 5 or 6 has dropped:

2014-2015 2015-2016

In the final dissertation and oral defense, the PhD graduate will be able to:

Number of 5 or 6 (out of 758)

% 5 or 6 Number of 5 or 6 (out of 936)

% 5 or 6

5. Communicate effectively in academic writing.

32 4.2% 29 3.0%

6. Effectively defend the work when questioned.

32 4.2% 38 4.0%

While the faculty continued to indicate slightly lower enthusiasm for PhD graduates’ written and oral skills than for the quality of the dissertation study itself, the scores moved out of a worrisome range.

b. Exiting PhD students’ lower ratings of the quality of advice from their advisors on non-academic career options than on academic career options were first identified as an issue in 2013. While almost all other ratings were very positive, these stood out as lower than desired. Looking at 3 years of data below, we see an improvement in 2014-15, and in 2015-16 our targets for satisfaction were met. Possible explanations are noted below under “closing the loop.”

Target Data Source and Results/Evidence of Effectiveness

Both academic and non-academic career advice will be rated on PhD Exit Survey as “somewhat “ or “very helpful” by >60% and as “very helpful” by >30%

PhD Exit Survey: Career Advice from Mentor

Late 2013 (N=67)

2014-2015 (N=184)

2015-2016 (N=247)

Academic careers:

“somewhat helpful” 38% 24% 30%

“very helpful” 38% 50% 58%

TOTAL 76% 74% 88%

Non-academic careers:

“somewhat helpful” 28% 17% 31%

“very helpful” 21% 34% 40%

TOTAL 49% 51% 71%

(shading= below target)

Page 35: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 34

UNIVERSITY GRADUATE STUDIES (continued)

c. “Closing the loop” activities over the past year (e.g., program changes or teaching/learning strategies) to address student learning needs previously identified:

Since 2013, based on the perceived need to improve students’ writing and oral presentation abilities, SMD instituted a writing coach service. Student comments on the exit survey indicate that this was very helpful. The anticipated launch of a campus-wide English language training service (Cambridge) open to both undergraduates and graduate students may further assist with both written and oral presentation skills. In the realm of career planning and specifically non-academic careers, three programs were launched in 2013-2015: CIRTL, providing training for teaching careers, housed in ASE but open to all; the Center for Academic and Professional Success in the SON; and a number of SMD resources including lectures, workshops, internships, and mentoring facilitated by a federal grant (UR BEST). SMD and ASE also sponsored a University subscription to VirtualPhD.com, an online resource for career options outside of academia, with access on the University Graduate Studies website. ASE is particularly encouraging students’ use of this resource. Although the more positive ratings in both these monitored areas cannot be attributed to a specific program change, we are happy to see satisfaction moving in a positive direction and will continue to monitor them.

d. Current and anticipated assessment challenges from external sources (upcoming discipline-specific accreditation reviews; accreditors’ changes in processes; etc.). The only anticipated challenge is the workload of data management in Institutional Research, to enable university-wide dashboards of grad student data to support plans described below.

e. Planned changes in formal assessment practice or procedures (software changes affecting assessment processes; new staffing; partnerships; etc.). Happily we do not anticipate any staff or software changes in the coming year. We are enthusiastic about joining others in the Provost’s administrative group in using Tableau to display data on graduate students and their outcomes for internal planning purposes. I also have set a goal of posting of PhD outcomes such as time to degree and degree completion rates on the Graduate Studies website in the coming year. As a future evaluation focus, I am starting to look at some thought-provoking 5-year University-wide trends in graduate students’ applications, selectivity, and yield. These data reflect quality of our programs’ reputation and competitiveness of admission offers, and they affect program quality through the quality of the student body.

Page 36: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 35

UNIVERSITY GRADUATE STUDIES (continued)

f. Professional development activities by the assessment liaison to remain up-to-date on

assessment approaches. As a member of the executive committee of the Association of Graduate Schools, The AAU’s graduate school deans’ group, I have been fortunate to be exposed to assessment practices, concerns, and forecasts of graduate deans across the top schools in the US and Canada. I helped organize and attended conferences in September 2014, 2015, and 2016, and moderated a session in 2016 on making PhD outcomes data public on university websites. This is of particular interest, as AAU is considering making posting of certain metrics an expectation of AAU members. As the AGS liaison to AAUDE (the AAU Data Exchange), I will be attending their annual meeting in Florida in April. I am also co-chairing a national working group of representatives of AGS and AAUDE (AAU Data Exchange) that is looking at refining AAUDE’s metrics for time to degree, completion rates, and the PhD exit survey, and is holding preliminary discussions of an AAUDE-wide approach to tracking PhD career outcomes. This level of national participation will enable me to bring best practices back to UR.

Page 37: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 36

WARNER SCHOOL OF EDUCATION AND HUMAN DEVELOPMENT

Assessment Report for Academic Year 2015-2016

Dr. Karen DeAngelis Reporting

Goal 1: Accreditation

As noted in the 2014-15 annual report, reaccreditation for the Warner School’s educator preparation

programs occurred during the 2015-16 academic year. In August 2015, the Warner School submitted its

NCATE self-study report in preparation for its final NCATE site visit in April 2016. In November 2016, the

Warner School received official notification that it had met all of NCATE’s standards for reaccreditation, with

the following three Areas for Improvement (AFIs):

1. Standard 4, Diversity: The unit has not articulated candidate proficiencies related to diversity

identified in the unit's conceptual framework (in its advanced programs).

2. Standard 4, Diversity: Candidates in the K-12 initial and advanced programs have limited

opportunities for interaction with faculty from at least two ethnic/racial groups.

3. Standard 4, Faculty Qualifications, Performance, and Development: The unit does not have a

systematic, formal process for evaluating university supervisors with consistency (in its initial

programs).

A plan to address these AFIs is noted in the Next Steps section below.

In fall 2014, the Warner School submitted Specialty Program Area (SPA) reports in order to be considered for

national recognition in each of its licensure areas, as required by New York State and NCATE. As part of the

accreditation standards, NCATE requires Educator Preparation Providers (EPPs) in states that require SPA

recognition to demonstrate national recognition in 51% or more of its programs. In spring 2015, the Warner

School learned that 7 of its reviewed programs were awarded national recognition, whereas 5 programs

(Building and District Leadership, Elementary/Childhood, and Reading and Literacy – birth to grade 6 and

grades 5 to 12) required revisions. Between spring 2015 and summer 2016, revisions were made to the

curriculum, assessments, and/or rubrics in each of these 5 programs in an effort to address the concerns

raised by the SPA reviewers. Reports documenting these revisions were submitted by the required deadline

in September 2016. The Warner School will receive feedback on these revisions in spring 2017.

Next steps. Following this 2016 NCATE decision, the Warner School is required to transition its educator

preparation programs to the new standards and policies of the Council for the Accreditation of Educator

Preparation (CAEP), the new accreditation body that has replaced NCATE. This transition will take significant

planning and effort due to the change in standards and the shift in expectations to fewer, higher quality

assessments and greater emphasis on students’ and graduates’ effectiveness in the field.

Page 38: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 37

WARNER SCHOOL OF EDUCATION (continued)

The Warner School’s first CAEP accreditation visit will take place in 2023. In the meantime, CAEP requires

annual reporting of common data from its EPPs, including progress made toward addressing its AFIs. To lead

this transition and the on-going reporting and assessment responsibilities associated with CAEP’s

accreditation process, Dean Borasi of the Warner School created in July of 2016 a new, part-time

administrative position, Associate Dean for Academic Programs. One of the responsibilities of this Associate

Dean is coordinating and overseeing CAEP accreditation. Karen DeAngelis, Associate Professor and Chair of

Warner’s Educational Leadership programs, was appointed to this position. During the 2016-17 academic

year, CAEP-related activities will include:

o Associate Dean DeAngelis will work with Dean Borasi to identify a Warner School accreditation/assessment committee that will be charged with leading the effort to implement the changes and continuous improvement cycle required by CAEP. Associate Dean DeAngelis will begin convening this committee during the spring semester of 2017 in order to create and begin working with other relevant Warner School faculty and staff to implement a transition plan. At a minimum, the committee will meet twice per semester. Agendas and minutes from the meetings will be used to document the decisions, activities, and progress made by the committee.

o Associate Dean DeAngelis and other members of the aforementioned committee will participate in professional development opportunities offered by CAEP and other accreditation and/or assessment organizations. Associate Dean DeAngelis attended the bi-annual, three-day CAEPCon conference in September 2016 and will be attending the next one in March 2017 with other members of the accreditation/assessment committee.

o In January 2017, Associate Dean DeAngelis will bring together program chairs and relevant faculty from Warner’s educator preparation programs to determine strategies for addressing the three AFIs noted above. The first and third AFIs require the creation of a new assessment tool for each. These assessments will be imbedded in program courses and/or milestones (e.g., portfolio) and the results will be reviewed annually by program area faculty and chairs to inform needed changes to coursework (AFI 1) or supervisor training and/or retention (AFI 3).

o The second AFI requires additional efforts to expand the diversity of faculty and instructional staff in our programs. Following the University’s Report of the Commission on Race and Diversity, Dean Borasi created a Faculty Hiring Task Force that will convene for the first time in January 2017 to review and amend Warner School policies and practices in order to increase the diversity of its faculty and instructional staff. The Task Force includes Warner’s two Faculty Diversity Officers, one Staff Diversity Officer, three program area chairs, and the Warner Center director. As she has been doing throughout her term and as required by Warner’s accreditation body, Dean Borasi will track the number of faculty and instructor positions held by underrepresented individuals, with the information used to assess progress over time and to inform future recruitment efforts.

o Associate Dean DeAngelis will submit CAEP’s required annual report in spring 2017.

Page 39: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 38

WARNER SCHOOL OF EDUCATION (continued)

Goal 2: “Life Cycle” of Student Support

As noted in the 2014-15 annual report, a primary goal in the Warner School’s 2013-2018 Strategic Plan is to

provide support throughout Warner students’ “life cycle”, which extends from their initial contact with the

Warner School as applicants through their career development and engagement as alumni. In February 2015,

a part-time career services professional, Harriette Royer, was hired as Assistant Director of Admissions and

Student Services.

During the 2015-16 academic year, Assistant Director Royer conducted 12 career development workshops for

Warner School students on a variety of topics, all focused on preparing the students for internship and career

opportunities during and following their time at Warner. In addition, she provided 1:1 consultations with

students and alumni by request. The ultimate goal/outcome of these services is gainful employment of

Warner School graduates in their chosen fields. The job market outcomes of graduates are important to track

and understand not only for Warner’s own internal purposes, but also for accreditation as one of the five

CAEP standards focuses on the post-graduate outcomes (i.e., employment and effectiveness) of our students.

Next steps: To date, the career workshops and consultation sessions have not been formally evaluated.

Assistant Director Royer has simply worked with program advisors, students, and faculty to understand

Warner students’ career-related needs and has created the range of current offerings in response. Associate

Dean DeAngelis with work with Assistant Director Royer in spring 2017 to develop protocols to evaluate and

inform the direction of these services.

Tracking graduates’ career outcomes requires a comprehensive database of Warner School alumni, including

up-to-date contact information in order to maintain contact with them. In fall 2016, Associate Deans Brent

and DeAngelis began meeting with Assistant Director Royer and other Warner staff (from IT, Student

Services, Admissions, and Advancement) to determine how best to construct and keep updated such a

database. The database needs to be designed to serve multiple Warner School purposes, including enabling

us to gather employment information over time, to contact alumni about events and other opportunities,

and to survey alumni of our educator preparation programs and their employers on an annual basis as

required by CAEP and CACREP (the accreditation body for Warner’s counseling programs). One activity in

support of this database construction effort that was implemented in December 2016 is the administration of

a mandatory survey to be completed by graduates just prior to their graduation. The survey collects

personal/permanent email and other contact information, as well as employment information, if known. This

survey will enable the Warner School to track graduates moving forward. Associate Dean Brent will track

survey completion rates to determine the effectiveness of this approach in collecting graduating students’

information.

In addition, the Warner School is utilizing the services of an external consulting firm to gather the contact and

employment information of Warner School graduates back to 2011 in an effort to populate the database

with recent graduates’ information. The percentage of graduates whose information is found or updated by

the firm will determine whether its services should be continued.

Page 40: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 39

WARNER SCHOOL OF EDUCATION (continued)

A third activity that has been implemented is an electronic form that faculty and staff can complete to

provide updated information on alumni as they obtain it. This form is based on the idea that program

advisors and other faculty and staff who worked most closely with students during their time at Warner are

most likely to maintain contact with them following graduation. The form will allow us to track when and by

whom a graduates’ record was last updated, thereby providing information regarding faculty participation in

the effort and the usefulness of the form in updating the database.

Ultimately, the goal of these efforts is to enable the Warner School in the coming years to analyze, report

and reflect on graduates’ employment outcomes and career trajectories and how our programs might be

improved to support these outcomes.

Page 41: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 40

THE COLLEGE

Assessment Report Academic Year 2015-2016

Josephine Seddon Reporting

In October 2016, the Director of Educational Innovation and Assessment position was redefined and named the Director of Educational Effectiveness.

In November and December, there were numerous meetings with department chairs, program directors, and office/center leaders that supported the gathering of information about College initiatives in progress during 2016. Details were also gleaned from a variety of sources including committee discussions/minutes, email communications, reports, results from survey research, and additional artifacts included in electronic file archives. This report summarizes findings thus far with respect to curricular and assessment initiatives for the College.

The report provides information about the School of Arts and Sciences, the School of Engineering, and the College offices and centers that lead academic support and co-curricular programs.

Reflection on Achievement of Annual Goals (as noted in College Assessment Plan for 2016)

1) The College STEM Foundation Course Initiative was designed with the goal of improving student learning and performance in the STEM foundation courses (math, physics and chemistry) and is still in progress. This initiative includes an option to take the two course required sequence in math (Math161/162) as a three course sequence (Math141/142/143). Kearns Center was involved in the piloting of the alternate three course sequence that was then made more generally available last year under Summer Programs. Preliminary results show increased retention among engineering students especially with individual follow-up by staff from the deans’ offices for each of the schools. Investigations are continuing into the impact on retention across the undergraduate populations especially with respect to the interventions. This initiative will continue during AY2016-2017.

2) The College Portfolio Initiative is currently on hold due to change in personnel and a university-wide effort to review and vet potential technologies that could best support electronic portfolio related products.

3) The College Curriculum Review Committee has completed its review and provided recommendations, noted in the final section of this report. Follow-up is planned for AY2016-2017 with most immediate attention to focus on the recommended “additional cluster” option.

4) The College Major Innovation and Assessment Initiative is also currently on hold. This goal will be revised with plans to establish a College-wide committee tasked with providing leadership and support to assist faculty in designing and implementing curricula and assessments. Special focus will be placed on “closing the loop” for continual improvement of programs and initiatives. Meeting and action details will be shared regularly with other stakeholders, and plans to collaborate across departments, units, and other committees will be paramount. Department and office/center/unit assessment planning workshops will continue to be offered in the next academic year.

5) The Graduate Studies Professional Skill Development Initiative will need to be revisited due to changes in personnel/staff. The previous Dean for Graduate Studies is now the Dean for the Hajim School of Engineering. Plans are in progress for the new Dean of Graduate Studies and the new Director of Educational Effectiveness for the College to meet and discuss desired/necessary activities for AY2016-2017.

Page 42: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 41

THE COLLEGE (continued)

Assessment of Curricular and Co-Curricular Programs

Program learning outcomes and individual assessment plans for collection of both direct and indirect evidence exist for degree majors across all departments and programs in the College with many also including course reflection memos to serve as a baseline for future curriculum and assessment initiatives.

Co-curricular programs, directed mainly by the centers and offices that support students in the College, also collect and monitor assessment data regularly that includes student participation rates, correlation of participation with academic success indicators, and anecdotal information to support decision making.

The College’s Office of Educational Effectiveness continues to play a critical role in preparing for and analyzing COHFE surveys (Survey of New Student, Enrolled Student Survey, Senior Survey, Alumni Survey), UR-specific surveys, and CIRP tools. These results are shared within the Dean’s office and across the College as appropriate.

The plan to develop a College-wide project, independent of the university-wide effort, has been, at least temporarily, placed on hold. Investigations into options for suitable technologies to support electronic assessment strategies are currently taking place across the university, with the College participating in the conversations and vetting of the various products.

The key areas of student learning concern that were addressed in last year’s assessment plan and report, related to both oral and written communication, will require follow up. Program plans to implement a full review and assessment of processes to support curricular student learning outcomes (discipline specific and general education) along with co-curricular student learning outcomes will be included as part of the charge for a planned College committee to be tasked with collaborating and providing leadership and/or support in the planning for assessment infrastructure and protocols for continual improvement of programs.

Assessment of the Rochester Curriculum – Recommendations and Actions

In January 2016, The College Curriculum Committee (CCC) completed the review of the Rochester curriculum

and prepared a report with their findings. The College Curriculum Committee states in their report that “The

committee believes that the Rochester Curriculum is still successful, and its recommendations were minor.

They emphasized areas such as global engagement, experiential learning, and the cluster system.” (CCC,

2016, p.3) Excerpts from the executive summary are included below for reference with action that was taken

as a result. (CCC, 2016, p.3)

Page 43: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 42

THE COLLEGE (continued)

Goal 1: General Education

RECOMMENDATION: “Permit students to complete up to two bonus clusters. These are clusters completed by a student beyond those used to meet the distributional requirements and will be listed on the transcript. This category would include a second cluster taken by Engineering students, as they are required to take only one cluster either in humanities or social sciences. Recognition of bonus clusters encourages students to explore new areas and pursue them in some depth.” ACTION 2015-2016 Students will be permitted to complete up to two bonus clusters.

RECOMMENDATION: “Every introductory-level course should be included in a cluster, so that any course a freshman takes (other than those associated with the primary writing requirement) could be used for a major or a cluster. This would enable students to explore various areas of interest without their having to worry about satisfying general education requirements.” ACTION 2015-2016 Under consideration. Follow up is planned for Summer 2017 by College Curriculum Committee.

Goal 2: Writing

RECOMMENDATION: “The Writing, Speaking, and Argument Program has collaborated with several departments in the Hajim School and with some departments in Arts and Sciences in the development of discipline specific upper-level writing courses. Expand the involvement of the Writing Program in upper-level writing courses to other departments, if such departments think this would be beneficial to their programs.” ACTION 2015-2016 The Writing Program is very much involved in both lower-level and upper-level writing courses. Expansion in involvement will grow where possible. This will need be revisited. A meeting with the Writing Program Director has been requested to discuss assessment results.

Goal 3: Experiential Learning

RECOMMENDATION: “To facilitate greater opportunities for independent study, encourage department

chairs to request funds in their instructional budgets to cover some of the teaching for faculty heavily

involved in supervising students doing independent study. ACTION 2015-2016 The Office of Undergraduate

Research continues to provide networking and support for research opportunities for undergraduates.”

Additional funding is not available at this time.

RECOMMENDATION: “Publicize the availability of Discover Grants, a $50,000 program to expand

undergraduate research opportunities. Proposals are accepted from either students or faculty, but must

include at least one faculty advisor.” ACTION 2015-2016 The Office of Undergraduate Research continues to

provide networking and support for research opportunities for undergraduates. Discover Grants are

publicized via the website at http://www.rochester.edu/college/ugresearch/discover.html

RECOMMENDATION: “To ensure that all faculty interested in including community-engaged activities in their

courses can do so, continue to publicize and support the Community- Engaged Learning Fund.” ACTION 2015-

2016 Again, the Office of Undergraduate Research continues to provide networking and support for research

opportunities for undergraduates. The Community-Engaged Learning Fund is publicized at

https://www.rochester.edu/college/rccl/faculty/minigrants.html

RECOMMENDATION: “Standardize the numbering of 39x courses across departments.” ACTION 2015-2016

Standardization of 39x courses across departments has been approved by the College Curriculum Committee

and Steering Committee with great support from the Career Center. Implementation is ongoing and will

continue through the coming years.

Page 44: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 43

THE COLLEGE (continued)

Goal 4: Global Engagement

RECOMMENDATION: “To facilitate student participation, departments should identify several international

programs that it can particularly recommend to its majors for study abroad.” ACTION 2015-2016 Follow up

will be required with respect to further identification of international programs that could be recommended

by departments to its majors. At this time, information regarding possible international programs affiliated

with degree majors can be found online on the University of Rochester website and also posted in many

areas across campus. The Director for Educational Effectiveness will work with the Center for Education

Abroad and academic departments during 2017 to assist in identifying potential alignment between program

goals and international program initiatives.

Goal 5: Career Preparation

RECOMMENDATION: “Appointment of a small committee to oversee and coordinate various initiatives that

are currently underway or under consideration, including development of an e-portfolio program, a

proposed career and internship center course, additional upper-level writing courses as described above, and

the KEY program. ACTION 2015-2016 will require additional follow up in light of the fact that there is a

university-wide committee reviewing e-portfolio programs. It is possible that the university-wide committee,

in collaboration with the IT team, will fulfil the role of the small committee noted in this observation by the

Curriculum Review Committee.

Page 45: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 44

SIMON BUSINESS SCHOOL Assessment Report for Academic Year 2015-2016 Greg Dobson Reporting

Goal 1: This goal follows up on a goal expressed in 2014-2015 to develop and implement the Simon EDGE program in fall 2015. The plan was to offer the EDGE program to the incoming class of 2018 in fall 2016, with continued annual report. ACTIVITIES SUPPORTING ACHIEVEMENT OF GOAL The Simon EDGE program was put in place in 2015-2016 and continued in 2016-2017. There was a long-term understanding in the school that our students needed to improve on a number of dimensions: Problem Solving, Communication, Leadership, Global Awareness, Integrity and Team Building. It brought together a number of initiatives that were run separately [Management Communication course, a “coach” program, a workshop program similar to what is done in the College, activities associated with CMC, and added additional components (first-year team project, mentored by business people and faculty)]. The effort was designed to brand these activities so students would take them more seriously because much of this effort was “co-curricular” and little of it received academic course credit. Examples of things that were changed: The problem solving seminar (consulting style problem solving) was combined with the first-year project to give them practice with such skills. The leading teams non-credit course from previous years for workshop leaders, was expanded to include the coaches as well as the workshop leaders because it was felt (based on feedback from some students who took on both roles) that the skills taught to the workshop leaders were useful in the coach role. This course was given 1 credit for each of two terms (without tuition payment). ASSESSMENT STRATEGY At the end of each academic year, David Tilson, Associate Dean of the Full-Time MBA, who oversaw the Simon EDGE Program, organized reviews of the program. RESULTS, USE OF THE ASSESSMENT Actions that resulted from June 2016 meeting:

1. Aligning the interpersonal learning goals of the Management Communications Course (MGC) and the Career Services Center (CMC).

2. Possibly renaming the (student) coach role to eliminate confusion with the executive coaches. 3. Reviewed the writing center needs. 4. Reviewed how student teams are formed and reformed throughout the first year.

Page 46: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 45

SIMON BUSINESS SCHOOL (continued) Goal 2: This goal follows up on another goal expressed in 2014-2015 academic year to investigate the benefits of transitioning to a semester calendar. ACTIVITIES SUPPORTING ACHIEVEMENT OF GOAL The possibility that the school would move to a semester calendar generated a great deal of discussion among the faculty. Eventually four proposals were put forth: 1) Stay on the quarter system as it currently exists, 2) Change to a quarter system with a winter quarter split by the Christmas holiday as RIT did in the past, 3) Move to a semester system with the intention of offering most courses as 2-credit minis (half semester courses), and 4) Move to a semester system with the intention to offer most courses as 3-credit minis (half semester courses). A 3-credit mini would meet 4 hours per week and thus have the same credit hours as our previous quarter courses. Benefits of moving to semester calendar are alignment with the rest of the university for the services the university offers our students and alignment with most other business schools and thus alignment with many firms who hire our students for internships and full-time jobs. The lack of alignment with businesses is causing increasing conflicts for our students and faculty, e.g., requests for early exams in the spring to get to an internship before classes have ended, and students not on campus in the fall for the peak of the recruiting season for full-time positions. Benefits of the remaining on the quarter system would be ease for faculty not having to redesign their courses or take on different teaching assignments. The 3-credit mini plan (#4) also had this benefit because the current quarter courses would be placed into a 3-credit mini schedule, and teaching assignments and course content would remain the same for most faculty. There was substantial debate, proposals listing benefits and costs were circulated, a faculty meeting was held where a spokesperson for each option presented, and then the faculty voted. Option #4 was picked. ASSESSMENT STRATEGIES AND/OR TARGETS The target date to move to a semester model to include mini-courses within each semester is Fall 2019. Faculty and Dean’s office discussions around implementation strategies will be ongoing throughout this transition process. The details of how the 3-credit mini courses will be offered in the part-time evening program has yet to be determined. Because the vast majority of our elective (non-core) courses are offered in the evening and thus full-time students take many of their electives in the evening how this is implemented will have a substantial impact on all our students. RESULTS, USE OF ASSESSMENT DATA AND FOLLOW-UP The Office of Student Engagement (OSE) runs regular surveys of the students and certainly their satisfaction with the change in calendar will be captured. There will be one class of students graduating in June 2020 who will experience both calendars and will offer insights into the change. Assessment of Learning can be used to assess whether the calendar affected learning outcomes for Simon students. Without a baseline, interpretation of these assessments will be difficult.

Page 47: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 46

SIMON BUSINESS SCHOOL (continued)

Goal 3: Review three programs, make appropriate modifications and submit the changes to the New York State Education Department for approval. (MBA in Public Accounting, MS in Marketing Analytics, MS in Business Analytics) ACTIVITIES SUPPORTING THE GOAL The two programs were reviewed by faculty and the determination was made to update coursework to increase the quantitative and analytic content of the courses to qualify them for STEM CIP codes. MS Marketing Analytics: Three new courses were developed to achieve STEM qualification: GBA462R Core

Statistics for MN Students Using R.., MKT412R Marketing Research Using R. and MKT465 Marketing Projects.

MS Business Analytics: Four new courses were developed to achieve STEM qualification: CIA442E Data

Management for Analytics, GBA463 Economics and Marketing Strategy for MS Students, GBA466 Accounting

and Finance for MS Students and GBA4664 Programming for Analytics.

ASSESSMENT STRATEGIES AND/OR TARGETS The goal will be considered to be achieved when the programs are approved by the State with STEM CIP codes. MS Marketing Analytics – approved by NYSED 9/8/2015 MS Business Analytics – approved by NYSED 9/8/2015 To review the efficacy of these changes the school uses the following resources: 1) Master’s Advisory Council (a student government committee) 2) Student Evaluation of courses (the Associate Dean of Faculty reviews these) 3) The MS Student Satisfaction Survey (run by the Office of Student Engagement) 4) MS Faculty Committees (there is one for each MS program) RESULTS, USE OF ASSESSMENT DATA AND FOLLOW-UP Review of the data will inform potential additional changes to coursework.

Page 48: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 47

2016 UCEEA Professional Development Activity

NAME/School DATE ACTIVITY

Margaret Kearney Graduate Studies

Mar. 2016 Association of Graduate Schools Executive Committee and Annual Conference

Apr. 2016 AAU Data Exchange Annual Conference

Sept. 2016 Association of Graduate Schools Executive Committee and Annual Conference

Dec. 2016 Council on Graduate Schools Annual Conference

Christopher Mooney School of Medicine & Dentistry – Med. Ed.

July 2016 Directors of Research in Medical Education Annual Meeting

John Hain Eastman School of Music

November 2016 National Association of Schools of Music Annual Meeting

Bethel Powers School of Nursing

January 2016 American Association of Colleges of Nursing Annual Education Conference

Karen DeAngelis Warner School of Ed

September 2016 Council for Accreditation of Educator Preparation Bi-Annual Meeting

Jane Marie Souza Office of the Provost

January 2016 Accreditation Team Member for CPME

April 2016 Serve on MSCHE Working Group

April 2016 Accreditation Team Member for MSCHE

June 2016 Assessment of Learning in Higher Education Conference – presenter

Sept 2016 Drexel Assessment Conference – Presenter/Keynote

Dec 2016 MSCHE Annual Conference

Page 49: Report of Student Learning Assessment · Report of Student Learning Assessment ... Questions regarding its usefulness and clarity, ... ASSESSMENT STRATEGIES AND TARGETS:

UCEEA Annual Report December 2016 Page 48

2015 UCEEA Professional Development Activity

NAME/School DATE ACTIVITY

Margaret Kearney Graduate Studies

Mar 2015 & Dec. 2015

Association of Graduate Schools Executive Committee and Annual Conference

Christopher Mooney School of Medicine & Dentistry – Med. Ed.

June 2015 Society for Directors of Research in Medical Education Summer Conference

Barbara Masi Arts, Science, Engineering

June 2015 American Society for Engineering Education assessment workshops

Jane Marie Souza Office of the Provost

June 2015 Association for Assessment of Learning in Higher Education Annual Conference (presenter)

Linda Lipani School of Medicine & Dentistry – Grad. Ed.

June 2015 Association for Assessment of Learning in Higher Education Annual Conference

James Zavislan Hajim School of Engineering

Aug. 2015 ABET Annual Meeting

Jane Marie Souza Office of the Provost

Aug. 2015 Periodic Report Review Evaluator for Middle States Commission on Higher Education

Jane Marie Souza Office of the Provost

Oct. 2015 Assessment Institute at Indianapolis (keynote presenter)

Christopher Mooney School of Medicine & Dentistry – Med. Ed.

Nov. 2015 AAMC Medical Education Conference

Linda Lipani School of Medicine & Dentistry – Grad. Ed.

Fall 2015 Book: Leading Change by John P. Kotter

Amy Bruinooge Simon Business School

Dec. 2015 Middle States Annual Conference

John Hain Eastman School of Music

Dec. 2015 Middle States Annual Conference

Jane Marie Souza Office of the Provost

Dec. 2015 Middle States Annual Conference