As unionized Faculty, we oversee and coordinate the development of a campus-wide program of assessment.
OFFICE OF EDUCATIONAL
ASSESSMENT
Session Learning Outcomes
Together we will:
1. Evaluate comprehensive assessment plan processes, designed around athree-year cycle, along with a specific example of evidence-based programimprovement.
2. Encounter specific methods of evidence collection from within existinginstitutional structures as well as the benefits and challenges of creatingelectronic tools.
4. Consistency with current assessment practices, especially for programs accountable to external accrediting bodies, such as AACSB's assessment requirement of the business programs.
Fall 2014We developed a comprehensive assessment plan guided by the following
principles.
1. Program level assessment should be the primary focus.
2. The plan must reflect assessment as a continuous process open to improvement.
3. Although necessarily well-structured, continuous assessment should not impose undue burdens on programs but rather augment existing initiatives to facilitate program development.
Year 1 1. Identify PLOs 1‐4 to assess.
2. Develop assessment tools for courses linked to PLOs 1‐4.3. Assess PLOs through courses linked to PLOs 1‐4.
4.Prepare assessment report detailing the assessment activities, assessment results and outlining the steps that may be taken to "close the loop" in the courses linked to the PLOs 1‐4
Year 2 1. Identify PLOs 5‐8 to assess.
2. Develop assessment tools for courses linked to PLOs 5‐8.
3. Assess PLOs through courses linked to PLOs 5‐8.
4.Prepare assessment report detailing the assessment activities, assessment results and outlining the steps that may be taken to "close the loop" in the courses linked to
the PLOs 5‐8.
Year 31. Identify PLOs 9‐12 to
assess.2. Develop assessment tools for courses linked
to PLOs 9‐12.3. Assess PLOs through courses linked to PLOs
9‐12.4.Prepare assessment report detailing the assessment activities, assessment results and outlining the steps that may be taken to "close the loop" in the courses linked to the PLOs 9‐12
Year 35. Implement steps for closing the loop in courses linked to PLOs 5‐8 as identified
in year 2.
6. Prepare short assessment report outlining the changes and the
assessment results.
Year 25. Implement steps for closing the loop in courses linked to PLOs 1‐4 as identified in
year 1.6. Prepare short assessment report
outlining the changes and the assessment results.
Year 37.Monitor changes in courses linked to PLOs 1‐4 (implementd in year 2). Plan any future change. Prepare a
brief status report.
Program Level Assessment Plan
Note: For the sake of convenience it is assumed that there are 12 PLOs for the program.
Evidence-Based Program Improvement ExampleAssessmentofLowRetentioninthePhysics/EEDepartment
Enrollment by Academic Year
Physics/EE Department Observations•…we discovered that by the end of the fall semester, we lose about 39.1% of our initial incoming majors, and then we lose 52.2% of our incoming majors by the end of their first academic year…
•In the early 2000s , we relaxed math pre/co-requisites placing all first year department majors into PHYS 140 – Elements of Physics I their first semester
Student Mathematics Aptitude Investigation• Scranton math placement exam scores
DAT/26 SATMath SATVerbal HSGPA Exam1 CourseGrade
PT≤14/28
=17.8s2 =20.8
=565s2 =3011
=552s2 =2528
=3.18s2 =0.10
=59.2s2 =253
=58.7s2 =306
PT>14/28
=23.0s2 =11.5
=661s2 =5193
=599s2 =5589
=3.62s2 =0.12
=75.5s2 =267
=78.1s2 =278
Implementation #1• Close that loop! The natural next step would be to…• Enforce outcome of the math placement exam!
• If PT score < 14, start with chemistry and hold off on introductory physics until spring semester of first year, placed in MATH 103 – Pre-Calculus
• If PT score > 14, start with introductory physics and place in MATH 114 – Calculus or higher
• Effect of enforcement is a trailing physics course sequence!
Implementation #2• Since students would switch majors because of their struggle with mathematics
they never saw any real content, so we did something else
• Created a new foundations course showing them the “fun” side of physics and engineering that they take their first semester
Results of Implementations
0102030405060708090
100
2012‐2013 2013‐2014 2014‐2015 2015‐2016 2016‐2017
Enrollment after Fall Semster of Each Respective Year
First Years in Fall First Years in Spring
All Engineering Majors in Spring Total Department Students in Spring
18 1924
45 4543
68 67 7080
55
3530 29 26
36
90
65
3335
ImplementationsEnrollment by Academic Year
Institutional Assessment Data Collection Challenges
•Banner ERP-Security issues•Evision’s Argos-Robust tool for accessing Banner Data
•Office 365-Cloud based many New and emerging tools for collaboration
To assess programs, like General Education, you absolutely need faculty buy-in across campus• How do you get faculty to participate in assessment across multiple
colleges at one university???• PLAN AHEAD and show them a SUCCESSFUL EXAMPLE
First Year Oral Communication (FYOC) Project• Across all three of our colleges
• College of Arts and Science, Kania School of Management, Panuska College of Professional Studies
• 14 instructors• 29 course sections• 507 students• Common rubric and data collection method