19
First Year Learning Initiative Assessment Report June, 2015 page 1 Note: This version of the assessment report has some identifiable course information redacted. For a copy of the appendices, please contact the FYLI director. Assessment Report for AY 20142015 First Year Learning Initiative Prepared by Michelle Miller, Director, First Year Learning Initiative June, 2015 Overview This report uses assessment data gathered in AY 20142015 to address four questions about the First Year Learning Initiative (FYLI): 1. What is the scope of the FYLI program? 2. What happens inside FYLIcertified courses? 3. What is the impact of FYLI on student success? 4. What are the impacts of the FYLI Peer TA program? Implications of the findings for the future of FYLI and recommendations for program refinement will be discussed. Executive Summary What is the scope of the FYLI program? FYLI currently includes 78 courses, with 979 sections offered in AY 1415. Total enrollment headcount was 50,150, up 22% from last AY. Virtually the entire firstyear cohort enrolls in at least one FYLI course (97.9%). FYLI funded approximately 450 Peer TAships in 1415. What happens inside FYLIcertified courses? Course coordinators report that course design and delivery shifts substantially as a function of FYLI certification, especially for effort in the first two weeks, active engagement, and scaffolding. Many coordinators report going from little or no systematic use of GPS to consistent GPS across sections. According to a survey of over 800 students in FYLI courses, these preferred course features were evident, especially required attendance and work due in the first few weeks of class most also reported that resources such as office hours and tutoring were available. Systematic inclass observations of 13 FYLI classes showed that approximately 33% of class time was used for activities other than lecturing. Similarly, students spent over half of class time engaged in active learning, such as asking questions or working in groups. What is the impact of FYLI on student success?

Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 1

Note: This version of the assessment report has some identifiable course information redacted. For a copy of the appendices, please contact the FYLI director. Assessment Report for AY 2014-­2015 First Year Learning Initiative Prepared by Michelle Miller, Director, First Year Learning Initiative June, 2015

Overview This report uses assessment data gathered in AY 2014-­2015 to address four questions about the First Year Learning Initiative (FYLI):

1. What is the scope of the FYLI program? 2. What happens inside FYLI-­certified courses? 3. What is the impact of FYLI on student success? 4. What are the impacts of the FYLI Peer TA program?

Implications of the findings for the future of FYLI and recommendations for program refinement will be discussed.

Executive Summary

• What is the scope of the FYLI program? Ø FYLI currently includes 78 courses, with 979 sections offered in AY 14-­15. Ø Total enrollment headcount was 50,150, up 22% from last AY. Ø Virtually the entire first-­year cohort enrolls in at least one FYLI course (97.9%). Ø FYLI funded approximately 450 Peer TAships in 14-­15.

• What happens inside FYLI-­certified courses? Ø Course coordinators report that course design and delivery shifts substantially as a

function of FYLI certification, especially for effort in the first two weeks, active engagement, and scaffolding.

Ø Many coordinators report going from little or no systematic use of GPS to consistent GPS across sections.

Ø According to a survey of over 800 students in FYLI courses, these preferred course features were evident, especially required attendance and work due in the first few weeks of class;; most also reported that resources such as office hours and tutoring were available.

Ø Systematic in-­class observations of 13 FYLI classes showed that approximately 33% of class time was used for activities other than lecturing. Similarly, students spent over half of class time engaged in active learning, such as asking questions or working in groups.

• What is the impact of FYLI on student success?

Page 2: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 2

Ø In AY 14-­15, over 56,000 GPS alerts were sent to students in FYLI courses. Ø Of the thirty top GPS-­generating sections on the Mountain Campus, every one is part

of a FYLI-­certified course. Significantly more FYLI sections generate GPS alerts, compared to non-­FYLI courses.

Ø Student survey results indicate high levels of agreement with behaviors and attitudes consistent with being a self-­regulated, effective learner, particularly It is very important to me that I work as hard as I possibly can on my academics and I examined how my behavioral choices led to specific results in this course.

Ø However, the number of FYLI courses taken (0-­9) in the first year does not appear to impact DFW, GPA, retention or academic standing.

• What are the impacts of the FYLI Peer TA program? Ø Student testimonials emphasize a number of benefits, including the opportunity to

learn course material more deeply, developing interpersonal and communication skills, and illuminating possible career paths.

Ø Peer TAs tend to have higher-­than-­average GPAs, but matched sample analyses failed to reveal any impact of being a Peer TA on grades.

• Recommendations for the coming academic year:

1. Revisit the role of student surveying in FYLI’s overall assessment plan 2. Identify and implement new ways to document the association between FYLI and student

success 3. Gather baseline data prior to certification for new courses going forward 4. Begin working on a comprehensive plan to amplify the direct impacts of FYLI on student

success 5. Pursue wider use of a modified version of the COPUS system 6. Amplify the impacts of the Peer TA program and expand assessment of those impacts 7. Continue efforts to improve compliance with required FYLI features, particularly GPS

Question 1: What is the scope of the FYLI program? Due to budget limitations, we were unable to bring any more courses into the certification process over the last AY, except for one (GSP 130) for which the Dean of SBS committed to providing all funding for development and future Peer TA support. GSP is finalizing its materials for certification at the time of this report. FYLI currently includes 78 unique courses (Honors sections are certified separately but for purposes of this count, are not considered separate certified courses). Many of these were multi-­section courses;; there were 979 total sections offered in F14-­SP15. The total enrollment headcount was 50,150, with 13,175 individual students enrolled. 97.8% of the first-­year cohort enrolled in at least one FYLI-­certified course in AY 14-­15. Compared to last year, total enrollment is up by 22%. Thus, even within current budget constraints, we are continuing to reach the large majority of first-­year students through the program, as well as accommodate enrollment increases. FYLI’s Peer TA program continues to be a major component of the initiative. In 14-­15, approximately 450 10-­hour-­per-­week TAships were funded by FYLI. Question 2: What happens inside FYLI-­certified courses? Several sources of evidence help illuminate the types of course design, teaching strategies and learning activities that students experience when they take FYLI-­certified courses. These include

Page 3: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 3

survey responses submitted by course coordinators, survey responses from students about their FYLI courses, and observer data on student and instructor behaviors in class.

A. Survey Responses from Course Coordinators As described in prior assessment reports, coordinators submit a set of written responses to the questions discussed during the FYLI development process. These questions tap aspects of course design, pedagogy and delivery organized into three broad areas: socializing students for success (e.g., how the course communicates high standards), best practices in design and pedagogy (e.g., emphasis on active pedagogy), and coordination (e.g., how the different sections of the course will provide a consistent experience). Each query is broken down into “before FYLI” and “after FYLI” to highlight areas where the course was revamped specifically as a result of FYLI participation. In this way, the self reflection responses capture both important aspects of planned practices within a course, as well as the impact of program participation on practice. Each query is also broken down into open-­ended and closed-­ended responses, to enable different ways of looking at the data. This year’s data set includes thirteen courses certified over Summer 2014 and one certified in Spring 2015. As described earlier, budget limitations prevented us from adding more than one course over the remainder of the year. 1. Analysis and Discussion of Closed-­Ended Responses The courses reported on include:

GER 101 CS 122 (lecture) SOC 101 CS 122 (lab) ARB 101 HUM 101 SPA 101 CHI 101 CIE 100 FRE 101 JPN 101 SW 220 ANT 101 POS 110 Appendix A shows the frequency of different responses to the closed-­ended responses. Most responses were formatted with four ordered choices, e.g., “Not at All/Somewhat Well/Often/Very Well/Extremely Well.” Others had yes/no response options. The chart below contrasts Before FYLI and After FYLI answers for a selection of the most salient FYLI criteria: explicitly address study skills and time management, lecture strategically*, offer early and formative feedback, maximize student time on task, explain the time and effort needed to succeed, actively engage students, scaffold from less complex to more complex tasks/material, require students to invest effort in the first 2 weeks, require attendance and/or participation. Data in the chart represent the percent of coordinators reporting the highest or most effective level of practice for each of these criteria. E.g., for lecture strategically, the chart shows the percent reporting that their course did so extremely effectively.

* I.e., do not lecture 100% of the time by default, but only for the proportion of time appropriate for the material, while balancing lecture with some proportion of non-­lecture activities.

Page 4: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 4

As the chart illustrates, FYLI certification is associated with shifts in positive directions for these critical practices. This is particularly apparent for effort in the first two weeks, active engagement, and scaffolding, all of which are particularly important for supporting student learning and setting a tone of high expectations. Similarly, the chart below contrasts Before FYLI and After FYLI answers for the FYLI criteria gauged on a yes-­or-­no basis: addresses how to access class materials and academic support, employs frequent low-­stakes assessments, uses GPS consistently, is coordinated across multiple sections. Data in the chart represent the percent of coordinators reporting “yes” for each of these criteria (where applicable). E.g., for multi-­section coordination, the chart shows the percent responding “yes” (excluding courses where there is only a single section, thus not applicable).

0

10

20

30

40

50

60

70

80

90

100

Before FYLI After FYLI

Percent Indicating Highest Level of Effective Practice

Address Study Skills/Time Management

Lecture Strategically

Early/Formative Feedback

Maximize Time on Task

Explain Time/Effort

Actively Engage

Scaffolding

Effort in 1st 2 Weeks

Attendance/Participation

Page 5: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 5

Here as well, coordinators report shifting practice in the desired direction. This is especially evident for uses GPS consistently, an important finding given that promoting the GPS alerting system is a high institutional priority.

B. Student Survey

As described in prior reports, the student survey was designed as a way to get the student perspective on key practices that coordinators stated would be implemented after certification, e.g., inclusion of forms of instruction besides lecture, required attendance, work due early in the course, and active pedagogy. It was also intended to measure levels of student endorsement of FYLI socialization goals, e.g., taking responsibility for one’s own academic success, trying different study strategies, and valuing academics. Lastly, it assesses student self-­reported utilization of support services and mechanisms, e.g., library resources, coaching, and peer TAs. Also as in prior years, coordinators and instructors were given the final say over whether surveys would be collected from a course, to avoid giving the impression that FYLI was inappropriately surveilling or bypassing instructors. Coordinators who did opt to collect surveys were provided with confidential reports of survey data from their courses for Fall 14 and will be provided with these for Spring . In AY 2014-­2015, a total of 886 student surveys† were completed. 28 courses participated in the survey. One note is that the number of surveys completed within each of these classes varied widely;; in a few, there were only 1-­2 respondents. ACC 205 ANT 102

EE188L EGR 186

† Not all respondents filled out every question, thus the Ns shown for individual items are less than 886.

0

10

20

30

40

50

60

70

80

90

100

Before FYLI After FYLI

Percent Resonding "Yes" Where Applicable

How to Access Support

Frequent Low-­‐‑Stakes

Consistent GPS

Multi-­‐‑section Coordination

Page 6: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 6

ARB 101 ARH 270 BIO 181 BIO 182 BIO 192 CENE 253 CINE 101 CIS 120 CS 110 CS 122 CS 122L EE 110

ENG105 FS 111 FS 121 FS 121H FS 141 FYS 121 GLG 112 HS 200 MAT 136 MAT 136H PHY 262 PSY 101

As in previous iterations of the survey, pedagogy and design features actually observed by students in FYLI courses were queried using a 10-­item, closed ended scale. Each item consisted of one FYLI feature, in the format “Mark how much/how often these things were true of this specific course”, with a 5-­point Likert-­type response option ranging from 5, “Very much/ Very frequently” to 1, “Not at all/Never.” There were also four Yes/No options, querying whether attendance was required, whether work was due in the first two weeks, whether rubrics were used, and whether the student received GPS alerts for that specific class. Lastly, there were two items directly tapping the proportion of class time spent on lecture and the amount spent on active learning. Appendix B-­1 shows descriptive statistics for the Pedagogy and Design Features scale, ordered according to how frequently/how much students saw those features present in the courses, i.e., ordered from best to worst. Notably, mean ratings for all items were well on the desired side of midpoint, i.e., above 3.0, suggesting that for these 28 courses, students endorsed the presence of FYLI design features. Even the least-­endorsed item (i.e., the one at the bottom of the ordered list) was approximately one-­third of one standard deviation different than the midpoint of 3.0, in the desired direction. Patterns of results were broadly similar to those found in earlier years, with particularly high ratings for The instructor was sensitive to different students’ cultural backgrounds, The instructor took into account what students already knew when they started the course, and The instructor explained the time, effort, and commitment needed to succeed in the course. Yes/no items also supported the idea that responding courses did employ FYLI practices: 95.9% said that attendance was required, 81.4% said that rubrics were required, and 97.0% said that there was work due in the first two weeks of class. 42.0% stated that they received GPS alerts in class, which is acceptable given that FYLI does not require all students to receive alerts, rather than there be a systematic plan for their use across a course as a whole. Respondents indicated that these 28 courses were not dominated by lecture, but rather, that there was usually some role of active learning:

Page 7: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 7

The chart below illustrates responses for how frequently students said they used different course-­related services and resources:

Page 8: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 8

Consistent with FYLI principles, few responded that these supports were not available within the course. The majority of respondents (50%) reported that they attended SI, contacted their instructor when needed, and obtained Peer TA support at least once. A substantial minority (>20%) reported attending SI and contacting their instructor 3 or more times. By contrast, relatively few students reported visiting their instructor during office hours, utilizing ITS support, utilizing the Student Learning Center, or tutoring.

C. Systematic In-­Class Observations of Teaching and Learning In Spring 2015, a multi-­disciplinary team of volunteers participated in a pilot project to adapt a system for quantitative observation of classroom practices, the Classroom Observation Protocol for Undergraduate Science (COPUS;; http://www.cwsei.ubc.ca/resources/COPUS.htm), for use at NAU. Team members completed the COPUS training protocol and gathered observation data in a total of 13 lower-­division classes drawn from the First Year Learning Initiative’s list of certified courses. After conducting these observations and submitting data to University College for analysis, team members also participated in debriefing meetings aimed at querying whether the system would be practical for wider, multi-­disciplinary use, and if so, how it should be adapted and changed. The full project description and recommendations were conveyed in a separate report to the Provost’s office, also available on request from FYLI. Relating to the specific question of what happens in FYLI classes, the COPUS pilot project findings paint a picture of diverse teaching methods (not just lecture) across a variety of different disciplines and class sizes.

0102030405060708090100

Percent of Responses

0 times

1 to 2 times

3 or more times

Not provided as an option in this course

Page 9: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 9

COPUS asks trained observers to tally a specific set of instructor behaviors and teaching strategies in two-­minute time increments across one class meeting (typically 50 minutes or one hour 15 minutes). In parallel, student behaviors are also tallied, as well as an overall general rating of student engagement (low, medium, high). Ratings for the thirteen FYLI classes observed showed the following breakdown of instructor behaviors and teaching strategies:

28%

2%3%

22%

2%

7%

14%

5%

5%4%

2%6%

Instructor Behaviors Across All Observations

Lecturing

Real-­‐‑time writing on board/padcam

Follow-­‐‑up/feedback on clicker question or activity

Posing non-­‐‑clicker question

Asking a clicker question

Answering student questions

Moving through class guiding student work

One-­‐‑on-­‐‑one extended discussion with one or a few students only

Showing or conducting demo, experiment, simulation, video or animation

Administration (assign homework, return tests etc.)

Waiting

Other

Page 10: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 10

Lecturing was the most frequently observed behavior, but notably, comprised only just over one-­quarter of all tallies. Instructors also commonly posed questions and moved through class guiding student work, as well as a range of other non-­lecture behaviors that showed up less frequently. Student behaviors were similarly diverse:

42%

1%

2%0%

7%

19%

8%

5%

3%2% 3%

3%

5%

Student Behaviors Across All Observations

Listening

Individual Thinking/Prob Solving

Discuss Clicker Question

Working in groups on worksheet

Other assigned group activity

Student answering instructor's question

Student asks question

Whole class discussion

Making prediction about outcome of demo or experiment

Student presentation

Test or quiz

Waiting on late instructor, AV problems etc.

Other

Page 11: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 11

Students spent a substantial amount of time listening (42% of tallies) but also frequently answered questions as well as asking their own questions. As in the instructor data, there was also a rich assortment of other activities that came in with less frequency, such as group work or making predictions. This trend toward active learning and diversity held up within the individual classes that were part of the project. In no case did Lecturing exceed 50% of all tallies for the class, or Listening exceed 75%. Lastly, the student engagement ratings were very strong:

The observations of teaching practices and student behaviors within these FYLI classes offer documentation of the range of active, non-­lecture approaches being used. Future assessment efforts may include an expanded number of classes and some refinements to the ratings scheme to better capture the range of activities typical in FYLI. Question 3: What is the impact of FYLI on student success?

A. GPS Alerts Historically, one of the most important functions of FYLI has been to funnel instructors towards using the Grade Performance System (GPS) to alert students (and their advisors) about performance and attendance issues, as well as other purposes such as commending good performance. FYLI requires courses to have some systematic plan in place for how GPS will be used in the class. Here are some illustrative responses, drawn from the survey of coordinators at the time of certification, for their GPS plans: Instructors will use GPS as the standard instrument to communicate performance and attendance problems at each chapter test (every three weeks) or earlier if necessary.

All … courses will be standardized in the use of GPS, with notifications sent out regarding attendance, and other grade concerns at the end of week 1 and week 2 and then regularly after this, throughout the semester.

3%

23%

74%

Student Engagement Across All Observations

Low

Medium

High

Page 12: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 12

We will use GPS during the 2nd week to inform students who haven't completed the required assignments that they are being dropped from the course. We will continue to use GPS throughout the semester after the midterms to inform students of tutoring hours, and availability of instructor to come ask for help.

All instructors will contact specific students through GPS after 4 unexcused absences or if grades are low and student needs tutoring * Direct contact with the student will be attempted first whenever possible. * The syllabus of practice will emphasize the role of the instructors in their appropriate use of GPS for warning students falling behind in attendance or success in the class. * Students who are missing in the beginning of the semester will be contacted right away and informed about the withdrawal process.

During each of the first four weeks of class, GPS will be used when and if a student misses a class and/or fails to complete an assignment. After the first four weeks, GPS will be used to notify students who miss classes and/or fail to complete an assignment. Current grades will be available in Bb Learn.

I will deploy GPS 2 and 3 weeks into the term, before the withdrawal date, and at mid-­term. I will continue to use GPS during the rest of the semester to inform at-­risk students.

Feedback is given through GPS -­ Between the first and second week, any student who has not completed initial assignments will be reminded that they will be administratively dropped from the course and should meet with their instructor immediately b) Notify students who perform poorly on 1st exam. c) Notify under-­performing students between 1st exam and midterm.. d) Midterm grades are assigned -­ Students with low attendance (from clicker scores) and/or unsatisfactory grades will again be encouraged to meet with their instructors immediately

This course will utilize GPS on a semi-­regular basis in order to communicate with students and assist them in development and success in the course. GPS will be used every other week for the first month, and then after each major assignment in the course, in addition to after midterm grades.

The FYLI course will utilize the GPS Tool as follows: As part of the new GPS pilot program, at the end of each week during the first three weeks of the semester, instructors will use GPS to notify students of missed classes, lack of participation and/or at-­risk class performance. For the rest of the semester, instructors will use GPS to alert students when they miss more than two consecutive classes or two classes in one month, do not complete the weekly quiz or discussion, or their course grade drops below a B. Finally, instructors will explicitly explain the connection between regular class attendance, completion of assigned work and participation and those expected of … professionals.

After the course’s FYLI redesign, GPS will be leveraged at strategic points in the semester. More specifically, there are 4 GPS notification opportunities: the first at about the 3rd week mark after a few homework assignments and at least one quiz, the second right after the midterm exam, the third around the 11th week mark after at least one more quiz, and the fourth before reading week. This will provide GPS notifications to students that include information about their progress as they move into the midterm, after the midterm, on the way to the final exam, and right before the final exam at times when changes in their study habits can still be useful. Another view of the link between GPS and FYLI comes from the following excerpts from FYLI “syllabus of practice” documents (these function as combination master syllabi/faculty handbooks): GPS: GPS reports serve a valuable function for communicating with students that may be struggling with the course. In order to keep the workload manageable as enrollments continue to grow, GPS reports are scheduled for three “rounds” during the semester: The first GPS report is

Page 13: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 13

sent to students with an average of less than 70% after the first two quizzes of the term, the second is sent to students with an average of less than 70% after the midterm, and the third and final GPS report is sent to students with an average of less than 70% following the completion of two quizzes after the midterm exam. In essence, they are sent after the first, second, and third quarters of the course. Instructors send emails through GPS to communicate about any academic performance or attendance issues students are having during the semester. This communication starts as soon as the first weeks of the semester and could be as frequent as necessary. In these GPS messages, instructors inform their students about problems and suggest possible measures to address the issues (e.g. institutional support through tutoring, conversational lab, or extra time with the instructor during office hours, make-­up homework). Once students have been referred to the appropriate service, instructors contact the service to check if students are taking advantage of the extra help offered to them. Instructors praise students that are working on improving their performance or attendance via GPS messages. It is also recommended students get GPS messages from their instructors at least twice during the semester with positive feedback praising them about their performance in class.

You must provide feedback in a few ways: online reading quizzes that provide immediate feedback, in-­ and out-­of-­class assignments where comments from faculty and TAs are provided, and exams. Students’ grades shall be made available to them on Bb Learn (it is beneficial to structure quizzes and exams so that the grades are automatically recorded). In addition, you (or your TAs) should provide early and regular feedback through GPS. For struggling students, you should encourage them to meet with you, your Graduate Assistant, the Peer Teaching Assistant, and/or other campus resources including tutoring provided at the South Student Learning Center. If you are not familiar with GPS, you shall, at a minimum provide feedback after exams until you have gained proficiency with the GPS program. By mid-­semester, all students will receive GPS feedback, even those students who are performing well, with messages encouraging them to continue their progress. Templates of GPS messages are available in the master Bb Learn shell.

In line with these materials from coordinators, substantial numbers of GPS alerts are sent in FYLI courses. In Fall 2014, over 33,600 GPS alerts were sent to students in FYLI courses. In Spring 2014, there were 22,639, for a total of over 56,000 alerts sent to students. Of the top 30 GPS-­generating sections at NAU, every one is part of a FYLI-­certified course. In other words, when all sections are ranked by how many alerts they issue, the top are all part of FYLI. Compared to other 100-­ and 200-­level Flagstaff Mountain courses, FYLI courses are much more likely to send GPS alerts at least once over the course of the semester:

Page 14: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 14

Across Fall and Spring combined, 22% of sections of non-­FYLI courses sent alerts, compared with 63% of sections of FYLI courses. This difference is statistically significant, X2 (2, N = 4025) = 627.81, p < .001. We do note that not all sections of all certified courses sent out GPS alerts at least once. While this technically may be in line with FYLI requirements (since individual students don’t have to be messaged any minimum number of times), it suggests that program leadership should improve efforts to communicate GPS requirements to coordinators and perhaps find ways to follow up with them based on the proportion of sections not sending alerts in a given semester.

B. FYLI Socialization Outcomes – Student Survey As in prior years, the end-­of-­semester student survey discussed earlier also contained a scale tapping FYLI socialization outcomes and attitudes , in the format “How much do you agree with each statement”, with a 5-­point Likert-­type response option ranging from 1, “Strongly Disagree” to 5, “Strongly Agree.” Appendix B-­2 shows descriptive statistics for the FYLI Socialization Outcomes scale, ordered according to mean agreement ratings, i.e., ordered from best to worst. As we found for the Pedagogy and Design Features scale, mean ratings for all items were on the desired side of midpoint, i.e., above 3.0, suggesting that on the whole, students reported high levels of agreement that socialization outcomes were achieved.

0

200

400

600

800

1000

1200

1400

1600

FYLI -­‐‑ Fall Non FYLI -­‐‑ Fall FYLI -­‐‑ Spring Non FYLI -­‐‑Spring

Num

ber of Course Sections

100-­‐‑ and 200-­‐‑Level Flagstaff Mountain Campus Courses

Did Not Send Alerts

Sent Alerts

Page 15: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 15

As has been the case for every semester the survey has been run, topping the rank-­ordered list was the item It is very important to me that I work as hard as I possibly can on my academics. Also highly-­rated were the items I examined how my behavioral choices led to specific results in this course and This course helped me accept responsibility for the effort I put into my academics. Overall, as in prior analyses, the Socialization Outcomes scale suggested high levels of student reports that they experienced aspects of socialization to the university environment that are specifically part of the FYLI program, including taking responsibility for one’s academic work and mindfully examining how different behavioral choices affect academic success. These consistent findings suggest that there are program-­specific characteristics built into FYLI courses that persist across courses and across semesters. These attitudes and outcomes are, of course, not direct measures of student success narrowly defined as course completion or grades. However, given that numerous teaching and learning experts‡ endorse academic self-­regulation and self-­management as critical to success in college, it is reasonable to assume that high scores on the FYLI Socialization Outcomes scale would predict greater chances of success.

C. Institutional Data on Student Success Earlier reports have documented significant improvements in course completion rates as a function of certification. If any additional analyses of this kind are run by OCLDAA, they will be added to this report. In consultation with FYLI leadership, OCLDAA ran several other analyses intended to examine the student success question from various angles and potentially uncover any important patterns. One analysis focused on whether the number of FYLI courses taken in the first correlated with student success metrics. Fall 2012 Mountain Campus cohort students were compared with respect to number of FYLI courses year (from a minimum of 0 to a maximum of 9) and retention, DFW, GPA and likelihood of being in good academic standing. There was some trend toward improved DFW, GPA and academic standing for students who took 5 or more FYLI courses, however, these trends did not hold up when FYLI course units (not individual courses) were used as the independent variable. Therefore, it is safer to conclude that we failed to find a connection between these metrics and the “dose” of FYLI courses in the first year in the studied cohort. Question 4: What are the impacts of the FYLI Peer TA program? This year, we did not attempt another iteration of the Peer TA survey. However, in Spring 2015 we solicited testimonials from Peer TAs willing to share personal accounts of their experiences in the program. These do not, of course, represent the typical or most common experiences but do give a qualitative sense of what specific aspects of the program Peer TAs find most valuable, among those who had positive experiences. Here are some excerpts from these collected testimonials: The opportunity to work as a Peer TA through the FYLI program has given me more opportunities than I ever expected. As an intermediate between a single professor and one hundred students, I was able to develop my skills as an organized assistant and a tutor. Helping students master concepts and improve their analytical skills was more fulfilling than I had ever expected it would be. I enjoyed working in a classroom so much that I changed my career goals to become a secondary school teacher, and have been accepted into an M.A. program. My experience as a

‡ See, e.g., Nilson, L.B. (2013). Creating self-­regulated learners: Strategies to strengthen students’ self-­awareness and learning skills. Sterling, VA: Stylus.

Page 16: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 16

TA has taught me problem solving skills that could never be learned from the other side of the lectern, and allowed me to become a competitor as a student of education. Being a Peer TA is so much more than a job, it is an opportunity to become a leader. I was a TA for EE188 and it greatly increased my knowledge on the subject. Although I had already taken the course, I still did not have a great grasp on some topics. However, after learning it a second time through a different teacher I have a much more extensive knowledge of circuit analysis. Also, since most of the students in the class were also in my other engineering classes, this led to networking and it helped me succeed in all my other courses. Overall I had a great experience and I look forward to learning more next semester! As a Computer Science major, the FYLI Peer TA program has granted me the joy to allow me to help with classroom functions and provide a way to generate more resources for students as well as improving multiple areas of my college career. The program has opened doors to generating relationships with faculty within the CS department and students that are learning within the class. Another benefit of the FYLI Peer TA program was the inclusion of allowing me to grow as a person and really appreciate what our faculty does on the individual level by stepping in their shoes every once in a while. Overall, this position is a great opportunity and I am lucky to have been apart of it.

My time spent as a PSY 101 Peer TA proved to be extremely beneficial to my professional and academic careers. I learned a variety of skills that I would have never obtained in college if not for this opportunity. For instance, learning how to grade papers opened my eyes to the variety of skill levels that students have. This benefitted me immensely because I plan on being a college professor in the future. Now I know what to look for in my future students’ writing. In addition, I now know which aspects of literacy that students need help with. I can take this knowledge with me and utilize it in my professional career. Another key benefit of my experience as a Peer TA was that it afforded me a number of connections with faculty members. While proctoring for exams, I got to learn about a number of professors’ areas of specialty. This helped me determine what specialties that I want to seek out in my future. In addition, these connections helped me with my graduate school search. Without the help of the professors who I met through being a Peer TA, the process would have been much more difficult. Overall, my experience as a Peer TA helped me grow my academic and job-­related skills. This opportunity helped shape my future as an educator and provided me with even more inspiration than I previously had. I do not believe that I would have the same opportunities without this job, and I am grateful for this experience every single day. I have been a Peer TA for two years, and I can honestly say it has been one of the best experiences of my undergraduate career. I started at the beginning of my junior year because I was looking for something to elevate me professionally while also being able to still focus on my courses. The Peer TA position is perfect for this because you are able to have a flexible schedule with your job while also not interrupting your class/coursework schedule. My work as a Peer TA has given me so many benefits over the last two years. One, I was not a very social person beforehand, finding it difficult to talk to those whom I did not know. This job, however, required that I answer questions from students and help them wherever necessary, which brought me out of my comfort zone and allowed me to help these students on their educational path, which was at least as important for me as it was for them. Secondly, I learned a lot about the process of an educator and the difficulties and demands they face every day. My sections always had at least 250 students, which gave way for many, many questions and a lot of time to be spent on each class, which is even more challenging when you teach four or five classes. My job as a Peer TA gave me a whole new respect for educators at every level, and gave me a different viewpoint of the educational system, since I was now at the level of an assistant, instead of just a student absorbing information. Lastly, my job as a Peer TA gave me the skills I needed to time manage successfully. Balancing two jobs (I am a Peer TA for two different courses) and a full-­time school load demanded a lot of time commitment on my part, and I am proud to say that this job allowed

Page 17: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 17

me to figure out how time management works for me, and prepared me for my graduate school career. I have cherished the last two years of working as a Peer TA and I think it can be extremely beneficial for every student.

Clearly, these testimonials highlight benefits that go well beyond having a paid part-­time job. Students offering these testimonials universally characterized being a Peer TA as a powerful learning and development experience, citing everything from developing their social skills to time management to suggesting new career paths.

OCLDAA and FYLI leadership collaborated to examine other possible impacts associated with being a Peer TA, comparing the GPAs of matched groups of students who were Peer TAs and those who were not, across four different cohorts between Fall 2012 and Spring 2014. GPAs in these matched groups were significantly better overall than for the NAU population at large (3.4 to 3.6, versus 2.9 to 3.0);; although FYLI does not set any kind of grade requirements for its Peer TAs, it seems likely that faculty tend to choose higher-­performing students for the role. However, no statistical differences were found between the TA and non-­TA matched groups. Thus, it is unlikely that the Peer TA experience directly impacts grades.

Interpretation and Recommendations

FYLI continues to reach nearly all Mountain Campus first-­year students and has a thriving Peer TA program. Based on information gathered from faculty, students, and in-­class observers, FYLI courses appear to employ a number of favored practices including active learning vs. straight lecture, early and frequent assessment, scaffolding and more. FYLI courses generate over 50,000 GPS alerts to students each year, and many certified courses report going from little or no systematic use of GPS prior to FYLI to consistent, course-­wide plans for using this powerful system for feedback. Students in FYLI courses consistently report that they value putting effort into academics and that the courses help spur them to adopt behaviors and attitudes consistent with being a self-­regulated learner. Students serving as FYLI Peer TAs have offered very positive personal accounts of how the program has been an opportunity to build skills and advance personal and professional development. Prior reports have documented an association between FYLI certification and course completion rates. However, this year’s efforts to extend this question to other metrics – such as the relationship between number of FYLI courses taken and GPA – did not reveal significant associations. Similarly, although the Peer TA experience may produce important impacts, these do not appear to include increases in GPA. The following points revisit last year’s recommendations in light of this year’s activities and assessment results

• Recertification. This was listed as the most important future direction for assessment and program refinement in last year’s report. The purpose of recertification is 1) to ensure that courses continue to uphold the practices required for FYLI certification, 2) engage faculty in continuous improvement, professional development and dissemination connected to FYLI and 3) encourage courses to participate in ongoing assessment efforts and use their own assessments to make further improvements to the course.

Page 18: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 18

The working group convened to design the process was successful, and a formal process was rolled out in early Spring 2015. It has so far been received well and a number of teams have already begun constructing plans to earn recertification in AY 15-­16. One course has already submitted their materials, becoming the first officially recertified FYLI course. They have graciously agreed to have their report posted as a model for others.

• Peer TA training, communication, and assessment of impact. As planned in last year’s report, several events for Peer TAs were held over the last AY. However, attendance continues to be a challenge, and it has also been challenging to provide programming that will be beneficial to Peer TAs across disciplines and across levels of experience. We in FYLI have experimented with multiple approaches and have not yet found one that provides an adequate impact for time invested.

This issue will continue to be a major area of effort for next year within FYLI. One approach we may pursue is creating online “just in time” modules that faculty can assign to Peer TAs, or that Peer TAs can self-­enroll in. These modules would selectively address specific issues or learning objectives, addressing the problem of customizing content across very diverse groups of TAs, and would be fully online with open enrollment, addressing the attendance and scheduling problem. We have tentatively scheduled a University Town Hall event centered on effective practice with Peer TAs. This event will help disseminate ideas from faculty about how to maximize impact of Peer TAs and help us continue to identify best practices as they relate to this unique program. We did conduct the analysis of grade impacts associated with the Peer TA program, as planned last year. This revealed no association between Peer TA status and grades, as compared to a matched group. In the upcoming year, we should refine our approach, perhaps examining longer-­term impacts such as the likelihood of applying to graduate school, or GPA upon graduation, to determine whether there are measurable academic or professional impacts of the program on Peer TAs themselves.

Recommendations and future directions

1. Revisit the role of student surveying in FYLI’s overall assessment plan. Although response rates have never been outstanding for our FYLI student survey, it has produced remarkably consistent results across courses and across semesters. As a centrally organized assessment effort, it may have outlived its usefulness. It should be reframed as a tool that individual courses can take ownership of and use for their own self-­assessment and improvement. The survey is now in a BB Learn version as well as the main Surveymonkey version, enabling coordinators to more easily manage their own data gathering and reporting. The recertification process offers other strong incentives for courses to do their own surveying.

2. Identify and implement new ways to document the association between FYLI and student success. With few or no new courses coming into the program, it is difficult to conduct ongoing assessment of course completion before and after FYLI certification. FYLI has also always had a problem with identifying appropriate control or comparison groups to enable more direct assessment of impacts. For example, constructing an appropriate group of comparison courses that are not FYLI certified presents many difficulties, as these course may have other important differences (size, student preparation, etc.) that create confounds. Similarly, there are no student survey data from non-­FYLI courses to compare to. FYLI and University College leadership should begin

Page 19: Note:Thisversion&of&the&assessment&report&hassome ... › ... › 2018 › 07 › FYLI_Assessment_Report_2015_for… · First&Year&Learning&Initiative&Assessment&Report&June,&2015&&&&page&&3!

First Year Learning Initiative Assessment Report June, 2015 page 19

formulating comparisons that are both appropriately rigorous and well aligned to FYLI’s strengths. Additional research analysis capacity within the University College would greatly expedite carrying out these comparisons, thus we should pursue ways to obtain this additional capacity.

3. Gather baseline data prior to certification for new courses going forward. Another way to address the control group problem is to make before and after comparisons on metrics such as DFW, GPS rates, COPUS observation data, or test scores. If new courses are brought into the program, FYLI leadership and coordinators should collaborate to get those metrics recorded before FYLI development gets underway, then compare them to the same metrics in the semesters after certification. This would not give a program-­wide picture of how certification affects these metrics, but would offer a detailed case-­by-­case picture for individual courses.

4. Begin working on a comprehensive plan to amplify the direct impacts of FYLI on student success. Recent assessments have suggested that FYLI’s impacts on grades, both for students enrollled in the courses and for the Peer TAs, are limited. There could be other impacts not captured by these analyses, but in any case, program leaders and coordinators should collaborate to generate ideas for how to boost impacts on key success metrics (while retaining flexibility and other aspects of FYLI that make it scalable across diverse disciplines and courses).

5. Pursue wider use of a modified version of the COPUS system. Results of the pilot project were promising, and the data generated by the system are particularly valuable for documenting what goes on in FYLI courses, as well as offering instructors specific, actionable feedback about the in-­class experience they provide. Coordinators should be provided with opportunities to have COPUS-­style observations done in their sections, and should be particularly encouraged to do so as part of their recertifications.

6. Amplify the impacts of the Peer TA program and expand assessment of those impacts. As described above, the Peer TA program continues to hold untapped potential. There should be more opportunities for Peer TAs to network with other Peer TAs, pursue professional development, and build their skills. FYLI leadership should hone in on the specific impacts of the program and expand efforts to do large-­scale assessment of those impacts.

7. Continue efforts to improve compliance with required FYLI features, particularly GPS. Recertification, as described earlier, will be an important mechanism to ensure that courses continue to maintain and exceed the standards they set in their original certification. Overall, evidence (e.g., student survey results) suggest that these features are being implemented. However, some extra effort may be warranted in the case of GPS, given the finding that a proportion of sections did not issue alerts over the past AY.