Upload
amberly-manning
View
213
Download
0
Tags:
Embed Size (px)
Citation preview
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
by Kathy Gates
University of Mississippi
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Introduction
• UM began a serious assessment of its teacher evaluation process in 1998.
• UM moved to web-based delivery of results in Fall 1999 and web-based collection of responses in Fall 2003.
• While not perfect, the resulting process is generally perceived as being successful, and our experiences hold valuable lessons for others.
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
UM at a Glance
• Located in Oxford, MS about 80 miles south of Memphis, TN
• About 14,500 students• Converted to SAP’s HR,
Finance, Plant Maintenance modules in 1999 & 2000
• North American pilot for SAP’s student system, Campus Management; went live in April 2003
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Agenda
• Historical Perspective
• Teacher Evaluations @ UM – Phase 1– Presentation of Results via Web
• Teacher Evaluations @ UM – Phase 2– Collection of Responses via Web
• Summary
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Traditionally a Controversial Topic
• First used North America in the mid-1920’s
• Subject of a huge body of literature– Nearly 2000 studies as of
1997– Most extensive area of
study in higher education• About 30% of colleges
and universities used some form of teacher evaluations 30 years ago, and almost all do now.
• “Many researchers have concluded that the reliability and validity of student ratings are generally good … however, many controversies and unanswered questions persist.”– From “Navigating Student
Ratings of Instruction” by Apollonia and Abram in the November 1997 issue of American Psychologist.
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Faculty Concerns
• Grade Inflation– Students’ evaluative ratings of
instruction correlate positively with expected course grades.
• Are ratings influenced by grades?
• Are grades influenced by ratings?
• Opportunity for Retaliation by Students
• Students Rate Different Academic Fields Differently
• May Lead to Superficiality of Course Content
• Ratings Based on Popularity not Teaching Effectiveness– “Instructors who are skilled in
the art of impression management are likely to receive high student ratings whether or not their students have adequately mastered course materials.”
• From “Instructor Personality and the Politics of the Classroom” by John C. Damron, Douglas College, Canada
– Must keep students entertained
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Faculty Concerns, cont.
• “In my office, I have a folder that contains two items: (1) a copy of the Rutgers Student Instructional Rating Survey, and (2) a Customer Satisfaction form I once took away from a Holiday Inn. Making due allowances for the difference in goods and services provided, they are the same form, produced by the same logic of market forces and consumer relations.”– From “Why We Should Abolish
Teaching Evaluations” by William C. Dowling
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
More Faculty Concerns
• “Student Teaching Evaluations: Inaccurate, Demeaning, Misused -Administrators love student teaching evaluations. Faculty need to understand the dangers of relying on these flawed instruments.”– Article by Gray and Bergmann in the
September-October 2003 issue of Academe
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Past Chronicle Discussions
• “We must recognize that student evaluations have led to grade inflation and the erosion of academic standards.”
• “When student evaluations contribute to tenure, promotion and salary decisions, the gross injustice of the present system becomes intolerable.”
• “Too many have been grievously wronged by the institutional discrimination that has resulted from the use of such ill-conceived and poorly utilized instruments as student evaluations.”
• “Student evaluations must go, and they must go now.” – From “Student Evaluations of Their Professors Rarely Provide a
Fair Measure of Teaching Ability” by Louis Goldman in the August 8, 1990 issue of the Chronicle of Higher Education
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Responses …• “Does he assume that all deans,
chairpersons, and faculty-committee members are too stupid to read the evaluation results intelligently?”
• “Or does Professor Goldman also think so little of students that the thousands of hours they spend in class do not entitle them to an opinion which is worthy of being taken seriously?”
• “In 30 years in higher education, as a professor and an administrator, I have yet to see anyone ‘grievously wronged’ by the use of student forms about teaching.”
• “After 20 years of undergraduate teaching with careful attention to a variety of evaluation instruments completed by my students, I am convinced that I have improved by working on the inadequacies identified by the students … I value my students’ assessment. In my experience, they have generally been more perceptive than I anticipate and more generous than I deserve.”
• “If there is anything the research is agreed upon, it is that student ratings are statistically reliable.”
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Student Perspectives
• “But valid student criticism of a professor is also a factor in my decision to take a class. If four out of six students tell me to avoid a professor and outline specific reasons why, then I will try to avoid the professor’s class.”– From “Letters to the Editor” in the April 25,
1997 issue of the Chronicle of Higher Education.
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Faculty Views on Student Comments
• “Yet, inescapably, life constantly evaluates us: We’re served divorce papers, we learn our cholesterol level, we get turned down for tenure. Or we get promoted, our poem gets published, we win in the over-40 age division in our local 10K race. We have to carry those moments with us, too, however difficult that may be. Too often the good in life seems temporary, while the bad stuff goes on and on. So it is with teaching evaluations. I’m sure I’ve forgotten many wonderful things that students have said about me over the years, but the zingers stick to me like burrs.”– From “Why I Stopped Reading My Student Evaluations” by Lucia
Perillo in the July 7, 2000 issue of the Chronicle of Higher Education
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Responses …
• “Not reading the forms is succumbing to the misguided belief that there is nothing to be learned from … those we say we respect, our students.”
• “Anonymous student evaluations of instruction are a failed experiment. They have no credibility with anyone except those who devise them … These types of evaluations should be discarded immediately.”
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Student Comments, cont.• “Granted spelling and grammar are often woeful. In my 16 years as
department chairperson, I had to assess remarks like, “he has no patients” and, in the case of a colleague given to cuss words, “he used profound language,” presumably teaching his “coarse.”
• “Most students are fair-minded, and their kind comments generally make for gratifying reading.”
• “I am a true believer that evaluations generally give an accurate collective assessment of a course by the participants – our customers, if you will.”
• “I still believe that we could learn a lot from each other.”– From “What Students Can Teach Professors: Reading Between the
Lines of Evaluation” by Douglas Hilt in the March 16, 2001 issue of the Chronicle of Higher Education
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Student Comments, cont.
• “Few of us will ever get the sexist but enthusiastic evaluation that one University of Illinois instructor received: ‘My French teacher,’ it read, ‘is drop-dead gorgeous, and I think I’m in love with her.’”
• “Regardless of how colleges evaluate teaching, everyone’s concerned about evaluation.”– From “Student Evaluations Deconstructed” by Joel J.
Gold in the September 12, 1997 issue of the Chronicle of Higher Education.
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Usefulness of Student Comments
• “Anecdotal comments are at least as useful as numerical ratings, so the evaluation should be constructed to prompt appropriate comments from students.” – From the Report of the Joint Commission on
Evaluation of Instruction” at Kent State University
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Web-Based Evaluations
• Benefits– Timely Feedback of
Evaluation Results– Flexibility in survey design
and development– Convenience for Students
• Increase in written comments
– Data Warehousing
• Challenges– Response Rates– Culture Change– Concerns about Privacy
and Confidentiality– Use of E-mail for
Participation Announcements
From “Web-based Student Evaluation of Instruction: Promises and Pitfalls” byMcGourty, Scoles, and Thorpe presented at the 42nd Annual Forumof the Association for Institutional Research, June 2002.
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Web-Based Evaluation, cont.
• “For this institution, female students were significantly more likely to respond to the web-based course evaluation survey than men.”
• “In conclusion, the results of this study suggest that concerns regarding low response rates and the potential for non-response bias in web-based course evaluation methods may not be warranted …” – From “Online Student Evaluation of Instruction: An
Investigation of Non-Response Bias” by Stephen Thorpe presented at the 42nd Annual Forum of the Association for Institutional Research, June 2002.
Note the “hotness” factor.
Did you know that your college/university already has web-based teacher
evaluations?
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Teacher Evaluations @ UM
• Purposes– Used to improve
teaching (formative)– Used in personnel
decisions (summative)– Serves as an
opportunity for students to provide feedback
– Used as a student consumer guide
• Goal– To make evaluations
useful to three constituencies
• Faculty• Students• Administrators
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Phase 1: Presentation of Results via Web
• Deployed in Fall 1999• Developed in cooperation with a
Faculty Committee on Evaluation of Instruction
• Featured graphical display of results with reference groups
• Also featured VIP Reports• Access limited to “olemiss.edu”
domain• Instructors given opportunity to
withhold results from the public view
Note the response rate.
Note the reference group.Note the user cautions.
Print-friendly format.
This class. Lower division ~ Liberal Arts
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
VIP Reports
• Summary Reports for Deans, Chairs, and other Administrators
• Report Types– Comparisons by School/College– Comparisons by Department– Sorted Responses by Question by
Department– Course Evaluations
Lower division ~ Liberal Arts Lower division ~
UM
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Technology Choices
• Scantron forms for collection of responses– Forms printed from legacy mainframe system
• Scanned results uploaded into mySQL database
• CGI scripts to display results• Graphs generated in real-time using the
gd GIF library– See http://www.boutell.com/gd/
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Phase 1 Results
• Advantages– Improved information
access for all constituencies
• Previously, students only had access through paper print-outs in the library.
• Disadvantages– Collection of results
costly, error-prone and labor-intensive
– Slow turn-around time– Scanning equipment
failures– Takes up class time– No anonymity for
written responses… but then scandal strikes the teacher evaluation process @ UM …
Scandal leads to progress!
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Phase 2: Collection of Responses via Web
• Deployed in Fall 2003• Timing was influenced by
– migration to SAP’s Campus Management student system in Spring 2003 and
– retirement of legacy mainframe in December 2003
• Focus groups with students and instructors in Summer 2003
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Fall 2003 Features
• WebID authentication• Back-end database converted from mySQL to Oracle for
better performance• Student comments in public reporting interface
– New “Amazon-style” question: “What do you want other students to know about your experience in this course.”
• Overall five-star rating• Student comments in faculty/VIP reporting interface• Better support for courses with multiple instructors• “Last chance to evaluate” opportunity before viewing
final grades
Stars filled dynamicallyto 1/100th point accuracythanks to UM graduate intern!
Go to comments
Must study and read inorder to pass!
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Fall 2003 Concerns
• Reduced response rate when going from paper process to on-line– Dropped to about 30%
• Loss of “middle” sample set due to voluntary participation– Will results be skewed if only those with strong
feelings one way or the other participate?
• Time frame for capturing responses has changed – will this affect the outcome?
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Changes for Spring 2004
• Participation Incentives– Students who completed
100% of their Spring 2004 evaluations will get to register one day early in Fall 2004 priority registration.
– For first seven days of grades viewing, students who had completed at least 50% of their evaluations had option to go straight to grades viewing; afterwards all had this opportunity.
• New Features– Tool for department chairs
to mark courses as exempt/non-exempt.
– Time stamps on responses– Variable questions by
course ~ pilot project
Link to go straight to final grades made visible or suppressed based on outstanding evaluations.
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Wins
• The rate of participation increased to about 58-60%.– 29,146 responses in Spring 2004 with on-line process vs. 25,578
in Spring 2001 with paper process– Remember that Spring 2002 evaluations were shredded, so that
comparison group was lost!
• The average scores remained similar to that for previous semesters.
• Very little bias is introduced by the online process.• “Live” comments feature seems to have value for both
students and instructors.
Average Question #11* Responses
* How would you rate the instructor's overall performance in this course?
Spring 2004 Responses By Date
010002000300040005000600070008000
4/19
/200
4
4/21
/200
4
4/23
/200
4
4/25
/200
4
4/27
/200
4
4/29
/200
4
5/1/
2004
5/3/
2004
5/5/
2004
5/7/
2004
5/9/
2004
5/11
/200
4
5/13
/200
4
5/15
/200
4
5/17
/200
4
5/19
/200
4
5/21
/200
4
5/23
/200
4
5/25
/200
4
Date
# R
esp
on
ses
Question #11 Narrative Question (Public)
Normal evaluation window Window for
grades submission
Progression runsat midnight Window for grades
viewing
* Mutual protection – Instructors can’t submit grades after viewing evaluation
results, and students can’t submit evaluations after viewing grades.
Window 2Window 1
Question #11* by Day
Time Frame
A = Superior
B = Excellent
C = Good D = Marginal
E = Poor
Window 1 # 3245 2584 1537 516 304
Window 1 % 39.64 31.57 18.78 6.30 3.71
Window 2 # 8620 6400 4421 1459 845
Window 2% 39.64 29.43 20.33 6.71 3.89
* How would you rate the instructor's overall performance in this course?
Spring 2004 % Evaluations Completed
-20
0
20
40
60
80
100
120
0 2000 4000 6000 8000 10000 12000 14000
Percent Completed
Responses by Gender
56%
44%
Female Male
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Student Comments
• Review Process– “Expletive Deleted”– An example of
negotiation and compromise between constituencies
• Overwhelmingly positive
• Being used in faculty dossiers
• Give insight into student perspectives
• Give reporting interface “life”
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Sample Student Comments• “Although one of my hardest classes,
this was my favorite class to go to. It was challenging but I feel that I have learned so much more from it being that way. Dr. Grisham is an excellent teacher and I would tell anyone to take him for sociology. It may be a little difficult but it is worth it getting to listen to his lectures every day.”
• “Wonderful and exciting class. I learned a great deal that I will take with me and use in the real world.”
• “If you don't study, you will FAIL, again, and again.”
• “Do the homework, no matter how long it takes.”
• “Try not to take this class at the same time as 304 and 405. It's GPA suicide.”
• “DON'T TAKE FROM HIM!!! For the love of God... find another teacher. You'll be so lost from day one that it will be too late. He's a nice guy... but gets himself confused when talking about economics.”
• “He is like listening to PBS. It soooo boring. All he does is show you powerpoints and read from them. He ends up posting them on Blackboard. No point in going to class excpet to take the random quizes he gives. I never go to class and have a B". Pretty easy class if you have ANY knowledge of geology. But if you take... be ready to ZZZZZzzzzz.“
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Sample Student Comments, cont.
• “Contemporary fiction will rock your world, it's very different compared to what i've been reading my whole life.”
• “Study hard and turn in your assignments on time.. Go to class as much as possible.”
• “Ms. Murfee is a very demanding teacher. She wants a lot of things done in a short amount of time and wants them to be perfect. I did however learn to be a better writer because of all the assignments.”
• “It was much better than I thought it would be.”
• “The class is not that fun (unless you love literature) and you have to read.”
• “Reading the book helps.”• “Great teacher.. but you WILL stay the
whole length of the class period!!”
• “STAY AWAY!! Course description is false!! This class is an insult to Southerners.”
• “It was very hard and challenging. Don't try to take any other hard classes in the same semester as this one.”
• “Go to the writing center for help on your essays, because what you think is a good paper might be a poor paper to your instructor.”
• “to all the students take Mrs. Wang. She will make sure a person learn how to play a piano before you leave her class.”
• “8 am is early”• “This is a class for disciplined students
who can meet deadlines. I loved it!”• “It was definitely an experience alright”
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
How Do Instructors Use the Results?
• “I look at what the students say I need improvement on, or how the course needs improvement. I look for patterns especially, more than one student saying the same thing. I look carefully at their suggestions for new ways to do things. I try to change things and improve each year.”– Dr. John O’Haver, Associate Professor of Chemical
Engineering
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
How Do Instructors Use the Results?
• “Actually, after all of these years, the "results" are pretty stable. Once in awhile there might be a surprise, but not very often. The question I pay the most attention to now is about the textbook--it is the most variable. Their written responses are generally humorous.
I think each semester is a data point. If we had a plot of the results over time, we might be able to spot trends that would be more useful than simply viewing the results one semester at a time.”
– Dr. Dawn Wilkins, Associate Professor of Computer and Information Science
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
How Do Deans Use the Results?
• “This past Spring, we used the data from the VIP reports for each department to compile a single spreadsheet that contained all faculty members in the school. We were then able to compute an "overall * difficulty" index rating of teacher performance which was used to help select the "outstanding teacher of the year" awardee for the School of Business. This would have been unworkable with a paper-based system.”– Dr. Brian Reithel, Dean of the Business School
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Areas Still Needing Work
• System instability during morning of first day of grades viewing– Caused by heavy system load combined with
variable question feature.– Had to temporarily disable teacher evaluation
incentive component.– Students experienced poor system response
the morning of 5/11.
1. Student connects.
2. Has this studentalready signed in?
3. If not, then promptfor webID and password.
9. SAP gathers data -- determines which terms can be displayed, checks to see if each grade has been marked as complete & nulls out if not, sorts results, retrieves related info such as org, academic standing, classification, semester and overall stats.
5. Are we in a window when evals are taking place? Does this student have any open evals? Has this student already been prompted? In what way do I need to prompt student? Record prompt type for this student.
6. Based on result eithersend student to teacher evals or view grades.
7. Teacher Evals:Queries SAP to get student schedule.Queries Oracle to get info about eachsection, e.g., is it exempt or non-exempt.Queries Oracle to find out what questionsshould be presented. Generates web form,accepts responses from student and storesresults. Updates menu to show that thissection has been evaluated.
8. View GradesSends academic work request to SAP.
10. View GradesDisplays results to student. Adds additional info such as links to academic dismissal/suspension letters.
4. Sends request to SAP –Does this student need to beprompted to complete evals?
7. Teacher Evals:Queries SAP to get student schedule.Queries Oracle to get info about eachsection, e.g., is it exempt or non-exempt.Queries Oracle to find out what questionsshould be presented. Generates web form,accepts responses from student and storesresults. Updates menu to show that thissection has been evaluated.
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Goals for Fall 2004
• Incentives that result in a response rate that is as good or better than Spring 2004
• Incentives that are perceived as being fair and reasonable by all constituencies– Students, Faculty, Administrators
• Incentives that balance system load
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Student Proposal for Fall 2004 Incentives
• Continue priority registration incentive with minor modification
– Only available to students who complete 100% of their evaluations before grades viewing begins
• Adjust grades viewing incentive– Will be something like students
who have completed 100% of their evaluations are eligible to view grades in “early window”; others will have to wait 12 hours.
– Will include allowance for students on dismissal or suspension.
• Allow students to submit evaluations during the short window after finals are over and before grades viewing begins; include “cool-off” period.
• Add additional hyperlinks to make it easy to move from web registration interface to teacher evaluation reporting interface.
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Phase 2 Results
• Response rate was unacceptable for Fall 2003 (~ 30 %) but is now moving into an acceptable range (~ 60%) with creative student incentives
• Student comments more extensive and thoughtful
• No noticeable difference in results from paper to online process
• Many advantages
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
Summary
• Communication is key– Five focus groups with
students over two years– Three presentations to the
Faculty Senate over two years
– Numerous memos and status reports
• Creative incentives can lead to acceptable response rates and possibly better response rates than with paper-based methods
• Increased opportunities for comments are valuable to all constituencies
• Choice of SAP Campus Management student system facilitated technical implementation of web-based evaluations
Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations
For More Information
• Kathy Gates– [email protected]
• Presentation available at www.olemiss.edu/working/kfg/educause2004Special thanks to Amelia Rodgers in the
UM Faculty Technology Development Center for help with interview videos.