REPORT OF THE COURSE EVALUATION PROJECT TEAM April … · REPORT OF THE COURSE EVALUATION PROJECT...

Preview:

Citation preview

1

REPORTOFTHECOURSEEVALUATIONPROJECTTEAM

April27,2017

1.0IntroductionIn2014,theAssociateVice-President,AcademicestablishedtheCourseEvaluationProjectTeam(CEPT)toexplorethepotentialforanew,campus-wide,courseevaluation1model(seeAppendix1).Thisinitiativearisesfromacommitmentto(a)updatethemechanismtocaptureWaterloostudentfeedbackaboutthequalityofthestudenteducationalexperience,andto(b)movetowardasystemwherestudentfeedbackisoneofseveralmetricsforevaluatinginstructorperformance.TheCEPT’smandatewastoworkonthefirstinitiative:updatingthestudentfeedbackmechanismsothatitalignswithcurrentteachingandlearningpractices(mostWaterloocourseevaluationtoolsweredevelopedinthe1980s).Recognizingthatcurrentcourseevaluationtoolsaremeasuresofstudentperceptions,theproposedassessmenttoolisdescribedas“studentcourseperceptions”(SCP)2.Since2014,theprojectteamhasreviewedtheliteratureoncourseevaluationandconductedconsultationsacrosscampuswithrepresentativestakeholdersregardingthepossibledevelopmentandimplementationofanewcourseevaluationmodel.AdraftreportwasproducedonNovember7,2016.InFall2016,theteamsoughtopinionsfromthecampuscommunityaboutitsrecommendations.Asurveywasrun,withseveralopen-endedquestionsabouttheproposedrecommendations(seeAppendix2)andthepreliminaryquestionset(seeAppendix3).Theliteraturereview,extensiveprojectteamdiscussions,andresultsoftheFall2016consultationprocesshaveculminatedintherecommendationsinthisreport.ImportantcontextTheprojectteamrecognizesthelimitationsofSCPswhilealsoacknowledgingthewaysinwhichtheyserveanimportantfunctionforuniversityoperationandsuccess.DatafromSCPsrepresentonesourceofevidencetobeconsideredforpromotionandtenure,andforannualperformancereviewpurposes.WhileitisbeyondthemandateoftheCEPT,theteamstronglyadvocatesforasubsequentuniversityteambeingstruckthatcontinuesthediscussionabouthowmethodssuchaspeerevaluation,teachingdossiersandotherapproachescanbeappliedinaconsistent,systematicmannercampus-widetoevaluateteaching,coursedesignanddelivery.Theseothersourcesofevidenceofteachingandcoursequalityshouldtakeonasubstantiallyenhancedrole(seePolicy77).InorderforSCPstobecrediblesourcesofinformation,theymustbevalidatedandrecognizedasstudentperceptionsofteachingeffectivenessandthelearningexperienceinacourse.

1 Theterm“courseevaluation”iscommonlyusedintheresearchliteratureandonmanyCanadianuniversitycampuses.2 AlternativenamesincludeStudentCourseFeedback,StudentCourseEvaluations,StudentEvaluationofTeachingandCourse,orStudentPerceptionsofLearning(SPLs),(amongothers).ThefinalnamewouldbedeterminedinPhase2.

2

2.0KeyResearchThemesandFindingsThissectionprovidesasummaryofkeyresearchfindings,organizedbythemes.2.1ReasonsforaCascadedCourseEvaluationModelTheprojectteamrecommendstheadoptionofacascadedcourseevaluationmodel.Inthismulti-levelmodel,allFacultiesincludeacommonsetofstandardquestions,complementedbyoptionaladditionalquestionschosenbyeachFaculty,academicunit,andinstructorfromanestablished,vetted,questionbank.(SeeFigure1).

CourseevaluationpracticesandinstrumentsarevariedatWaterloo.Adoptingacommonsetofuniversity-widecourseevaluationquestionswouldenableustoreportinstitutionallyonakeycomponentofourmission–studentperceptionsoftheirlearningexperience.Institutionalreportingisfullyconsistentwiththegrowingexpectationfromgovernmentandbythepublicfortransparencyandaccountabilityfromourpost-secondary,publicly-fundeduniversities.TheOntarioUndergraduateStudentAlliancehascalledforincreasedstudentaccesstocourseevaluationdata,andWaterloo'sFederationofStudentshaspubliclyadvocatedforaccesstoaggregatedata.ADecember2015reportbytheMinistryofTraining,Colleges,andUniversities(MTCU–renamed“MinistryofAdvancedEducationandSkillsDevelopment”in2016)identifiedcourseevaluationdatainitslistofadditionalmetricsthatcouldbeusedinadvancinganoutcomes-basedfundingmodelforpost-secondaryeducationinOntario.i

UniversityQues-ons(≤10Likert-scale+≤3open-ended)

Facultyand/orUnitQues-ons(≤4Likert-scale)

InstructorQues-ons(≤2Likert-scale)

CourseEvalua-onInstrument

Figure1.CascadedCourseEvaluaAonModel

Figure1.CascadedCourseEvaluaAonModel

3

ThiscascadedmodelalsogivesFaculties,departments,andindividualinstructorstheabilitytoselectadditionalquestionsformorecustomizedfeedback.Instructorsmayselectdifferentadditionalquestionsovertime,suchaswheninstructionalpracticesarechangedinacourse.University-widequestionsarecommonatotheruniversities,asnotedinarecentsurveycarriedoutforanMTCU-fundedresearchprojectonevaluationsofteaching.Morethan90%oftheOntariouniversitiessurveyed(n=20)hadinstitution-widestudentevaluationsofteachingii.Inaddition,Canadianuniversitiesofcomparablesizeandprominencehavealreadymovedtoacascadedcourseevaluationmodel(e.g.,Toronto,McGill,SimonFraser).2.2EvaluationInstrumentDesignPrinciplesTheevaluationmodelisstructuredonasetofguidingprinciples.TheprimaryprincipleisthatSCPquestionsneedtoconnecttoawell-grounded,empiricallyinformeddefinitionofeffectiveteaching.The project team’s review of research into the elements of effective teachingiiishows thateffective instructors design and deliver courses that result inmeaningful student learningiv.While course evaluations do notmeasure student learning (that is the role of tools such asassignments, tests, and exams), students can provide useful feedback about how well thedesign and delivery of a course facilitated their learning (or not) and affected their learningexperience.Thisunderstanding - togetherwith reviewingcourseevaluation instrumentsvusedelsewhereandthecurrentliteratureaboutsuchinstrumentsvi-allowedtheprojectteamtoidentifythreemaindimensionsthroughwhichstudentsprovidefeedbackbywayoftheinstrument:CourseDesign,CourseDelivery,andLearningExperience3.AnanalysisofWaterloo’sin-usecourseevaluationinstrumentshasrevealedthatfewquestionsexplicitlyfocusonthestudentlearningexperience.Thequestionsalsoprivilegelecture-basedinstructionalpracticesvii,andthereisconsiderablevarietyinthewordingandnumberofquestionsasked.Instructionalandassessmentpracticeshaveshiftedoverthepastfewdecadestoembraceanexpandedrepertoireofoptions(e.g.,collaborativelearning,active-andproblem-basedlearning,authenticassessments).Therehasalsobeenincreasedfocusonlearningoutcomesandtheuseofeducationaltechnologiesviii.Evaluationinstrumentsthatcapturethisevolutionincoursedesign,delivery,andthestudentlearningexperienceareconsiderednecessaryandvaluable.Finally,wewishtoreinforcetheimportantmessagethatstudentcourseperceptions(SCPs)representoneofseverallinesofevidence,eachofwhichplaysanimportantandcomplementaryroleinestablishingacompletepictureofeffectiveteachingandthelearningexperience.(SeeFigure2).

3 SeeAppendix3,whichdemonstratesthelinkbetweenthesethreeprinciplesandapreliminaryquestionset.

4

Figure2.Threecomplementarylinesofevidence

Accordingly,theprojectteamrecommendsusingthefollowingprinciplestoguidethedevelopmentofevaluationquestions:

1. SCPsshouldfocusonstudents’perceptionsofthequalityofcoursedesign,coursedelivery,andthelearningexperience

2. SCPsshouldbedesignedtoprovideinstructorswithhelpful,timelystudentfeedback3. unitchairs/directorsshouldbeabletousetrendsevidentinsuccessiveSCPsasone

meanstohelpensurehighqualityteachingfortheiracademicprograms4. resultsfromscaledquestionsshouldbeviewedasstudentperceptionsofteachingand

areflectionoftheirlearningexperiencesthatmaybefurtherilluminatedbyopen-endedcomments

5. theselectionofindicatorsofeffectiveteachingandthewordingofinstrumentitemsshouldbeguidedbytheresearchliteratureaswellasbyongoingassessmentofevaluationinstruments

6. evaluationquestionsshouldfocusoninstructionalelementsthatstudentscanreliablyevaluateandavoidonestheycannotreliablyevaluateix

7. institution-widequestionsshouldtranscendcoursedeliveryformatsanddisciplines,and

8. theinstrumentshouldallowfortheassessmentofdiverseteachingapproacheswithacombinationofopen-endedandLikertscalequestions.

Studentfeedback(SCP)

Peerreview

Instructorself-report

5

2.3NewandexistingmodeluseWaterloo’sPolicy77statesthat“studentevaluationsareanimportantsourceofinformation”intheassessmentofteaching.However,teachingatWaterlooisassignedtoabroadercommunitythanfacultyandthereforetheSCPprocessneedstoconsidertheentireinstructionalcommunity.“Instructor”,asusedinthisreport,includesalltenuredandtenure-trackfaculty,adjunctappointments,lecturers,sessionalinstructors,andteachingassistants(TAs)whoareinindependentinstructionalroles.Tenuredprofessorsandcontinuinginstructorswouldusethenewevaluationmodel.Instructorswhosestartdateisafterthecommencementofanewevaluationprocessshouldbeassessedwiththenewevaluationinstrument.Facultiesshouldofferinstructorswhosestartdatewaspriortothecommencementofanewcourseevaluationprocesstheoptiontobeassessedwith(a)thepreviousFaculty-basedinstrumentor(b)thenewcampus-wideinstrument,untiltheyhavebeenawardedtenureandpromotiontoassociateprofessororattainedcontinuingstatus.2.4SupportiveOnlinePlatformTheprojectteamrecognizesthebenefitsofonlinedeliveryofcourseevaluations.Theseinclude:

1. loweringresourcecostswhencomparedwithpaper-basedapproaches2. easingtheworktoanalyze,shareandpostdata3. addingflexibilitytoaccommodateacascadedevaluationmodel4. increasingsecurityofstudentaccess,and5. enhancingaccessibilitybycampusstakeholderstotheevaluationprocessandits

outcomes.

Alocallydevelopedonlinedeliverysystem–eValuate–hasbeenunderdevelopmentintheFacultyofScienceforseveralyears.FiveofoursixFaculties(AppliedHealthSciences,Engineering,Environment,MathematicsandScience)havefullydeployedeValuateusingtheirexistinginstruments,andtheFacultyofArtshasconductedanextensivepilot.FacultieshavereportedthateValuatehaslargelymetexpectations,andthissoftwarehaseffectivelybecomethedefactocampussolution.TheprojectteamconcludedthatthemostreasonableandbeneficialcourseofactionwouldbeforallFacultiestoadopteValuate.AnadvisorycommitteehasbeenstrucktoprovideinputtoprioritiesforthetechnicaldevelopmentofeValuate.ThiscommitteeisaccountabletotheofficeoftheAssociateVice-President,Academic(AVPA)andtheAssociateProvost,GraduateStudies(APGS),andprovidesreportstotheUniversityCommitteeonInformationSystemsandTechnology(UCIST).Ifthecascadedmodeloftheinstrumentisadopted,afullreviewofsoftwarerequirementstosupportthemodelwillbeinitiated.Thetechnicaladvisorycommitteewouldassistwiththatreview.

6

2.5ManagementoftheSCPSystemThissectionofthereportaddressesseveralhigh-profileissuesthathavebeenraisedinprojectteamdiscussionsandthroughdiscussionswithcampusstakeholders.2.5.1TheIssueofBiasTheteamrecognizesthateveryopportunitymustbetakentoenhancetheclarityofeachquestion’sintent,andtominimizethepotentialforinappropriatecomments.Researchacknowledgesthatsocio-culturalvariables,biases,the"haloeffect"andotherinfluencescanaffectcourseevaluationresultsx.Forexample,studentparticipationinSCPscanbecompromisedbyfactorssuchasbias(e.g.,genderandrace)inperceptionsofcourseandinstructionalquality;indifferencetotheexercisebystudentsand/orinstructors;immaturityofrespondents;misunderstandingsofthepurposeandapplicationofcourseevaluationresults;andinstrumentquestionsthatareinappropriateorsimplycannotbeansweredinaninformedmannerbystudents,amongotherfactorsandvariables.Theinherentbiasinevaluationtoolsisastrongreasonforinstructorevaluationtobemulti-pronged(i.e.,SCPsareconceptualizedasoneevaluationtool).Institutionalandindividualbiasregardingspecificgroupsisachallengethatwefaceinoursocietyandinhighereducation.Thereisnoquestionthatbiases(e.g.,sexism,racism,ageism)existonanycampus,andthatthesebiasescanbeexpressedinSCPs.Forexample,inastudycarriedoutattheUniversityofWaterloo,whenstudentsreceivedlowgrades,theygavestatisticallyloweroverallratingsofquality(courseandinstructorqualityratingswerecombined)tofemaleinstructorsthanmaleinstructors.4Theseareseriousissuesthattheprojectteamhasdiscussedextensively.WhileitisnotpossibletocontrolindividualbehavioursandresponsestoSCPs,itispossibletoreducethepotentialforbias,initsmanypotentialforms,throughcarefuldesignoftheinstrument.Inaddition,acommonsetofquestions,anddatainelectronicform,canprovidetoolstoinvestigate,recognizeandaddressbias,betterthanispossiblewithdifferentsetsofinstrumentsthatcannotbeaggregatedforbroadertrendsandoutliers.Similarly,wenotethatbiascanbeafactorintheinterpretationofSCPdatabyuniversityadministrators(e.g.,academicchairs/directors,staff).Arealizableactionwillbetoprovideeducationalopportunitiesforthosewhousethedata.(SeeSection2.5.2.)Asnotedalready,theprojectteamadvocatesforamulti-prongedevaluationofinstructorperformanceandaninvestigationoftheimpactsofbias.Finally,afollow-upinvestigationofotherassessmentandevaluationmethods,inadditiontoSCPinformation,wouldbeaworthyundertaking.Othermethodslikelywillrequiresignificantresourcestoscaleupfordepartment,Faculty,andorcampus-wideuse.

4 SeeEndnotex.

7

2.5.2DesigningSupportforEvaluationInstrumentUsersWhileitisimpossibletoanticipateeverypotentialfactorthatcouldcompromisethequality,validityandfairnessofevaluationresponses,aproperlydesignedandimplementedtrainingandorientationprogramcanenhancetheutilityandvalidityoftheseevaluations.Manyuniversitieshavedesignedandimplementedtrainingandeducationprogramsforstudents,staffandinstructorstosupportandguidethecourseevaluationprocess.Accordingly,allUniversitystudents,faculty,staff(Facultyanddepartmental)administratorsandsystemadministratorsshouldbetrainedin,andorientedto,theSCPandtheuseandinterpretationofresults.Inaddition,thereshouldbeorientationtotheeValuateplatform.Trainingandorientationcontentshouldcompriseagenericcoreofinformation,plusmaterialthatmeetstheinformationneedsofspecificevaluationusers.Theseinformationneedsshouldbedeterminedfollowingconsultationwitheachevaluationusergroup.Showcasing,andpotentiallysharing,thedataanalysisalreadyoccurringintheFaculties(e.g.,trendsagainstclasssizes)couldbeanotherbeneficialpartoftheeducationprogram.WithregardtotheSCP,trainingandorientationcontentmustaddressissuessuchastheintentofthisevaluationtool,howandwhytheseevaluationsareused,howtointerprettheresults,theneedtoacknowledgetheimportance/roleofbias(especiallyconcerninggenderandrace)whencompletingandinterpretingevaluations,andethicalobligationsgenerally.IntermsoftheeValuateplatform/technology,trainingandorientationcontentshouldexplainthekeyfeaturesoftheeValuatesystemandprovidelinkstousefulon-lineresources(e.g.,FAQs,instructionalvideos)thatmeettheneedsofdifferentusergroups.Trainingandorientationcontentshouldbeaccessible“ondemand”viaasingle,dedicatedon-lineportalwhichwouldalsoenableaccesstotheeValuateSCPandusefulresources.Mandatorytrainingandorientationcontentshouldbepresentedasa“toolkit,”withonlinesub-sitesdedicatedtospecificSCPusergroupinformationneeds.2.5.3Testing,MonitoringandEvaluation:InstrumentandToolkitTheprojectteamrecognizesthatvalidationoftheSCPisneeded.Thetestingoftheinstrumentresultswilldeterminethereliabilityandvalidityoftheinstrument,includingtheinfluenceofvariablesthatcouldbiasresultsatWaterloo.Theresultsofthistestingshouldbeusedtorevisetheinstrumentand/ortheeducationaltoolkitasappropriatebeforeandfollowingimplementationtodeterminetheinfluenceofsuchvariablesatWaterloo.RefinementstotheSCPinstrumentshouldbemadeasnecessary,followingconsultationwithkeycampusstakeholders(includingFAUW,GSA,andFeds)andregularexpertreviewofoperationsandinstrumentdesignandperformance.Further,afullassessmentoftheinstrumentandplatformshouldtakeplaceafterfiveyearsofcampus-wideapplication,withmonitoringandevaluationfindingsreportedtoSenateannually.

8

TheOfficeoftheAssociateVice-President,Academic(forundergraduatecourses)andtheAssociateProvost,GraduateStudies(forgraduatecourses)shouldberesponsibleforoversight,coordinationandreportingofcampus-wideSCPassessmentthroughtheQualityAssuranceOffice,withconsultationasrequiredfromtheCentreforTeachingExcellence(CTE).SupportforthetechnicaluseoftheeValuatesoftwarewouldbeprovidedbyInformationSystemsandTechnology(IST),andwhenrequiredbytheCentreforTeachingExcellence(CTE).QualityAssuranceOfficestaff,alongwiththedevelopersofeValuate(ScienceComputing)andISTstaff,shoulddetermineanoptimalstrategytoensureappropriateresourcing(sufficientcapacityandoperationalsupport,usertrainingandsupport)foreValuateforcampus-wideuse.QualityAssuranceOfficestaffshouldalsomonitortheperformanceoftheSCPinstrumentandplatformonaterm-by-termbasis,andreportfindingsannuallytoSenateviatheSenateUndergraduateCouncil(SUC),theSenateGraduateandResearchCouncil(SGRC),andtheCourseEvaluationAdvisoryGroupco-ledbyScienceComputingandIST.2.5.4DataManagementTheownershipofSCPdataisanimportantissue.Assuch,informationgeneratedbytheSCPsmustbemanagedcarefully.Thecollection,analysisanddisseminationofstudentSCPdatamustbecarriedoutinaccordancewithbestpracticesconcerningprivacyofinformation,transparencyandaccountability.Thecollection,analysisanddisseminationofevaluationdatamustadheretoprivacyofinformation,transparency,andaccountabilityinaccordancewithPolicy46(InformationSecurity).NumericinformationshouldbemadeaccessibleafterauthenticationbytheWatIAMsystemandshouldbeavailableattheindividualcourselevel.Thesedatashouldprovideinformationgeneratedbythecorequestions.TheSCPdatashouldpresentinformationtofacilitatecomparisonwithFaculty-wideratingsandprogram-specificratingsasdeterminedtobestatisticallyappropriate.Instructorsshouldhaveaccesstoalloftheirnumericinformation.Thenumericresultsfromtheseevaluationsshouldbepartoftheinstructor’srecordforannualperformancereview,andfortenureandpromotionpurposes.Writtencommentsfromstudentsareintendedfortheinstructor’suseonly.OptionalquestionsregardingTAsshouldbesharedwiththeTAs;instructorsareencouragedtoengageindiscussionsabouttheseresultswithTAs.Instructors,attheirsolediscretion,mayusethewrittencommentswhenseekingfeedbackandimprovement.Forexample,theymayshowsomeorallofthecommentstomembersoftheCentreforTeachingExcellence(CTE)orcolleagueswhenseekingadviceaboutimprovingtheirteachingtechniqueorcoursedesign.

9

2.5.5SCPAdministrationProcessTheprojectteambelievesthatbestpracticeforadministeringtheSCPinstrumentincludesthefollowingobligations:

1. Providestudentswithinformationabouttheinstrumentattheoutsetofeachtermineachcoursesothattheyareawareofthetypeoffeedbackthatwillberequested.Thisinformationshouldbeincludedinthecourseoutline

2. OrientstudentstothepurposeandapplicationsoftheSCPwithreferencetothetoolkit3. ConducttheSCPduringthelasttwoweeksofclasseseachterm4. Setasideapproximately15minutesforin-classevaluation(forface-to-facecourses);

and5. CloseaccesstotheSCPbeforethestartoftheexamperiod.

3.0Recommendations:5Thissetofrecommendationsreflects(a)theevolvinganalysiscarriedoutbytheprojectteam,asrepresentedinsuccessivedraftreports;(b)considerabledebateamongstteammembersaboutkeyissuesandresponses;(c)theperspectivesofstakeholdergroupswhowerebriefedabouttheproject;and(d)thesuggestionsprovidedbyindividualrespondentsandgroupsintheFall2016campusconsultationprogram.3.1TeachingEvaluation3.1.1Studentcourseperceptions(SCP)

• AllUWcourse-basedlearningexperiences,inallformats,shouldbeevaluated.

• Studentshaveauniqueperspectivetocontributeregardingthecourselearningexperienceand,assuch,theirfeedbackshouldbesolicitedaspartoftheevaluationofteaching.

• TherecommendednomenclatureforthisexerciseisStudentCoursePerceptions(SCP).

3.1.2Useofcomplementaryevaluationmethods

• SCPresultsshouldbeconsideredoneofseveralpotentialdatasourcesforannualperformanceappraisals,andfortenureandpromotionpurposes.

• Asapriority,theuniversityshouldexplorethepotentialusesofadditional,

complementaryteachingevaluationmethods.

• Theuniversityshouldpromotetheuseofadditionalteachingevaluationmethods(e.g.,peerevaluations,teachingdossiers,etc.).Thesecomplementarymethodsmustbeusedinaconsistentmanneracrosscampus.

5Notethatfullconsensusbyteammemberswasnotpossibleonallrecommendations.ItisunderstoodthattheseRecommendationsmayberefinedfollowingtestinginPhase2.

10

• Facultiesshoulddecidewhichandhowoftencomplementaryevaluationmethodsshouldbeused.

• Significantinvestmentintrainingforinstructors,chairs/directors,andrelevantstaff

shouldbeallocatedtoensureconsistentandeffectiveuseofallevaluationtools.3.1.3TeachingQualityImprovement

• Triangulationofteachingevaluationmethods(i.e.,studentcourseperceptions(SCP),peerevaluations,teachingdossiers,and/orothermethods)shouldbeused.

• Theresourcesandexpertiseoftheuniversity’sCentreforTeachingExcellence(CTE)

shouldbepromotedandendorsedasavaluableandeffectiveresourcetohelpinstructorsenhancetheirteachingeffectiveness(e.g.,throughworkshops,individualconsultations,etc.).

3.1.4Tenureandtenure-trackstatus

• Tenuredprofessors,continuinginstructors,andinstructorswhosestartdateisafterthecommencementofanewevaluationprocessshouldbeassessedwiththenewSCPinstrument.

• Instructorswhosestartdatepre-datesthecommencementofanewevaluationinstrumentcoulddecidetousethenewevaluationinstrument,orthepreviousinstrumentusedintherespectiveFaculty.

3.2InformationManagement3.2.1InformationManagement–Instructors

• InstructorsshouldhaveaccesstoallnumericinformationgeneratedfortheircoursesbySCPexercises.

• ThenumericresultsfromSCPresults(i.e.corequestions,Facultyandinstructor-selectedquestions)shouldbepartoftheinstructor’srecordforannualperformancereview,andfortenureandpromotionpurposes.

• InstructorsshouldhavetheopportunitytoplaceintocontextSCPresultswhenusedforannualperformancereviewsandfortenureandpromotionfiles.

• Instructors,solelyattheirdiscretion,mayshareSCPdata(forexample,whenseeking

feedbackandadvice).

• IndividualFacultiesshouldfollowstandardanduniformprotocolsestablishedbytheuniversityconcerninginterpretationofSCPinformation,andaboutlevelsofaccessbyDeans,chairsanddirectorstoSCPcommentsdata.

11

3.2.2InformationManagement–StudentAccess• Summary/overallratingsaboutindividualcourses(i.e.corequestions)shouldbemade

availabletoallstudents.

• Thenamesofindividualinstructorsshouldnotbelistedoraccessible.

• AccesstoSCPresultsshouldbelimitedtothosemembersoftheuniversitycommunitywithWatIAMcredentials.

3.2.3ManagingOffensiveComments

• TheeValuatesystemshouldbedesignedtoscreentheCommentssectionforpotentiallyoffensivewordsorphrases.ThesecommentsshouldbeeliminatedfromtheCommentscontentthatisreviewedbytheinstructor.Thenumerics(data)wouldalsobedeletedinthesecases.

• IntheeventthateValuatewouldnothavethecapacitytoscanforoffensivecomments,alternativemeanswouldneedtobeidentifiedandimplemented.

• AbankofoffensivewordsorphrasesthatcouldbepresentintheCommentssectionshouldbedevelopedwithadviceprovidedbyFAUW,Feds,GSAandSWEC.

• Anonymityofresponsesandcommentsshouldbeensured,withtheexceptionofcommentsconsideredthreatening,inwhichcaseeValuateshouldbeusedtofilterthesecommentsandtoidentifythestudent.Relevantuniversitypolicieswouldthenbeapplied(e.g.,Policy33,46and/or71;GuidelinesforManagingStudentInformation)tobeadministeredbytheappropriateFacultyassociatedean(undergraduatestudies).

• QualityAssuranceOfficestaffshouldbetheresponsibilitycentreforthisoversightrole.

3.3InstrumentDesignandAnalysis3.3.1CascadedModelDesign

• Thethree-levelcascadedmodel(i.e.,core,course-basedandFacultyquestions)shouldbeimplementedcampus-wide.

• Corequestionsinthestudentcourseperceptions(SCP)instrumentshouldidentify

commonelementsofeffectiveinstruction(i.e.,thethreedimensionsofeffectiveinstruction:coursedesign,coursedelivery,andlearningexperience).

• Decisionsabouttheselectionofcomplementaryquestionsshouldtakeplaceatthe

Faculty,programand/orinstructorlevel.

• ComplementaryquestionsshouldbedrawnfromabankofvalidatedquestionsmaintainedbytheQualityAssuranceoffice.

12

• Thecascadedmodelshouldbemandatoryforsummative(end-of-term)evaluations,andusedattheinstructor’sdiscretionforformative(mid-term)evaluationpurposes.

3.3.2NumberandTypesofQuestions

• TheSCPquestionsetshouldbefinalizedfollowingextensivetestingandrefinementwitharepresentativesampleofstudentsandinstructors.

• Testingshouldincludeexaminationofthepotentialforbiasinquestionchoice,

phrasingandsequence.

• Thepotentialfor,orevidenceof,biasininstrumentquestiondesignandresponsesshouldbeakeyelementofpre-launchtestingandoffutureassessments.

• Theuniversity’sSurveyResearchCentre(SRC)shouldbeengagedtoprovideadvice

andhelptodesigntheinstrumentandtomanagethetestingprocess.

• Anoptionalquestionregardingexperientiallearningshouldbeavailableinthebankofcomplementaryquestions.

3.3.3Compatibilitywithexistinginstruments

• Thecascadedstudentcourseperception(SCP)modelstructureshouldsupportandextendpastandcurrentdatacaptureeffortsusedinpreviouscourseevaluationinstruments.

3.3.4DifferentiationbetweenCourseandInstructor

• Theinstrumentdesignshoulddistinguishbetweencoursedesignanddeliveryelementstoaddresscaseswheninstructorsteachcoursestheydidnotdesign.

3.3.5AnalysisofNumericData

• Statisticiansanddatavisualizationexpertsoncampusshouldbeconsultedtodeterminehowbesttoanalyzeandrepresentnumericdata.

• Numericdataanalysesshouldincludereportsontrendsoncesufficientdatahavebeen

collected.3.4InstrumentImplementation3.4.1TrainingandOrientationToolkit

• AtoolkitshouldbedevelopedtosupporttheSCPinstrument.Whenadditionalevaluationmethodsofteachingareimplemented,thetoolkitshouldalsosupportthesemethods.

• Thetoolkitshouldemphasizehowtointerpretthenumericdataandcommentsinthe

contextoftheSCP’slimitations.

13

• Allstudentsandfacultyshouldberequiredtocompleteanon-linetrainingandorientationmodulebeforeuseoftheinstrument.

• Experts(i.e.,educationalpsychologists)shoulddesignthetoolkit.

• Instructors(andthetoolkit)shouldconveytostudentsthatcoursecontextsdiffer(e.g.,

thetimetoreturnmachine-scoredtestswillnotbethesameasforessays,testsorassignments).

• In-classexplanationsofcoursedesignandintentshouldbeprovidedtopromoteclarity

andunderstanding.3.5MonitoringandEvaluation3.5.1RoleofQualityAssuranceOffice

• TheQualityAssuranceofficeshouldmonitortheSCPinstrumentandtoolkitonanannualbasis.

• TheQualityAssuranceofficeshouldprovideannualreportstoSenateaboutthestatusoftheSCPinstrumentandtoolkit.ThesereportsshouldalsobeprovidedtoSenateUndergroundCouncil(SUC)andtheSenateGraduateandResearchCouncil(SGRC).

• TheQualityAssuranceofficeshouldcarryoutafullassessmentoftheSCPinstrumentandtoolkitona5-yearcycle.

• RefinementstotheSCPinstrumentandtoolkitshouldbemadeinconsultationwithkey

campusstakeholders(i.e.,FAUW,Feds,GSA,SWEC). 4.0NextSteps:Phase2 TheVice-PresidentAcademicandProvostwilldeterminenextstepsastheyrelatetoSCPsatWaterloo.IfthedecisionismadetoproceedtofullydeveloptheSCPframework,thenthefollowingmajortaskswouldneedtobeaccomplished:

• Develop,test,refineandvalidateaquestionset(bothcoreandoptional)• Designandtestthetraining/orientationtoolkit• TesttheeValuatesoftwareandplatformtoensuredeliverycapabilityusingacascaded

model• Pre-launchtestoftheprototypeinitsentirety(i.e.,questionset,toolkitand

eValuateplatform.

Thisworkwouldrequirethecreationofanewprojectteamandsub-groups.Itislikelythattheseprojectelementswouldrequireatleastoneyeartocomplete.Theprojectcouldrequirehiringaprojectleaderandpossiblystaffresourcestoconductresearch,develop,andtestthequestionsetandtoolkit.

14

Ifthereisapprovaltopursueacascadedevaluationmodel,asub-committeeshouldbestruckandusertestingandsurveyvalidationshouldbeundertakenonthecorequestions.Itemsshouldalsobedevelopedfortheadditionalquestionbank(i.e.,theFaculty/academicunitandinstructorquestions).Theprototypeinstrumentshouldbefieldtestedthroughpilots,theresultsofwhichwouldbeusedtochange,refine,andfinalizethequestionset.TheeValuateprojectteamandISTwillneedtoworkcloselywithPhase2projectcolleaguestoidentifyandexploreissuesandopportunitiesforsystemdesign. Therewillalsobeaneedtokeepthecampuscommunityinformedregularlyaboutprojectprogress.ThiscommunicationcouldincluderegularbriefingsforSenateandforcampusinterestgroups.TheprototypeSCPframeworkwouldneedtobereviewedbykeyuniversity-leveldecision-makingbodies(e.g.,SUC,etc.)andSenatefortheirreviewandapproval.Reportsubmittedby:MarkSeasonsChairCourseEvaluationProjectTeamApril27,2017

15

Appendix1–CourseEvaluationProjectProjectMandateandProcessIn2014,theAssociateVice-President,AcademicestablishedtheCourseEvaluationProjectTeam(CEPT)toexplorethepotentialforanew,campus-wide,courseevaluationmodel.Forthepasttwoyears,theprojectteamhasreviewedliteratureoncourseevaluationandconductedconsultationsacrosscampuswithrepresentativestakeholdersregardingthepossibleimplementationofanewcourseevaluationmodel.Specifically,theprojectteamwasmandatedtoaccomplishthefollowing:

1. Examinethevariousadministrative,logistical,technological,andculturalissuespertainingtocourseevaluationsattheUniversityofWaterloo

2. Establishbestpracticesconcerningallaspectsofcourseevaluationsbasedonareviewoftheliterature

3. ConsidertheimplicationsofadoptingchangestocurrentcourseevaluationproceduresinrelationtoPolicy77,theMOA(MemorandumofAgreementwiththeFacultyAssociation),andfacultyannualperformanceevaluations

4. Assessthefeasibilityofdesigningacommoninstitutionalsurveyinstrument,withcustomizablesectionsattheFaculty,department,orinstructorlevel(referredtohereasa“cascaded”model)

TheprojectteamiscomposedofrepresentativesfromthemajorstakeholdergroupsattheUniversity:facultyrepresentation(academicFaculties;FacultyAssociationoftheUniversityofWaterloo–FAUW);undergraduatestudents(FederationofStudents-Feds);graduatestudents(GraduateStudentsAssociation–GSA);academicsupportunits(CentreforExtendedLearning–CEL;theCentreforTeachingExcellence–CTE);andtheUniversity’sInformationSystemsandTechnologygroup–IST.TheprojectteamanditssubgroupshavemetregularlysinceMay2014.Recommendationshavebeeninformedbythereviewofappropriateliterature,consultationswithcolleaguesatotheruniversities,andthereviewofanumberofpeeruniversitywebsitestoidentifybestpracticesandfactorstoconsiderwhendesigning,implementing,andinterpretingcourseevaluations.Inaddition,theteamhascarefullyconsideredperspectivesandadviceofferedbytheuniversity’sAccessAbilityServices(AAS),theOfficeofthePresident(SpecialAdvisoronWomen’sandGenderIssues),aswellassubjectmatterspecialists,includingsocialpsychologists,surveydesignmethodologists,andteachingfellowsatWaterloo.Aconsultationprocesswasundertakenthroughout2015withtheSenate,DeansCouncil,FAUW,FEDS,GSAandallsixFaculties(AppliedHealthSciences,Arts,Engineering,Environment,Mathematics,Science).Threekeyconcernsemergedfromtheseconsultations:(1)inherentbiasesincourseevaluation,(2)theadvisabilityofuniversity-widequestions,and(3)theprivacyofandaccesstodata.Therecommendationsinthisreportaddresstheseandrelatedconcerns.

16

Appendix2–2016SurveyProcessandResultsSurveyDesignandManagement:InFall2016,thecourseevaluationprojectteamdecidedthatitwastimetoseekopinionsfromthecampuscommunityabouttheprojectteam’srecommendations.Accordingly,asurveywasdesignedwiththefollowingopen-endedquestionsthatrelatedtokeyaspectsoftheproposedapproachtocourseevaluation:

1. WhataretheadvantagesanddisadvantagesofWaterlooadoptingacascadedmodelforcourseevaluation?

2. Howwelldothesamplequestionsalignwiththeinstrumentdesignprinciplesoutlinedinthisreport?

3. Whataretheadvantagesanddisadvantagesofaccesstocourseevaluationinformationaspresentedinthisreport?

4. WhatothercommentsdoyouhaveabouttherecommendationsandinformationpresentedregardingcourseevaluationsatWaterloo?

ThesurveylaunchedonNovember8,2016.TwoemailsweresentfromtheofficeoftheAssociateVice-President(Academic)tointroducethesurveyandencourageresponses(November8,2016andDecember9,2016).Inaddition,thesurveywashighlightedintheuniversity’sDailyBulletin(seeNovember25,2016)andbytheRegistrar’sOffice(December9,2016).AllcommunicationstothecampuscommunityincludedahotlinktothesurveythatwaspostedontheAssociateVice-President(Academic)’swebsite(see:https://uwaterloo.ca/associate-vice-president-academic/course-evaluation-project).Allsurveyresponseswerecollectedanonymously.Thesurveywasavailableon-lineforcampusstakeholderresponsefromNovember8,2016toJanuary20,2017.Morethan90individualresponsestothesurveyhavebeenreceivedtodate,aswellaswrittensubmissionsfromseveralacademicunitsandcampusorganizations.TheseresponseshavebeenorganizedforanalysisinanExcelspreadsheetmanagedbyQualityAssurancestaffintheofficeoftheAssociateVice-President(Academic).GeneralImpressions:Overall,thecommentsandsuggestionsaresupportiveoftherecommendationspresentedintheDraftReport.However,positionsvarywidelyandsomeappearfixedregardingspecificissues.Wecanstatethat,ingeneral,studentswantasmuchinformationaspossibleaboutthelearningexperienceandteachingeffectiveness.Wenotethatspecificgroups,suchasFAUW,seekstrictcontrolsonstudentcourseperceptiondatauseandaccess.Themajorityofcriticalcommentsconcernedtheissuesofbias(e.g.,gender,race,etc.)inthecontextofteachingevaluation;thecapacityofstudentstoassessteachingquality;theproposedquestionset(i.e.numberandtypesofquestions);andaccesstostudentcourseperceptiondata,specificallythewrittencomments,bystudentsanddepartmentchairs/directors.Commentsweremadeaboutthevalidityoftheconceptofteachingevaluationandtheutilityofanorientation/trainingtoolkit.

17

Areasofcommonality/consensusincluded:• Thecascadedmodel(tri-level:core,courseandFaculty-levelquestions)• UseoftheeValuatecourseevaluationsoftwareandplatform• Themeritsofacomprehensivesystemofevaluationofteachingbeyondthesolesource

ofstudentfeedback• Theneedtotestandvalidatethequestionsetbeforelaunch• Recognitionofbias,initsmanyforms,andimplicationsforteachingevaluation• Theneedtobuilduponhistoricaldatabasesofpastcourseevaluations• RestrictionsonaccesstoSCPdata

Areasofdifferenceofopinion/divergenceofperspectivesincluded:• Theproposedquestionset–numberandtypesofquestions• Whetherthecomplexissueofbiascouldbeaddressedeffectivelythroughinstrument

designandusertraining• Whetheranorientation/trainingtoolkitwouldbeaneffectivewaytodealwithbias• Whetherstudentsshouldevaluateteachingeffectiveness• Whetherinstructornamesshouldbeaccessibleinevaluationdatabases• Whetherstudentsand/oracademicdepartmentchairs/directorsshouldbeableto

accessstudents’writtencomments

SummaryofOrganization/Group/DepartmentalSubmissions:PleaseseeTable1.

18

Table1:SummaryofGroup/DepartmentalSubmissionsOrganization/Group/Dept.

Comments

English ConcernsaboutthephrasingofdraftquestionsCommentsaboutwhatconstitutes“reasonableamountoftime”re:assignmentreturnConcernsabouttheconceptof“clearcommunicator”Bias:needtomapoutquestionsandassessre:bias;ensurethattoolkitdesignedwithbiasaskeyissue;howmightbiasbeaddressedbychairs?Datamanagement:arewetakingsufficientstepstoensureprivacy?

FAUW Cascadedmodel:thevalueofcampus-widecorequestionsisquestionable,givenFacultyculturesSamplequestions:needtoclarifyuseandpurposeofevaluationsbeforefinalizingquestionsPosition:validtohavestudentperceptions;notforcoursedesignorqualityofteachingAccesstoinformation:agreesthatevaluationcommentsforinstructoralone;FacultiesshoulddecidewhethertomakenumericscoresmorewidelyavailableAdditionalcomments:biasremainsasignificantconcern;rejectspositionthatpotentialforbiascanbemanagedRecommendations:clarifythatthisisaboutstudentexperiences;shouldnotbeusedtoevaluateteachingformeritorT+P;supportconsistentuseofanynewmodel

Feds Overall,Fedsissupportiveofdraftreport(November7,2016);whileareasofconcern,happywithproposedprocessesandstrategiesBiasissue:acknowledgespotential;notesstudentevaluationspartofoverallassessmentprocess;students’assessmentsareanintegralandnecessaryaspectofcourseevaluationCascadedmodelsupportedNumericdata:recommendscampuscommunityshouldhaveaccesstothesedataMid-Termevaluations:FedspolicycallsforbothformativeandsummativecourseevaluationsTAs:encouragesgreaterattentiontoroleofTAs,importantroleplayed;shouldbeproperlyevaluated

19

Fekeetal. Agreedwithmuchinreport–e.g.,cascadedmodelIssue:systemicbiases(explicitandimplicit)–graveconcerns;concernedthatnoconcretesolutionsprovidedinreport,sobroaderconversationisneededOpinionthatstudenttrainingprogramwouldnotproperlyaddresspotentialforbiasStudentcourseevaluationinformationshouldnotbeusedinmerit,T+P;anew,moreequitablemethodisneeded(e.g.,peerteachingevaluations)

GSA InagreementwithCEPTreport(November7,2016)Acknowledgescommitmentandeffortofprojectteam,“extraordinaryeffort”toreachconsensusCascadedmodel:couldprovidemeaningfulinformationre:cross-campus;mustwatchforbiasesandstudents’reliabilityinresponsesSamplequestions:notesthatprovince’soutcomesbasedfundingmodelwilllikelyusecourseevaluationdataCoursedesign:notedthatstudentsnotinpositiontocommentknowledgeablyaboutcoursedesign;rather,shouldbeinterpretedasexpressionofstudents’perceptionsofcourseCoursedelivery:someelementsareuniversalacrosscampus–e.g.,timeliness,clarityofcommunication,etc.–whileotherscouldbecontext-specificLearningexperience:open-endedquestions(comments)areimportant,avaluabletoolTraining:supportformandatorytrainingplusin-classmessages/explanationsofimportanceofevaluationPrivacyofinformation:expectthatUWprivacypoliciesandreportrecommendationsre:accesstodataappliestoTAsaswellOngoingmonitoringandevaluationofinstrumentwillbeessential;eachstakeholdergroupmustfeelrespected;academicfreedomandintegritymustbeupheldBeliefisthatstudentsandtheirorganizationswantaccesstostudentperceptionsofthequalityoftheircourseexperiences;usetocomparewithexperienceselsewhere.Datashouldexcludereferencetoratings/rankingsofinstructorsEvaluationofcourses/teachingqualityshouldbebasedonmultipleevaluationmethods

20

MATH SupportforuseofevaluationstoprovidemeaningfulfeedbackCascadedmodelgenerallysupportedChoiceofquestionsisakeyissue.Shouldhave4-6corequestionsmaximum.MATHwantsroleinchoiceofcorequestions;thesequestionsshouldbeapplicableacrossdisciplinesStrongdesiretomaintainhistoricaltrenddata–anynewinstrumentwouldneedquestionsthatretain,orrelateclearlyto,existingonesDatamanagement/access:moreclarityrequiredre.internalvs.externaluseofdata;numericshouldwidelyaccessible,whilecommentsforinstructor’suseStudentevaluations:theinstrumentshouldbepartofoverallteachingevaluationprocess

PSYCH Extraneous,“biasing”factorsmakestudentquestionnairesinvalidforsummativeevaluationSummativeuseofstudentevaluationsharmsstudentlearningandinstructors’integrityandacademicfreedomProposedremediesforbiaswillnotbeeffectiveStudentevaluationscouldbeusefulforformativefeedbackExperiencesonothercampusesmaynotberelevantatWaterloo;needtobecarefulabout“bestpractices”AlternativestostudentquestionnairescangeneratelessbiasanddomoretopromoteeffectiveinstructionQuestionnairedesignshouldbeinformedbyon-campusexpertadvice

SWEC FocusshouldbeonstudentperceptionsofcourseandtheirlearningStudentfeedbackdatashouldnotbepublishedShouldinvestigatewhetherdiscriminationapparentinpastcourseevaluationresultsFutureinstrumentsshouldbeevaluatedregularly(i.e.,annually)tosupportrefinementofquestionsTrainingtominimizeopportunitiesforbiasshould(a)beusedifproventobeeffectiveand(b)designedbyexpertsMechanismsshouldbeinplacetomitigateimpactofsexist,racist,otherinappropriatecommentsThereisaneedto(a)examinemethodsthatcouldbeusedtoassessteachingeffectivenessofinstructorsandstudentlearningand(b)reviewweighting/importanceofstudentevaluationsformeritandT+P

21

Appendix3-DimensionsofeffectiveteachingandsamplequestionsCourseDesignDimension CourseDeliveryDimension LearningExperience

Dimension• IknewwhatIwasexpectedtolearninthiscourse

• ThegradedworkassessedwhatIwasexpectedtolearn

• Thecourseactivitiespreparedmeforthegradedwork

• Thecourseworkdemandswere…(Likertscaleanswerchoiceswillreflectworkloadintensity–forexample,verylighttoveryheavy)

• Theinstructorreturnedgradedworkinareasonableamountoftime

• Theinstructorwasaclearcommunicator

• Theinstructorcreatedasupportiveenvironmentthathelpedmelearn

• Theinstructorstimulatedmyinterestinthiscourse

• ThemostimportantthingIlearnedinthiscoursewas*

• Overall,Ilearnedagreatdealfromthisinstructor

• Overall,thequalityofmylearningexperienceinthiscoursewasexcellent

• Whathelpedmetolearninthiscourse?*

• Whatchanges,ifany,wouldIsuggestforthiscourse?*

*Denotesanopen-endedquestion.

22

References i MinistryofTraining,Colleges,andUniversities.(2015).FocusonOutcomes,CentreonStudents:PerspectivesonEvolvingOntario’sUniversityFundingModel.Toronto,ON:Queen'sPrinterforOntario.

ii Wright,A.W.,Hamilton,B.,Mighty,J.,Scott,J.,&Muirhead,B.(2014).TheOntarioUniversities'TeachingEvaluationToolkit:FeasibilityStudy.UniversityofWindsor:CentreforTeachingandLearning.37iii Chickering,A.W.&Gamson,Z.F.(1987).SevenPrinciplesforGoodPracticeinUndergraduateEducation.AAHEBulletin;Feldman,K.A.(2007).Identifyingexemplaryteachersandteaching:Evidencefromstudentratings.InR.P.PerryandJ.C.Smart(eds.),Thescholarshipofteachingandlearninginhighereducation:Anevidence-basedperspective,pp.93-143.SpringerOnline.;Gravestock,P.&Gregor-Greenleaf,E.(2008).StudentCourseEvaluations:Research,ModelsandTrends.Toronto:HigherEducationQualityCouncilofOntario;Hativa,N.,Barak,R.,&Simhi,E.(2001).Exemplaryuniversityteachers:Knowledgeandbeliefsregardingeffectiveteachingdimensionsandstrategies.TheJournalofHigherEducation,72,699-729.;NationalSurveyofStudentEngagement(NSSE).(2014).FromBenchmarkstoEngagementIndicatorsandHighImpactPractices.http://nsse.iub.edu/pdf/Benchmarks%20to%20Indicators.pdf;Spooren,P.,Brockx,B.,&Mortelmans,D.(2013).Onthevalidityofstudentevaluationofteaching:Thestateoftheart.ReviewofEducationalResearch,83,598-642;UndergraduatedegreelevelexpectationsfortheUniversityofWaterloo.Retrievedfromhttps://uwaterloo.ca/centre-for-teaching-excellence/teaching-resources/curriculum-development-and-renewal/program-review-accreditation/8-degree-expectations;Young,S.&Shaw,D.G.(1999).Profilesofeffectivecollegeanduniversityteachers.TheJournalofHigherEducation,70,670-686.iv Weimer,M.(2010).Inspiredcollegeteaching:Acareer-longresourceforprofessionalgrowth.SanFrancisco,CA:Jossey-Bass.75

v McGillUniversity.(2014).Recommendedpoolofquestionsforcoursesandinstructors.RetrievedfromtheworldwidewebonMarch11,2015:http://www.mcgill.ca/tls/files/tls/recommended_pool_of_questions_2014-final.pdf;StudentEvaluationofEducationalQuality(SEEQ)–exampleusedwasemployedattheUniversityofManitoba.Retrievedfromhttp://intranet.umanitoba.ca/academic_support/catl/media/seeq.pdf;UniversityofBritishColumbia.(2008).Universityquestions.Retrievedfromhttp://teacheval.ubc.ca/introduction/university-questions/;UniversityofToronto.(2013).Courseevaluationquestionbank.Retrievedfromhttp://www.teaching.utoronto.ca/Assets/Teaching+Digital+Assets/CTSI+1/CTSI+Digital+Assets/PDFs/qb-nov13.pdf;Werhun,C.&Rolheiser,C.(2014,June).Students’motivationforparticipationincourseevaluationsoutsideoftheclassroom:Theroleoftheinstructorinonline

23

courseevaluations.PresentedattheannualSocietyforTeachingandLearninginHigherEducationconference,Queen’sUniversity,Kingston,ON.vi Gravestock,P.&Gregor-Greenleaf,E.(2008).StudentCourseEvaluations:Research,ModelsandTrends.Toronto:HigherEducationQualityCouncilofOntario;Spooren,P.,Brockx,B.,&Mortelmans,D.(2013).Onthevalidityofstudentevaluationofteaching:Thestateoftheart.ReviewofEducationalResearch,83,598-642.vii Marks,P.(2012).Silentpartners:Studentcourseevaluationsandtheconstructionofpedagogicalworlds.CanadianJournalforStudiesinDiscourseandWriting,24(1),1-32.Retrievedfromhttp://www.cjsdw.com/index.php/cjsdw/article/view/16/8viii Gravestock,P.&Gregor-Greenleaf,E.(2008).StudentCourseEvaluations:Research,ModelsandTrends.Toronto:HigherEducationQualityCouncilofOntario.ix Gravestock,P.&Gregor-Greenleaf,E.(2008).StudentCourseEvaluations:Research,ModelsandTrends.Toronto:HigherEducationQualityCouncilofOntario.x Kulik,J.A.(2001).Studentratings:Validity,utility,andcontroversy.InM.Theall,P.C.Abrami,&L.A.Mets(Eds.),Thestudentratingsdebate:Aretheyvalid?Howcanwebestusethem?[Specialissue].NewDirectionsforInstitutionalResearch,109,9-25.;Gravestock,P.&Gregor-Greenleaf,E.(2008).StudentCourseEvaluations:Research,ModelsandTrends.Toronto:HigherEducationQualityCouncilofOntario;MacNell,L.,Driscoll,A.,&Hunt,A.N.(2014).What’sinaname:Exposinggenderbiasinstudentratingsofteaching.InnovativeHigherEducation.DOI:10.1007/s10755-014-9313-4;Ory,J.C.,&Ryan,K.(2001).Howdostudentratingsmeasureuptoanewvalidityframework?InM.Theall,P.CAbrami,&L.A.Mets(Eds.),Thestudentratingsdebate:Aretheyvalid?Howcanwebestusethem?[Specialissue].NewDirectionsforInstitutionalResearch,109,27-44.;Sinclair,L.&Kunda,Z.(2000).Motivatedstereotypingofwomen:She’sfineifshepraisedmebutincompetentifshecriticizedme.PersonalityandSocialPsychologyBulletin,26,1329-1342.;Spooren,P.,Brockx,B.,&Mortelmans,D.(2013).Onthevalidityofstudentevaluationofteaching:Thestateoftheart.ReviewofEducationalResearch,83,598-642.;Theall,M.,&Franklin,J.(2001).Lookingforbiasinallthewrongplaces:Asearchfortruthorawitchhuntinstudentratingsofinstruction?InM.Theall,P.CAbrami,&L.A.Mets(Eds.),Thestudentratingsdebate:Aretheyvalid?Howcanwebestusethem?[Specialissue].NewDirectionsforInstitutionalResearch,109,45-56.

Recommended