Peer evaluation of student generated content

Embed Size (px)

Citation preview

  • 8/11/2019 Peer evaluation of student generated content

    1/2

    Peer Evaluation of Student Generated Content

    Jared TritzUniversity of Michigan

    Department of PhysicsAnn Arbor, MI [email protected]

    Nicole MichelottiUniversity of Michigan

    Department of PhysicsAnn Arbor, [email protected]

    Ginger ShultzUniversity of Michigan

    Department of ChemistryAnn Arbor, [email protected]

    Tim McKay

    University of MichiganDepartment of Physics

    Ann Arbor, [email protected]

    Barsaa MohapatraUniversity of Michigan

    Department of Ind. Oper. Eng.Ann Arbor, MI

    [email protected]

    ABSTRACTWe will present three similar studies that examine onlinepeer evaluation of student-generated explanations for missedexam problems in introductory physics. In the rst study,students created video solutions using YouTube and in thesecond two studies, they created written solutions usingGoogle documents. All peer evaluations were performedusing a tournament module as part of the interactive on-line coaching system called E 2 Coach[4] at the University of Michigan. With the theme of LAK 2014 being intersectionof learning analytics research, theory and practice, we thinkthis poster will provide an accessible example that combinesa classroom experiment with rigorous analysis to understandoutcomes.

    Categories and Subject Descriptors10010405.10010489 [Applied computing ]: Education

    General Termspeer evaluation, tournaments, videos, google docs, blendedlearning

    1. INTRODUCTIONFirst inspired by ubiquitous screen capture technology, weconducted an experiment that encourages students to createvideo solutions to an exam problem they got wrong. Know-ing that there would probably be a wide range of qualityin the content produced we also wanted to provide an effi-cient means of ltering the content using a student poweredpeer review system. To accomplish these, we implemented aprocess whereby students were assigned a problem they got

    Developer and presenterPrinciple Investigator of E 2 Coach project

    Permission to make digital or hard copies of part or all of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for prot or commercial advantage and that copiesbear this notice and the full citation on the rst page. Copyrights for third-party components of this work must be honored. For all other uses, contactthe Owner/Author.Copyright is held by the owner/author(s). LAK 14 , Mar 24-28 2014, Indianapolis, IN, USA ACM 978-1-4503-2664-3/14/03. http://dx.doi.org/10.1145/2567574.2567598

    wrong and asked to submit their redo solution to a tourna-ment style peer review process. Borrowing techniques fromcognitive psychology [5] to stimulate learning in the hardsciences and measuring the effect is becoming an open areaof exploration[2]. To activate meta-cognitive reection, stu-dents were asked to emphasize any mistake(s) they made onthe exam in their solution explanation.

    For the rst study (Table: 1) in Physics 240 (Electricity andMagnetism for Engineers), students had only one chance tomake a video solution and participate in peer review forthe second midterm. In the later parallel studies in thePhysics 135 (Mechanics for Life Scientists) and 235 (Elec-tricity and Magnetism for Life Scientists), we altered thesubmission requirement to use written Google documentsinstead of videos. The study was also expanded to includeall midterms in order to explore student participation andlearning effects over time.

    Course Term Mode Students Chances SubmitsPhys 240 W12 video 352 1 206Phys 135 F13 doc 396 3 647Phys 235 F13 doc 222 3 284

    Table 1: Description of three similar studies

    In all studies, students were given extra credit equivalent toone exam question (5% = 1/20 questions) for each chancethey participated, where participation meant creating a so-lution and reviewing two assigned solutions to select the onethey thought was better. Appropriately, the extra credit is

    not included in the exam data analysis.

    2. PROBLEM ASSIGNMENTWe wanted individual students to revisit problems that weremost salient to them, therefore, most students were assignedthe easiest problem they got wrong. This was to help en-sure the best opportunity for learning and reection at theboundary of their understanding. We had to balance thiswith the somewhat arbitrary goal of assigning each probleman equal number of times. For voting, we tried to assignstudents the second easiest problem they go wrong again inan attempt to make the process most useful for them.

  • 8/11/2019 Peer evaluation of student generated content

    2/2