6
The Instructional Design Portfolio Peer Assessment Support System (PASS) By Lan Li and Allen L. Steckelberg "PASS is a database- driven website developed to facilitate the peer assessment process." Background The model Peer Assessment Support System (PASS) was initially designed in the spring of 2002 and further improved and accommodated in the spring of 2004 to facilitate a peer assessment procedure in an undergraduate technology application course for pre service teachers at University of Nebraska at Lincoln. One of the class projects in this course was a WebQuest. In this project, students constructed a WebQuest activity and made it available on the internet. A WebQuest is an instructional activity designed to create internet-based learning materials. To help students gain a better understanding of how WebQuests should be constructed, peer assessment of WebQuests was first introduced into several sections of this course through a paper based system. Peer assessment was used to encourage deeper thinking on the WebQuest assignment and to facilitate learning of critical features. The focus ofthe peer assessment was students' ability to develop a WebQuest meeting criteria described by Dodge (2001). After students uploaded their WebQuest projects to the internet, each student was randomly assigned to review two peers' projects. The reviewing process included two parts; 1. Students rated and commented upon each project in a paper system according to the rubric (Dodge, 2001), which included 13 marking items ranging from the overall aesthetics ofthe website to each critical attribute of a WebQuest activity. For each marking item, there were three levels of performance indicator and corresponding points to help students interpret peers' performance and provide feedback. Each student had a chance to review two peers' work and get feedback from two peers for their own performance. 2. Students revised their WebQuests according to peer feedback. Instructors and students generally acknowledged peer assessment as a valuable process, recognizing the value of this process in fostering student critical thinking skills and promoting meaningful learning. However, two issues associated with this system — anonymity and the administrative workload — hindered the widespread acceptance of this process in this course. In a paper-based system, assessors and assessees' identities could be revealed easily. Potential biases such as friendship, gender or race could cause students to rate good performance down or poor performance up, thus discrediting peer assessment's reliability and validity. Instructors were also concerned about the excessive workload necessary to maintain the feedback distribution system. Other studies also suggested a substantial workload for managing peer assessment process. This problem has been noticed previously by Hanrahan and Isaac (2001) who reported more than 40 person hours for documentation work in classes with 244 students in a similar peer assessment study. 80 TechTrends Volume 49, Number 4

The Instructional Design Portfolio Peer Assessment …ehk1/pdf/Peer Assessment of WebQuests.pdfas friendship, gender or race could cause students to rate good performance down or poor

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: The Instructional Design Portfolio Peer Assessment …ehk1/pdf/Peer Assessment of WebQuests.pdfas friendship, gender or race could cause students to rate good performance down or poor

The Instructional Design Portfolio

Peer Assessment SupportSystem (PASS)

By Lan Li and Allen L. Steckelberg

"PASS is a database-driven website

developed to facilitatethe peer assessment

process."

BackgroundThe model Peer Assessment Support System (PASS) was initially designed

in the spring of 2002 and further improved and accommodated in the spring of2004 to facilitate a peer assessment procedure in an undergraduate technologyapplication course for pre service teachers at University of Nebraska at Lincoln.

One of the class projects in this course was a WebQuest. In this project,students constructed a WebQuest activity and made it available on the internet. AWebQuest is an instructional activity designed to create internet-based learningmaterials. To help students gain a better understanding of how WebQuests shouldbe constructed, peer assessment of WebQuests was first introduced into severalsections of this course through a paper based system. Peer assessment was used toencourage deeper thinking on the WebQuest assignment and to facilitate learningof critical features. The focus ofthe peer assessment was students' ability to developa WebQuest meeting criteria described by Dodge (2001). After students uploadedtheir WebQuest projects to the internet, each student was randomly assigned toreview two peers' projects. The reviewing process included two parts;

1. Students rated and commented upon each project in a paper system accordingto the rubric (Dodge, 2001), which included 13 marking items ranging fromthe overall aesthetics ofthe website to each critical attribute of a WebQuestactivity. For each marking item, there were three levels of performanceindicator and corresponding points to help students interpret peers'performance and provide feedback. Each student had a chance to review twopeers' work and get feedback from two peers for their own performance.

2. Students revised their WebQuests according to peer feedback.

Instructors and students generally acknowledged peer assessment as a valuableprocess, recognizing the value of this process in fostering student critical thinkingskills and promoting meaningful learning. However, two issues associated withthis system — anonymity and the administrative workload — hindered thewidespread acceptance of this process in this course. In a paper-based system,assessors and assessees' identities could be revealed easily. Potential biases suchas friendship, gender or race could cause students to rate good performancedown or poor performance up, thus discrediting peer assessment's reliability andvalidity. Instructors were also concerned about the excessive workload necessaryto maintain the feedback distribution system. Other studies also suggested asubstantial workload for managing peer assessment process. This problem hasbeen noticed previously by Hanrahan and Isaac (2001) who reported more than40 person hours for documentation work in classes with 244 students in a similarpeer assessment study.

80 TechTrends Volume 49, Number 4

Page 2: The Instructional Design Portfolio Peer Assessment …ehk1/pdf/Peer Assessment of WebQuests.pdfas friendship, gender or race could cause students to rate good performance down or poor

To overcome the anonymity and administrative workload issues andto facilitate peer assessment in this course, we designed and built PASS, atechnology-mediated system. In this system, data collection can be automatedand summarized. Students and instructors have immediate access to reviewsonce they are generated. The whole process can be conducted anonymously andmanaged via internet.

PASS

PASS is a database-driven website developed to facilitate the peer assessmentprocess. This system was iirst implemented in one section of the course in thesummer of 2004. In the fall of 2004, four more sections adopted the procedure.As a multi-user system supporting student interaction, PASS contains separateinterfaces for instructors and students (Figure 1).

Student Interface

student Log In—

1—

View marking critena

Review and provide feedback to peers' projects

Edit rating and comments for peers' projects

View feedback from peersfor his/her own project

Instructor Interface

Insructor

Log In

Administrator I n stru cto r

Manage student account Monitor student peer assessment

View detailed reviewcreated by students

View detailed peerfeedbvack for students'̂own projects

Figure I. Student interface and instructor interjace in Peer Assessment Support System

When we designed the student interface, we were trying to create a formatand structure supporting each step of peer assessment process. This scaffoldingguided students through the process from basic understanding of the rating

Volume 49, Number 4 TechTrends 81

Page 3: The Instructional Design Portfolio Peer Assessment …ehk1/pdf/Peer Assessment of WebQuests.pdfas friendship, gender or race could cause students to rate good performance down or poor

•waM Qn

You can view marking cntena. understand Ihe basK elementsof good periormance and how to apply them in projectassessmeni

Vou wiM be randomly assigned lo two wsb projects. Rate andcomment upon each project accortfing to the marking criteria.

You can edit your rating and comments for the pearassessment after sut)missk)n.

Score and feedback for your own project will be summariedinstantly once they are generated by your peers View thefeedback and improve your project.

EwmnteProjects Need Hdp? Why Pw Assessment?

Figure 2. Peer assessment page in the student interface

rubric to application ofthe rubric, and to the improvement of projects using peerfeedback.

Once students logged onto PASS v̂ 'ith a username and password, they weredirected to the peer assessment page (Figure 2). Ihe four critical attributes of peerassessment are presented in this page:

1. Students could view the marking criteria(Dodge, 2001), which helped studentsunderstand what the basic elements ofgood performance are and how eachproject should be evaluated.

2. Students had access to two randomlyassigned web projects from peers. Aform was provided for students torate and comment on each project.Scores and comments were instantlysummarized and transmitted to adatabase once they were generated.

3. Since this is a multi-level learningprocess, we believe that studentswould get a deeper understandingof tbe project and rubric after theyapplied a rubric to rate peers' projects.To accommodate this need, therewas a "Peer Review Editing" functionprovided for reviewers to go back andmodify their grading and comments.

4. Students bad instant access to tbe feedback for their own projects, whichprovided suggestions for further improvement of their projects.

Using the student interface, students performed two specific roles: assessorand assessee. As assessors, they rated and commented upon two web projectsaccording to the marking criteria. As assessees, they had instant access to the

r-ij 1 A r^ i v ^ j ' i feedback provided from peers for their own

Student Record Detai ls projectsThe instructor interface {Figure 1} was

designed to enable instructors to performtwo roles: administrator and instructor.As administrators, instructors had accessand priority to overwrite student accountinformation; as instructors, they couldmonitor the peer assessment process. Foreach student, instructors had access to thedetailed records of the two reviews createdby the student as well as tbe feedback thisstudent's project received from two peers(Figures 3 & 4). The separate instructorinterface was designed to provide etficientaccess to the information needed to manageand monitor the peer assessment process.

PASS bas the following major merits:1. Assuring anonymity: Students were

instructed not to input any personalinformation into their projects.Students' projects were coded asnumbers for review. Their identitieswere not revealed during peer

Figure 3. Student record page in the instructor interface assessment.

FirstName

LastNaroe

Email

Password

Name

section

Pro ject Web Ad d ress

Review eeOnelD

|198

|Emi)y

|Chang

|[email protected]

|husker123

|Chang, Emily

|2004 Fall 008

|http://edweblab.unLedu/Hilan/259

|217

RevieweeTwolD {|203

Peer Assessment have been completed by students (ID)(Click to view more information.)

Peer Assessment have been created for students (ID)(Click to view more infomiation.)

213214

217203

Retum to Cla&s List

82 TechTrends Volume 49, Number 4

Page 4: The Instructional Design Portfolio Peer Assessment …ehk1/pdf/Peer Assessment of WebQuests.pdfas friendship, gender or race could cause students to rate good performance down or poor

2. Reducing management workload: All the data wassummarized automatically and transmitted fromusers' computers to a database. The WebQuests,supporting structure and relevant reviews were allavailable in one location via the web.

3. Stimulating students' interaction: Once assessmentdata was submitted, students and instructors hadimmediate access through the PASS web pages, whichencouraged students' engagement and stimulatedtheir interaction.

4. Focusing on quality criteria: To assess peers' projects,students needed to get a deep grasp of what wasrequired for good performance.

5. Rethinking projects: Feedback from peers providedanother opportunity to reconsider the qualityof students' own work and fostered furtherenhancement.

AtMiMMft t Rtconl

T>ia MH* of A* pralact: Daaan

T O M lean Om pni)Ki

Irvm yaur pMT Hid knpnv* y w

« AHttiMlo (Dito ivlHi u (w WtbOunl paga btK. mH • » a l rMourcH tnlwd u K

Layoul o f. conklbula ID t i i unMmandmg (Mnwili n UHd Ki maU vWuMol concMB. •AH* Hid MMonmVt connKWnl tiM cantlCiiO B M

and ralaHmhlpi Dl»t•yp« tea aooor cdw iwe* and c

Figure 4. Student assessment details in the instructor interface

Effectiveness of PASS32 out of 36 students randomly selected from the four sections of the course

utilizing PASS, responded to a post assessment survey replicated from previousstudy (Un, Liu, & Yuan, 2002). This survey consisted of two parts. The first partwas 5-point Likert Scale ranging from 1 (strongly disagree) to 5 (strongly agree).The second part consisted of two open ended questions regarding students' likesand dislikes. Analysis of data indicated that students generally had a positiveattitude toward this process facilitated by PASS (Table 1).

Items

1

2

3

4

5

6

7

8

9

10

11

I am content with my own work.

I learn more from peer assessmentthan from traditional teacherassessment.

The procedures on how to do peerassessment are clearly outlined.

Peer assessment is a worthwhileactivity.

Peers have adequate knowledge toevaluate

I benefited from peers' comments.

The peers' comments on my workwere fair.

Peers can assess fairly.

I have benefited from markingpeers' work.

I took a serious attitude towardsmarking peers' work.

I felt that I was critical of otherswhen marking peers' work.

Minimum

3

1

2

1

2

1

3

3

1

3

2

Maximum

5

5

5

5

5

5

5

5

5

5

5

SD

.66

.91

.71

.84

.63

.86

.77

.72

.02

.55

.97

Mean

4.38

3.41

4.41

4.06

3.72

4.19

4.16

4.06

3.84

4.56

3.97

Table I. Minimum, maximum, standard deviation and mean in post-assessment survey

Volume 49, Number 4 TechTrends 83

Page 5: The Instructional Design Portfolio Peer Assessment …ehk1/pdf/Peer Assessment of WebQuests.pdfas friendship, gender or race could cause students to rate good performance down or poor

In answering the first open ended question "Pleasespecify what you like most in this peer assessmentprocedure," students acknowledged the positive impact ofthis peer assessment process on understanding the contentarea and improving project quality. Students also expressedthe comfort brought by anonymous marking and instantfeedback.

For the second open-ended question "How would youchange this peer assessment procedure? And why?" somestudents would have liked more than two peers rating theirprojects as they found it difficult to decide what to do if twopeers gave them contlicting comments. Some students askedfor more critical and constructive teedback. A number ofstudents stated their satisfaction with the procedure andstated no desire for change.

Instructors found this system valuable in improvingstudent learning and building student positive perceptionstowards peer assessment. At the same time, instructorsrecognized a substantial reduction of managementworkload. All the data was automatically summarized bythe system. Students and instructors had instant access todata once they were generated.

The initial implementation of PASS was a success ingeneral. In the summer and fall of 2004, more than IOOstudents used this system to facilitate peer assessment. Andthis system can be adapted easily to an even larger numberof users. Since this system provided a format and system forthe project review, the peer assessment process was carriedout smoothly with only minimum problems such as lostpasswords. We did reahze that an automatic email responsesystem for solving such problems would reduce instructortime. Another issue we noticed was the timing issue. In thispeer assessment process, timing was critical. Each step ofthis process was linked together and indispensable. Forexample, students needed to understand the rubric andpossess basic assessment skills to be able to evaluate peers'work. Students needed to complete peer assessments in atimely manner for their peers to get feedback and improveprojects. Delays or missing any ofthe steps would lead toproblems and disturb the harmonious operation ofthe peerassessment process. PASS could be developed further bysetting up an email reminder to remind students of deadlinesat each critical step. In the instructor interface, instructors

could oversee individual student peer assessment progress.Adding an additional view of overall student activities couldalso be helpful. Overall, compared to paper-based systems,PASS is certainly promising and provides advantages.

Project Contributors:

• Lan Li, University of Nebraska at Lincoln, instructorand doctoral student

• Allen L. Steckelberg, University of Nebraska atLincoln, Assistant Professor

• Marjorie Bisbee, University of Nebraska at Lincoln,College of Education and Human Sciences,Technology Specialist

Lan Li is an instructor and doctoral student at the College of Educationand Human Sciences at the University of Nebraska-Lincoln. She receivedher master's degree in instructional technology from the University ofNebraska in 2002.

Allen L. Steckelberg is an assistant professor in the Department ofTeaching, Learning and Teacher Education in the College of Educationand Human Sciences at the University of Nebraska-Lincoln. Dr. Steckelbergcoordinates graduate and undergraduate instructional technology programs,within the College. He received his Master's from the University of Nebraskain 1978 in special education and his Ph.D. from the University of Nebraska in1992 in the area of psychological and cidtural studies. His areas of teachingand research include teacher education, technology in education, web-basedinstructional and educational management, andparaprofessionais in schoolprograms. He has served as director, principal investigator or co-principalinvestigator on 16 federally sponsored grants representing $4.9 million inexternal funding. Viese projects have produced a number of widely usedweb-based training and educational management resources.

ReferencesDodge, D. (2001). WebQuest rubric. Retrieved November 20, 2004, from

http://webquest.sdsu.edu/webquestrubric.htmlHanraban, S. J., & Isaacs, G. (2001). Assessing self- and peer-assessment:

The students' views. Higher Education Research & Development,20(1), 53-70.

Lin, S. S. J., Liu, E. Z. F., & Yuan, S. M. (2002). Student attitudes towardnetworked peer assessment: Case studies of undergraduate studentsand senior high school students. International Journal of InstructionalMedia, 29(2), 241-254.

84 TechTrends Volume 49, Number 4

Page 6: The Instructional Design Portfolio Peer Assessment …ehk1/pdf/Peer Assessment of WebQuests.pdfas friendship, gender or race could cause students to rate good performance down or poor