Upload
eileen-oconnor
View
319
Download
0
Embed Size (px)
DESCRIPTION
This presentation address the findings about anaction research study on the use of badging within a graduate course. This course itself studied the theory behind and the educational use of emerging technologies. Here you will see how the students responded to reviewing the work of their peers (in an anonymous manner).
Citation preview
Badging Fall Academic Conference – 2013
Empire State College Amy McQuigge ([email protected])
Eileen O’Connor ([email protected])
For the research section of the presentation:EXPLORING BADGING FOR PEER REVIEW, EXTENDED LEARNING AND EVALUATION, AND REFLECTIVE/CRITICAL FEEDBACK WITHIN AN ONLINE GRADUATE COURSE
Research questions
• Can the badging process itself extend the learning in lateral ways, that is, engage students beyond the specific learning outcomes intended within the course? • Can the use of peer review and badges within the course serve as an
example itself of an emerging conceptual-framework for learning, evaluation, and motivation becoming available through the advances possible with web-based interfaces? • Can the peer review process strengthen student connections? • Will the process of peer review (towards the generation of badges) be
used by the students in a diligent and thoughtful manner?
Methodology
• Student created 4 web-based projects in 4 modules of the course• In each module, student reviewed the web-based artifact vs. the
criteria then: • They voted anonymously (although w/ their name identified to the instructor• Votes – collected through a Google Form & analyzed via an Excel Pivot Table
• Data on voting outcome was sent to the students at the end of each module – using the detailed categories• Aggregated data (averaging ratings over the course on all 4 artifacts)
was used to assign a final badge that was delivered via a virtual meeting at the end of the semester
Different rubrics & evaluations – aligned with the different purposes
(0=no evidence, 1 = little evidence . . . 10=excellent) % of grade
Points* Comments
-- Creates an attractive YouTube that communicates your desired intention in a clear, non-rambling way
40 10
-- Has a clear central purpose that is evident in the emphasis within the video and is shown in a bulleted way within the video at some point
15 10
-- Uses basic editing skills with cutting, transitions, and titles / captions perhaps
25 10
-- Posts the YouTube using either a public or unlisted link (NOT a private link)
5 10
-- Put a link-to or embeds-in your website (optional) NA NA
-- Reviews and comments on several colleagues’ YouTubes
5 10
-- Posts within the required time 10 10 Total points possible = 100 100 = Actual
points earned
No go (1) Pewter (2) Silver (3) Gold (4)Won’t even make the grade for the assignments minimum criteria
Minimally acceptable for the assignment but nothing noteworthy in this aspect
Interesting & useful; solid display of expertise on this criteria
Wow, I am learning and taking notes here – a great job; I’ll have my friends visit here too
How do the evaluations by instructor & peer compare?
Instructor ranking Peer rankingStudent 1 5 (lowest performer) 4Student 2 2 1 (highest performer)Student 3 2 3Student 4 4 6Student 5 3 1Student 6 2 7 (lowest performer A)Student 7 4 3Student 8 1 (highest performer) 3Student 9 3 7 (lowest performer B)
Instructor ranking – final grade for each student; Peer ranking is the average for all the evaluation criteria
Total votes cast by badge categoryBadge category Number of votes % in this
category1 – No Go 16 4%2 – Pewter 43 16%3 – Silver 190 40%4 - Gold 181 42%
Skewed towards the high end but still some differentiation & discrimination
Students rating of peers
Std1 Std2 Std3 Std4 Std5 Std6 Std7 Std8 Std9
Aesthet-ics
3.3 3.5 3.4 3.2 3.5 3.3 3.4 3.4 2.5
Category 2
3.1 3.5 3.4 3.2 3.6 3.3 3.5 3.1 2.4
Category 3
3.1 3.7 3.3 2.8 3.5 2.9 3.6 3.3 2.4
"Try new"
2.7 3.5 3.2 3 4 2 3.8 3 3
1.752.252.753.253.754.25
Peer ratings averages on the different categories for each student
Rating - 4 highest/
y axis is trun-cated
Student rating of peers – different data representation
Total votes student
Average vote score
Lowest vote received
Highest vote received
Std Dev
Std1 57 3.2 1 4 0.9Std2 43 3.6 2 4 0.6Std3 62 3.3 2 4 0.7Std4 44 3.0 1 4 0.9Std5 34 3.6 3 4 0.5Std6 59 3.2 1 4 0.8Std7 53 3.5 3 4 0.5Std8 43 3.3 1 4 0.7Std9 35 2.5 1 4 0.9
Types of comments given by the different students
Std1 Std2 Std3 Std4 Std5 Std6 Std7 Std8 Std9
Improve-ment
2 4 NaN NaN 6 NaN NaN 3 NaN
Negative NaN 2 4 3 2 2 NaN 3 NaN
Positive NaN NaN 12 3 1 12 NaN 8 1
1
5
9
13
No. of votes in category given by this student
Type of comments given to the different students
Std1
Std2
Std3
Std4
Std5
Std6
Std7
Std8
Std9
Improvement
2 NaN 5 2 1 1 2 2 NaN
Negative
1 1 NaN NaN 1 3 NaN 3 5
Positive
8 6 2 5 3 5 5 2 3
0.52.54.56.58.5
No. of votes in category
given to this student
Comment comparison
Std1 Std2 Std3 Std4 Std5 Std6 Std7 Std8 Std9
Improve-ment
2 NaN 5 2 1 1 2 2 NaN
Negative 1 1 NaN NaN 1 3 NaN 3 5
Positive 8 6 2 5 3 5 5 2 3
0.5
2.5
4.5
6.5
8.5
No. of votes in category given to this student
Std1 Std2 Std3 Std4 Std5 Std6 Std7 Std8 Std9
Improvement 2 4 NaN NaN 6 NaN NaN 3 NaN
Negative NaN 2 4 3 2 2 NaN 3 NaN
Positive NaN NaN 12 3 1 12 NaN 8 1
1
3
5
7
9
11
13
No. of votes in category given by
this student
Tone of the student comments
The students’ comments were generally specific, sincere, and helpful (whether positive or negative in tone). A sample of a few comments illustrates the general tenor of the comments: • In a YouTube comment: “Amazing! Visual. visual, visual! No matter what type of learner,
there was something in the video for them. Made me want to go out and learn an instrument. [Name of student removed] is a natural speaker too. Her voice was soothing, relaxed and real.”
• “I enjoyed the last part and how it tied all the concepts together. Seeing how learning is fun and the effect of learning a second language is positive.”
• In a Facebook comment, useful and kindly written: “A little text editing, I wouldn't start the About section with ‘This is a Page about.’ I would start, ‘Exploring emerging.’”
• Not all comments were positive, but they were all supportive: “A bit long and somewhat repetitive, but would certainly appeal to its intended audience.” Or, “I liked the "woman on the street" segments. The sound was a little off, I liked the concepts!”
Sampling of some positive commentsStudents also often specifically indicated what they had learned that encouraged them to expand their own work in the future. • “Nice use of video plug in - I did it too after her example”• “Where in SL did you find the keyboard? The address or a way to find it would be useful for your
viewers who might want to try it out.”• “I loved the pics, pet education links, and the therapy dog link. I needed this info for my dog.” • “I learned something from viewing it about MY presentation! Totally clear what the environment is.”• “I also liked the wallpaper. (I tried to find that and couldn't)”• “Made me want to go out and learn an instrument”• “I will try and follow the instructions that were detailed here. Thanks”
The range, specificity, and expanded learning suggested by the comments indicates a dedicated, invigorated group of students who are willing to support their colleagues.
Could participating in the badging process itself extended the learning in these lateral ways.
• Very-different ranking of overall web-artifact evaluation suggests different dimensions were considered; • An average of 20% more votes than required were cast – overall a
wide range of different artifacts were consider; • The optional comments were rich & often indicated direct learning
from colleagues
Suggesting that learning was happening beyond the course constraints
Giving evidence of connectedness• Many forms of connection in the course -- discussion boards, virtual
synchronous meetings, virtual field trips, presentations in Second Life, shared video work via YouTube, therefore difficult to ascribe connectedness to badge reviews alone but . . . • Ongoing concern for colleagues seemed evident: • Students remembered colleagues’ audience, giving specific references - “I
enjoyed the last part and how it tied all the concepts together. You see how learning is fun and the effect of learning a second language is positive.” • Students took the extra time to suggest specific improvements; 15 comments,
about 20% of the comments offered, focused on concrete and specific recommendations for improvements; • More than half of the voting scenarios included specific, useful, critical yet
supportive comments (that were shared anonymously with students
Scaffolding a conceptual framework for an emerging concept: • As reported by Finklestein (2012) during a webinar where he
considered how instructors could: “leverage digital badges to build ongoing relationships with learners, foster engagement, and acknowledge learning that often goes unrewarded or unrecognized,” he explained how the process of engagement itself within the course modeled the application of badging.• Students did not simply read about badging, they reviewed colleagues
work, voted on different criteria, and extended recommendations and comments. Furthermore, they observed who received the awards and on what dimensions the awards were received.
Commitment & diligence
• 20% additional voting – an extra time commitment; • Despite high end voting, the comments show more discrimination; • Reasonably close to colleagues votes on the same student
Implications & cautions
• The process of piloting badges as a peer recognition of achievement that goes beyond course objectives appears to be of sufficient value to continue to improve, refine, and re-evaluate effective ways to embed badging in future courses. • However, embedding a non-instructor evaluation within a course could
risk the safety, security, and sense of fairness that students develop within the course.• Instructors must be very clear on the detachment from the peer review in
any grading consequences, should that be the case, or explain the role of the badges and peer review within the intended course schema if the peer review is to factor into the course evaluation.