14
The use of an Electronic Voting System to enhance student feedback James H. Davenport 1 , Alan Hayes 1 , and Nitin R. Parmar 2 1 Department of Computer Science University of Bath, Bath BA2 7AY United Kingdom {J.H.Davenport,A.Hayes}@bath.ac.uk 2 e-Learning Team, Learning & Teaching Enhancement Office University of Bath, Bath BA2 7AY United Kingdom [email protected] | http://go.bath.ac.uk/nitin Abstract. In this paper we present how we have adopted e-Learning technologies in order to enhance our existing practice of both respond- ing to feedback from students and also providing them with meaningful, formative feedback. Our method involves the integration of an Electronic Voting System (EVS) into our learning and teaching approach. The inte- gration reflects our emphasis on the higher order learning skills required by our students. We present how we have applied this approach, in a collaborative learning context, to the tutorial components of a large fi- nal year unit on our B.Sc.(hons) Computer Science programme. The use of an EVS is normally thought of as a ‘multiple-choice’ system, which we feel is superficially inappropriate for the deep learning intended for a final year unit. We have found that, in a formative feedback setting, it is possible to set questions for which there is not necessarily a ‘right’ answer, and which can therefore provoke greater reflection and debate. We describe how we have adopted the EVS to facilitate such feedback. We present a qualitative reflection and perspective from the teaching team on how the adoption of blending this technique with our tradi- tional approach has modified the preparation, delivery and experience of our tutorial sessions. Additionally, we present the results of surveying our students and present their reflections upon experiencing the adoption of this technology within the tutorial context. We highlight the peer and collaborative engagement experienced by the student cohort. Finally, we present our conclusions and our plans for how we intend to conduct fur- ther work in this area with a view to enhancing the feedback process further. 1 Introduction Wishing to enhance student engagement with the material being taught is noth- ing new, but pressures on staff–student ratios on the one hand, and the National Student Survey (at least in the U.K.) on the other hand, have brought this issue into greater prominence.

The use of an Electronic Voting System to enhance student ...opus.bath.ac.uk/12505/1/UnivOfBathDavenport_et_al_Final_Full_Paper.… · The use of an Electronic Voting System to enhance

Embed Size (px)

Citation preview

The use of an Electronic Voting System toenhance student feedback

James H. Davenport1, Alan Hayes1, and Nitin R. Parmar2

1 Department of Computer ScienceUniversity of Bath, Bath BA2 7AY

United Kingdom{J.H.Davenport,A.Hayes}@bath.ac.uk

2 e-Learning Team, Learning & Teaching Enhancement OfficeUniversity of Bath, Bath BA2 7AY

United [email protected] | http://go.bath.ac.uk/nitin

Abstract. In this paper we present how we have adopted e-Learningtechnologies in order to enhance our existing practice of both respond-ing to feedback from students and also providing them with meaningful,formative feedback. Our method involves the integration of an ElectronicVoting System (EVS) into our learning and teaching approach. The inte-gration reflects our emphasis on the higher order learning skills requiredby our students. We present how we have applied this approach, in acollaborative learning context, to the tutorial components of a large fi-nal year unit on our B.Sc.(hons) Computer Science programme. The useof an EVS is normally thought of as a ‘multiple-choice’ system, whichwe feel is superficially inappropriate for the deep learning intended fora final year unit. We have found that, in a formative feedback setting,it is possible to set questions for which there is not necessarily a ‘right’answer, and which can therefore provoke greater reflection and debate.We describe how we have adopted the EVS to facilitate such feedback.We present a qualitative reflection and perspective from the teachingteam on how the adoption of blending this technique with our tradi-tional approach has modified the preparation, delivery and experience ofour tutorial sessions. Additionally, we present the results of surveying ourstudents and present their reflections upon experiencing the adoption ofthis technology within the tutorial context. We highlight the peer andcollaborative engagement experienced by the student cohort. Finally, wepresent our conclusions and our plans for how we intend to conduct fur-ther work in this area with a view to enhancing the feedback processfurther.

1 Introduction

Wishing to enhance student engagement with the material being taught is noth-ing new, but pressures on staff–student ratios on the one hand, and the NationalStudent Survey (at least in the U.K.) on the other hand, have brought this issueinto greater prominence.

Hence the authors tried an experiment with an Electronic Voting System(EVS: referred to as an Audience Response System, in the local context). Theirrespective roles were as follows.

JHD The lecturer for the course chosen for the experiment. An experiencedlecturer, who frequently presents off his laptop. Has used PowerPoint before,even though it’s not his tool of choice.

AH The Director of Studies for the course. He persuaded JHD to make theexperiment, and attended all the classes at which it was done, not least inorder to hand out and collect the EVS equipment.

NRP The Learning Technologist who, as part of the e-Learning team, is leadingthe institutional pilot of the EVS.

In the rest of this paper, we discuss the pedagogic context of our use of EVS,the way we tried to engender deep learning, the costs of the system, and thestudent attitude to the experiment. It is worth noting that the students did notsee this as an experiment, merely as a new way of teaching. We finish with someconclusions and open questions. Details of the local context, which may help thereader decide if these findings are relevant to their context, are in the appendices.

2 Pedagogic Context

Details of the pedagogic and technological infrastructure are given in the appen-dices: here we focus on the context.

The module in question is “Advanced Networking”, taught to final year un-dergraduates3 in the first semester, being worth 20% of that semester, and 7%of the overall degree. The course is 100% assessed by examination, and has tocompete with the first tranche of the final-year project (worth 21% of the overalldegree), and with other modules, most of which have coursework, ranging up to75% of the module assessment. Hence students find it difficult to focus on thenetworking module. The course is taught in October–December, but the exami-nation is in January, which also leads to an “I’ll catch up with it over Christmas”effect.

The course is heavily based on [10], which, while an excellent text4, is heavilyfactual. It is very easy for students to get into the habit of coming to lectures,listening to the lecturer, whose slides are all from the book or on the web5, andnot engaging with the deeper side of the material.

Much of the relevant pedagogic literature (e.g. [1, 6]) focuses on the use ofEVS in large introductory courses. While this is important, the aims of a first-cycle qualification include

can apply their knowledge and understanding in a manner that indicatesa professional approach to their work or vocation, and have competences

3 In terms of the Bologna process [3], “First Cycle” students.4 JHD has reviewed many alternative texts over the years, but has stuck with Stevens.5 They are on the web in response to previous student demand!

typically demonstrated through devising and sustaining arguments andsolving problems within their field of study [3]

and JHD was initially worried that a ‘fact-based’ technology such as EVS mightdivert students further from the underlying principles in favour of ‘easy-to-learn’facts, and figure 1 was an initial attempt to counteract this.

The course is taught as three hours/week, two of which are formal lecturesand the third is designated as a “problem class”, in which the students aresupposed to ask questions. Over the years, as the course has grown from 20students to 70, these problem classes have become more and more intimidatingfor the students, and the common question is “can we do past examinationquestions”, which tends to deteriorate into another lecture-style explanation bythe lecturer, and does little to enhance student engagement.

One possible solution is an Electronic Voting System (EVS).

2.1 Electronic Voting System Technology

At the beginning of the 2008/2009 academic year, the e-Learning team at theUniversity of Bath chose to embark on a pilot of a TurningPoint-based EVS. Thisintention was to work with a range of stakeholders to identify and evaluate ifuse of an EVS could be used as one mechanism to give students greater feedbackon their learning. The TurningPoint software, which is a Microsoft PowerPointplugin, can give normal lecture slides greater impact. In fact, the TurningPointEVS is a more than just a simple voting system, as it allows for the collection ofstudent feedback and the immediate sharing of the results. Further analysis ofthe data generated can be completed using the accompanying reporting software.

Following initial use of the EVS at the University of Bath, NRP identifiedfive areas in which the system could be used to support learning and teachingin a face-to-face context.

– For diagnostic testing at the beginning of a lecture– For monitoring understanding of the content by students– For enabling the provision of immediate feedback within the context– For keeping students actively engaged in their learning– For promoting peer interaction and support

It should be noted that, in this experiment, the students were told at thestart of the first class that the EVS was being used in anonymous mode, andpurely as formative feedback for the students. This contrasts strongly with [5],who stated (after the first year)

In particular, we announced at the beginning of the semester that thekeypads would count as some substantial part (15-25%) of the final grade.

2.2 Our Use of Electronic Voting System

[9] gives various uses of electronic voting systems, but, while we used (in theirterminology)

– student feedback on lecturer/lecture

as a side-effect of having the system available6, our main goal was one they donot give directly:

– assisting students to engage with the material.

Table 1. Number of questions per session

Question CountSession Week Warm-up Content Course Total1 5 1 7 5 132 8 1 12 4 173 11 1 22 2 25

The Electronic Voting System was used as the primary means of learning inthree “problem class” sessions, in weeks 5, 8 and 11 of the class. Table 1 gives abreakdown of the number of questions asked per session. The growth in numberof content questions is noteworthy, and can be attributed to two factors.

1. Greater facility on the part of the students. Session 1 ran out of time, andsome questions were deferred to session 2 — in the table they are countedas session 2. This effect would probably occur every year.

2. Greater facility on the part of the lecturer, in question setting and “answerslide” writing.

An alternative method would have been to use the EVS intermittently in all,or most, of the sessions, as described in [6].

Modes of implementation are as varied as the instructors who use them,but typically between two and five questions are given per 50 minutes ofclass instruction

We chose not to do this for a variety of reasons, which seemed good at the time,even though many of them are transient.

1. The course was already under way, and it seemed to us to be too disruptiveto change the formal lecturing style, whereas the problem classes had yet toget under way.

2. The lectures were barely sufficient to get through the allocated material, andJHD was reluctant to lose any time from the formal lectures.

3. Technology failures7 are much more readily brushed off in a problems classthan in a formal lecture, at least for JHD’s style of lecturing.

6 See the last few slides of each session.7 Which in fact did not happen, but this was far from guaranteed.

4. The deployment of the EVS was logistically complicated — see section 4.2.

The full sessions can be found at the course URL: http://staff.bath.ac.uk/masjhd/CM30078.html. We reproduce some of the slides here, in the formatthat the students would see it after they have attempted to answer the ques-tions8. It is worth noting that they get real-time feedback, not only as to whatthe right answer is, but also as to how they compare with their peers. It is onething to be wrong with several others, another thing to be distinguished (albeitin your own mind only) by your ignorance.

3 The Questions used

Following the usual “warm-up” questions9, the first “real” slide and the cor-responding answer slide are shown in figure 1. The main aim of this was toemphasise the points in the answer slide — the surface content of the questionwas largely irrelevant from this point of view. Equally, the fact that a majorityof the students got it “right” was largely irrelevant to JHD.

100 Mbps twisted pair length limit is

60 m

etres

100 m

etres

195 m

etres

500 m

etres

14%16%

12%

58%A. 60 metres

B. 100 metres

C. 195 metres

D. 500 metres

The correct answer is B: 100m

However, this is the sort of factual answer:

a) I don’t expect you to know, and would give you in an exam question;

b) You should always check in real life, especially if equipment installation depends on it;

c) As I showed in the Library, isn’t what it seems in practice

Fig. 1. Simple question, but pedagogic points (Session 1, slides 7–8)

Bloom’s taxonomy [4] classifies learning into six cognitive levels (knowledge,comprehension, application, analysis, synthesis and evaluation). The taxonomyrepresents an increasing level of learning abstraction and difficulty ranging frommemory recall (knowledge) through to making critically informed judgements(evaluation). The qualifications framework of [8] contains descriptors that de-scribe the learning outcome expectations for both undergraduate and postgrad-uate provision in the Higher Education sector. The final year of an undergraduateprogramme is labelled as level 6 within this framework. The learning outcomesexpressed at this level make reference to outcomes that are described as “crit-ical evaluation” and “an appreciation of the uncertainty, ambiguity and limits8 However, in practice many of the slides, particularly answer slides, were phased.9 One totally generic and one (figure 7) on the material.

of knowledge”. It is the enabling of the student engagement with these higher-order skills that was the focus of our experiment. Our goal was to utilise an EVSsystem to promote this deep learning within the student.

Figure 2 is an example of how we used the EVS system to engage the studentsat this level. The question has been designed in a manner such that there is notone correct answer. The students need to utilise their analysis and evaluationskills in order to address the question. In observing the class when this waspresented two things were immediately notable. The first was the initial silencewithin the class as the students began to engage with what was being asked.The second was the readiness with which the students engaged with their peersin order to discuss the issues that were pertinent to the question. In surveyingstudents who had been grouped together to formulate a collective, group responsefrom students to questions posed within an EVS context [1] reports that almost80% of student cohort felt that such discussion led to a better understanding ofthe subject matter. Our expectation, from our initial analysis, is that we wouldanticipate a similar enhancement in understanding within this deep learningcontext.

Internet and Ethernet

An inte

rnet

is im

poss...

The Inte

rnet needs

Et...

Today’

s Inte

rnet

dep...

Tota

lly independent ...

2%

64%

23%

11%

A. An internet is

impossible without

Ethernet

B. The Internet needs

Ethernet

C. Today’s Internet

depends on Ethernet

D. Totally independent

concepts

I hope that made you think

A. Certainly false

B. Was false in the beginning, and may well be false again.

C. True in practice today.

D. True, but not as helpful as C.

Fig. 2. “Deep thought” question (Session 1, slides 9–10)

It is also possible to ask questions which probe around the area ostensiblybeing asked for. Routing (i.e. how packets traverse the Internet) is a complexquestion, and the standard terminology distinguishes between

Routing MechanismsRouting PoliciesRouting ProtocolsRouting Programs (generally called daemons).

Bearing in mind that this course is assessed 100% by examination, the only wayto reinforce this point in the students’ minds in the past was to set examina-tion questions of the form “Explain the differences between routing mechanisms,policies, protocols and program”.

The system administrator decides

on the routing mechanism

Tru

e

Fals

e

44%

56%

1. True

2. False

False

• There is only one routing mechanism: see Stevens sections 9.1, 9.2.

• Who confused routing mechanism with policy?

Fig. 3. A question with a deeper point (Session 1, slides 15–16)

In the EVS context, we can proceed slightly differently, as illustrated in figure3. It is, from the data presented, impossible to know why a (small) majority ofstudents got the wrong answer, but, judging from the sheepish looks when thelast bullet point of the answer slide was revealed, many did indeed have thatconfusion.

It is perfectly possible to ask questions in this context which have no rightanswer: figure 4 is one such example from the first session, and 5 from the second.This idea is not new to us: [5] write

In more recent applications, we have experimented with partially correctanswers and with even more than one correct answer.

Here we discovered that it is important for the lecturer not to amplify the ques-tion, but rather say “ask your neighbour — what do you think”.

The manager of an autonomous

system in the Internet

Can d

o w

hate

ver he l...

Must

im

ple

ment th

e r...

Must

satisfy

the c

ons...

Depends o

n w

heth

er i...

30%

23%

40%

7%

A. Can do whatever he likes

B. Must implement the routing policy of whoever his AS is connected to

C. Must satisfy the constraints of whoever his AS is connected to

D. Depends on whether it’s stub/transit

A. True, but he may not stay connected to the Internet for very long.

B. It would be rare to be told what routing policy to implement.

C. Yes, such constraints must be satisfied.

D. In theory no, but in practice a transit AS will have many more constraints.

Fig. 4. A question with no right answer (Session 1, slides 17–18)

Between A and PTR records

The D

NS is a

lways c

...

Consi

ste

nt except fo

r...

Mig

ht w

ell b

e inconsi..

.

Consi

ste

ncy

is m

ean...

Consi

ste

ncy

is u

nde...

14%

27%

20%18%

20%

A. The DNS is always consistent

B. Consistent except for deliberate fraud

C. Might well be inconsistent

D. Consistency is meaningless

E. Consistency is undecidable

A “thinking” question: C or E

A. Certainly not true

B. Fraud can certainly work by adding inconsistencies, but it’s not the only way

C. DNS maintainers can easily screw up!

D. An abstract definition is possible

E. But it requires testing infinitely many possible cases. Also the DNS is distributed, so untestable.

Fig. 5. Another question with no right answer (Session 2, slides 15–16)

The examples above serve to illustrate our approach to facilitate deep learn-ing within the students. We have essentially adopted two distinct mechanisms.The first is to present the student with a question for which there is not a singlecorrect answer. In this context the deep engagement is through the students de-liberating upon several distinct, possible and correct responses to the questionbeing asked. The second is to present the students a question for which there isonly one correct solution. However, the question is posed in such broad termsthat in order to determine the correct answer the students need to employ thehigher order skills of analysis, evaluation and ambiguity resolution in order tocontextualise, choose and apply the technological principles that are pertinent toaddressing the question being addressed. In adopting a blended approach whichencompassed both types of mechanism we have observed the students engage indeep learning and peer supported discussion. We believe that this has led to anenhanced learning experience for our students.

4 Costs of the EVS

4.1 Central Costs

The startup cost for such an initiative is not insubstantial. A single USB RFreceiver was costed at an average of £220, with the clickers costed at £35 perunit10. The total capital expenditure was therefore £8100.

On average, NRP works as a 0.3 Full-Time Equivalent (FTE) on the EVSproject, and has spent substantial time in creating a structure of the imple-mentation of the project. Whilst much of the support and training material forthe TurningPoint hardware is available to lecturers through the project website,

10 This contrasts with the $CAN30≈£17 at Waterloo. There may be a psychologicalbreak point here in terms of student purchase of clickers, since in the UK context itis no longer obviously “cheaper than a book” [2].

some face-to-face sessions have also been arranged. Hence, even in one year, thestaff cost here exceeds the capital cost.

The e-Learning team use the EVS (for diagnostic testing and evaluation)in all of their staff development workshops, for highlighting one of contexts inwhich the EVS can be used. Individual training sessions with staff typicallylast between 30-60 minutes, where in addition to some basic instruction of thecomponent parts of the EVS and the software, some pedagogical issues of usingthe technology are also addressed.

4.2 Costs to the Lecturer

The initial learning curve was not too bad: 4 hours was spent in installing thePowerPoint add-on on JHD’s laptop, his writing a few test questions and thenrunning through them with NRP and the equipment to check that they worked,and to correct a few misconceptions.

Each 50-minute session (which was already timetabled, and would have beenspent on something else if it were not for the use of EVS) required about fivehours of preparation in terms of writing the questions. One question to whichwe do not know the answer yet is how re-usable these are. Some probably are,but in view of the fact that the material has to be topical, the re-use will besignificantly less than total, say 50%.

In addition, AH handed out and collected the EVS equipment. JHD wasalready timetabled for a different class preceding this one, and it would havebeen physically impossible for him to do both classes and collect and issue theequipment. Similarly, it would have been very difficult for him to collect upall the equipment and return it, as well as give a subsequent lecture. Hence,at least in the University of Bath context, it seems that use of this equipmenteither requires an assistant (AH in this case) or blocking out the lecturer for thehours11 either side of the class.

5 Student Opinion

The last few slides of each EVS-enabled class were used to gain student opinion(as has been remarked elsewhere [6], EVS are a simple way of doing so, especiallyif, as in our case, the devices are anonymous). Opinion at the end of the firstslide was clearly positive, see figure 6.

Following the first session of EVS use, a short online survey was conductedby JHD in conjunction with the e-Learning team, in an attempt to gauge studentreaction following the first use of the EVS. Reaction was generally positive withone student commenting that: ”...it gives a good view on how the rest of theclass is doing in comparison and lets you know how much harder you should beworking”. Another student added: ”It was useful to be able to see my answer11 Of course, most of each hour would be available for other work, since the AVS would

consume only 10 minutes or so either side of the class, but the point is that anothertimetabled teaching hour would not be possible.

in comparison to other peoples. This gave me an easy way to benchmark mylearning against others to see how I was doing on the course.”

Further comments were noted in the end of unit evalution, which studentswas once again asked to complete online. Several commented on the fact thatthe EVS had been used to support problems’ classes and would support furtherembedding of the technology within their learning.

What about this technology

Revolu

tionis

es p

rob...

Usefu

l ext

ra, but sho...

OK 1

-2 tim

es/term

Never again

Absta

in

66%

32%

0%0%2%

A. Revolutionises problem classes

B. Useful extra, but shouldn’t dominate

C. OK 1-2 times/term

D. Never again

E. Abstain

I wish this Audience Response

System had been used earlier

(i.e. in previous years)

Strongly

Agre

e

Agre

e

Neutral

Dis

agre

e

Strongly

Dis

agre

e

62%

26%

2%0%

10%

1. Strongly Agree

2. Agree

3. Neutral

4. Disagree

5. Strongly Disagree

Fig. 6. Student Opinion: (a) Session 1, slide 25; (b) Sesssion 2, slide 33

An Ethernet address is how long?

4 b

ytes

6 b

ytes

16 b

yte

s

Variable

len

gth

13%9%

58%

20%

A. 4 bytes

B. 6 bytes

C. 16 bytes

D. Variable length

The answer is B (6 bytes)

If you didn’t get that, you’re pretty confused: re-read sections 2.2 and 3.2 of the book.

Fig. 7. A factual question displaying lack of comprehension (session 1, slides 5–6)

6 Conclusion and Further Questions

We can see the following conclusions from this usage of EVS.

– It is possible to convert a relatively sceptical lecturer (JHD) into a stronguser of this system in the appropriate context .

When it comes to END characters,

PPP

Is b

etter th

an S

LIP

, as

..

Has o

ne, w

hic

h c

an’t ..

Allows it in

the m

iddle

..

Allows it an

yw

here

in...

21%

29%

14%

36%

1. Is better than SLIP, as it doesn’t use one

2. Has one, which can’t be used in IP

3. Allows it in the middle of IP packets

4. Allows it anywhere in IP packets

The correct answer is D:

anywhere in IP packets

• Since PPP doesn’t have a length field, it must have some END marker

• IP can contain arbitrary binary data, so B and C can’t be right.

• If there is an END character in the IP packet, it is “escaped” by the sending PPP, and re-interpreted by the receiver

• See figure 2.2

Fig. 8. Another question displaying lack of comprehension (session 1, slides 11-12)

– This technology does enable the lecturer to gauge levels of misapprehension,as seen from figures 7 and 8, in a way that would be hard otherwise in anexamination-only course of this size.

– It is possible to use EVS to help students with deeper points than merefactual knowledge.

– When used at this level, the lecturer has to be prepared to improvise. Figure8 led to a five-minute explanation of the points in the answer slide.

– The students like it (this is certainly not a new point, but it is an importantone).

There are various further questions that have arisen.

1. Our usage of the EVS was based firmly around “one student: one handset”.Although we observed student discussion (see the discussion above of figure2), our usage was not set up for this. [7] report strong benefits from moreformal peer discussion, and we should investogate this.

2. Although popular with the students who attended (about 42 in each session,and apparently a strong commonality), there were 71 students on the course,and therefore 29 who did not take up these problem classes (but who werelargely at the formal lectures). The total anonymity did not permit anycorrelation of examination results with attendance at the EVS sessions, andthis was a distinct experimental drawback.

There is one point of relevance to institutional policy that the first two authorswould like to make. The logistic difficulties of issuing and collecting the EVS(section 4.2 — we are referring to the student handsets) did constrain the useof the EVS to intensive use in a small number of sessions, rather than beingintegrated. Though it is not totally clear, it seems very likely that the facility of[7] was fitted with the relevant equipment. The University of Waterloo (Ontario)gets12 students to buy such handsets: they can be sold on to the next generation12 The motivation we know about [2] is that EVS work is assessed, at 1 point for

answering at all, and 1 point for the correct answer.

in the same way that textbooks are, and through the campus bookshop. Thehandsets are about $CAN30 each [2], and less than 5% of students cite ‘cost’as a reason for not continuing with this technology. If there is to be an institu-tional commitment to serious use of this technology, then one or the other routeprobably has to be followed to reduce the logistic difficulties.

References

1. M. Bade, D. Drane, Y.B.-D. Kolikant, and M. Schuller. A Clicker Approach toCalculus. Notices A.M.S., 56:253–265, 2009.

2. B.W. Becker. Private Communication. 20 February 2009.3. Bergen Conference of European Ministers Responsible for Higher Education. The

framework of qualifications for the European Higher Education Area. http://www.bologna-bergen2005.no/EN/BASIC/050520_Framework_qualifications.pdf, May19–20, 2005.

4. B. Bloom. Taxonomy of Educational Objectives: the Classification of EducationalGoals Longman, London, 1956.

5. R.A. Burnstein and L.M. Lederman. Using wireless keypads in lecture classes. Phys.Teach., 39:8–11, 2001.

6. J.E. Caldwell. Clickers in the Large Classroom: Current Research and Best-PracticeTips. Life Sciences Education, 6:9–20, 2007.

7. D. Nicol and J. Boyle. Peer Instruction versus Class-wide Discussion inLarge Classes: a comparison of two interaction classes in the wired class-room. Studies in Higher Education 28:457–473, 2003. http://dx.doi.org/10.1080/0307507032000122297.

8. The Quality Assurance Agency. The Framework for Higher Education Qual-ifications in England, Wales and Northern Ireland. http://www.qaa.ac.uk/

academicinfrastructure/FHEQ/EWNI08/FHEQ08.pdf.9. V. Simpson and M. Oliver. Using Electronic Voting Systems in Lectures. http:

//www.ucl.ac.uk/learningtechnology/assessment/ElectronicVotingSystems.

pdf, 2002.10. W.R. Stevens. TCP/IP Illustrated Volume 1: The Protocols. Addison-Wesley,

1994.

A Pedagogic Infrastructure at Bath

The University of Bath is physically a compact13 1960s campus, with some lateradditions. “General Teaching Area” (GTA) is a scarce resource14 centrally con-trolled and timetabled, and it is not unusual for a course with three timetabledhours to have them in three different rooms in three different buildings. Thelecture theatres are very traditional, and we do not have the benefits of theadaptation described in [7, p. 460] to facilitate group disucssion. Audio Visual(AV) support is provided by an organisationally central team, physically cen-trally located, which maintains and issues the EVS hardware. Technical and13 All but one of the teaching spaces are within an easy five-minute walk of each other.14 The University is 12th highest out of 132 to report teaching area allocation, with an

occupancy rate of 74%, as against a U.K. median of 53%.

pedagogical support is provided by members of the e-Learning team, led byNRP.

Timetabling is in hour-long units, and a “one hour lecture” is actually 50minutes long, with ten minutes allocated for the change-over, which in practicetends to be relatively tight.

B Technical Infrastructure at Bath

The e-Learning team opted to purchase the TurningPoint ResponseCard RFEVS15. This radio-based option is perfect for small and large groups alike. Withan operating range of 60 metres, it is ideal for a variety of rooms around theuniversity campus. Unlike the IR (Infra Red) model, the RF model does notrequire a ‘line of sight’ between the clicker and the USB receiver, which makesit particularly useful in larger rooms.

At the beginning of the 2008/2009 academic year, e-Learning team purchased5 bags of 40 clickers, with each bag containing a USB Radio Frequency (RF)receiver. A single RF receiver can support up to 1000 clickers meaning thatseveral bags can be used together without any problems. The hardware is shownin figure 9. The USB RF receiver is usually plugged into a PC at the front ofthe teaching room, which in turn is connected to a projector.

The EVS comes with a suite of software, which has two distinct purposes.On the one hand, it can be used when writing the questions, as part of a normalPowerPoint presentation. On the other hand, when run on a computer with theUSB receiver, it can be used to set the questions and collect the answers. Thereis no logical need for the writing and setting computers to be the same, thoughin fact JHD used his laptop for both.

Following liaison with colleagues in BUCS (Bath University Computing Ser-vices), the software to support the EVS was installed on every Public AccessComputer (PAC), such as those in the Library and Learning Centre (over 400),and on the PCs on the rostra of General Teaching Area rooms. The intentionof installing the software on PACs was to enable students to create their owninteractive slides if and when required. JHD in fact installed the software on hisown laptop.

For staff using centrally managed PCs on their desktops, the EVS softwarecould be installed on their PC through contacting with their departmental ITsupporters. In addition, the software, both PC and Mac versions, could be down-loaded from the EVS project website at http://go.bath.ac.uk/evs for instal-lation on personal devices.

15 Further information about this hardware can be found on the Turning Technologieswebsite: http://www.turningtechnologies.com/

Fig. 9. The hardware: USB receiver (lecturer’s use) and clicker