46

University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for
Page 2: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

University Chemistry Education

The Higher Education chemistry journal of the Royal Society of Chemistry

2001 Volume 5, Issue No 1ISSN 1369-5614

Contents

Statement of Editorial PolicyInstructions for Authors

Papers1 Generating Coursework Feedback for Large Groups of Students Using MS

Excel and MS WordPhilip Denton

9 Experience with a Random Questionnaire Generator in the chemistrylaboratory and for other continuous assessmentMary R Masson

16 Student misconceptions of the language of errorJane Tomlinson, Paul J Dyson and John Garratt

Communication24 Using questions to promote active learning in lectures

William Byers

Perspectives31 Why Lecture Demonstrations Are ‘Exocharmic’ For Both Students And Their

InstructorsGeorge M Bodner

36 Science and the public: Teaching about chemistry on university coursesJim Ryder

Letters39 Customising and Networking Multimedia Resources

Anthony Rest40 On the need to use the Gibbs' phase rule in the treatment of heterogeneous

chemical equilibriaPaolo Mirone

40 Questionable questionsJohn Garratt

Page 3: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Editorial Board

ChairmanProfessor M J PillingUniversity of Leeds

EditorsProfessor P D BaileyHeriot-Watt UniversityDr S W BreuerUniversity of Lancaster

Editorial ConsultantDr C J Garratt

Manager of the Web BoardDr P C YatesUniversity of Keele

MembersDr S W BennettOpen UniversityDr J M GaganOpen UniversityDr A HubbardHurtwood HouseDr T L OvertonUniversity of HullDr A J RestUniversity of SouthamptonDr R G WallaceNottingham Trent University

Editorial Policy for UniversityChemistry Education (UChemEd)

The journal is aimed at those whoteach chemistry in higher education.As a journal for all practisingteachers of chemistry at this level, itdeals with any topic of practicalrelevance and use to those involved.It is a place to publish effectivemethods and ideas for the teachingand learning of chemistry and issuesrelated to the effectiveness ofteaching and learning. Contributionsare particularly welcome if thesubject matter can be applied widelyand is concerned with encouragingactive and independent learning,with increasing student motivationfor learning, with helping them tobecome effective exploiters of theirchemical knowledge andunderstanding, or with assessment.Contributions should be of clearpractical interest to those who teachchemistry.

There are not hard and fast rules forsubdividing manuscripts. However,an introduction should provide aclear statement of the relationship ofwhat is described to previous workand opinion (and is likely to includesome references to some aspects ofeducational theory), and also theoverall purpose of the article(including, where appropriate, theeducational objectives, intendedlearning outcomes and why these arenot satisfactorily achieved by otherapproaches). Other sections may beequivalent to methods, results, anddiscussion as used in conventionalscientific papers; these sectionswould describe how the work wascarried out, show or illustrate theoutcomes (new teaching materialetc) which have been created, andcritically evaluate how far theoriginal objectives have been met.

Four types of contributions may besubmitted:

Full Papers describe a specificmethod of or approach to teaching,or some teaching material that hasbeen used by the author; papersshould explain the educational

objectives that led to the use of themethod.

Communications are brief accountsof work still undergoing evaluationand development, but of sufficientinterest to merit publication.

Reviews provide for practitioners anup-to-date survey of current methodsor approaches to teaching andlearning and also show how theserelate to our understanding of studentlearning.

Perspectives provide an opportunityfor contributors to present a concisebut in-depth analysis of a topic ofgeneral interest, with clearconclusions likely to be directlyuseful to other academics involved inteaching.

Letters: these are a medium for theexpression of well-argued views oropinions on any matter falling withinthe remit of Journal.

All contributions, whether or notthey were solicited, are rigorouslyreviewed. Referees are required toevaluate the quality of the argumentspresented, and not to makesubjective judgements involvingtheir personal views of whatconstitutes good or effectiveteaching. Contributions are judgedon:

(i) originality and quality ofcontent;

(ii) the appropriateness of the lengthto the subject matter;

(iii) accessibility of supportingmaterial

Page 4: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

University Chemistry Education

Guidelines for Authors

Submission of contributions

University Chemistry Education (UChemEd) is aimed atthose who teach chemistry in higher education. As ajournal for all practising teachers of chemistry at this level,it deals with any topic of practical relevance and use tothose involved. It is a place to publish effective methodsand ideas for the teaching and learning of chemistry andissues related to the effectiveness of teaching and learning.

1. The original con tribution should be submittedelectronically, preferably in Word for Windowsformat. Any associated diagrams should be attachedin JPG or GIF format, if possible. Submissions shouldbe made by e-mail as a file attachment, or on a floppydisk, to Dr Stephen Breuer [email protected] or at School of Physics andChemistry, Lancaster University, Lancaster, LA14YB, United Kingdom, or to Professor Patrick Baileyat [email protected] or at Department ofChemistry, Heriot-Watt University, Riccarton,Edinburgh, EH14 4AS, United Kingdom.

2. Submitted contributions are expected to fall into oneof several categories. Authors are invited to suggestthe category into which the work should best fit, butthe editors reserve the right to assign it to a differentcategory if that seems appropriate. These are thefollowing.Full papers describe a specific method of or approachto teaching, or some teaching material that has beenused by the author.Communications are brief accounts of work stillundergoing evaluation and development, but ofsufficient interest to merit publication.Reviews provide for practitioners an up-to-datesurvey of current methods or approaches to teachingand learning and also show how these relate to ourunderstanding of student learning.Perspectives provide an opportunity for contributorsto present a concise but in-depth analysis of a topic ofgeneral interest, with clear conclusions likely to bedirectly useful to other academics involved inteaching.

A word count (excluding references, tables, legendsetc) should be included at the end of the document.

3. Presentation should be uniform throughout the article.

Text should be typed in 12pt Tim es New Roman (orsimilar), with 1" margins, double-spaced, unjustified,ranged left and not hyphenated.

Always use an appropriate mix of upper and lowercase letters: do not type words in uppercase letterseither in the text or in headings. Bold or italic textand not upper case letters should be used foremphasis.

All nomenclature and units should comply withIUPAC conventions.

Tables and figures should be numbered consecutivelyas they are referred to in the text (use a separatesequence of numbers for tables and for figures). Eachshould have an informative title and may have alegend.

Equations should be written into the text using theword processing program, either as normal text orusing the program’s equation facility.

Structures should, wherever possible, be treated as afigure and not incorporated into text.

References should be given as superscripts, designatedas numbered endnotes.

Footnotes should be generally avoided and importantadditional information may be referenced andincluded in the reference list.

4. A title page must be provided, comprising:

an informative title;

authors’ names and affiliation, full postal address ande-mail; (in the case of multi-authored papers, use anasterisk to indicate one author for correspondence,and superscript a, b, etc. to indicate the associatedaddresses);

an abstract of not more than 200 words.

5. Wherever possible articles should be subsectionedwith headings, subheadings and sub-sub-headings.Do not go lower than sub-sub-headings. Sectionsshould not be numbered. Headings should be no morethan 40 characters in length and subheadings and sub-sub-headings no more than 30.

The introduction should set the context for the workto be described; include references to previous relatedwork, and outline the educational objectives.

A concluding section (which need not be headedconclusion) will include an evaluation of the extent towhich educational objectives have been met. Asubjective evaluation may be acceptable.

6. The formatting of references should follow the generalpractice in the various titles within the Journal of theChemical Society range. For example:

Books and Special Publications:• W.G. Perry, Forms of intellectual and ethical

development in college years: a scheme; Holt,Rinehart and Winston, New York, 1979. p. 218.

• M. McCloskey in D. Gentner and A.L. Stevens ( eds),Mental models; Lawrence Erlbaum, New Jersey,1983, pp. 147-153.

Page 5: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Journal Articles:• D.C. Finster; J.Chem.Ed., 1989, 66, 659.• A.H. Johnstone and K.M. Letton, Educ.Chem., 1990,

27, 9.

References to articles in U.Chem.Ed. on the Webshould be identified by paper number until they areincorporated into numbered issues.

7. All contributions submitted will be refereedanonymously. The decision of the Editors on theacceptance of articles is final.

8. Authors grant U.Chem.Ed. the exclusive right topublish articles. They undertake that their article istheir original work, and does not infringe thecopyright of any other person, or otherwise break anyobligation to, or interfere with the rights of such aperson, and that it contains nothing defamatory.

9. Articles will be published on the Web in PDF andHTML formats as soon as the editorial process iscomplete. In addition, they will be combined intoelectronic issues as a PDF file in April and October.

10. Letters to U.Chem.Ed. relating to published articlesor on any other educational topic should be sent forpublication on the U.Chem.Ed. Web Board to itsmoderator, Dr Paul Yates [email protected]. A selection of theseletters will be incorporated into the issues of thejournal.

Page 6: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

U.Chem.Ed., 2001, 5 1This journal is © the Royal Society of Chemistry

Generating Coursework Feedback for Large Groups of StudentsUsing MS Excel and MS Word

___________________________________________________________________________

Philip Denton

School of Pharmacy and Chemistry, Liverpool John Moores University, Liverpool, L3 3AF, UK.E-mail: [email protected]

A novel electronic procedure for generating and returning coursework feedback to students has beenintroduced by tutors at Liverpool John Moores University. The technique uses a combination of MicrosoftExcel 97 and Microsoft Word 97 to generate personalised feedback sheets that can include the student's mark,position in the class, and a series of statements selected from a bank of comments, written by the tutor.Feedback sheets can be printed off and returned to students with their marked work, or distributed via e-mail.This procedure is particularly suited to classes undertaking the same coursework assignment, a commonfeature of undergraduate chemistry courses, and can make the assessment of work from large groupsconsiderably less onerous. The operation of the software is described and the responses of staff and students tothe procedure are reported.

IntroductionThe importance of assessment in learning is welldocumented.1, 2 It is generally accepted that ifstudents are to gain the maximum educationalbenefit from a written coursework submission,their marked script should be returned withappropriate annotations. In particular, tutorsshould indicate to students where they have donewell, where their misunderstandings are, and whatfollow-up work might be required. 3 Such writtencomments do more to motivate students than ticksor crosses alone. Indeed, Ramsden4 suggests that“… beneficial information about progress is valuedeven more by students than qualities such as clearexplanations and the stimulation of interest.”Accordingly, studies in this area indicate that anabsence of feedback is an important contributorycause of student failure. 4

Although educationally sound, the extensiveannotation of students’ work requires aconsiderable investment of time and effort by theassessor. It is understood, however, that markedwork should be returned as quickly as possible ifstudents are to pay attention to the marker’scomments. Thus, Gibbs and Habesha w state that afew weeks after a coursework submission, studentshave moved onto another topic and, “have neitherthe time or the interest to take feedback to heart.”3

The introduction of electronic methods candecrease the time taken for feedback to be returnedto students. For example, the use of multiplechoice question sheets, where student responsesare analysed by an optical mark reader, 5 enablework to be graded rapidly. Such approaches havebeen criticised, however, if they give the student

no way of knowing why they got particularquestion incorrect.6 Advanced software packagesthat require students to answer a series ofquestions may provide in-depth explanations ofanswers and direct the student to further reading. 7,

8 It is evident that computer assessments thatprovide immediate feedback can have a positiveeffect on student attainment. 9

The Examine software developed at the Universityof Nottingham neatly illustrates a drawback of allthe computer-assisted methods of assessment thatare currently available. 10 Although this packagewill accept multiple-choice answers, numericalresponses and written text passages up to 150words in length, the latter cannot be marked bycomputer. This is a major limitation, given that alarge part of student assessment in chemistry relieson the grading of written work, such as laboratoryreports.

Electronic methods can be employed to generatewritten feedback to students on work that isassessed by tutors. It is suggested, for example,that a word processor is used to build up a bank offeedback comments, which can be copied andpasted into personalised feedback sheets alongwith general comments relating to the classperformance.11 Presumably, however, this wouldrequire the tutor to undertake a number of tediouscut and paste operations. Ideally one would want asystem that could automatically generate largeamounts of individualised feedback, after tutorshad entered the minimum possible amount ofinformation relating to the assessed exercise.

Paper

Page 7: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

U.Chem.Ed., 2001, 5 2This journal is © the Royal Society of Chemistry

At present, there appears to be a dearth ofcommercially available software for generatingwritten comments to university students on theirassessed work. The development of electronicfeedback, a software package based on MS Excel97 and MS Word 97, is a response to thisrecognised need. The programme has anadditional advantage in that it allows tutors todistribute feedback via e-mail. The purpose of thispaper is to describe this method and to report onits effectiveness when marking chemistrycoursework at Liverpool John Moores University(JMU). Given that Excel and Word areapplications that have a wide user base and thatmost institutions have well developed e-mailnetworks, it is thought that this procedure will bereadily transferable.

MethodThe electronic feedback software consists of twoprograms: Feebac5.xls, an MS Excel 97workbook, and Fb.doc, a MS Word 97 document.Together, these programmes can be used by tutorsto generate individual word-processed feedbackreports that can be printed and/or e-mailed to eachmember of the class. Each feedback sheet details

the student’s name and can include the percentagemark, class rank, a general comment, and a seriesof comments directed specifically at the student.To illustrate the operation of the software, afictitious set of data has been created. This data setis smaller in size than one that might typically beconsidered, but is sufficient to convey the essentialdetails of the procedure. An example feedbacksheet that has been generated using this data set isshown in Figure 1.

PreparationThe Excel workbook Feedbac.xls is composed of aseries of worksheets; Configure, List, Header,Annos, Numbers and Report, that contain anumber of blank cells. The feedback sheets arecreated using data that is entered into these cellsby the assessor. Tutors enter student names, e-mailaddresses and registration numbers into the Listworksheet, Figure 2. Typically, this information isreadily transferred from institutional electronicinformation services. The class list shown inFigure 2 has the format preferred by JMU in whichdata appears in the order; e-mail address,forename, surname and registration number. Userscan configure the software so that it will accept

FEEDBACK SHEET Created at 13:23 pm on 18/9/2000Determination of a rate constant for the reaction of I-(aq) and S2O8

2-(aq).Assessed by Dr. Philip DentonSTUDENT: CATHERINE BAKER00066329MARK: 24 % (HIGHEST: 76 %, AVERAGE: 46 %, LOWEST: 24 %)RANK: 8th out of 8COMMENT: Satisfactory work. This work was submitted late. A lateness penalty has been applied.The numbers on your work have the following meanings. Note that the % figures after each commentindicate the % of students who required that particular comment.3 Your axis is not numbered correctly. Always select chart type XY SCATTER when using MS Excel.

(25%)2 Lab. reports should have the following sub -headings and should be presented in the following order;

introduction, method, results, conclusion. (88%)5 Your graph should display the individual data points, in addition to a best-fit line. The data points should

NOT be joined together by a "dot to dot" type line. (63%)1 When comparing your result with value( s) from the literature, you should state the author, title, year, and

publisher of any data sources you refer to. In this experiment k2 = 1.0 x 10-2 mol-1 dm3 s-1 (J. Chem. Ed.1997, page 972). (75%)

4 Incorrect units/units not stated clearly. In this expe riment, t in s, V in ml, k 1 in s -1, k2 in mol -1 dm3 s-1, ln(Vinf - V) is unitless. Correct units should be stated in all column headings and on graph axes. (38%)

6 A best-fit line (BFL) is required. In a plot of ln(Vinf - V) versus t, the BFL should be li near. In a plot of Vversus t, the BFL should be curved and should pass through the values of Vcalc. (50%)

7 Your graph title is unclear/incorrect or absent. As a minimum, it should state the quantities plotted on theY and X axes. (38%)

8 There is insuffi cient discussion of experimental error in your work. The main errors in this practical resultfrom the volatilisation of I 2 during heating and uncertainties in the end point due to incompletedecolourisation of the starch indicator. (50%)

Electronic Feedback 5. Licenced to Dr. Philip Denton until 01/07/2002.

Figure 1: Example feedback sheet produced using the electronic feedback procedure.

Page 8: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

U.Chem.Ed., 2001, 5 3This journal is © the Royal Society of Chemistry

electronic details in the order favoured by theirinstitution.

The marker enters specific details of the activitythat is being assessed into the Header worksheet,including the title of the coursework and themaximum mark that can be awarded. The assessor

can also put in statements that they wish to appearin the ‘Comment’ section of the feedback sheet,Figure 1. Some of these comments will onlyappear if the student ’s mark falls within certainboundaries. Thus, the example Header worksheetin Figure 3 has been completed so that any studentawarded a mark of 60 % or above, but below 70 %,

List

Column 1 Column 2 Column 3 Column 4 Column 5 Column 6

e-mail Forename Surname Regno1 PACCBAKE MISS CATHERINE BAKER 663422 PACJBALE MISS JOANNE MA BALEED 2062823 PACABASS MR AHMED HAF BASSI 466344 PACRBERA MISS RAMANDEE BERAHNEG 2835995 PACABULL MR AMIR BULLOCK 1251466 PACJCAME MISS JULIE ELIZA CAMERON 3283657 PACACAVE MR ALASTAIR J CAVE 1338798 PACBCHAK MR BENJAMIN CHAKRABO 1346679 PACPEVAN MR PARTHA PR EVANS 62804510 PACHFAZL MISS HAYLEY FAZLEE 23404211 PACMHARR MR MOHAMMA HARRIS 26583712

Figure 2: Example List sheet from the spreadsheet Feedbac.xls.

Header

Enter filename Uchemed

Title of Coursework Determination of a rate constant for the reaction of I}-( aq) and S{2O{8}2}-(aq).

33 Maximum Mark

Maximum % Mark Comment

70 Excellent work.60 Very good work.50 Good Work.40 Satisfactory work.30 Unsatisfactory work.0 Poor work.

Other Comments

Top Mark Top of the class, well done!

Late Work This work was submitted late. A lateness penalty has been applied.

All Students The numbers on your work have the following meanings. Note that the % figures aftereach comment indicate the % of students who required that particular comment.

Figure 3: Example Header sheet from the spreadsheet Feedbac.xls.

Page 9: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

U.Chem.Ed., 2001, 5 4This journal is © the Royal Society of Chemistry

receives the comment, “Very good work.”Additional comments can be directed to thosestudents who are subsequently identified as havinghanded work in late, and to the student whosecured the highest mark.

Tutors are required to enter a series of feedbackstatements into the Annos worksheet, Figure 4.The statistical information that appears on thissheet will be considered subsequently. Typically,the feedback comments are those which pastexperience shows are most likely to be neededduring marking, e.g. “You have failed to state thecorrect units”. Clearly, the number of writtencomments required will depend on the nature ofthe assessment activity. When a large group ofstudents have submitted practical reports on aparticular experiment, for example, the sameerrors and misunderstandings crop up time and

time again, limiting the number of statements thatis required. Since Excel does not have the facilityreadily to format text, comments that containsuperscripts, subscripts, line breaks or tab spacescan be entered using a series of special characters.Thus, ‘{‘ = convert next character to a subscript,‘}’ = convert next character to a superscript, ‘̂ ’ =insert line break, ‘¬’ = insert tab space. Forexample, the formula of the persulphate ion is‘S{2O{8}2}-‘ using this system.

MarkingThe three worksheets described up to this point,List, Header and Annos can be completed beforethe students have submitted their work. Duringmarking itself the students ’ scripts are annotatedwith digits where each number corresponds to oneof the feedback comments on the Annos worksheet.In this way tutors can avoid having to write the

Annotations

Determination of a rate constant for the reaction of I}-( aq) and S{2O{8}2}-(aq).Dr. Philip Denton

Number of scripts marked (to date) 8Highest Mark (%) 76Average Mark (%) 46Lowest Mark (%) 24

Standard Deviation (%) 16

% of students with this

Meaning of the annotations annotation1 When comparing your result with value(s) from the literature, you should state the author, title, year, and

publisher of any data sources you refer to. In this experiment k{2 = 1.0 x 10}-}2 mol}-}1 dm}3 s}-}1 (J.Chem. Ed. 1997, page 972).

75

2 Lab. reports should have the following sub-headings and should be presented in the following order;introduction, method, results, conclusion.

88

3 Your axis is not numbered correctly. Always select chart type XY SCATTER when using MS Excel. 25

4 Incorrect units/units not stated clearly. In this experiment, t in s, V in ml, k{1 in s}-}1, k{2 in mol}-}1 dm}3s}-}1, ln (V{i{n{f - V) is unitless. Correct units should be stated in all column headings and on graph axes.

38

5 Your graph should display the individual data points, in addition to a best fit line. The data points should NOTbe joined together by a "dot to dot" type line.

63

6 A best-fit line (BFL) is required. In a plot of ln(V{i{n{f - V) versus t, the BFL should be linear. In a plot of Vversus t, the BFL should be curved and should pass through the values of V{ c{a{l{c.

50

7 Your graph title is unclear/incorrect or absent. As a minimum, it should state the quantities plotted on the Yand X axes.

38

8 There is insufficient discussion of experimental error in your work. The main errors in this practical resultfrom the volatilisation of I{2 during heating and uncertainties in the end point due to incompletedecolourisation of the starch indicator.

50

Figure 4: Example Annos sheet from the spreadsheet Feedbac.xls.

Page 10: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

U.Chem.Ed., 2001, 5 5This journal is © the Royal Society of Chemistry

same comment repeatedly on students ’ work. Uponreceipt of their marked work, students canappreciate the meaning of the numericalannotations on their work by referring to theirfeedback sheet.

Before any feedback sheets can be generated it isnecessary to provide the Feedbac.xls workbookwith details of which feedback comments wereassigned to which students and also the marks thatwere awarded. Tutors can put this information intothe Numbers worksheet, Figure 5, which is createdautomatically when a class list has been enteredinto the List worksheet. Adjacent to each name are27 empty cells. The first cell is only filled in if astudent handed their work in late. Tutors enter themark awarded before the imposition of anylateness penalty. The feedback sheet for thisstudent will then include the comment for latework that is specified on the Header worksheet.Into the second cell, tutors enter the final markthat has been awarded to the work. This score isautomatically converted into a percentage bydividing it by the maximum mark that has beenentered into the Header worksheet. Alternatively,tutors can enter ‘PMC’ into the second cell if it isknown that the student is not going to submit anywork because they have personal mitigatingcircumstances. The calculated % mark or ‘PMC’ isdisplayed on th e Numbers worksheet in the secondcolumn. In the remaining 25 blank cells that areadjacent to each student’s name, tutors put thenumbers that correspond to the feedbackstatements that have been allocated to that classmember on their marked script. Thus, themaximum number of comments that can beassigned to a particular student is 25.

Upon completion of the Numbers worksheet, theAnnos worksheet displays statistical detailsrelating to the activity, Figure 4. Thus themaximum, average and minimum percentagemarks are reported and this information isreproduced on the feedback sheet, Figure 1. TheAnnos worksheet also computes the percentage ofstudents who required a particular commentduring marking, Figure 4. These values arereproduced on the feedback sheet as a percentagefigure in brackets after each comment, Figure 1.Generating and returning feedback to studentsWhen a mark for a particular student is enteredinto the Numbers worksheet, the spreadsheetautomatically generates the correspondingfeedback sheet. Excel does not have the capabilityto produce large amounts of formatted text. Thus,before printing or e-mailing, the feedback sheetsmust be copied and pasted into the MS Word 97document Fb.doc. Both electronic feedbackprogrammes, Feedbac.xls and Fb.doc, incorporatea series of visual basic programmes that enablethis copying, pasting and formatting procedure tobe accomplished via a couple of mouse clicks.

Evaluation of electronic feedbackThe educational benefits of electronic feedbackwere evaluated by studying the frequency withwhich feedback comments that relate to aparticular error were used during marking.Clearly, one would hope that the frequency of useof such comments would gradually decrease overtime as students reacted to their feedback andcorrected their mistakes.

The attitudes of students to the electronic feedbackstrategy was ascertained by their responses to astructured questionnaire that was completed by 58first year undergraduate students within the JMU

Mark awarded to late work before theimposition of a lateness penalty.

Final mark awarded or PMC.

Number

%Cohort

/33/33 Annotations (max. 25)

1 24 CATHERINE BAKER A 14 8 3 2 5 1 4 6 7 82 PMC JOANNE BALEED A PMC3 52 AHMED BASSI A 17 2 3 1 74 48 RAMANDEEP BERAH A 16 5 4 1 25 39 AMIR BULLOCK A 16 13 6 8 16 45 JULIE CAMERON A 15 1 2 5 67 76 ALASTAIR CAVE A 25 2 5 68 52 BENJAMIN CHAKRAB A 17 5 8 29 PARTHA EVANS A10 30 HAYLEY FAZLEE A 10 4 7 2 1 811 MOHAMMED HARRIS A

Figure 5: Example Numbers sheet from the spreadsheet Feedbac.xls

Page 11: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

U.Chem.Ed., 2001, 5 6This journal is © the Royal Society of Chemistry

School of Pharmacy and Chemistry. This was inaddition to three focus groups, each consisting ofthree students chosen at random. Members of staffwere also requested to offer their views on thesoftware after it was presented to them during aJMU training session.

ResultsThe electronic feedback method has been used toassess physical chemistry laboratory reports andworksheet assignments submitted by first andsecond year undergraduates at JMU. Theprocedure has been found to work well in practice.Ideally, the bank of feedback statements should bewritten before the assessor receives the students’work to enable the marking to be completed asquickly as possible. There is no reason, however,why it cannot be gradually built up duringmarking itself.

When marking laboratory reports, the same initialbank of general feedback comments can be used.These are then edited and augmented so that theyare appropriate to each experiment. Typically,about 25 distinct comments are required. Many ofthe comments relate to core skills such as reportwriting and the graphical representation ofexperimental data. The frequency with whichparticular comments were used when marking twofirst year undergraduate physical chemistryexperiments, conducted two weeks apart, is shownin Figure 6. By the time the students came toundertake the second practical they had alreadyreceived e-mailed feedback on the first. As is

evident, the ability of the students to present theirwork in an appropriate scientific manner hadimproved markedly over this period.

Annotating students ’ work with numbers in placeof comments ensures that marking is relativelystraightforward and rapid. Moreover, th e List andHeader worksheets of the Feedbac.xls file can befilled efficiently if electronic class list informationis available. The time taken to complete th e Annosworksheet will depend on the number andcomplexity of the feedback comments. Asdiscussed above, however, if the assessedassignment is similar to one that has been setpreviously, tutors may find that it is possible to usean existing bank of feedback comments that hasbeen appropriately modified. In this way, the timetaken to enter the Annos worksheet can beconsiderably reduced. Th e Numbers worksheet canbe completed quickly if marked scripts are firstarranged in class list order, before the marks andthe numerical annotations that appear on the workare entered.

Students reacted positively to the electronicfeedback procedure when questioned in the focusgroups and in responding to the questionnaire. Allthe interviewed students felt that the e-mailing offeedback was an efficient way to receive details oftheir performance in an assessment, as it removedthe requirement for them to wait until the nexttime they met the lecturer. It was evident thatstudents were comfortable with the principle ofreceiving feedback when they were at a computer

Meaning of the annotation (abridged) % of students withthis annotation

(7/10/99)

% of students withthis annotation

(23/9/99)

Lab. reports should be presented in the following order; introduction,method, results, conclusion.

60 36

Your graph axis is not numbered correctly. Always select chart type XYSCATTER when using MS Excel.

36 2

Incorrect units/units not stated clearly. 98 74

Your graph should display the individual data points, in addition to aline.

34 2

Your best-fit line is incorrect and/or absent. 91 31

Your graph title is unclear/incorrect or absent. 57 7

Your graph axis is not labelled correctly or is not labelled at all. 21 0

Figure 6: Assessment profiles from two first year undergraduate chemistry practicals.

Page 12: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

U.Chem.Ed., 2001, 5 7This journal is © the Royal Society of Chemistry

terminal on their own. The focus groups confirmedthat they were more likely to pay attention to theelectronic feedback that is returned quickly andfelt that it was then acceptable to wait 2-3 weeksfor marked coursework to be returned.

The questionnaire revealed that 88% of theundergraduates felt that it was useful to havewritten feedback e-mailed to them in advance ofreceiving their marked script. The vast majoritystated that they appreciated knowing themaximum, average and minimum marks for theactivity (91 %) as well as their position in the class(88 %). All questioned students found thecomments on their feedback sheet useful and mostof the class (81%) felt that they had received morewritten feedback than they normally obtained fromtutors. It became clear that one of the particularadvantages of the electronic feedback procedure isits ability to return lengthy, detailed comments ona particular aspect of the assessment. Theresponses of two students were typical, “It is ahelpful method of marking as it enables you to seehow and why mistakes were made...” and, “Itoffers a more in-depth description of how you havegone wrong.” A number of students alsocommented that the printed feedback sheetsovercame difficulties associated with the legibilityof staff handwriting. In response to the question,“should electronic feedback be used moreregularly within the School”, 100% of respondentssaid, “Yes”.

After the staff training session, 31 colleaguesreturned written sheets to provide feedback on thesession. Those staff who have a familiarity withExcel reported minimal difficulties using thesoftware. One member of staff commented, “Afairly complex piece of software which I will feelmore confident of using once I’ve tried it out usingmy own annotations. Educationally, a very soundmethod.” Other staff acknowledged that theprocedure could become second nature withpractice. 5 members of staff said they woulddefinitely not use the software in future, eitherbecause they had an existing electronic system thatthey preferred, or because they had experienceddifficulties using the software.

DiscussionUp to now, the electronic feedback technique hasbeen used primarily in the grading of chemistrycoursework. The procedure is quite general,however, and can be used for any assessmentwhere it is expected that students will make thesame errors repeatedly. It is clear that theelectronic feedback approach can make themarking process considerably less onerous as it

removes the requirement to annotate students ’work with repeated hand-written comments. Thepackage would be of particular interest to thosetutors who find that, using conventional methods,they are unable to return as much feedback as theywould wish to. Although this approach is perhapsless personal than traditional marking, there is noreason why tutors cannot supplement their printedfeedback with hand-written comments toindividual students.The two files that comprise electronic feedback arepassword protected to prevent the accidentaloverwriting of essential subroutines. Thus, thereare limited opportunities for customisation. Tutorscan have some control over the final appearance ofthe feedback sheets, however, and may choose toomit details of the maximum, average andminimum marks as well as the student ’s positionin the class. Tutors who prefer to allocate gradesinstead of marks can choose to hide the percentagemarks that are normally displayed on the feedbacksheets and can write feedback statements such as‘Grade B+’. Each statement can have a particularnumber associated with it and these can beallocated to students in the customary manner.

The software need not necessarily be usedexclusively when marking assessments where thesame errors crop up repeatedly. If they so wish,tutors can write a single, lengthy feedbackcomment for each student and use the software togenerate the corresponding feedback sheets. In thisway the software can be used to e-mail feedback tostudents on highly individual assessments such asproject work. If this approach is adopted, thenumber that corresponds to the feedback comment,and the percentage of students requiring thatstatement, may be omitted from the feedback sheet.

AcknowledgementThis work was completed as part of thePostgraduate Certificate in Learning and Teachingin Higher Education 1999/2000. I wish to thank allthe tutors associated with this course.

Supporting materialCopies of the requisite software and a user guideare available from the author. Interested personsshould send a stamped addressed A4 envelope anda formatted 3½” disk. Please include your e-mailaddress so that you may be contacted subsequentlyfor your opinions of the software. Respondentsshould indicate how they wish their title and nameto appear on the feedback sheet, as thisinformation cannot be changed subsequently. Thisis a security precaution that is included so as toprevent the unauthorised proliferation of thesoftware.

Page 13: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

U.Chem.Ed., 2001, 5 8This journal is © the Royal Society of Chemistry

References 1 G. Brown, J. Bull and M. Pendlebury,

Assessing Student Learning in HigherEducation, Routledge, London, 1997, p. 7.

2 G. Gibbs, Improving the Quality of StudentLearning, Technical and Educational Services,Bristol, 1992, p. 17.

3 G. Gibbs and T. Habeshaw, Preparing toTeach: An Introduction to Effective Teachingin Higher Education, Technical andEducational Services, Bristol, 1993, p. 95.

4 P.Ramsden, Learning to Teach in HigherEducation, Routledge, London, 1993.,

5 T. King, Using an Optical Mark Reader forContinuous Student Assessment: a Case Studyin Higher Education in Computer AssistedAssessment., ed.-s, J. Darby and J. Martin,1994, p. 23.

6 G. Gibbs, S. Habeshaw and T. Habeshaw, 53Interesting Ways to Assess Your Students,Technical and Educational Services, Bristol,1986, p. 77.

7 B.S. Nicholls, U. Chem. Ed., 1998, 2, 10.8 B.S. Nicholls, U. Chem. Ed., 1999. 3, 22.9 S.L. Helgeson and D.D. Kumar, Journal of

Computers in Mathematics and ScienceAssessment, 1993, 12, 227.

10 L. Barnett, D. Brunner, P. Maier and Warren,A Technology in Teaching and Learning: AGuide for Academics, Interactive LearningCentre, University of Southampton, 1996, p.59.

11 S. Habeshaw, G. Gibbs and T. Habeshaw, 53Problems with Large Classes: Making the Bestof a Bad Job, Technical and EducationalServices, Bristol, 1992, p. 136.

Page 14: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Mary R Masson

U.Chem.Ed., 2001, 5 9This journal is © the Royal Society of Chemistry

Experience with a Random Questionnaire Generator in thechemistry laboratory and for other continuous assessment

___________________________________________________________________________

Mary R. MassonDepartment of Chemistry, University of Aberdeen, Meston Walk, Aberdeen AB24 3UE.E-mail: [email protected]

The Random Questionnaire Generator, a suite of programs designed to produce randomised multiple-choice testsfor assessment of a first year chemistry class, has now been in use at Aberdeen University for three years. It hasproved popular with students and staff and gives a much more reliable mark for each student than the previoussystem. The Creator program has also been used to generate tests for use in continuous assessment tests forstudents at Level 2.

IntroductionAt Aberdeen University, as at many of the otherolder Scottish Universities, a large proportion ofB.Sc. students take the Level 1 Chemistry course.There is no separation of intending chemists fromother science students. All students taking the firstyear course follow the same laboratory course,which requires attendance at one 3-hour class eachweek. Several set experiments run on each lab day,and students rotate around these according to a pre-defined rota.

Students make records of their work in theirlaboratory manual during the laboratory class. Therecords include calculations, graphs, data analysisand answers to questions. Until three years ago alllaboratory manuals were checked and marked by amember of staff before the student was permitted toleave the laboratory. The increase in studentnumbers during the 1990s meant that there could beup to 120 students in the lab during any lab session.This overloaded the system, caused unacceptablequeuing and resulted in some students stopping labwork early so as to get to the front of the queue (apractice which was quickly copied by others).

Staff identified the following problems with themarking process:• It was impossible to award a meaningful and

consistent mark in the time available to assesseach student (less than 2 minutes) and as aresult the marks did not discriminate betweenable and less able, or even betweenconscientious and careless students.

• Opportunities to teach through interaction withstudents at the bench were significantlyreduced by the time spent on marking.

• Student time spent on laboratory work wasunacceptably reduced not only by the need toqueue, but also by the tendency to stop workearly.

In considering how to reorganise markingprocedures to alleviate these problems weconcluded that a computer-based test offered the

most promising way forward. We identified thefollowing characteristics of a satisfactoryassessment procedure:• Each student would be provided with an

individualised test to be completed by the endof the laboratory session; this individualisationwould both prevent simple copying of answersand allow the test to take account of the factthat in many experiments students are givendifferent samples to analyse.

• The questions would be worded in a friendlyway so that students would recognise the test asa valuable learning experience.

• Questions would attempt to ensure that theinformation provided in the laboratory manualis read and understood (preferably in advanceof the class) and would consolidate thetheoretical background by asking about newterms and definitions introduced in theexperiment, and understanding of generalinformation related to the experiment.

• The test would be designed to provide a markfor the recording of observations made duringthe experimental work (e.g. the colour of aprecipitate formed at a particular point), for thequality of the results obtained and would givepractice in calculations with dummy data.

• When dealing with observations and results, itshould be possible to award fractional marks.

• As far as possible the test would be markedautomatically using an optical mark reader.

Our Department has been using multiple-choicetesting in examinations since the early 1960s andhas followed with interest the literature on therequirements for the design of effective questions.This has been revisited in recent years as a result ofthe increase in popularity of computer-basedassessment.1, 2 Also, we have noted since we startedour project that other institutions have reported theintroduction of computer-based pre-lab and post-lab tests.3, 4 However none of the published workappears to have been concerned with the provisionof a single test based on a particular laboratoryclass in which individual questions are designed to

Paper

Page 15: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Mary R Masson

U.Chem.Ed., 2001, 5 10This journal is © the Royal Society of Chemistry

be completed at different times throughout theclass. We looked at other testing systems thatinclude randomised selection of questions (forexample Turton5), and at commercial systems suchas Question Mark Designer,6 but found that thesewere designed for computer delivery of tests, anddid not offer any possibility of customisation tomeet all our needs. We therefore decided that thebest option was to design our own style of test.

The system we designed has now been in operationfor three years.7,8

MethodsCreating the questionsThe first year laboratory course consists of 20sessions in each of which each student completes adifferent experiment. Before leaving the laboratoryeach student completes a test of 20 questions andhands in a coded answer sheet. Each test paper isunique to the student concerned and is generated onpaper for each student from a bank of multiple-choice questions. Students collect their test papersat the start of each laboratory session, and answerthe questions at appropriate points during and at theend of the experimental procedures. Because all thetests are different, we are able to allow students todiscuss the answers to questions. Students are alsoencouraged to seek help from demonstrators withquestions they find difficult.

The answers to the test questions are entereddirectly onto a copy of the University’s standard,machine-readable form for marking by an opticalmark reader; this form simply lists the questionnumber and offers the choice of five boxescorresponding to the alternative responses provided(five boxes are always provided even though somequestions are only provided with three or fourresponses). Individual test papers, each of which isprepared as a Microsoft Word document, arecreated by the computer from the bank of questions.Typically the question bank for each experimentconsists of 24 different basic questions, but thevariety is considerably increased because many ofthese basic questions have a number of variants (2 –8). Variants are generated in several different ways.Sometimes it was possible to devise equivalentquestions that were totally different, or (in the caseof dummy calculations) to provide different data forthe same calculation. Sometimes it was possible touse a fixed set of responses, but different questionstems, one corresponding to each response.However, in some cases, the only acceptablevariation was to change the order of the possibleresponses. With such questions, it is obviouslyeasier for students to collaborate, but we weresatisfied that they would at least have to read thequestion and its responses carefully — it would notbe possible just to find out from a neighbour thatthe answer to question 3 was B.

The provision of a machine-readable test ofobservations made and of results was tackled in twoways. The first is used in experiments where allstudents should in theory obtain the same answer.The style of question used here is:

• From the calibration graph, in which of thefollowing ranges was the concentration ofpotassium ions in the river water?A Less than 8 mg l –1

B 8–9 mg l–1

C 9–10 mg l–1

D 10–11 mg l–1

E More than 11 mg l –1

We tried here to avoid making the middle responsethe correct one, so that students could not use ourranges to guess the answer that we were expecting.The evidence we have from student queries is thatthey are very keen to code their results correctly,rather than attempt to cheat the system. The correctresponse is awarded a full 1 mark, but otherresponses may be awarded partial marks, based onour knowledge of the errors of the experiment.

The second method is required for experimentswhere students are expected to get different results,and where the mark is to be awarded for aspects ofexperimental work that involve human judgement.For these questions, the student is informed whatthe marks are awarded for but is instructed not toprovide an answer. Instead, responses to be codedon the machine-readable answer sheets are decidedby demonstrators after examining the laboratorymanual and checking calculations where necessary.This is done throughout the class and does notusually cause delays. Examples of the thesequestions are:

Has the mass of the iron compound beenrecorded correctly?Are the titration volumes in good agreement?Has the percentage yield been calculatedcorrectly?Has the graph been drawn neatly and correctly?

Demonstrators have a key for each experiment,which details the appropriate response for thesequestions, which we define as ‘demonstrator-marked’ . For each question, the key provides aletter that corresponds to the full mark of 1; theother letters correspond to fractions of a mark.Thus, if the key letter is C, students awardedresponse C receive a mark of 1, but other responsesreceive 0.75, 0.5, 0.25, or 0. The letterscorresponding to the correct result are different forthe different experiments, are kept secret from thestudents, and are changed from time to time.Amongst other characteristics of student work,these demonstrator-marked questions requiredemonstrators to check calculations, to examine

Page 16: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Mary R Masson

U.Chem.Ed., 2001, 5 11This journal is © the Royal Society of Chemistry

graphs for neatness and correct plotting, and toexamine the appearance of organic products.Marking guidelines leave as little room as possiblefor personal interpretation so that postgraduatedemonstrators are able to assist the staff withmarking. Demonstrator-marked questions areincluded in the same format and in the same placeon all the test papers for a particular experiment.They are therefore designated as ‘fixed’.

Questions dealing with other aspects of theexperiment (for example, calculations with dummydata, and questions dealing with backgroundtheory) are of a more conventional style. For thepurposes of creating test papers these questions aredesignated as ‘f ixed’ (one of the variants appears inevery test paper, always in the set position),‘compulsory’ (one of the variants appears in everytest paper, but in any available position) or as‘optional’ (a question which need not be selected).

Depending on the experiment, a test paper willcontain 3–20 fixed questions, 0–10 compulsoryquestions, and 17–20 optional questions selectedfrom a bank of up to 30 (basic questions) with atotal of up to 80 variants.

Creating the test papersThe test papers are created from the question banksand the corresponding Questionnaire DefinitionFiles, by using the Creator program, which iswritten in Delphi (like the others in the suite). Theprogram reads a set of daily Excel spreadsheets(giving lists of student names, laboratory numbers,and class codes), another spreadsheet giving therota of experiments (which relates studentlaboratory numbers with experiments for eachweek), and a file, which defines all the experiments.The program uses the student’s laboratory numbertogether with a ‘day code’ defining the day of theweek and the week number to set the rules by whichthe algorithm selects the questions for each studentin a manner that appears to be random. Each testwil l include all the fixed questions, in their definedpositions, along with the Compulsory questions andenough Optional questions to give a total of 20; thepositions of the “c” and “o” questions are differentfor each different test. The algorithm alsodetermines which of the optional questions areselected and (for questions with variants) it alsodetermines which variant is selected.

To simplify the handling and distribution, the fontsize and margins for the questionnaire are set to

permit each test to be printed on a single sheet ofpaper. An example of a test is shown in Figure 1.Once the student names are available, laboratorynumbers have been allocated, and the daily Excelspreadsheets prepared, questionnaires are simple toproduce. Normally they are generated for allstudents for a given day and week number in asingle batch (although if necessary a single formcan be printed). This is repeated for other days andweek numbers. The limiting factor in the process isthe printer — but with a fast printer, tests for 20 labsessions (for 5 weeks, with 4 lab classes each week;up to 3000 tests) can be printed in a single day.

Marking programThe laboratory technicians scan the completedmachine-readable forms by using a Scanmark 2000(http://www.scantron.com/scan/sm2000.htm)optical mark reader (omr). The machine is set toreject forms if marks are missing or not darkenough to be readable; it can also reject someinvalid codes. Any rejected form is returned at onceto the student, with advice to make marks darker, orinsert or amend codes and then the form isrescanned. We bought our own reader for thisproject, but use the University's standard printedforms. The omr software writes the data to anASCII text file.

Each week, the four daily files are processed by theMarker program, which uses the same algorithm asthe Creator program to determine, for each student,which questions have to be marked. It then checksthe answers in the answer file, and awards theappropriate mark. It has not proved possible tomake the marking process fully automatic, becausestudents sometimes make errors in entering theirlaboratory numbers and/or day codes, but theprogram attempts to flag these, and the flaggedentries are corrected manually. When allcorrections have been made, the program isinstructed to insert the marks into the Excelspreadsheet, which also serves as the register forthe class. At the time of marking, the responses arerecorded in a text file for later analysis, if desired,for example to calculate the fraction of the classwith the correct answer to every variant of everyquestion (the facility value).

A special arrangement had to be made for the"Unknown Samples" that students have to identifyin the Inorganic part of the laboratory class. Thereare 47 different samples, so it was not considerednecessary to generate fully randomised tests.

Page 17: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Mary R Masson

U.Chem.Ed., 2001, 5 12This journal is © the Royal Society of Chemistry

2071 JOHN SMITH Cations part 2 Day code: 0321

Q1. The colour of the solid and the stock solution of compound T is?A Colourless / white B Pink / red C Green D Blue

Q2. With which of the following pairs of cations are the colours of the solid and stock solution of compound T compatible?A Cu2+ and Cr3+ B Cu2+ and Fe2+ C Cr3+ and Co2+ D Ni2+ and Fe2+

Q3. On heating the precipitate a change in colour was noted. What was the final colour observed?A Blue B White/colourless C Green D Black

Q4. Which of the following is the only cation to fit the result obtained when sodium hydroxide solution (dilute then excess) was added to thecompound?A Ni2+ B Cu2+ C Cr3+ D Fe2+

Q5. Unknown T was copper sulphate 5-hydrate. Which of the following is the formula for this compound.A (CuSO4)5 H2O B Cu(SO4)2.5H2O C CuSO4.5H2O D Cu2SO4 H2O

Q6. Which of the equations below represents the formation of the precipitate of copper hydroxide?A Cu2+(aq) + 2OH–(aq) → Cu(OH)2(s) B 2Cu2+(aq) + OH–(aq) → Cu2OH (s)C Cu2+(aq) + OH–(aq) → CuOH(s) D Cu2+(aq) + 2OH–(aq) → CuOH2 (s)

Q7. What is the formula of the compound formed on heating copper hydroxide?A CuS B CuO C CuOH D Cu(NO3)2

Q8. The precipitate redissolved on adding excess ammonia solution owing to the formation of a complex. Is the complexA An anion B A cation C A neutral molecule

Q9. What is the name given to this type of complex?A Hydroxo B Ammine C Amine D Hydrate

Q10. Which of the following equations represents the formation of the copper complex ?A Cu2+(aq) + 6NH3 (aq) → [Cu(NH3)6]2+(aq) B Cu2+(aq) + 4NH3 (aq) → [Cu(NH3)4 ]2+(aq)C Cu2+(aq) + 4NH3 (aq) → [Cu(NH3)4]4+(aq) D Cu2+(aq) + 6NH3 (aq) → [Cu(NH3)6]6+

(aq)

Q11. On addition of excess of concentrated hydrochloric acid to a solution of unknown T a colour change was observed. Which of the followingchanges in colour best fits your observation?A Blue to greenish yellow B Blue to violet C Blue to red D Green to blue

Q12. The change in colour is due to the formation of a complex. Is the complexA An anion B A cation C A neutral molecule

Q13. What is the name given to the type of complex formed with hydrochloric acid?A Hydroxo B Chloro C Amine D Hydrate

Q14. Which of the following represents the formula of the complex formed?A [CuCl]– B [Cu2Cl]2+ C [CuCl4]2– D [Cu2Cl4]2–

Q15. Which of the following is the correct formula for the precipitate formed when sodium sulphide was added to T?A CuS2 B Cu2S C CuS D Cu2S2

Q16. From the colour of compound U and its solution, which of the following groups of cations can you say are definitely not present in thecompound?A Mg2+, Ca2+, Sr2+, Ba2+, Cd2+ B Cr3+, Fe3+, Co2+, Ni2+, Cu2+

C Zn2+, Al3+, Pb2+, Sn2+, Sn4+ D Cd2+, Mn2+, Fe2+, Ag+

Q17. Which of the following groups of cations is compatible with the result from the reaction between compound U and sodium hydroxidesolution?A Mg2+, Ca2+, Sr2+, Ba2+, Cd2+ B Cr3+, Fe3+, Co2+, Ni2+, Cu2+

C Zn2+, Al3+, Pb2+, Sn2+, Sn4+ D Cd2+, Mn2+, Fe2+, Ag+

Q18. Which one of the following cations can be eliminated because it would have given a precipitate if the solid was tested with nitric andsulphuric acids?A Ca2+ B Mg2+ C Cd2+ D None

Q19. Unknown U was magnesium sulphate 7–hydrate.Which of the following is the formula for this compound?A (MgSO4)7 H2O B Mg(SO4)2.7H2O C MgSO4.7H2O D Mg2SO4 .7H2O

Q20. Which of the following is the formula for the precipitate formed when sodium hydroxide was added to a solution of U?A Mg(OH)2 B Mg2(OH)3 C MgOH D Mg2OH

Figure 1. An example of a questionnaire

Page 18: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Mary R Masson

U.Chem.Ed., 2001, 5 13This journal is © the Royal Society of Chemistry

Instead, a limited number of tests were prepared foreach of the samples used. The pre-preparedquestionnaires are associated with the samplesrather than with individual students. These tests areidentified to the Marker program by special codesprinted on the test forms in place of the normal daycodes. Different students complete differentnumbers of unknown samples, so all the marksachieved by one student are summed in a singlespreadsheet cell.

Providing students with feedbackFeedback to the students is provided in two ways.In the first, marks are made available by theLaboratory Mark Reader program, which runs ontwo low-specification computers in the laboratory.This program reads copies of the main Excelspreadsheets, and allows students to see the markachieved in the previous week(s), and also to findout the numbers of the questions they haveanswered incorrectly. Students are advised toretain their questionnaires so that they can reviewthese questions; they are advised to consult ademonstrator if they do not understand their error.

The second form of feedback is provided only tostudents who get marks below a given threshold(usually 12 out of 20). These students are offered aprinted report showing their incorrect responses andthey are particularly recommended to consult ademonstrator for advice about their errors. Theprintouts include the full text of questions answeredincorrectly, but questions related to the student’sexperimental data are normally labelled in theanswer file so that they are excluded from thesereports. (The same applies to the first form offeedback.)

ResultsThe facility value calculated for each question (andeach variant) provides evidence that no questionsappear to be either so easy or so difficult that theyprovide no useful discrimination between students.Typically, the facility values fall between 0.6 and0.8, which we regard as satisfactory here. In anormal exam, an ‘ ideal’ question would have avalue of 0.5, because half the class got the questioncorrect; but a test should include a range of facilityvalues in order to discriminate across a range ofstudent abilities (1).

Furthermore, the facility values for variants of thesame basic question are essentially the same. Theseresults provide reassuring evidence that the tests arediscriminating effectively. A point of particularinterest arises from our examination of the facilityvalues for questions, which differed only in theorder of the distractors. We observed considerablevariation in the numbers choosing the variouswrong answers, but we found only two for whichthe correct answer had a higher facility when the

nearest distractor immediately preceded it. In viewof the large number of questions we examined, thiswas not statistically significant and ourobservations are therefore inconsistent with thesuggestion that the relative locations of the correctanswer and 'nearest' distractor can have asignificant effect on the facility of questions.9 Auseful reminder that ‘objective testing’ is not fullyreliable and reproducible was provided by aquestion with only 3 responses (A, B and C), butfor which 6% (1998) and 4% (1999) of studentsmarked response D.

Figure 2 shows a comparison of the marks awardedduring a single semester (1999, semester 2) usingthe new system with those awarded during the lastsemester in which the old system was used (1996,semester 2). The distribution of marks is stillskewed towards the top end of the marking scale,but the new system has resulted in a much fuller useof the total mark scale. Table 1 provides acomparison of the marks obtained in all thesemesters since 1996 (2) and shows that 1999(2) istypical of semesters since the introduction of thenew system. (Although there is no statisticaljustification for calculating a standard deviation fordata, which are clearly not drawn from a Normal

Population, it is a convenient way of providing acrude comparison between semesters).

Using the new system, the distribution of marks forlaboratory work is similar to the distribution forclosed examinations. This is illustrated in Figure 3,which shows a plot of lab marks vs. exam marks for

Semester 96/2(Old)

97/1 97/2 98/1 98/2 99/1 99/2

Mean 17.7 16.0 15.7 16.0 15.9 16.5 16.2

StandardDeviation

0.5 2.6 1.8 2.5 3.0 3.0 2.7

Table 1. Statistics for Laboratory Marks

Figure 2. Comparison of laboratory marksawarded by the old and new systems

Page 19: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Mary R Masson

U.Chem.Ed., 2001, 5 14This journal is © the Royal Society of Chemistry

semester 1999(2). Almost every student obtained ahigher mark for lab work than for the closedexamination. However the correlation coefficient ishighly significant (R = 0.57, n = 243, P < 0.001).

We are satisfied that the distribution of marksobtained with the new system is a better reflectionof our students’ performance in the laboratoryclasses. We find it hard to accept that the datashown in Figure 2 for semester 1996(2) really meanthat all our students truly deserve marks in the topquartile of the scale. And we are neither surprisednor concerned that most students achieve a highermark for lab work than they do in examinationsbecause it is possible in the laboratory tocompensate for mistakes or lack of aptitude byperseverance and hard work. Furthermore, we havenoted a marked improvement in student attitude tolaboratory work since the introduction of the newsystem; the great majority actively strive to achievehigh marks and keen to learn how they have lostmarks, to the extent that there have been manyrequests for the feedback printouts to be madeavailable for all students. In official StudentCourse Evaluation Forms, over 90% of respondentssaid that they thought the practical work was wellorganised. In addition, around 85% said they foundthe laboratory work interesting (in contrast toJohnstone’s report10 that the majority of younglecturers interviewed by him had, as students, foundlabs boring).

Staff and demonstrators also report favourably onthe new system. The laboratory is now able to closeat the advertised closing time. The automation ofthe recording of attendance and mark hassignificantly reduced the administrative workassociated with the running of the class.

DiscussionThere can be no doubt that we achieved our firstobjective of removing the problem of queues withthe consequent advantage that the students do notstop work earlier than is justified by their progress.

There are other advantages. The system formarking and recording of marks operates veryefficiently giving demonstrators more time to teachin the laboratory. The use of paper tests means that,for some kinds of experiment, we can completelyintegrate the tests with the experiments so that theyare in-lab rather than post-lab tests. The MasterExcel spreadsheet allows us to monitor studentattendance and take action if necessary. Ourcomparisons of the new lab marks with old onesand with examination marks have convinced us thatwe now have a much more reliable mark for thelaboratory work than we did in the past so that wenow have no worries about the weighting of 20%given for this mark in the end-of-module degreeexaminations, and indeed we are giving thought toincreasing this.

The original version of the Reader program gaveonly the mark, and comments from studentsobtained from course evaluation forms resulted inmany requests for the feedback printouts to bemade available for all students instead of just thosewho obtained low marks. We take this to be apositive indication of a genuine wish to improve,and to comply with this request, we have veryrecently introduced the enhanced version of theReader program, which informs students of thequestions that they answered incorrectly.

The suite of programs we have developed is genericin that it can be readily adapted for a variety ofuses. This is illustrated by our use of theQuestionnaire Generator Program to createhomework assignments for level 2 students doingthe analytical lab course. Because this is timetabledfor the end of the session, students face a conflictbetween their perceived need to revise and therequirement to prepare lab reports. It was thereforedecided to separate the calculations from theexperimental work, and issue the calculations earlyin the semester. We had not felt able to do thispreviously because of the prevalence of copyingwhen all students receive the same homework tests,but the opportunity to create randomised testsovercame this objection.

For each experiment, six sets of "good" but notperfect data were selected from those obtained bystudents in previous years, and the Creator programwas used to generate randomised tests for eachstudent. There was no intention of using machinemarking, so no multiple-choice responses had to becreated. The students are no longer required toprepare lab reports at the end of term, because theevidence of their ability to carry out thecalculations is measured from their calculationsusing dummy data, and their actual data is enteredinto a Visual Basic program which calculates andrecords the final answers for subsequent assignment

30

40

50

60

70

80

90

100

0 20 40 60 80 100

Exam Mark

Lab

Mar

k

Figure 3. An example of correlation betweenlaboratory and examination marks

Page 20: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Mary R Masson

U.Chem.Ed., 2001, 5 15This journal is © the Royal Society of Chemistry

of marks for accuracy. The course evaluation formsshow that this is appreciated.

As far as the staff are concerned, there is aconsiderable saving in workload. Although themarking of the calculations is done manually withthe help of a key indicating the data given to eachstudent, this is much faster than marking reallaboratory data because the answers are alreadyknown. More time is saved by the automatedcalculation of results even though it requires somehuman intervention, since there is no need to checkthe calculation itself. This procedure also serves toprevent students from trying to "fudge" theirresults.

Our system is an example of the power ofcomputers in teaching with no pretence at beingComputer Assisted Learning. We see this as a verypositive aspect since we have long observed thatthe majority of our students are not veryenthusiastic about the use of computer-assistedlearning.

AcknowledgementsWe are grateful to the University InformationServices Committee for providing the originalfunding which allowed us to develop the databaseof questions and the software, to Dr Simon Heathof CLUES for provision of programming support,and to Aberdeen University Learning TechnologyUnit for support for recent enhancements to thesoftware.

The programs that make up the RandomQuestionnaire Generator suite are completelygeneric in nature, although they do require the userto be running Windows 95/98/NT, together withMicrosoft Word and Excel. We are willing to makethem available at a modest price to anyone who isinterested in using them.

References 1 S. Heard, J. Nicol and S. Heath, Setting

effective objective tests MERTaL Publications,Aberdeen, 1997.

2 P. Race Changing assessment to improvechemistry learning, Project Improve, 1997.

3 B.S. Nicholls, U.Chem.Ed., 1999, 3, 22.

4 P.W.W. Hunter, U.Chem.Ed., 2000, 4, 39.

5 B.C.H. Turton, ALT-J, 1996, 4, 48.

6 Question Mark Designer, Question MarkComputing Ltd, http://www.qmark.com/

7 M. Masson, and W. Lawrenson, Computer-aided laboratory assessment: in Research inAssessment XIV: Aspects of First Year HigherEducation, RSC, London, 1999, pp. 9–11.

8 M. Masson, W. Lawrenson, G. Cowan and S.

Armstrong, Random Questionnaire Generatorfor objective tests for laboratory practicals v.1.1 MERTaLTM Courseware, University ofAberdeen, Aberdeen, 1997, ISBN 187315447 X

9 J. Handy and A.H. Johnstone, Educ.Chem.,1973, 10, 47.

10 A.H. Johnstone, U.Chem.Ed., 1998, 2, 35.

Page 21: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jane Tomlinson, Paul J Dyson and John Garratt

U.Chem.Ed., 2001, 5 16This journal is © the Royal Society of Chemistry

Student misconceptions of the language of error

________________________________________________________________________________________________

Jane Tomlinson, Paul J Dyson and John Garratt

Department of Chemistry, University of York, Heslington, York, YO10 5DDE-mail: [email protected]

We have collected student responses to questions designed to establish their understanding of twelve terms usedregularly when working with error and uncertainty in quantitative data. The terms are ‘reproducible’, ‘precise’,‘accurate’, ‘sensitive’, ‘random error’, ‘systematic error’, ‘negligible’, ‘significant difference’, ‘qualitative procedure’,‘quantitative procedure’, ‘correlated data’, and ‘transforming data’. In most cases less than 50% of the sample of firstyear chemistry students provided evidence of ‘some or good understanding’. We suggest that their misconceptions aremost likely to be rectified by persistently challenging the students to make explicit use of key words and concepts suchas these whenever they present reports of quantitative data collected during practical work.

IntroductionOne of the skills that chemistry graduates are expectedto acquire is the “ability to interpret data derived fromlaboratory observations and measurements in terms oftheir significance and the theory underlying them”.1Effective data interpretation involves dealing witherrors and uncertainty in the measurement of physicalquantities. This is an area that requires a particular useof language. Even the word ‘error’ is a source ofconfusion to many students since they commonlyregard ‘errors’ as personal mistakes 2 rather thanrecognising that “every physical measurement issubject to a host of uncertainties that lead to a scatterof results”.3 Another simple example is that manydictionaries give ‘accuracy’ as one meaning of‘precision’4 whereas these two words have distinctlydifferent meanings in the context of scientificmeasurement.

We recently published evidence that first yearchemistry students have not developed an effective useof the language used to handle error and uncertainty. 5

The majority of a sample of 65 students believed thatthe accuracy of their results would be improved byusing an objective rather than a subjective method to fita line to their experimental data, and that rather morethan half of them showed confusion over the meaningof the term 95% confidence limits. In retrospect, itseemed to us that these misconceptions were easy toreconcile with their previous experiences; they areaccustomed to teachers looking for a high level of‘accuracy’ and consequently they readily assume that a‘line of best fit’ means that the line gives the mostaccurate result; their use of ‘confidence’ in every daylanguage is sufficiently different from its technicalusage in error analysis that this can easily be a sourceof confusion.

The constructivist view of learning 6 leads to theexpectation that it is not easy to bring about areconstruction of a misunderstood concept alreadyembedded in the mind. 7 This may help to explain theconclusion that “students find it difficult to grasp thevalue and purpose of statistical procedures”.8 Wetherefore decided to explore student understanding ofkey words and concepts used in dealing with error anduncertainty in measurement. We hoped this would leadto better understanding of the concepts held by firstyear students, and that this would help us to deviseopportunities for learning that would lead to betterunderstanding of those issues generally considered tobe important. We had the opportunity to include a setof questions as part of a first year laboratory course onAnalysis taken by the same cohort of chemistrystudents who, in the previous term, alerted us to thepossible problem.5 At this stage they had receivedvirtually no relevant instruction in this area, at least notsince they left school. Our intention was therefore notto evaluate the course itself, but to gain a betterunderstanding of the concepts related to errors anduncertainty, which these students brought to the course.We expected that this would help us to deviseopportunities for learning that would addressspecifically any widely held misconceptions. We reporthere on the responses received to this set of questions.

MethodsIn devising our questions we first attempted to obtain(through discussion with a number of concernedcolleagues) a consensus view on the vocabulary withwhich first year students are expected to be familiar.Some of these (such as ‘accuracy’, ‘precision’,‘systematic and random error’) are routinely defined ordescribed in many standard treatments of error (e.g.ref.-s 3 and 12), others (such as ‘sensitivity’,‘negligible’) are rarely dealt with in such texts though

Paper

Page 22: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jane Tomlinson, Paul J Dyson and John Garratt

U.Chem.Ed., 2001, 5 17This journal is © the Royal Society of Chemistry

they are routinely used by scientists and have a specialmeaning in context.

We attempted to word the questions in a way thatwould distinguish between ability to provide adefinition (declarative knowledge) and an ability to usewords and concepts (procedural knowledge). Severaldrafts were required before we were satisfied with theset of questions. An important requirement was that wecould ourselves prepare, for each question, a shortanswer that we regarded as demonstrating an adequateunderstanding of word usage in the context of analysisand error. This requirement caused us to makesignificant revisions to the wording of our first draft.

Figure 1 shows the final version of the set of fivequestions; these cover 12 key words or concepts(counting ‘significant difference’ and ‘insignificant

difference’ as a single concept). Also shown is theexplanatory preamble which draws attention both tothe possible differences between the technical andevery day meaning of some words, and to the fact thatwe were looking for each student’s view of how to usethe words, given that the meaning may vary with thecontext. The whole fits conveniently on to one side of asheet of A4 paper.

The students in our survey were in the second term oftheir chemistry degree course. Each of the 103 studentsof the first year cohort received their own copy of thequestion sheet at the beginning of the six-week courseon Analysis that started in the first week of the springterm of 2000. The question sheets were handed out bythe Course Organiser, who gave a short verbalexplanation to reinforce the points made in thepreamble. No responses were received before the end

Questions set to first year chemistry students

The Language of Analysis and Error

Analysis usually involves measuring quantitatively or qualitatively one or more constituent of something. In order tocommunicate analytical results (including information about the effect of experimental error on their reliability), chemists usewords which have technical, or specialist meanings. Many of these words are also in ordinary use; for example accuracy,precision, random, systematic, significant. The purpose of this exercise is to give you an opportunity to think about and explainor describe how you would use, in a scientific context, some words which have both a technical and a general meaning.

There are five questions for you to answer. Write your answers clearly and unambiguously so that your reader knows exactlywhat you think. Remember that the questions are asking what you think; they are not asking for the ‘correct’ answer; (in asense there is no single correct answer, since the meaning varies with the context). Later in the course, when your answers havebeen handed in, you will be provided with our answers, so that you know how we think the words should be used. You shouldcompare your answers with ours, and reflect on any differences.

Be concise. Each question should be answered in a few lines.

Questions

1. An analytical procedure needs to be reproducible, precise, accurate, sensitive.How would you investigate how well a procedure meets these criteria?

2. Explain why systematic error is harder to detect than random error.3. Under what circumstances would you describe an error as negligible, and a difference between two values as significant orinsignificant?

4. Can a qualitative procedure prove that a constituent is absent from a substance? And can a quantitative method be used todetermine exactly how much of a constituent is present?

5. How would you decide whether data are correlated and when would you consider transforming data?

Notes* in order to be correlated, data must have an x value and a y value; the question here is: under what circumstances are the twovalues correlated with each other?* if you measure something (e.g. temperature), you may sometimes wish to transform it (for example to 1/T)

You may use examples to illustrate your answers, if you find this appropriate.

Figure 1

Page 23: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jane Tomlinson, Paul J Dyson and John Garratt

U.Chem.Ed., 2001, 5 18This journal is © the Royal Society of Chemistry

of the course and any handed in before the end of termwere accepted for evaluation. We do not think theresponses were affected either by the content of thecourse or by the four-lecture course on AnalyticalProcedures which coincided with the beginning of thelaboratory course, since neither were designed to dealwith the kinds of question we asked in thequestionnaire. In order to encourage students toprovide answers that reflected their currentunderstanding, and to discourage them from seekingtextbook answers, we made it clear that the exercisewas voluntary and that answers would not contributetowards the mark for their laboratory work.

We did not attempt to analyse the overall response ofeach individual since our intention was not to attemptto map individual understanding of errors but to gainan overview of the sorts of ideas and misconceptionsthat students in general have about errors. We wereconcerned not only to discern how much understandingthe students have, but also the nature of anymisunderstanding. Accordingly, we evaluated howwell each response answered the question and we alsolooked for answers that demonstrated someunderstanding of the issues even though the wordingwas more indicative of declarative knowledge than ofprocedural understanding. Before attempting toevaluate the student responses, we drew up a tablegiving a short (one sentence) acceptable answer toquestions about each of the thirteen words or concepts.This defined the key points we looked for and helpedus to be consistent in our evaluation.

Results A total of 33 responses were received. This rather lowresponse rate (32%) is almost certainly a consequenceof the explicitly voluntary nature of the exercise.However, the average A-level score of the respondentswas 22 points (three subjects, excluding GeneralStudies, equivalent to BBC) compared with 24 points(equivalent to BBB) for the whole cohort. Thus, on thebasis of the only criterion available, the respondentsare a reasonable cross section of the whole cohort.

We classified all the responses as showing either ‘someor good’ understanding, or ‘little or no’ understanding.Our attempts to attain greater precision, for example byclassifying under four rather than two headings, wereunsatisfactory. It therefore seemed better to present acrude summary followed by a more detailed discussionof the student responses to each of the five questions inturn. Figure 2 shows the summary of our findings.Even though we were generous in attributing ‘someunderstanding’ to some responses, less than 50% of therespondents were judged to show ‘some or goodunderstanding’ of most of the terms. In the discussionwhich follows we first enumerate and describe thoseresponses which show ‘some or good understanding’,and then those which show ‘little or none’.

Question 1 An analytical procedure needs to bereproducible, precise, accurate and sensitive .How would you investigate how well a proceduremeets these criteria?Specimen answer:• Reproducible: Make multiple measurements on

same sample using same procedure.• Precise: Determine by observation (of replicate

measurements) how many significant figures arejustifiable.

• Accurate: Use procedure on a standard sample tocheck closeness to correct value.

• Sensitive: Use decreasing quantities orconcentrations until signal cannot be distinguishedfrom noise.

The question of investigating reproducibility wassignificantly better answered than the other threeconcepts; eighteen of the respondents based theiranswers on replicate measurements, thus showing‘good understanding’. Five appeared to have confusedthe reproducibility of results with the opportunity torepeat an experiment (obviously a prerequisite fordetermining reproducibility, and one which must surelybe understood to apply to any analytical procedure). Atypical example of this misconception is “For anexperiment to be reproducible it should be easy andaffordable to recreate the experiment and experimentalconditions”. The remaining ten students (30% of thesample) are judged to have no useful understanding ofany of the four terms dealt with in question 1 becausethey either made no attempt to differentiate betweenthem or they dealt specifically with reproducibility butdid not distinguish between precision, accuracy, andsensitivity. Examples of these responses are “ Repeatthe procedure a number of times to see if there is alarge error in the accuracy etc. in which case theresults would be significantly different” and “In orderfor an experiment to be reproducible all measurable

Summary of responses from 33 students

Word or concept Good or someunderstanding

Little or nounderstanding

Reproducible 18 15Precise 13 20Accuracy 14 19Sensitive 6 27Random error 13 20Systematic error 26 7Negligible 16 17Significant difference 20 13Qualitative 6 27Quantitative 7 26Correlated 31 2Transforming data 31 2

Figure 2

Page 24: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jane Tomlinson, Paul J Dyson and John Garratt

U.Chem.Ed., 2001, 5 19This journal is © the Royal Society of Chemistry

external stimuli should be measured and taken intoaccount. Precision, accuracy and sensitivity can beobtained using a suitable instrumentation giving anacceptable degree of accuracy and a method ofmeasurement which reacts quickly enough to observesignificant changes.”

In describing how to investigate the precision of ameasurement, only seven respondents gave someindication that the key indicator is the number ofsignificant figures that can be justified. A lower levelof understanding was demonstrated by six studentswho made some reference to repeating the procedureseveral times to obtain the precision (a point alreadymade by three of them in connection withreproducibility); the weakness of these answers wasthat none gave any indication of how they would judgethe precision, or indeed that they understood itsmeaning in the context of analysis. In addition to theten already identified as showing no understanding, afurther nine showed little understanding, and one gaveno response to this part of the question. Four of thenine suggested that precision is related to the caretaken with experimental procedures; undoubtedly thisis in a sense correct, but it is not a response whichengenders confidence that these respondents have aclear understanding of the meaning of precision in thiscontext and it may be related to an assumption thatvariation is the result of their mistakes. 2 Two confusedprecision and accuracy and three respondents gaveanswers, which defied any attempt at classification.

Nine responses on accuracy showed goodunderstanding by referring to the use of known orstandard samples and comparing experimental resultswith these. Within this group of nine, four were clearlyreferring to investigating the accuracy of the procedureusing some specific standard sample independently ofmaking an experimental measurement (one of thesealso offered as an alternative the possibility of makingthe same measurement using a variety of procedures)and five referred to comparing their results with aliterature or text book value (thus illustrating that theyare thinking only in terms of analytical exercises towhich the answer is already known). Five showed alower level of understanding by making simplestatements about ‘closeness to the correct or true value’without giving any indication of how the true valuemight be known. Nine respondents showed twodifferent sorts of misconception. Five referred to theneed to take care either with the procedure or with thechoice of equipment, but showed no awareness of theneed for calibration or standardisation. The other fourconfused accuracy with precision (one of themexplicitly stating that both accuracy and precision aredetermined “from the range of values over whichrepeats lie”). Ten showed no understanding, asdescribed above.

Six respondents showed that they understoodsensitivity to be concerned with the ability of ananalytical procedure to detect small quantities or lowconcentrations of the analyte, though (perhaps notsurprisingly) none of these referred to the signal-to-noise ratio as a criterion for judging detection limits.The remaining twenty-seven either showedmisconceptions (fifteen) or showed no usefulunderstanding (ten referred to above) or failed toprovide an answer (two). Eleven of the fifteenresponses with identifiable misconceptions wereconcerned with the ability of a procedure to detectsmall differences or with the effect on the resultobtained of small changes in conditions. This is acommon meaning of sensitivity in everyday language;however our view is that, in the context of chemicalanalysis, it is the precision of a procedure thatdetermines whether small differences can be detected,and that sensitivity is properly reserved to refer to thelower limit of detection. Four responses showed neitheruseful understanding nor any clear misconception(examples are “ensuring that all likely changes aremeasured” and “if the reactant is not sensitive to a test,the results will be hard to obtain”).

Question 2. Explain why systematic error is harderto detect than random error.Specimen answer:Random error: Easy to detect from variability ofreplicate measurements.Systematic error: Difficult to detect unless you have areason for supposing the result is incorrect.

The majority of students’ responses (twenty-six)showed an understanding that, when systematic errorsare present, they occur in all measurements; twenty-one of these linked this with the difficulty in detectingsystematic error. Six respondents submitted statementsthat could not be interpreted as showing anyunderstanding of the use of either ‘random’ or‘systematic’ error. One response included no referenceto systematic error.

The responses showed a much lower appreciation ofthe meaning of random error. Thus only fiverespondents (a mere 15% of the sample) made clearstatements about random error causing variation inreadings. An example of these responses is “randomerrors are generated by the finite precision ofmeasurement. They affect different readings bydifferent amounts”. A further eight could (with moregenerosity) be interpreted as showing someunderstanding of the term. Four of these stated thatrandom error can be removed by repetition as in“random error can be worked out of the experiment byrepeating it several times”; these statements could beinterpreted as showing that the respondents understandthat replicate readings vary as a result of random error,and that the mean value is likely to approximate to thevalue that would be obtained in the absence of random

Page 25: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jane Tomlinson, Paul J Dyson and John Garratt

U.Chem.Ed., 2001, 5 20This journal is © the Royal Society of Chemistry

error. The other four were more difficult to interpret, asin “ random error only occurs in one mode on onevariable at one time, and its nature must change onrepetition so its location and magnitude can easily bedetermined”. Fourteen made the mistake of assumingthat random error occurs in a single or a small numberof results and one of these explicitly described randomerror as “a human mistake”. An example of this styleof response is “random error will not affect all resultsso a result due to random error will look out of place”.As stated above, the remaining six showed nounderstanding of the term.

Question 3. Under what circumstances would youdescribe an error as negligible, and a differencebetween two values as significant or insignificant?Specimen answer:Negligible: When the error is so small compared withthe value of the measurement in question that it doesnot affect the final result (enough for you to careabout).Significant difference: When a statistical(mathematical, objective) test shows that there is only asmall chance that the difference between two valuesarose by chance.

Sixteen responses showed understanding by makingsome comment to the effect that an error is negligiblewhen it does not have a (noticeable) effect on theoverall result (this includes two who related this towhether the overall result is “close to the expectedanswer”, thus drawing attention to the view of manystudents that analysis involves looking for an alreadyknown answer). Three of these sixteen specified apercentage error that would qualify as negligible, butmost made little attempt to describe how they wouldmake their judgement as in “if it does not affect thefinal result too much”. Eight other responses specifieda percentage error that would be regarded as negligible,but gave no indication of why this was negligible andwere therefore judged to be too simplistic to qualify asshowing useful understanding. Interestingly theestimates of what might be considered negligiblevaried from “several orders of magnitude less than thevalue” to “10%”, with the most common suggestionslying around 1%. One respondent firmly stated that “ noerror is negligible and should always be stated” – abelief with which we have some sympathy and areinclined to applaud, but it is a very inflexible attitude toapply to the real world of experimentation. Onesuggested that an error is negligible if it only affects asmall proportion of the results – reinforcing that somestudents regard errors as occasional events rather thanas an inevitable feature of measurement. Three moreused suspiciously similar wording to state that an erroris negligible “when the result is unaffected whether ornot it is included” (presumably thinking, as theprevious response indicated, that the ‘error’ occurred inone measurement out of a number). The remaining fourwere so confused as to defy analysis.

In this question we linked the concepts of ‘negligible’and ‘significance’ with the intention of drawing out thepoint that the latter is almost always concerned with adifference between two values (normally meanvalues), whereas the former (at least in this context)applies to the error in an individual value. In practicetwenty respondents addressed the question ofsignificant difference in a meaningful way but onlyfive of these showed real understanding of thisconcept, as in the statement “when a differencebetween two values can be explained by error, then itcan be regarded as insignificant”. Fourteen of themstated that a difference would be significant (orinsignificant) if it was greater (or less) than a specifiedpercentage of (one of) the values in question. Thepercentages suggested as a measure of a significantdifference varied from 1% to 5% (except for onestudent who did not give a general rule of thumb, butgave as an example that “the difference between11032.06 and 11032.91 is insignificant, but thatbetween 1.123 and 1.921 is significant”). One of themsimply stated that a difference is insignificant “when itwon’t affect the results”; we find it hard to assess thelevel of understanding that this represents. Theweakness of all twenty of these answers, which weclassified as showing some understanding, is that noneof them showed any awareness that the amount ofvariation in (or precision of) data is crucial in decidingwhether a difference is significant. Eleven respondentsdemonstrated that they had not grasped the key pointby failing to refer to a difference, and eight of thesespecifically referred either to a significant value or to asignificant error. One made no attempt to answer thispart of the question, and another stated (as an example)that the arithmetic difference between an atomicnumber and a molar mass is insignificant “because ithas no meaning” thus demonstrating at least someconcern for sensible handling of units!

Question 4. Can a qualitative procedure prove thata constituent is absent from a substance? And can aquantitative method be used to determine exactlyhow much of a constituent is present?Specimen answer:Qualitative procedure: Can only prove that somethingis below a certain level (determined by the sensitivityof the method).Quantitative procedure: Can only determine thequantity within the limits determined by the precisionof the method.

Only three students recognised the limitation that verysmall amounts (below the sensitivity of the procedure)cannot be detected by qualitative methods, and thatexperimental error prevents exact measurement. Inaddition the first limitation was recognised by four andthe by second three. This left twenty-three respondentswho replied affirmatively to both questions. It is truethat the wording of many responses might be regardedas ambiguous in a court of law; thus one of the

Page 26: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jane Tomlinson, Paul J Dyson and John Garratt

U.Chem.Ed., 2001, 5 21This journal is © the Royal Society of Chemistry

respondents did not state explicitly that a qualitativemethod can prove the absence of a substance, and onlynine explicitly stated that a quantitative method coulddetermine exactly how much is present. However,wording such as “a qualitative procedure is used todetermine what is in a substance”, or “ a quantitativetest can be used to determine the amount of asubstance” does not lead us to believe that therespondent was trying to suggest that the determinationis in doubt. Indeed we suggest that the reason whymore students specified ‘proof’ for qualitativeprocedures but not ‘exactly (how much)’ forquantitative was that their sentence structure did notrequire them to add the word ‘exactly’ to theirdescription of a quantitative procedure, whereas it washarder for them to avoid a word such as ‘proof’ whendescribing the testing of absence.

Question 5. How would you decide whether dataare correlated and when would you considertransforming data?Specimen answer:Correlated data: Examine a graph of x vs. y to seewhether there is evidence of a relationship(correlation).Transforming data: When the transformation converts anon-linear relationship into a linear one.

Twenty respondents indicated that correlations couldbe detected from a graph of the data, and a further sixsaid that they would be regarded as correlated if twovariables showed some kind of relationship (withoutspecifying how they would detect the relationship).Five said that data are correlated when they fit amathematical relationship. All of these show someunderstanding of correlation and only two did notanswer this part of the question. Of the twenty whoreferred to graphical representation of the data, onlyeight actually answered the question as set by statingthat their decision would involve inspecting a graph ofthe relevant data. The other twelve wrote their answermore in the form of a definition or a theoreticaldescription of correlation. One respondent accepteddata to be correlated “when they show a pattern ofincrease or decrease or constancy” (our emphasis)and four explicitly restricted the meaning of correlationto linear relationships, (though there is evidence thatothers shared this misunderstanding even though theydid not make it explicit). The wording of the responses(especially the unwillingness to deal with the question“how would you… ”) gave the impression that moststudents were more familiar with the collection of dataknown to be correlated than with the concept ofinvestigating whether data are correlated or not.

The question of transforming data gave a similarindication of the majority showing someunderstanding. Thirty-one referred in a wide variety ofways to the wish to show a relationship more clearly.Many made it clear that the main (or only) purpose was

to create a linear relationship from a non-linear one.However the range of answers included a number thatindicated more confused objectives almost certainlybased on misconceptions. For example threerespondents seem to believe that transformation canreveal a correlation which does not exist in the rawdata (e.g. “if there is no correlation, considertransforming the data to determine whether there isany correlation there”); it may be that these studentsbelong to the group who believe that correlationimplies a linear relationship, and several other forms ofwords suggest that this is the case for a number ofothers. Many students appear to believe that correlateddata can only be analysed when the relationship islinear (“a plot of T vs. P is useless, but a plot of lnT vs.1/P is useful”, or “data should be transformed to givemeaningful graphical data”), and this includes twowho explicitly stated their assumption that the accuracyof their results will be improved by working with alinear relationship. As with the first part of thisquestion, students were apparently reluctant topersonalise their answers by answering “when wouldyou consider… ”. Only two respondents gave noanswer to the question about deciding to transformdata.

DiscussionOur survey is based on a relatively small sample ofstudents. Nevertheless we believe that our sample issufficiently representative for us to draw usefulconclusions. This confidence is based largely on ourcomparison of the A-level grades of the respondentsand the cohort as a whole. For two of the concepts wetested we are also able to compare our data with theconclusions of Davidowitz et al.9 These authorsanalysed the reasons given by 135 second yearchemical engineering and science students for makingrepeat measurements. They concluded that 45% oftheir sample perceive the purpose to be either toidentify a recurring (correct) value (20%) or to perfectmeasuring skills (25%). This is in broad agreementwith our finding that students are more inclined to linkvariation in measurements with ‘mistakes’ than withrandom error (in the sense used in quantitativemeasurement). The same authors found that more thanhalf of their sample of students, when asked tocompare two sets of data, regarded the mean value asof much greater significance than the spread. This isalso consistent with our finding that none of oursample of first year students made reference to thespread of data when discussing significant differences.

Our data indicate that first year chemistry studentswould benefit from a considerably better understandingof the language used to deal with error in quantitativemeasurement. We believe that this is a matter forconcern since the handling of quantitative data is ofcrucial importance to the procedural understanding ofscience. We suggest that teaching in this area needs tobe radically rethought and restructured because the

Page 27: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jane Tomlinson, Paul J Dyson and John Garratt

U.Chem.Ed., 2001, 5 22This journal is © the Royal Society of Chemistry

problem is not simply one of impressing correct ideasabout errors on to a blank sheet of a student’s mind, butto reconstruct their misconceptions into matureunderstanding. The first step must be a careful analysisof the key concepts, which students need to understand.We do not claim that our list of five questions coversall of these key concepts. For example, with the benefitof hindsight, we recognise the value of directly probingthe students’ perception of the origin of variation inmeasurement. However our study illustrates the valueof covering qualitative aspects of the use of language(such as ‘negligible’) as well as rigorously definedterms (such as ‘random error’). Furthermore, thewording of our questions indicates the value of givingmeanings and understanding in operational terms like“how would you investigate… ” and “explain why… ”.

It seems unlikely that students will improve theirunderstanding through textbooks, even if they could bemotivated to read them, Our pessimism is based on ourinability to find guidance to our questions 3 – 5 in mostundergraduate texts. The Open University text on thissubject10 provides a rare example of dealing with theinappropriateness of judging the significance ofdifferences between values by reference only to a meanvalue and of the benefit of preferring a graph to a tableof data in order to discern a correlation betweenvariables. Even for our questions 1 and 2 the guidancegiven is often contradictory. For example Skoog et al.3define ‘precision’ as “the agreement between two ormore measurements that have been carried out inexactly the same fashion” thus suggesting that aprocedure capable of determining a value to only twosignificant figures is very ‘precise’ because it lacks theprecision needed to detect significant variation. Incomparison Atkins 11 defines ‘precise measurements’ ashaving “small random error” making the cardinalmistake of failing to specify that the error must besmall compared with the size of the measurement.Hanson et al.12 state that “the standard deviation is ameasure of the precision of the measurement”. Theygo on to use as an example a measure of the boilingpoint of water for which they quote a mean value of400.00K and a standard deviation of 0.0126. Incontrast the Open University 10 maintains that it is“rather silly” to quote a standard deviation to so manysignificant figures, and there certainly seems littlejustification for quoting a greater number of decimalpoints for the standard deviation than for the mean .

Given the lack of clarity and consistency in thetextbooks, most chemistry students will necessarilyrely on their course work for information about errorsand their treatment. Meester and Maskill reported thatmost laboratory manuals for first year chemistrycourses included some information about error analysisbut concluded that “although this indicates the greatimportance attached to it, generally speaking, erroranalysis was not a central feature of the courses”.13

We do not think the situation has changed significantly

in the ten years since the survey was conducted.Furthermore, we have no reason to suppose that thesections on errors in the laboratory manuals are anymore likely than the text book accounts to lead toeffective learning; our scepticism is based on anecdotalevidence suggesting that there is no consensus amongstacademics either about the correct usage of words andconcepts used to describe uncertainty in data or in thebest procedures available for interpreting experimentaldata. We are thus led to the conclusion that there is aneed for much careful thought about the best ways tomeet the Benchmark objective relating to datainterpretation.1 We do not believe that lecturing islikely to be effective because the misconceptions wehave documented are unlikely to be corrected by anaccount (however authoritative) of received wisdom;such accounts rarely involve active participation of thestudents. Even workshop activities often do little morethan introduce the students to routine exercises, whichdo not really engage their minds. The studentperceptions need to be actively challenged in such away that they reconstruct their own understanding. 7

We suggest that these operational learning outcomesare most likely to be achieved by regularly andpersistently challenging students, through theirlaboratory work, to discuss their data in terms of thedesired outcomes. Thus they could be required to giveevidence of the random error in their data, to indicateprecautions they have taken to avoid systematic error,to comment on the comparability of data collected bydifferent individuals (and whether differences aresignificant), and so on. Our proposal is that theseregular challenges should be fully integrated into thelaboratory course; there is little advantage in paying lipservice to the idea by requiring a bolt-on statementabout error at the end of a laboratory report. Thefamiliarity with the terms and concepts gained throughsuch regular usage is likely to lead the students torevise their own understanding through a deep learningprocess. We do not doubt that such learning could beusefully reinforced by structured class discussions andinteractive workshops or even by well-constructedlectures, but we suggest that the primary route forlearning should be frequent and explicit challenges touse the relevant words and concepts.

AcknowledgementsWe acknowledge financial support from the RoyalSociety for a University Research Fellowship to PJD,the LTSN Physical Sciences Centre for a project grantto JLT, and Glaxo Wellcome plc for support for CJG.

Page 28: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jane Tomlinson, Paul J Dyson and John Garratt

U.Chem.Ed., 2001, 5 23This journal is © the Royal Society of Chemistry

References

1 General guidelines for the academic review of

Bachelors Honours Degree Courses in Chemistry,Quality Assurance Agency, Gloucester, 1998.

2 M.J. Goedhart and A.H. Verdonk, J.Chem.Ed.,1991, 68, 1005.

3 D.A. Skoog, D.M. West and F.J. Holler,Fundamentals of Analytical Chemistry, 5th ed.,Saunders College Publishing, Fort Worth, 1996.

4 J.M. Hawkins, J. Weston and J.C. Swannell, TheOxford Study Dictionary, Oxford University Press,Oxford, 1992.

5 J. Garratt, A. Horn and J. Tomlinson, U.Chem.Ed.,2000, 4, 54.6 G.M. Bodner, J.Chem.Ed., 1986, 63, 873.7 K.S. Taber, U.Chem.Ed., 2000, 4, 63.8 J. Garratt, D. Clow, A. Hodgson and J. Tomlinson,

Chemistry Education Review, 1999, 14, 51.9 B. Davidowitz, F. Lubben and M. Rollnick,

J.Chem.Ed., 2001, 78, 247.10 The science foundation team, The handling of

experimental data, Open University Press, MiltonKeynes, 1982.

11 P.W. Atkins and J.A. Beran, General Chemistry, WH Freeman, New York, 1992.

12 J.R. Hanson, J.I. Hoppe and W.H. Prichard, A guideto the reporting of practical work and projects inchemistry, Royal Society of Chemistry, London,1995.

13 M.A.M. Meester and R. Maskill, First yearpractical classes in undergraduate chemistrycourses in England and Wales, Royal Society ofChemistry, London, 1993.

Page 29: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

William Byers

U.Chem.Ed., 2001, 5 24This journal is © the Royal Society of Chemistry

Using questions to promote active learning in lectures

___________________________________________________________________________

William Byers

Faculty of Science, University of Ulster, BT37 0QB.E-mail: [email protected]

The first key to wisdom is constant questioning...By doubting we are led to enquiry, and by enquiry we discern the truth.

Peter Abelard (1079-1142)

An attempt has been made to remedy some of the deficiencies of the traditional didactic lecture by enhancing studentinvolvement and learning through the use of focussed questioning within the lecture format. The potential benefits ofquestioning are considered and the effectiveness of the approach is evaluated through classroom observations, peerobservation, an end of module questionnaire and student discussions. Some limitations of the approach are identified andsuggestions for future improvements are made. The paper concludes with a brief consideration of the importance ofthinking time to the promotion of meaningful learning.

Introduction30 years ago when I started teaching I believed that I hadknowledge to impart and that the better I taught the moremy students would learn. When I, like many others,1 cameto realise that what my students were learning was notalways what I was trying to teach them, I tried to teachbetter. What I then found, however, was that the better Itaught the better my teaching was rated by students but not,alas, the better they learned. It was only when Iencountered constructivism2, 3, 4 and Alex Johnstone’sInformation Processing Model of Learning (Figure 1)5, 6

that I started to think about the learner and realised that Ineeded to teach not just better but differently. Knowledge,alas, can’t simply be transferred from the teacher to thelearner, much though we might wish that it were otherwise,but meaning must be constructed in the mind of thelearner.2 I see an analogy with digestion where even for a

cannibal , ingested proteins are not incorporated directlyinto body structures but rather are broken down beforebeing reassembled into useful biomolecules. Learninginvolves the linking and interpretation of incominginformation with what is already known by an individual. As we all have different stores of knowledge in our long-term memories (Figure 1) we may all interpret incominginformation differently.

If we look into the black box between teaching andlearning it seems reasonable that where new informationcan be satisfactorily linked to pre-existing knowledge andinterpreted this will be likely to happen. Piaget2, 7 referredto this process as assimilation and under low resolution itis indistinguishable from simple information transfer. However, this will not always be the case, particularlythroughout the education process; confusion

Communication

Events Observations Instructions

Perc

eptio

n fil

ter Interpreting

Rearranging Comparing Storage Preparation

Storage… sometimes branched, sometimes as separate fragments

Working Memory

Storing

Retrieving

Feedback loop for perception filter

Long term memory

Figure 1. Information processing model for learning

Page 30: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

William Byers

U.Chem.Ed., 2001, 5 25This journal is © the Royal Society of Chemistry

(disequilibrium) will occur whenever new informationcan’t readily be assimilated into existing schemes. Whathappens now? Such input can simply be ignored, and it ismy belief that this is exactly what our students areincreasingly choosing to do, it can be linked andinterpreted incorrectly leading to so called alterativeframeworks8 or the learner may resort to rote memorisationand try to save the information without linking it to existingknowledge. None of these, of course, represents anacceptable learning outcome from the perspective of theteacher. The possibility also exists however, that thelearner will modify the pre-existing schemes until anydiscrepancies can be resolved. Piaget referred to thisprocess of modifying and developing pre-existing schemesas accommodation.2,7 Education surely involves learningnot just more but better and I believe that the promotionand facilitation of accommodation is of pivotal importancein this.9 The better we organise information and link it towhat we already know, the more easily it is likely to berecalled and applied to new situations in the future. However, Herron recently postulated that learning operateson a principle of minimum effort.10 This suggests thatwhenever possible the learner will resist restructuringcognitive schemes, preferring to ignore data that don’t fit,or to make false connections. Only when the learnerbecomes really dissatisfied with existing structure willmodification start to occur. One can go so far as to suggestthat a bad lecture, whatever that may be, followed by peergroup discussion or outside reading represents a muchricher learning experience than a good lecture, whateverthat may be, with minimum student follow-up. Anythingwhich produces active involvement of the learner istherefore likely to enhance the quantity and in particular thequality of learning.

Research by Anderson in 1980 found that questionsinterspersed in text were amongst the most effective aids tohelp understanding.11 I therefore decided to use focussedquestioning to try to promote learning within a lectureformat. The aim of questioning was not to assess currentknowledge or understanding; though undoubtedly somemisconceptions, which need to be rectified quickly, will beidentified. Rather, the questions sought to promote activelearning through the stimulation of thinking and thecreation of disequilibrium. Questions can contributepositively in at least four ways:

(a) They may promote both variety in the presentation andmore active student involvement during lectures.

(b) They may stimulate learning by relating newinformation to knowledge already stored in long-termmemory (Figure 1).

(c) They may help to identify what is particularly importantto concentrate on, i.e. they may ‘tune the filter’ (Figure1).

(d) They may indicate to the learner where further width ordepth of knowledge is needed and help stimulatethought by generating disequilibrium in the learner’smind. Over the years many unsuccessful students havetold me that they had expected to do much better in my

examinations than they did and that they hadunderstood the material when I covered it in mylectures.12 They hadn’t of course, but like passengersin a car who see no problems with the journey becausethe driver knows the way, they are incapable ofretracing the journey on their own at some future date. It therefore seemed particularly important to breakthrough this complacency.

A question that contributes positively in any of these wayscan be considered successful, but I was particularlyinterested in promoting learning through forcing studentsto reassess their current knowledge and understanding bycreating dissatisfaction with current thinking.

The StudyA 12 week, one hour a week sequence of lectures onbioinorganic chemistry was used to evaluate this approach. The lectures, supported by 12 hours of associatedlaboratory time, represented one third of a final yearhonours degree module. The module was taken by 36 full-time students on the BSc (Hons) Applied BiochemicalSciences degree and by 10 part-time students studying forChemistry or Life Sciences degrees. All students hadpreviously studied some biochemistry as well as chemistrymodules, all were well known to me and all but three hadbeen taught by me previously. I started the course with abrief introduction to my ideas about how meaningfullearning can take place and to metacognition.13 Theimportance of active engagement with new information andparticipation in class activities was stressed. This occupiedabout 20 minutes at the start of the first lecture. It wasgratifying to note subsequently, that during informaldiscussion with several students in individual studiesadvice sessions, students both understood and appeared tosupport the approach being taken.

Each lecture commenced with a brief synopsis of what wasto be covered and one Big Question that would bedeveloped and considered during the course of the lecture.The aim of these questions was to prompt students to thinkabout a key problem. Examples of such questions included:Why are only certain elements used by life? and How areion gradients produced and maintained in the body? Asignificant number of questions, about ten each lecture,were also posed during the lectures. These were eithertargeted at individuals or the class as a whole but were tobe answered directly. Research has clearly identified thatwaiting time must be adequate if questioning in theclassroom is to be productive.14 Care was therefore takento ensure that adequate time was available and multiplecontributions from the class were encouraged. Examplesof questions used included: (1) What is a Lewis acid? (2)Given the solubility product of Fe(OH)3 as 10-38 what isthe maximum concentration of free Fe3+ available inaqueous media at pH7? and (3) If copper is the catalyticsite what is the likely role of zinc in superoxidedismutase?

Page 31: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

William Byers

U.Chem.Ed., 2001, 5 26This journal is © the Royal Society of Chemistry

While the first of these questions does appear to requireonly simple factual recall, its purpose is not to find out thatmost students on hearing the word acid immediately thinkof hydrogen ions. Rather it is used to alert the individualthat knowledge in long-term memory (Figure 1), whichrelates to polarisation will be needed. Lewis acidity is amajor and recurring theme on the course and relatedquestions used in subsequent lectures included: Why aremetal ions needed when the proton is so effective as apolarising cation? When would ‘life’ choose to use Mg2+

rather than Zn2+ as a Lewis acid catalyst? and Why doyou think ‘life’ chooses Zn2+ rather than Cu2+ as a Lewisacid catalyst?

Question 2 is more complex, requiring both conceptualunderstanding and a ‘back of the envelope’ calculation. Given time most students should have been able to solvethis problem, but in the lecture context progress was onlymade after some prompting on how best to proceed. Inmany ways the final question might be considered the mostdemanding of the three. Here some students had littledifficulty in deciding that the most likely role would bestructural. Although this question predated any detaileddiscussion of the biological roles of zinc, a generalconsideration of roles undertaken by metal ions hadpreviously taken place.

In their own way each of these three very differentquestions can be thought to show that what is beingpresented is more complex than it might at first appear andhence create a learning opportunity.

Four approaches were used to evaluate the effectiveness ofthe approach just described:

(a) Classroom observations.(b) Peer observation in week 10.(c) An end of module questionnaire.(d) A group discussion with three students (four were

invited but one was unable to attend) conducted inweek 10.

Results(a) Classroom Observations

Having explained the importance of active participationat the onset, I had hoped that willingness to contributewould improve rapidly once students became familiarwith the process. However, while five or six studentswere regularly prepared to contribute, the vast majorityavoided answering unless directly challenged. Thelullaby effect15 was apparent with many answers beingshallow and not indicative of deep thinking. Forexample, to the question “Why is lead not an essentialelement?” the answer “because it is toxic” seemed tobe accepted by all the students. Only after I pointed outthat oxygen had been extremely toxic to the earliestforms of life did the class appear to be willing torefocus on cause and effect. Although the level ofstudent contributions fell below what I had hoped for,perhaps it was unreasonable to expect to obtain

evidence of meaningful learning concurrent with theteaching (vide infra). As an optimist, I can still hopethat the stimulus/disequilibrium resulting from thequestions may still initiate active engagement with theinformation over a more appropriate time scale.

(b) Peer ObservationAs it was the process rather than the content that wasof interest, I decided against using a fellow chemist andasked the Coordinator of Learning and Teaching for theFaculty of Business and Management if she would sitin and observe one of my teaching sessions. We metsome 15 minutes prior to the lecture and I briefed heron what I was trying to achieve in the session; shemade a series of notes throughout the lecture and wemet up for a debriefing session later on the same day.Although much good practice was identified, the keyobservation as far as I was concerned was that Ieventually answered all the questions myself. Thestudents knew that I was going to do this and werehappy to wait for my answers.

(c) Student QuestionnaireThe questionnaire asked students to assess thehelpfulness of six aspects of the teaching on a six-pointscale and then invited free responses relating to the bestaspects of the teaching, the worst aspects of theteaching and any suggestions as to how teaching mightbe improved. The questionnaire was handed out at theend of the last lecture. One student was asked tocollect and return the completed questionnaires to me;34 were subsequently returned. The questionnaire andresponses to the six Likert-scale questions are shownin the Appendix.

All six aspects of the teaching, which were evaluated,appeared to be well supported by the class with asignificant majority assigning one of the two top gradesfor each feature. The course booklet, which containedgaps (many of which related to the questioning) to befilled in during the lecture, received outstanding ratingsfrom students returning the questionnaire. They werefamiliar with the use of structured incomplete handoutsas I use this approach throughout my teaching.16

Although no textbook was recommended, studentswere informed that any modern inorganic text waslikely to contain a relevant chapter well worth reading.16 references to original papers were provided and thelecture to which each related, was indicated. Fivecomplete compilations of the 16 references were madeavailable to the class to be used and shared throughoutthe semester. Several students chose to make their owncopies. My discussion on how learning takes place wasalso generally well received and, as noted above,discussion with students led me to believe that theyboth understood and supported the ideas outlined.

The usefulness of both the Big Question and frequentquestioning appeared to be highly rated. Hopefully

Page 32: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

William Byers

U.Chem.Ed., 2001, 5 27This journal is © the Royal Society of Chemistry

this suggests that even where students were reluctant tovoice their opinions they were still thinking about thequestions. The free response questions (7, 8 and 9) didprovide some interesting information. All respondentsidentified a best aspect of the teaching. A clearmajority (19 students) identified the course booklet,which is primarily a simple information transfertechnique, as the best aspect of the teaching. Only fourstudents out of 34 returned questionnaires opted for thequestioning. Few students listed any worst aspectsthough two suggested that there was a lot of material tocover and this led to things being rushed. A furthertwo suggested that starting the lecture at 9.15 onMonday mornings was the worst aspect. Only thefollowing suggestions for improvement were made:

(i) More marks for coursework.

(ii) Review answers to past examination papers toenable students to know what will be required.

(iii) Supplement references with appropriate webpage addresses.

While these suggestions all seem reasonable, (i) and(ii) provide further evidence for the assessment-drivenmotivation for learning which we continue toencounter17 and (iii) would be more justified if therewas evidence that the 16 references provided had beenwell studied.

(d) Group DiscussionA number of issues, mirroring the questionnaire, whichI wanted discussed were considered by three of thestudents. I was there introducing the topics but I tookno part in the discussion. The students talked abouteach issue for some minutes and one student wrote asummary of the consensus view. The group consideredthat the discussion on learning was a good way to startthe course and was useful because it prompted studentsto think about how they learned. The students thoughtthat the introduction of frequent questions duringlectures was beneficial because it helped them torealise how well, or how little, they understood thetopics being discussed. The Big Questions were alsoconsidered useful because they helped to unify eachlecture. It was, however, suggested that more activediscussion of these questions would help.

Reflective DiscussionThough the approach appeared to attract widespreadstudent support, it clearly did not produce the increasedlevels of student participation that I had hoped for. It istempting to suggest that the assessment-drivenmotivation which directs the behaviour of most studentsprobably means that they did not want to answer myquestions, they merely wanted to know my answers. However, perhaps on reflection, my aims were ratherambitious. A majority of students appear likely, initially,to resist any innovative approach to teaching,18 so anattempt to introduce the questioning approach was

unlikely to meet instant success. Clearly, however, ifprogress is to be made students must be coerced intocontributing more effort towards developing their ownanswers and hence, enhancing their knowledge creation. Unfortunately it seems likely, as recently suggested byBahor et. al.19 that learning does not occursimultaneously with but after the teaching. This suggeststhat more success might be encountered if students wererequired to answer the questions at some time in thefuture. I have in fact tried to finish lectures with aquestion, which the class will be required to answer atthe start of the next lecture but it was evident that only afew students thought about these questions in themeantime. I believe that I have had success with the useof buzz groups12 but this is a very time demandingprocess and thus has to be used sparingly. A recentpaper by Hutchinson20 suggests that awarding somemarks for participation will encourage interaction.However, I suspect that Hutchinson’s success canprobably be attributed to the fact that their students arerequired to study appropriate chapters before coming tothe classes. As this approach was common throughoutthe general chemistry teaching programme, the studentsappeared comfortable with the requirements andcomplied with them. My use of questioning, however,differed from what students encountered in other lecturesand more familiarity with the approach is probablyneeded before progress could be expected. Any attemptto promote more interactive learning will not bestraightforward and, unless innovations are introducedwith care, may even be counterproductive. I recentlyheard of a Management module in the third year of anengineering degree where the lecturer asked students toprepare information for oral presentation to the nextclass. The next class was attended by only 11 of the 110enrolled students.

The use of student questionnaires for both teaching qualityassessments and the evaluation of teaching innovations isnow widespread. The present study supports my ownexperiences over many years that these questionnaires needto be interpreted with caution, particularly the quantitativeaspects. I believe that many unwarranted conclusionscontinue to be drawn from the indiscriminate use of suchdata.

Peer observation was employed as an assessment toolalmost as an afterthought, yet this clearly provided anextremely useful insight into what was actually happening.Perhaps I should have seen what was happening, perhapsI eventually would have; it was certainly clear once thesuggestion was made. The experience certainly convincedme that we could all benefit from sympathetic andconstructive peer observation and support.

Throughout the course there was constant conflict betweentime required for questioning and the demands of thecurriculum. I can only agree with the two students whosuggested that the course was rushed in parts. It seemscertain that increased student interaction will require moretime than simple didactic teaching. It is therefore perhaps

Page 33: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

William Byers

U.Chem.Ed., 2001, 5 28This journal is © the Royal Society of Chemistry

instructive to consider the importance of time to learning.

Some Thoughts on the Relevance of Time to theLearning ProcessIt is clear from earlier comments that meaningful learningrequires effort and is therefore likely also to require time.However, the information-processing model (Figure 1)which has been so useful to understanding how we learn,is, at least in the way that I have viewed it, a timeindependent model. Yet time is clearly a key variable indetermining the quality of learning that can take place. It issurely much easier and therefore quicker merely to transferinformation to the learner than for meaningful learning tooccur. In fact I suggest that getting the information into themind of the learner is really the first step towardsmeaningful learning. This corresponds to what is usuallycalled rote memorising or surface learning. Time andeffort are then required to link, interpret, possibly correct(if initially misconstrued) and then accommodate this newinformation to produce deep or meaningful learning. So, farfrom being alternatives, rote memorising and meaningfullearning may be considered as different stages within thelearning process (Figure 2). The second step requires botheffort from the learner and time for meaningful learning todevelop. The model is clearly consistent with recentsuggestions that teaching less, i.e. reducing the rate ofinformation transfer, can actually lead to more learningtaking place.21, 22, 23, 24 The model thus predicts that wherenew concepts are being taught, sufficient time as well as

effort is required to enhance cognitive schemes throughaccommodation, and therefore questions the pedagogicalsoundness of the recent move towards wide-spreadsemesterisation in UK universities. Students all too oftentreat each module as an isolated unit and do not have, or donot take, the time to reflect on what they memorised. Itwould indeed be tragic if current benchmarking exercisesserved to increase curricula rather than to embrace andscaffold learning outcomes.

AcknowledgementsThe author wishes to thank Kate Greenan for hersupportive and insightful comments during the peerobservation process, John Garratt for his constructive andencouraging comments on the original manuscript andDavid Ruddick for many informative discussions.

Information Transfer

Rote memorising Teaching

FAST

Knowledge construction

SLOW r.d.s.

Meaningful learning

Figure 2 . Model showing the time dependence of learning

Page 34: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

William Byers

U.Chem.Ed., 2001, 5 29This journal is © the Royal Society of Chemistry

Appendix

Bioinorganic Chemistry

Student Opinions on Teaching Approach

Please indicate by ticking appropriate box how helpful you have found each of the following features of the teaching(stating with 0 to indicate useless rising to 5 where you would consider the feature indispensable).

0 1 2 3 4 5

1. Course booklet

2. Recommended references

3. Discussion on how learningtakes place

4. Big question

5. Frequent questionsduring lectures

6. Prelab session

Number of responses for each option were as shown above

7. Please indicate what you consider to be the best aspects of the teaching.

8. Please indicate what you consider to be the worst aspects of the teaching.

9. Please indicate how you believe the teaching could be improved for you.

9 25

1 5 10 11 7

2 3 10 13 4

1 1 13 13 6

1 10 15 8

4 4 6 13 3

Page 35: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

William Byers

U.Chem.Ed., 2001, 5 30This journal is © the Royal Society of Chemistry

References

1 C.J. Garratt, U.Chem.Ed.,1997, 1, 192 G.M. Bodner, J.Chem.Ed., 1986, 63, 873.3 L.B. Resnick, Science, 1983, 220, 477.4 .N. Spencer, J.Chem.Ed., 1999, 76, 566.5 A.H. Johnstone, J.Chem.Ed., 1997, 74 262.6 A.H. Johnstone, Educ.Chem., 1999, 36, 45.7 R. Good, E.K. Mellon and R.A. Kromhout,

J.Chem.Ed., 1978, 55, 688.8 K.S. Taber, Educ.Chem., 1999, 36, 135.9 G.J.Posner, K.A. Strike, P.W. Hewson and W.A.

Gertzog, Science Education 1982, 66, 211.10 D. Herron, The Chemistry Classroom: Formulas for

successful teaching, American Chemical Society,Washington, DC, 1996, p.18.

11 D. Herron, The Chemistry Classroom: Formulas forsuccessful teaching, American Chemical SocietyWashington, DC, 1996, p.21.

12 W. Byers, Don’t Throw the Baby out with theBathwater, Proceedings of the 4th EuropeanConference on Research in Chemical Education,Univ. of York, 1997 p.52.

13 D. Rickey, and A.M. Stacy, J.Chem.Ed., 2000, 77,915.

14 M.B. Rowe, J.Res.Sci.Teach., 1974, 11, 81.15 M.L. Bigge and S. Shermis, Learning Theories for

Teachers, 6th Ed., Longman, New York, London,1999, p.285.

16 W. Byers, Proceedings of Variety in ChemistryTeaching 1995, ed.-s J. Garratt and T. Overton,Royal Society of Chemistry, London.

17 A.H. Johnstone, Educ.Chem., 2000, 37, 12.18 R.F. Orzechowski, J.Col.Sci.Teach., 1995, 347.19 M. Bahar, A.H. Johnstone and R.G. Sutcliffe,

J.Biol.Ed., 1999, 33, 134.20 J.S. Hutchinson, U.Chem.Ed., 2000, 4, 1.21 F. Garafalo and V. Lo Presti, J.Chem.Ed., 1993, 70,

353.22 C.J. Garratt, U.Chem.Ed., 1998, 2, 29.23 A.B. Arons, J.Col.Sci.Teach., 1980, 10, 81.24 A.H. Johnstone and W.Y.Su, Educ.Chem., 1994, 31,

75.

Page 36: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

George M Bodner

U.Chem.Ed., 2001, 5 31This journal is © Royal Society of Chemistry

Why Lecture Demonstrations Are ‘Exocharmic’ For Both StudentsAnd Their Instructors

____________________________________________________________________________________________

George M. Bodner,

Department of Chemistry, Purdue University, West Lafayette, IN 47907, USAE-mail: [email protected]

A theoretical model is proposed to explain why lecture demonstrations are often popular among both students andtheir instructors. This model provides hints about selecting demonstrations that are most likely to enhance thelearning of chemistry. It also suggests ways in which demonstrations can be used more effectively.

IntroductionThere are many reasons for doing lecturedemonstrations.• They are fun to do.• Students like them.• They grab the students’ attention.• They provide breaks that help students recover

from the deluge of information in a typical class.• They provide concrete examples of abstract

concepts.• Most importantly, they can teach chemistry.

Demonstrations are so attractive they are sometimesdone under conditions where neither the students northe instructors are adequately protected againstinjury. In an earlier paper we collected examples ofaccidents and near accidents that might remindchemists of the need to pay more attention to safetywhen doing demonstrations, hopefully withoutfrightening them away from demonstrations. 1

Some demonstrations are so much fun for both thestudents and their instructors that the termexocharmic has been used to describe thesedemonstrations that are so inherently fascinatingthey “exude charm”. 2 The thermite reaction mightbe an example of an exocharmic demonstration.

Fe2O3(s) + 2 Al(s) → Al2O3(s) + 2 Fe(l)

In a lecture manual developed for use at PurdueUniversity, we describe various ways in which thispopular demonstration can be used. 3 We argue thatit can be used as the basis for discussions of the

chemistry of the elements; to demonstrate what wemean by the term exothermic; to convince studentsthat aluminum is not ‘inert’, regardless of theirexperience with sandwiches wrapped in aluminiumfoil; or to help students develop an appreciation ofwhat we mean when say that a reaction gives offapproximately 800 kJ/mol.

This paper is based on the assumption that none ofthese uses satisfactorily explains the enormousattraction that demonstrations of the thermitereaction have for both students and their instructors.Furthermore, it assumes that it might be useful tounderstand the fascination of ‘exocharmic’demonstrations such as the thermite reaction, so thatdemonstrations can be used more effectively to teachchemistry. It therefore proposes a model based on atheory of motivation, which assumes that thesedemonstrations fall into the category of phenomenaknown as discrepant events.

A Theory Of MotivationWhen I was a student, the most hated words in theEnglish language were ‘intuitively obvious’ becausethey were invariably used to describe things thatwere never obvious to me. When I became a teacher,the most hated words became: “Is this going to be onthe exam?” Often, but not always, students’ use ofthis phrase stems from their questioning the value ofmaterial we ask them to learn because they don’tthink it is important. We’ve seen this behavior withstudents of all ages, from elementary school throughthe final stages of graduate work. It is frequently asign of the instructor’s failure to motivate the

Perspective

Page 37: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

George M Bodner

U.Chem.Ed., 2001, 5 32This journal is © the Royal Society of Chemistry

students to want to learn.

Motivation is a complex topic. 4 One aspect ofmotivation, however, can be understood in terms ofthe theory of optimal arousal. 5 This theory assumesthat we try to attain a state in which we experiencesome arousal of our senses — not too little, nor toomuch. At times, we devote considerable effort toraising our arousal level by reading exciting stories;by going to scary movies; by riding roller-coasters;and so on. Other times, we escape situations wheretoo much is happening to seek peace and quiet. Itisn’t just the frequency and intensity of the input oursenses receive that determines arousal. Th e contentof this sensory input is also important. Considerwhat would happen if you put down that fascinatingdocument on household contents insurance andpicked up a novel you found interesting. Therewould be no change in the frequency or intensity ofthe input your senses would receive. (No more lightwould reach your eyes, for example.) But it is likelythat there would be a change in your level of arousal.

What makes us respond to an object or event isn ’tthe physical input of our senses as such, but adifference between what we experience and what weexpect. In other words, we respond to situations thathave an element of surprise.

We tend to like mild surprises, not severe ones. Ifthere is no surprise, there is too little arousal and wefeel bored. If there is too much surprise, we feelshocked and disoriented. With due apologies to theauthor of the story of Goldilocks and the ThreeBears, this theory assumes that we avoid extremes ofeither too much or too little surprise, and tendtoward an intermediate stage in which the amount ofsurprise is ‘just right.’

There is abundant evidence that we becomehabituated to events that occur in a regular scheduleto the point that we ignore them. (We no longerrespond when the event occurs.) One of the mostcommon occurrences of this phenomenon in theclassroom involves rhetorical questions. It has beenshown that many teachers fail to give studentsenough time to develop answers to questions theyask.6 The students soon become habituated to theteacher’s tendency to ask questions for whichanswers aren’t expected — rhetorical questions —

and from that point on, they don ’t even notice thatquestions are asked.

Educators have long recognized the role of theunexpected in motivating students. 7 Curiosity, forexample, has been shown to be an importantcomponent of learning, particularly amongchildren. 8 But what is curiosity if not a drive toinvestigate and understand situations that evokesurprise? Individuals in all age groups show amarked preference for objects or situations that arenovel; that have an element of surprise orincongruity; that generate uncertainty. One of thesimplest ways of introducing surprise into theclassroom — and therefore take advantage ofstudents’ natural curiosity — is through aphenomenon known as a discrepant event.

Discrepant EventsDiscrepant events have two characteristic properties:They are contrary to what we intuitively expect andthey are events we experience for ourselves. Beingtold something that is counterintuitive doesn ’tconstitute a discrepant event because we can resolvethe conflict between what we hear and what weexpect by questioning the validity of what we aretold. This is harder to do when we observe the eventourselves.

The thermite reaction can be a discrepant event.Students know that chemical reactions give offenergy. (They might even know how to calculate theamount of energy liberated.) But the magnitude ofthe energy given off in this demonstration and thespeed with which it is liberated are counterintuitive.Even those of us who should know better are stillsurprised by the vigor with which two seemingly‘inert’ solids react.

Young children are often surprised by iodine clockreactions9 when they first encounter them because ofthe speed with which the solution turns fromcolorless to deep blue. The ‘Old Nassau’demonstration9 — in which the solution first turnsorange and then black — is even more surprisingbecause students don't expect the contents of abeaker to change color twice. Oscillating clockreactions,10 however, are better examples ofdiscrepant events. Regardless of the extent to whichstudents have been exposed to the concepts of

Page 38: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

George M Bodner

U.Chem.Ed., 2001, 5 33This journal is © the Royal Society of Chemistry

reversible reactions, equilibria, kinetics,thermodynamics and so on, there is nothing in theirprior experience that prepares them for a reactionthat cycles between states in which the solution iscolorless, then gold, and then blue.

Implications Of This Model Of ExocharmicDemonstrationsThe notion that some of the fascination of lecturedemonstrations results from the fact that they may bediscrepant events has an important implication:Demonstrations don't have to be spectacular to beeffective. They should, however, contain an elementof the unexpected.

Let me offer an example, from my own experience.When I took chemistry for the first time, I was toldthat equal volumes of different gases contained thesame number of particles. Until I took physics, thiswas the most absurd thing I had heard a teacherclaim to be true. I knew that gases contained empty

space, but I seriously underestimated the fraction ofthe space that is empty. It therefore seemedreasonable to expect that equal numbers of gasparticles of different size would occupy differentamounts of space. I now know I was in goodcompany; Dalton rejected Gay-Lussac's data oncombining volumes for the same reason. To me, andto many of my contemporaries, Avogadro'shypothesis was just as counterintuitive as it was toJohn Dalton.

I am reasonably confident that I could have statedAvogadro’s hypothesis, if asked to do so on an exam.I am equally confident that I couldn ’t have usedAvogadro’s hypothesis to solve a problem because Ididn’t really believe it to be true.

About 15 years ago, I learned a lecturedemonstration that provides a discrepant event that

confronts the intuitive model of gases I brought tomy first chemistry course. 11 Start with a plastic 50-mL Leur-lok syringe, a syringe cap, and a 10-pennynail. Pull the plunger out of the barrel until thevolume reads 50 mL. Now drill a small holethrough one of the veins of the plunger into whichthe nail can be inserted, as shown in Figure 1.

Push in the plunger until no gas remains in thesyringe, seal the syringe with a syringe cap, pull theplunger back out of the barrel of the syringe, insertthe nail into the hole, and weigh the ‘empty’ syringeto the nearest 0.001 grams with an analyticalbalance. Fill the syringe with different gases ∗ anddetermine the weight of 50 mL of each gas. Now usethe molar mass of each gas to calculate the numberof gas particles in each sample.

Typical data obtained with this apparatus are givenin Table 1. Within experimental error, the numberof gas particles in each sample is the same. It mightstill seem strange that equal volumes of differentgases contain the same number of particles, but it isno longer possible to avoid this conclusion.Although this demonstration isn ’t as spectacular asthe thermite reaction, or one of the oscillatingclocks, it can still be exocharmic because it containsan element of surprise for many students.

Some demonstrations, such as the hydrogenwhistle,12 are such excellent sources of surprise that

∗ A simple way to handle gases for lecture demonstrations starts witha 1 or 2 inch length of ¾-inch diameter plastic tube. Plug one end ofthe tube with a rubber septum and secure the septum to the tube withcopper wire. Fill a balloon with the appropriate gas from a gascylinder and insert the open end of the plastic tube into the mouth ofthe balloon. A sample of the gas can now be collected with a syringeby inserting the syringe needle through the rubber septum

Figure 1

gas Weight of50 ml of gas

Number of particlesIn 50 ml of gas

H2 0.005 g 1 x 1021

N2 0.055 g 1.2 x 1021

O2 0.061 g 1.2 x 1021

Ar 0.081 g 1.2 x 1021

CO2 0.088 g 1.2 x 1021

C4H10 0.111 g 1.15 x 1021

CCl2F2 0.228 g 1.14 x 1021

Table 1

Page 39: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

George M Bodner

U.Chem.Ed., 2001, 5 34This journal is © the Royal Society of Chemistry

nothing has to be done to enhance their status asdiscrepant events. The hydrogen whistle is based onthe apparatus in Figure 2, which consists of a pair ofmetal funnels welded together so that there is a smallhole at the top and a somewhat larger hole in thebottom. The apparatus is filled with H 2, the hole atthe top is plugged with a match, and a rubber stopperis used to close the hole at the bottom. The lights arethen dimmed, the stopper and match are removed,and the match is used to ignite the H 2 that escapesthrough the hole at the top. Attention is drawn tothe small flame at the top of the apparatus and thestudents are told to listen carefully. As the H 2 isconsumed, air rushing in through the hole in thebottom makes the apparatus vibrate, and a clear‘whistle’ can be heard.

The frequency of the whistle changes with time, asthe average molecular weight of the gas in theapparatus increases. With the lights dimmed, andthe students paying careful attention to the change inthe frequency of the low-intensity whistle theapparatus emits, the demonstration becomes‘striking ’ when the gas in the container reaches oneof the explosive mixtures of H 2 and O2 — and a louddetonation is heard as a flame shoots out of thebottom of the apparatus.

Demonstrations And The Theory Of CognitiveChangeSome might argue that demonstrations, bythemselves, are sufficiently powerful as a teachingdevice that all we have to worry about is simplydoing them. Sarason, however, suggested thefollowing rule for curriculum development orcurriculum reform: “A good idea, whose time hascome, is no guarantee of success”.13 We’d like topropose a corollary to Sarason’s rule: An idealdemonstration, done ‘properly’, is no guarantee thatstudents will learn what we thought wedemonstrated.

Instead of having the students play the role ofpassive observers of a demonstration, we might usethe demonstration as the basis of a phenomenon thatWhite an d Gunstone describe as a POE task — fromPrediction, Observation, and Explanation.14 Thefirst step in a POE task is to ask students to predictthe outcome of some event, such as what mighthappen during a demonstration. They are then

asked to describe what they observe, and finallyasked to reconcile any conflict between what theypredict and what they observe. Much has beenwritten in recent years about the misconceptionsstudents bring to chemistry 15 and the fact that thesemisconceptions are difficult to change. 16

Demonstrations, by themselves, won ’t overcomemisconceptions, but they can provide the basis onwhich conceptual change is built.

Strike and Posner17 have proposed a model ofconceptual change that begins when students becomedissatisfied with their present concept. They arguethat dissatisfaction is necessary for conceptualchange to occur, but not sufficient to induce thechange. The student must understand the newconcept they have been asked to learn. The newconcept must also seem plausible to the students.And, the new concept must seem fruitful — it mustseem worth learning. The demonstration ofAvogadro’s hypothesis in Figure 1 provides the basisfor the first step in this model, the stage at whichstudents begin to question the conceptualunderstanding they bring to the course.

ConclusionIf you accept the arguments in this paper, you canthink about demonstrations in terms of the followingguidelines.• There is no evidence that students learn from

demonstrations, by themselves.• There is some evidence that students remember

the visual images of a demonstration long after

Figure 2

Page 40: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

George M Bodner

U.Chem.Ed., 2001, 5 35This journal is © the Royal Society of Chemistry

they forget the words.• Good demonstrations provide a basis on which

learning can be built.• Demonstrations don't have to be spectacular —

or dangerous — to be useful.• Demonstrations that contain an element of

surprise, which don't behave the way studentsmight expect, are often the most charming.

• Demonstrations that are the most charming arethose that students remember.

• Demonstrations that are charming can thereforefacilitate both the learning of chemistry and theretention of this knowledge.

• Demonstrations that students find ‘exocharmic’might therefore be those that best teachchemistry.

You might also note that the spectacular (and oftendangerous!) demonstrations that attract somestudents to chemistry might be driving others awayby giving students an unrealistic image of whatchemists do.

References 1 G.M. Bodner, J.Chem.Ed., 1985, 62, 1105.2 R.W. Ramette, J.Chem.Ed., 1980, 57, 68.3 G.M. Bodner, P.E. Smith, K.L. Keyes and T.J.

Greenbowe, The Purdue Lecture DemonstrationHandbook, 6th Edition. West Lafayette, IN,1997.

4 R.J. Ward and G.M. Bodner, J.Chem.Ed., 1993,70, 198.

5 D.G. Mook, Motivation: The Organization ofAction, W. W. Norton, New York, 1987.

6 M.B. Rowe, Teaching Science as ContinuousInquiry, McGraw-Hill, New York, 1978.

7 R.L. Shrigley, Sci. and Children, 1984, 21(7), 6.8 J.J. Koran and S.J. Logino, Sci. and Children,

1982, 20(2), 18.9 H.N. Alyea and F.B. Dutton, Tested

Demonstrations in Chemistry. Journal ofChemical Education, Easton, PA, 1965, p. 19.

10 B.Z. Shakhashiri, Chemical Demonstrations: AHandbook for Teachers of Chemistry. Volume 2.University of Wisconsin Press, Madison, WI,1985, pp. 232-307.

11 I. Talesnick, 8th Biennial Conference onChemical Education, Storrs, Connecticut, 1984,

12 R.D. Eddy, J.Chem.Ed., 1959, 36, 256.13 S.B. Sarason, The Predictable Failure of

Educational Reform: Can We Change CourseBefore It's Too Late? Jossey-Bass, SanFrancisco, 1990.

14 R. White and R. Gunstone, ProbingUnderstanding, The Falmer Press, London,Chapter 3, 1992.

15 M.B. Nakhleh, J.Chem.Ed.,1992, 69, 191.16 G.M. Bodner, J.Chem.Ed., 1991, 68, 85.17 K.A. Strike and G.J. Posner, A Conceptual

Change View of Learning and Understanding, inCognitive Structure and Conceptual Change,L.H.T. West, and A.L. Pines ,, Ed., AcademicPress, New York, 1985.

Page 41: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jim Ryder

U.Chem.Ed., 2001, 5This journal is © the Royal Society of Chemistry

36

Science and the public: Teaching about chemistry on universitycourses

__________________________________________________________________________________________

Jim Ryder

Centre for Studies in Science and Mathematics Education, School of Education, University of Leeds, Leeds, LS29JT, United KingdomE-mail: [email protected]

A distinction can be drawn between knowledg e of chemistry (the facts, concepts and relationships of chemistry,e.g. the structure of benzene, valency, Raoult's law) and knowledg e about chemistry (the practices of chemistry,e.g. how chemists decide which questions to investigate, how new knowledge claims in chemistry are developedand validated and how disagreements between chemists are resolved). Such knowledge about chemistry is ofrelevance to all chemistry undergraduate students irrespective of their future employment intentions. Whilstknowledge about chemistry is inevitably an aspect of university chemistry courses, it is suggested thatknowledge about chemistry needs to be taught explicitly and should be a recurring feature of universitychemistry teaching.

Science and the publicThere is growing interest both in the UK andworldwide in the ways in which science interactswith public policy . The relevance of this issue hasbeen highlighted in recent debates, such as thoseconcerning the safety of foodstuffs derived fromgenetically modified organisms, whether or notdepleted uranium used in warheads might be acause of leukaemia amongst military personnel andthe safety or otherwise of the measles-mumps-rubella (MMR) vaccine. In all these cases appealshave been made to scientists to provide evidence toinform public debate. On the morning I began workon this perspective I had listened to an intervieweron a national radio programme questioning twoscientists concerning their work on the toxic andradioactive effects of depleted uranium on humans,an excellent opportunity for contemporary sciencefindings to inform media debate. However, fromthe listener's point of view, the key feature thatemerged from the interviews was that the twoscientists disagreed about the conclusions thatcould be drawn from their work. What is thelistener to make of this? Is one of the scientistsincompetent or even biased? What the listener, andperhaps also the scientists being interviewed andthe interviewer herself, needed was someunderstanding about how science works, i.e.knowledge about science. This needs to be part ofpeople's general understanding about science toenable them to engage in science-related debates asthey arise. In the context of the radio interview thelistener should be able to appreciate that thequestion of the impact of depleted uranium onhuman health is a complex one. Carrying outempirical work in this area using human subjects isnot an option. Whilst empirical investigation mightinvolve non-human subjects or in vitro studies,

such work is open to questions about the validity ofextrapolating its results to humans. A retrospectiveinvestigation might involve a statistical study of thehealth of military personnel and relating it to theirexposure to depleted uranium. Here issues such assample size, estimating dosage and the location ofexposure in the human body, become important.Also, cases of leukaemia may have occurred as aresult of other causes. How can these bedistinguished from those that might follow fromexposure to depleted uranium? All theseconsiderations involve knowledge about howscience works as much as they do technicalknowledge of uranium and its physiological effectson humans.

A number of detailed studies 1 have been made ofhow non-scientists make decisions on issues with ascientific dimension. Examples involvingchemistry include local debates about the toxicityof emissions from an industrial site located near tourban housing and the impact and causes of acidrain. As in the depleted uranium case, these studiesshow that the knowledge important in these issuesis not solely, or even significantly, knowledge ofthe facts of science. It is knowledge about sciencethat often plays the most crucial role as scientists,mediators of science, and the public becomeinvolved with science as it affects issues of publicpolicy.

Purposes of university chemistry coursesUniversity chemistry courses provide preparationfor three broad areas of employment: asprofessional research chemists; chemistry-relatedemployment in the media, teaching, thecommercial sector, or within local or nationalgovernment; and employment not directly related

Perspective

Page 42: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jim Ryder

U.Chem.Ed., 2001, 5This journal is © the Royal Society of Chemistry

37

to chemistry, such as work in business and financesectors. Where science relates to issues of publicinterest, such as in the examples given earlier,individuals in all three employment areas may beinvolved. Research chemists generate new findingsand are asked to report on these to their peers, theirfunders and the public. Science journalists,spokespeople for commercial companies andpressure groups, and science policy makers providetheir reactions to the findings of the scientists.Those not professionally involved in science reactto the findings by making choices as consumers,protesters and/or voters. In many contexts publicresponse can have a significant impact on thedirection of future science research as commercialand governmental organisations react to consumerand voter pressure. In this way all graduatechemists have a role to play in public debates aboutscience. Given the crucial role of knowledg e aboutscience in such debates, its incorporation into thecurriculum would be a service to all chemistryundergraduates.

Additional impacts of knowledge aboutchemistryAside from the science and public policy rationaleoutlined above, there are additional, perhaps moreimmediate, reasons for developing students' ideasabout how chemistry works. There is growingevidence that encouraging students to think aboutthe structure and purposes of scientific knowledgecan support their understanding of scienceconcepts. For example, one study2 designed andevaluated an upper secondary course that includedteaching about the general relationships betweentheory and phenomena in science alongside theteaching of energy transfer in electrical circuits.For many students an understanding of the natureof scientific knowledge enhanced their ability toapply their developing understanding of theconcept of energy in electrical circuits inexperimental situations. To my knowledge, theinteraction between knowledge about science andscience concept learning has yet to be examinedwithin university science courses. By contrast, theinteraction between ideas about science anduniversity science students' experiences ofinvestigative work has been examined. A studyinvolving chemistry undergraduates found thatnaïve views about how data and theoretical modelsinteract in science can act as a barrier to students'progress during investigative project work. 3 It islikely that emphasising knowledge about chemistrywithin university courses will enhance students'understanding of chemistry concepts and theiractions during investigative work.

Knowledge about chemistry in the curriculumAssociated with continuing concern about thenature of the interaction between science and thosenot professionally involved in science 4, 5 there have

been a number of initiatives to emphasiseknowledge about science within pre-universityscience education.6 For example, the currentNational Curriculum for Science in England has anew section entitled 'ideas and evidence in science'that focuses on 'how science works'. At the post-16level there is a new AS course 'Science for PublicUnderstanding'; 7 a group supported by the RoyalSociety has begun investigations towards an AScourse on the 'History and Philosophy of Science';and a project funded by the Nuffield Foundationhas recently published materials for teaching aboutscience within A level science courses. 8 Similarprojects have been pursued outside the UK.9

Against this background, new entrants to universitychemistry courses will increasingly be aware ofdiscussions about how knowledge in chemistry isdeveloped, how disputes in chemistry arise and areresolved and what chemistry knowledge can andcannot contribute in complex problems outside thelaboratory whenever chemistry interacts with issuesof public concern. In part this article aims tocontribute to a debate about whether/howuniversity chemistry courses should respond tothese developments.

University students' knowledge about chemistryIt might be said that 'how chemistry works' isaddressed already in university courses. Indeed anycourse that requires students to apply chemicalknowledge in problem solving tasks, and toundertake science investigations of their own, isinevitably raising issues of knowledge aboutchemistry. However, many of the areas in whichstudents are asked to solve problems or conductchemical investigations are far removed from theways in which chemistry impacts on public policy.For example, many first and second year coursesinvolve investigations in the laboratory in whichthe answer is already known and the detailedguidance notes limit the chances of things goingwrong. Of course such activities provide alegitimate way of developing students' ability touse established empirical techniques. However,they are unlikely to communicate the uncertaintiesand complexities of applying chemistry to issues ofpublic concern.

Furthermore, despite the inevitable presence ofknowledge about chemistry in university courses,several studies have shown that students oftenleave university with very naïve views about howscience works.10,11 Many students see science ascapable of providing 'hard facts', that it is alwayspossible to obtain data that will provide a single,incontrovertible interpretation. The presence ofuncertainty and multiple interpretations,particularly in complex settings, is often notrecognised.

Page 43: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Jim Ryder

U.Chem.Ed., 2001, 5This journal is © the Royal Society of Chemistry

38

Teaching knowledge about chemistrySo how can knowledge about chemistry becommunicated within undergraduate courses? Thestrongest message coming from the few studies thathave been conducted to date is that knowledgeabout how chemistry works needs to be taughtexplicitly. It is rarely sufficient for students toengage in chemical investigations or chemicalproblem solving activities for them to develop theirknowledge about chemistry. For example, wefollowed the experiences of 11 undergraduatescience students (including 2 chemists) as theyundertook final year research projects over a periodof 8 months.12 These projects gave students theexperience of engaging in authentic research thataddressed complex issues. Gathering reliable datawas often a real challenge and in most cases onlytentative conclusions could be drawn. However,experiencing authentic research was found not tobe a sufficient condition for being able to articulatean appropriate view about 'how science works'.Many of these students persisted with their view ofscience as always involving 'hard facts'.

To make knowledge about chemistry explicitstudents need to be encouraged to ask questionsabout the structure, purpose and limitations ofchemistry knowledge. How sure can we be aboutour conclusions? Do our findings enable us tomake any generalisations outside the context of thestudy? What can our laboratory study tell us aboutthe chemistry of materials in contexts outside thelaboratory? What additional issues would need tobe considered? Are there other possibleinterpretations of the data? If so, what shouldchemists do next in order to resolve the dispute? Itmight be said that add-on courses on the Historyand Philosophy of Science/Chemistry could servethis function. I would argue that whilst suchcourses do serve legitimate aims, they are not bestplaced to develop students' ideas about science witha view to supporting their consideration of publicpolicy issues and their learning of science conceptsand investigative activities. The teaching ofknowledge about chemistry needs to be anintegrated part of a university chemistry course,with discussions about how chemistry worksrunning through lecture courses, problem solvingclasses, investigative work, and (critically)assessment activities.

ConclusionNot everyone would agree that a strengthening ofknowledge about chemistry within universitycourses is desirable, particularly given the otherpressures on curriculum time. This perspectiveaims to contribute to a debate about how/whetheruniversity chemistry courses should respond to theincreasing focus on knowledge about sciencewithin pre-university education. Secondly, lecturersthemselves have limited experience in explicit

teaching about chemistry and few resources areavailable to support such teaching. This perspectiveis also a plea for university chemistry teachers whorecognise the need for knowledge about chemistryteaching to develop such teaching and tocommunicate to others what works and whatdoesn't work and there are signs that this isbeginning to happen. 13 Finally, there has been little,if any, research into the impact of knowledge aboutchemistry teaching on students' ideas aboutchemistry within university courses. Such studieswould provide insights into how students developnew ways of thinking about the practices ofchemistry both within chemistry research and alsoin matters of broad public concern.

References 1 J. Ryder, Studies in Science Education, 2001,

36, (in the press)2 A. Tiberghien and O. Megalakaki, European

Journal of Psychology of Education, 1995, 10,369.

3 J. Ryder and J. Leach, International Journal ofScience Education, 1999, 21, 945.

4 House of Lords, Science and society, Thirdreport of the Select Committee on Science andTechnology UK, 2000.

5 Office of Science and Technology / TheWellcome Trust, Science and the Public: Areview of science communication and publicattitudes to science in Britain, 2000.

6 R. Millar and J. Osborne, (eds.), Beyond 2000:science education for the future, King'sCollege, London, 1998.

7 A. Hunt and R. Millar (eds.) , AS Science forPublic Understanding, Heinemann, Oxford,2000.

8 A. Hind, J. Leach and J. Ryder, Teaching aboutscience, A research project funded by TheNuffield Foundation, 2001 (materials areavailable at:http://www.nuffieldfoundation.org/aboutscience/index.shtml)

9 W. De Vos and J. Reiding, InternationalJournal of Science Education, 1999, 21, 711.

10 N.W. Brickhouse, Z.R. Dagher, W.J. Letts andH.L. Shipman , Journal of Research in ScienceTeaching, 2000, 37, 340.

11 J. Ryder, J. Leach and R. Driver , Journal ofResearch in Science Teaching, 1999, 36, 201.

12 J. Ryder and J. Leach, International Journal ofScience Education, 1999, 21, 945

13 E.g. J. Garratt, T. Overton and T. Threlfall , Aquestion of chemistry. Creative problems forcritical thinkers, Longman, 1999; J. Garratt,U.Chem.Ed., 1997, 1, 19; S.T. Belt and L.E.Phipps, U.Chem.Ed., 1998, 2, 16; S.B. Duckett,N.D. Lowe and P.C.Taylor, U.Chem.Ed., 1998,2, 45; S.T. Belt, M.J. Clarke and L.E. Phipps,U.Chem.Ed., 1999, 3, 52.

Page 44: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Letters

U.Chem.Ed., 2001, 5 39This journal is © the Royal Society of Chemistry

Customising and NetworkingMultimedia Resources

From Antony J RestDepartment of Chemistry,University of Southampton,Southampton SO17 [email protected]://www.soton.ac.uk/~ajrchem

The article by George McKelvy,‘Preparing for the chemistrylaboratory: an Internet presentationand assessment tool' 1, whichdescribes networking multimediaimages and resources at the GeorgiaInstitute of Technology, promptsthe question “Is such practicecommon in the UK and Europe? ”

The short answer is ‘no’. The UKis, however, at the forefront of suchdevelopments through the CTICentre for Chemistry, the ‘Teachingand Learning TechnologyProgramme’ (HEFCE TLTP), the‘Funds for the Development ofTeaching’ (HEFCE FDTL) projectsand Government initiatives toprovide computers for schools andcolleges, as described in theSOCRATES Open and DistanceLearning Report.2 However, even inthe UK few universities haveattempted schemes as ambitious asthat described for Georgia. Theprime reasons for this would seemto be lack of time and resources.This raises the question “How cansuch obstacles be overcome?”

To create the resources the choice iseither making one ’s own orobtaining, customising andnetworking images and materials.The latter requires the consent ofthe Copyright owner. For thisreason Georgia Institute ofTechnology, which has very largefreshman classes, could afford tomake its own materials. This is anexpensive option. For example itcost £500,000 to make the videoimages for the “Basic LaboratoryChemistry” laser video discs andVHS tapes and £40,000 tocustomise them with interactivematerials for the series of CDROMs: “Practical LaboratoryChemistry” (http://www.emf-

v.com). In favour of the other routeis that most publishers andcopyright holders of images, e.g.the “Chemistry Images ” database(http://www.rsc.org/is/cvc/chem_img.htm), are amenable to requeststo customise and network materialswithin an institution, provided thatsuch an institution purchases a copyof the materials, does not ‘export’them to other institutions and that amodest licensing agreement issigned. This makes the customisingroute much cheaper and saves “re-inventing wheels ” but the costscannot be recouped by selling onthe materials.

Customising existing materials isnot as difficult as it is oftenimagined, but ensuring quality is atthe heart of producing worthwhileresources. For examplecompression of video images isbest achieved if high qualitysources are used (Betacam), goodquality software and hardware forcapture and compression is used(MPEG), and the compression isnot severe but tailored to deliverhigh quality images and sound.Such images, when captured anddigitised, can be stored as files onthe hard drive of a PC andincorporated into learning,teaching, and training packages,together with text and animationsusing a number of design andmanagement packages, e.g.Toolbook and MacromediaDirector. Some examples of what ispossible will be published in 2001.These may involve compiling a setof images for a specific course froma variety of sources onto a CDROM, adding subtitles for studentswith learning difficulties, changingthe level of content, e.g. the CDROMs on ‘Practical Chemistry forSchools and Colleges’ (published in2000) which, in turn, were derivedfrom the ‘Practical LaboratoryChemistry’ CD ROMs, and addingextra content as in the series‘Physical Chemistry Experiments’.Adding subtitles and adaptingvoice-overs can be applied to

produce materials in otherlanguages for scientific andlanguage learning purposes, e.g. aFrench/English version of‘Practical Laboratory Chemistry’will be published in 2001.

To maximise availability,networking within an institution ispossible. This is a process, whichdepends on the compatibility of thenetworking software and thesoftware of the multimedia resourceto be networked. Such is the varietyof combinations that designing atotally generic package is adaunting task. For this reason mostCD ROMs are specified as ‘Forsingle user, stand-alone computers ’.What is possible, however, is thatthe video component can bedelivered as ‘streaming video’3 toany point within the ‘firewall ’ of aninstitution, using a dedicated server.Having obtained the video on aserver, this can be mixed withlearning, teaching and training toproduce resources specific to theneeds of a particular institution. Theuse of an internal server ensuresthat the material remains within the‘firewall ’ and the copyrightconditions set by producers andpublishers are met.

The way is open, therefore, forexpansion of the use of multimediamaterials for learning, teaching andtraining. This will become crucialin the future because students willincreasingly be able to downloadthe resources they need to their ownPCs from central network servers.

References1. G. McKelvy, U.Chem.Ed.,

2000, 4, 46.2. Multimedia Resources for

Chemistry in Europe,http://mrc.chem.tue.nl

3. J. Smith, Viewfinder, 2000, (40),12, see alsohttp://webhelp.ucs.ed.ac.uk/services/media/bufvc.html .W. Garrison, Viewfinder, 2000,(41), 13.["Viewfinder" is the magazineof the British Universities Filmand Video Council (BUVFC)and is published four times each

Letters

Page 45: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Letters

U.Chem.Ed., 2001, 5 40This journal is © the Royal Society of Chemistry

year. It is designated by its issuenumber. See alsohttp://www.bufvc.ac.uk]

On the need to use the Gibbs'phase rule in the treatment ofheterogeneous chemical equilibria

From Professor Paolo MironeDepartment of Chemistry,University of Modena and ReggioEmilia, Modena, ItalyE-mail: [email protected]

Misconceptions are well known tobe serious barriers to effectivelearning. Unfortunately it is not juststudents who have misconceptions.In a recent paper1 Thomas andSchwenz investigated third andfourth year students' conceptions ofequilibrium and fundamentalthermodynamics by means ofinterviews concerning variousaspects of a heterogeneousequilibrium. They considered asystem at constant temperature andpressure (T: not specified; p = 1atm). The initial system consistedof CaCO3 and CO2 only and thestudents were asked to consider itsevolution to the equilibrium systemcomprising CaCO 3, CaO and CO2.Thomas and Schwentz assumed thatthe volume of the gas phase atequilibrium would be nearly doublethat of the initial state, as shown inFigure 1 of their paper.

The behaviour of the system is mosteasily understood from the GibbsPhase Rule. The equilibrium statehas only two independentcomponents (C), in consequence ofthe equilibrium condition on thechemical potentials:

µCaCO3 = µCaO + µCO2

Application of the phase rule:F = C − P + 2

with C = 2 and the number ofphases, P = 3, shows that there isonly one degree of freedom, F.Thus if the pressure is fixedexperimentally (as Thomas andSchwentz do: p = 1 atm) theequilibrium temperature is fixed (byinference) at 898.6°C. From a

purely thermodynamic point ofview, this equilibrium does notdiffer from the equilibrium betweena pure solid and its vapour.

Since pressure and temperature areconstant, it follows from the abovethat, provided the values of thephysical variables p and T arecompatible with equilibrium, thenequilibrium is reached as soon asthe very first crystals of calciumoxide have formed, namely, beforethe amount of additional carbondioxide is enough to show anyappreciable volume increase. So,the process is not "a realspontaneous change" as stated byThomas and Schwenz.

It is interesting to consider whatwould happen if the temperaturewere above 898.6°C, but we haveto define the experimentalconditions carefully. Suppose thepressure is imposed on the CaCO 3and CO2 by a piston. The pressureimposed by the piston is 1 atm. Atthe higher temperature, theequilibrium pressure of CO 2 isabove 1 atm, so that CaCO3decomposes to generate more CO2in an attempt to increase itspressure to the new equilibriumvalue. However, the piston movesback to maintain a pressure of 1atm, until all of the CaCO 3 isconverted into CaO and CO2.Under these conditions, i.e. with animposed external pressure, CaCO 3has a fixed temperature ofdecomposition, (898.6°C, at 1 atm)analogous to the boiling point of aliquid.

Nearly half of the intervieweescorrectly asserted that ‘if thetemperature were high enough orthe pressure low enough, all theCaCO3 would be consumed’. Thisstatement was classified by Thomasand Schwentz as an ‘alternativeconception’ falling into thecategory 'Using informal priorknowledge from everydayexperience to explain thethermodynamics of chemicalphenomena'. In other words, theyjudged that the students had giventhe wrong answer.

This example shows that, whendealing with heterogeneousequilibria, the first thing to be doneis to apply the phase rule. It issimple, powerful and alsoaesthetically satisfying. It is a pitythat it appears so rarely in papers onthe teaching of thermodynamics,especially when it can be a key tounderstanding the system underconsideration.

Reference1. P.L. Thomas and R.W.

Schwenz, Journal of Researchin Science Teaching, 1998, 35,1151.

Note Professors Schwenz andThomas were offered anopportunity to comment, butdeclined to take it.

Questionable questions

From John GarrattSennotts FarmScaynes HillWest Sussex RH17 7NW

I enjoyed reading Byers’Communication on ‘using questionsto promote active learning inlectures’,1 and am sorry he wasdisappointed in the results. I hopethat others will be encouraged byhis report to try something similarand that he perseveres with his ownsuggestions for improvements. Ihave two comments that may helpto explain why the studentsparticipated less actively than hehoped. One relates to habit, and theother to the need for absoluteclarity.

Byers introduced his questions in asequence of twelve lecturesattended by 46 students in the finalyear of their undergraduate course.Compare this with Hutchinson, whorefers to a greater level ofparticipation in classes of up to 300students.2 Significantly, his class isheld in the first year, and it runs atthe rate of three lectures a week for15 weeks, during which time it isthe students’ only exposure to

Page 46: University Chemistry EducationEditorial Policy for University Chemistry Education (UChemEd) The journal is aimed at those who teach chemistry in higher education. As a journal for

Letters

U.Chem.Ed., 2001, 5 41This journal is © the Royal Society of Chemistry

chemistry. Thus Hutchinson’sstudents quickly becomeaccustomed to the idea that learningchemistry at university involvesactive participation. In contrast,Byers’ final year students havealready discovered that learningchemistry at university involvessitting passively in lectures andswatting up the notes just before theexam. They would be more likely torespond to questions in class if thepractice of asking them wereintroduced in the first term, andsystematically adopted by allmembers of staff.

My second point relates to the needfor clarity. Students respond muchbetter to questions when they areconfident that they know what thequestion means and what sort ofanswer is appropriate. That is why,in our book,3 most of our questionsinvolve giving reasons for selectingone answer from the several weprovide. This strategy gives thestudents clear and useful signpostsand seems to help them to getinvolved quickly. Byers might findthis a useful model to use, at leastwhile the students are gettingaccustomed to the idea ofparticipating in lectures. Also, hemight find that students respondbetter if they are given a moment todiscuss the question and its answerwith their neighbours. Whether ornot he likes these suggestions, Irecommend that, before he asks aquestion in class, he writes down amodel answer in the number ofwords (perhaps 20 or less) heexpects from the students in alecture and which shows the depthof thinking it is reasonable toexpect. If he can’t do this, then it isunreasonable to expect the studentsto provide an answer. Take thequestion “why is lead not anessential element? ”; the answer“because it is toxic” is little morethan a restatement of the question,and in that sense indicates shallowthinking (as Byers agrees). In factthe real problem lies with thequestion since the answer cannot beknown, nor can it be investigated byobservation or by experimentbecause it lies in evolutionary

history. The best we might be ableto do is to suggest chemical reasonswhy lead is incompatible with lifeas we know it; the question couldeasily be rephrased to make it clearthat this is the desired response. Adifferent style of answer would besomething like “life may haveevolved in an environment in whichno lead was available, and so therewas no opportunity to develop a usefor it.” But few students are likelyto be in a position to adopt thattrain of thought, given that most ofthem are ignorant of thefundamental principles of lifeprocesses and of evolution. Iconfess to having fallen many timesinto the trap of asking questionsthat are ambiguous or require morethought than can be expected in themiddle of a class. I also admit that,when I try to provide written modelanswers, it often takes me manyattempts and usually forces me torewrite the question. It is a salutarylesson and brings home the value ofthe exercise. Finally, I would like tostress the need for chemists, whenasking questions about lifeprocesses, to do so from a positionof secure knowledge ofbiochemistry and of evolution.

References1. W. Byers, U.Chem.Ed., 2001, 5,

24.2. J.S. Hutchinson, U.Chem.Ed.,

2000, 4, 1.3. J. Garratt, T. Overton and T.

Threlfall , A question ofchemistry. Creative problemsfor critical thinkers, Longman,1999