11
This article was downloaded by: [Northeastern University] On: 27 October 2014, At: 15:46 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK International Journal of Mathematical Education in Science and Technology Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/tmes20 Software tools for computeraided learning in mathematics C. E. Beevers a , B. S. G. Cherry a , D. E. R. Clark a , M. G. Foster a , G. R. McGuire a & J. H. Renshaw a a Mathematics Department , HeriotWatt University , Edinburgh, Scotland, U.K. Published online: 09 Jul 2006. To cite this article: C. E. Beevers , B. S. G. Cherry , D. E. R. Clark , M. G. Foster , G. R. McGuire & J. H. Renshaw (1989) Software tools for computeraided learning in mathematics, International Journal of Mathematical Education in Science and Technology, 20:4, 561-569, DOI: 10.1080/0020739890200410 To link to this article: http://dx.doi.org/10.1080/0020739890200410 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub- licensing, systematic supply, or distribution in any form to anyone is expressly

Software tools for computer‐aided learning in mathematics

  • Upload
    j-h

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Software tools for computer‐aided learning in mathematics

This article was downloaded by: [Northeastern University]On: 27 October 2014, At: 15:46Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

International Journal ofMathematical Education in Scienceand TechnologyPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/tmes20

Software tools for computer‐aidedlearning in mathematicsC. E. Beevers a , B. S. G. Cherry a , D. E. R. Clark a , M. G.Foster a , G. R. McGuire a & J. H. Renshaw aa Mathematics Department , Heriot‐Watt University ,Edinburgh, Scotland, U.K.Published online: 09 Jul 2006.

To cite this article: C. E. Beevers , B. S. G. Cherry , D. E. R. Clark , M. G. Foster , G. R.McGuire & J. H. Renshaw (1989) Software tools for computer‐aided learning in mathematics,International Journal of Mathematical Education in Science and Technology, 20:4, 561-569,DOI: 10.1080/0020739890200410

To link to this article: http://dx.doi.org/10.1080/0020739890200410

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information(the “Content”) contained in the publications on our platform. However, Taylor& Francis, our agents, and our licensors make no representations or warrantieswhatsoever as to the accuracy, completeness, or suitability for any purposeof the Content. Any opinions and views expressed in this publication are theopinions and views of the authors, and are not the views of or endorsed by Taylor& Francis. The accuracy of the Content should not be relied upon and should beindependently verified with primary sources of information. Taylor and Francisshall not be liable for any losses, actions, claims, proceedings, demands, costs,expenses, damages, and other liabilities whatsoever or howsoever caused arisingdirectly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes.Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly

Page 2: Software tools for computer‐aided learning in mathematics

forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014

Page 3: Software tools for computer‐aided learning in mathematics

INT. J. MATH. EDUC. SCI. TECHNOL., 1989, VOL. 20, NO. 4, 561-569

Software tools for computer-aided learning in mathematics

by The CALM Project team:C. E. BEEVERS, B. S. G. CHERRY, D. E. R. CLARK, M. G. FOSTER,

G. R. McGUIRE and J. H. RENSHAWMathematics Department, Heriot-Watt University,

Edinburgh, Scotland, U.K.

(Received 1 April 1987)

This article aims to describe a number of software tools designed for thepreparation of computer-aided learning materials (CAL) in mathematics. TheCALM Project, based in the Mathematics Department of the Heriot-WattUniversity in Edinburgh, is one of the projects currently receiving funding fromthe Computer Board of the United Kingdom as part of the Computers inTeaching Initiative within British universities. This paper will explain thefeatures of software design favoured by the CALM Project team and it will dealnot only with CAL in mathematics but also with CAL in general. The CALMProject provides a practical example of software development in the program-ming language of Pascal within an educational environment.

1. IntroductionThe way the computer helps the learning process for humans is not well

understood yet. Indeed, some would argue that it does not enhance education at all.In the Mathematics Department of the Heriot-Watt University there is anexperiment under way which is considering the role of the computer in the educationof large groups of first year engineering undergraduates in a typical Scottishuniversity. This article seeks to describe the project known as the CALM Project,illustrating the description by giving examples of the software so far developed.

After a brief summary of the background to the CALM Project, this article willconcentrate on aspects of software design used by the CALM Project team. Inparticular, section 3 sets out details of screen display, text handling facilities and theuse of colour in general. In section 4 we discuss some specific mathematical aspects ofthe task with emphasis on routines for comparing formulae and the design of goodCAL for mathematics. As well as formula recognition, we have experimented withmultiple choice responses and we have employed coloured text strategically tohighlight significant steps in a mathematical argument.

Section 5 deals with the software tools needed to make up and mark mathematicaltests. The final section sets out some of the lessons learnt after one term's use of theCALM software and it looks ahead to the work in hand.

2. Background to the CALM ProjectThe Heriot-Watt University is a technological university and its Mathematics

Department has a major service teaching commitment. There are large classes ofengineering and science undergraduates to be taught. For example, in a populationof some 2000 students there are between 300-400 first year students taking 6 hours of

0020-739X/89 $.300 © 1989 Taylor & Francis Ltd.

Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014

Page 4: Software tools for computer‐aided learning in mathematics

562 The CALM Project Team

mathematics per week. The students spend half of this time on calculus at a levelabove Scottish Higher Grade, at the level of the Certificate of Sixth Year Study(CS YS) Mathematics Paper 11 in Scotland; and in terms of the English qualificationsour first year course runs from roughly mid-way through a typical A-level syllabus.

This large service commitment coupled with limited resources has given birth toa cumbersome and largely ineffective tutorial system for a significant minority of thestudents. In the past, traditional pen and paper tutorials have operated with thetutoring role taken over mainly by postgraduate students. The better students do notsuffer but inevitably the weaker and less motivated students fail to understandmathematics at an early stage and they struggle with the subject for the rest of theiruniversity careers—however long that may be! Large lecture classes compound theproblem and some students become despondent and opt out of the system altogether.

Following the publication of the Nelson Report [1] the Computers in TeachingInitiative began. And in 1985, in the early days of the Teaching Initiative, we putforward the idea of a computerized tutorial system to enhance the teaching of firstyear calculus for engineering students. We aimed to produce material to back up theconventional lecture, to provide instruction for the students and to test theirunderstanding of the material. These categories, then, form the three components ofeach topic we cover. The basic structure of our software has a theory section made upof a series of screens to consolidate the lecture; a worked example section whichinstructs the student in ways of doing mathematics; and then, to emphasize thatmathematics is an active subject, each week's work (designated a unit) has a testsection. This mix of ingredients has proved to be a successful combination with thefinal item of major importance. At the beginning we had not realized just howpowerful the assessment element can be for students and they appear to 'enjoy' theweekly challenge to test their knowledge and discover for themselves their ownstrengths and weaknesses. On questioning the students at the end of one term's use ofthe CALM software it is clear that at the start many of them were apprehensive aboutworking on a computer, but after nine weeks' experience such worries have largelydisappeared.

Most of the topics in the differentiation part of the calculus syllabus have nowbeen covered following this simple formula: theory, worked example and test.Naturally, over the nine week term more than nine topics have been studied and aweek's unit of programs may in some weeks comprise three or four topics. In suchweeks, although each topic has had a theory and worked example section, ourpreference has been to set one test only. In such tests a mix of questions from alltopics studied that week is ensured by the test section routines (see section 5 forfurther details). We have written our units with flexibility in mind. One reason forthis was so that another teacher may use our material but in a different order.Throughout our preparation of mathematical CAL materials we have been keenlyaware of the advice provided by Bajpai et al. [2] for the construction of goodmathematical software.

This article, now, seeks to describe the CALM Project for Computer AidedLearning in Mathematics with emphasis in the next two sections on the CAL designtools that we have found particularly useful over the last eighteen months.

3. Some CAL design toolsHaving assembled a team of six academic and programming staff we made our

first major programming decision. We chose to work with Pascal as the language for

Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014

Page 5: Software tools for computer‐aided learning in mathematics

Computer-aided learning 563

our software production. The reason was a natural one based on the availability ofPascal across a wide range of modern micros. So, in theory, our programs can betransported across the micros used in education. In addition Pascal has an orderedstructure that is particularly appealing to mathematicians.

In the course of our software development there have been a number of stages: westarted writing self-contained programs but soon realized that there were a numberof common elements that could be put into a library of useful routines. Originallythis library of procedures was pulled into our programs at compilation time but nowwe have an object code version of the library that is linked into the program aftercompilation. Thus, we obtain executable files ready to run on the network of 32 RMNimbus microcomputers.

Another of our project objectives, secondary but nonetheless important, is toleave a legacy of software tools for other teachers who want to prepare their own C ALmaterial. This echoes the sentiment in a recent article by Harding [3]. In this contextwe have identified and written a number of software tools like menu screen design,split screen creation, text handling routines and screen editing facilities. This sectionwill describe some of these features.

The facility of easy access to all parts of a week's unit was one of our firstpriorities. The unit in week 4 for example covered three topics: implicit differenti-ation, second derivatives and the derivatives of inverse functions. It was essentialthat students be able to jump around these topics easily so that they could go to thoseparts of the course they found difficult. This direction to specific parts of the softwarewas achieved by use of a printed 'worksheet' given out a week in advance of thecomputer tutorial session. The worksheet contained examples for the students to try:if they had problems with a particular type of question then the worksheet gavedirection to the student on how to find that part of the software in that week's unit.This worked well with some students opting for the theory of inverse functions,others for the worked examples on implicit differentiation and the more confidentgoing straight for the test. Throughout we have designed our material in small piecesso that the material does not become set in concrete and to enable subsequentchanges to be made relatively easily. It may be too that another teacher would comealong and prefer to teach topics in a different order and this we have borne in mind.The access then to all parts of the software has been by simple but colourful menuscreens with the students in command. To change the order of the topics in thefuture will require just a few amendments to these menu programs.

The ability to place text on the screen at the right position, in different coloursand with variable height and width settings has been most helpful in designingteaching materials. We have a flexible procedure which works via the coordinatesystem on the screen relative to an origin at the bottom left hand corner. The Nimbusscreen has a horizontal range of 0 to 639 and a vertical range of 0 to 249. A normal-sized character takes up an eight by ten pixel area on the screen. The procedureallows text to be displayed on the screen between the textual mode of character rowsor columns, at variable height and width and in a given colour choice.

For display purposes we have found it useful to have a number of procedures formoving students on through a program. Routines that leave a specific message at thebottom of the screen have been written to cater for normal progression through thesoftware and other routines that allow the teacher the chance to point out a specificpoint before a screen of information is removed. Such routines have been employedextensively throughout the worked examples section and always the program control

Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014

Page 6: Software tools for computer‐aided learning in mathematics

564 The CALM Project Team

remains with the user. For example, a procedure called 'continue' is useful for takinga user stage-by-stage through a developing mathematical argument.

Split screens have been designed to allow important information to remain onpart of the screen when other details are removed. Thus, vital equations can bedisplayed on one part of the screen while a developing argument continues alongsideit.

4. CAL in MathematicsOne of the main problems that we have encountered in developing the

mathematical CAL software has been some parts of the computer/student interac-tion. In the system that we have devised it is a major feature that the students arerequired to input the results of mathematical problems. This requires the student toput into the computer complicated mathematical formulae involving a range ofsymbols and expressions usually in computer format. A fundamental drawback ofany computer system is that it expects these inputs in a very precise fashion and,although the computer is a very patient teacher, at the present stage of development,it is also a very stubborn one! The solutions to many problems can be expressed in avariety of ways and it is an obvious drawback, and an additional complication, forstudents who may be having difficulties solving the problems, if they are thenrequired to input their answers in a specific way, or if they are informed that theiranswer is wrong only when it is arranged differently from the computer expression.

Therefore, we set out to construct an evaluation procedure that would allow usersto input their answers in any order (but still within the rules of computing such asbrackets around functions). This procedure initially checks the input string forsimple errors, such as mismatched brackets, and for obvious sources of error, such asvery long input strings, at which point the user is asked to correct the errors orsimplify the answer. The procedure then evaluates the string using an array ofpreviously-stored variables and a wide range of mathematical functions, allowing fordifferent ways of inputting parts like multiplication of factors and powers.

Obviously, this process cannot account for all formats and the users are stillrequired to think carefully about the way in which they record their answers and ifthey do this is a bonus! This procedure has removed a major difficulty of CAL inMathematics. It is a routine that has now been thoroughly tested by the rigours of thefertile imaginations of sixty first-year Mechanical Engineers who have producedsome original solutions to the problem of 'What is the derivative of sin (a:)?'!

Until very recently the use of coloured text in mathematical textbooks has beennegligible and somewhat trivial, due mainly to prohibitive costs. Indeed, even thebest American textbooks have done little in this direction beyond highlighting a fewsignificant formulae. The recent advent of coloured graphics on the other hand hasadded a whole new and exciting dimension for the imaginative presentation ofmathematical material on a VDU. A flexibility of expression is now available to theteacher which is either very difficult or impossible to achieve in printed book form.This manifests itself in three main ways.

Firstly, not only can whole lines of text or formulae be highlighted in significantcolours or modes, but individual variables and/or symbols within a formula can behighlighted very simply. Thereby the overall global logic of a mathematicalargument can be indicated merely by the imaginative selection of appropriatecolours. More specifically, the progress (or location) of a particularly significantsymbol (e.g. sign, variable, function or constant) can be labelled or printed on the

Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014

Page 7: Software tools for computer‐aided learning in mathematics

Computer-aided learning 565

screen in an outstanding colour—whence its exact location in, say, a whole row ofequations, can be followed with ease.

Secondly, and even more exciting, particularly important symbols, equations orinstructions may be highlighted dynamically—by means of the intermittent flashingof the symbol or instruction in colour or even in a sequence of different colours. This'dynamic' technique represents a radical departure from what is possible in a printedtextbook. For example, at the lowest level of sophistication, simply flashing the word'WORKSHEET' on the screen in a sequence of three colours adds not only anappropriate degree of urgency to the instruction but also a certain element of'dynamicism' to the system as a whole. At a more advanced level, a particularlysignificant change of sign (as when like terms are grouped together on one side of anequation), or when a common factor is removed from a collection of terms, can besignalled 'dynamically' by, for example, a flashing ' + ' or ' —' sign or flashing thecommon factor in some appropriate colour. Finally, at a yet more subtle level, byconsistently colouring one particular variable with the same colour, its progressduring the course of a relatively complex piece of mathematical reasoning can befollowed with comparative ease.

Thirdly, at any point in the course of the presentation of a piece of mathematicspertinent comments can be interjected temporarily on the screen. Each commentmay be contained within visually interesting or amusingly shaped windows, withmoving (so-called 'rubber-band') arrows leading the eye to the precise point ofinterest, say in an equation.

All three of these techniques have been, and are being explored in a number ofinteresting ways within the CALM system. There are other ways too that employ thedynamic features of microcomputers. We have used graphics to stimulate andbroaden the student's experience. Graphs of functions can be quickly sketched onthe screen to give visual emphasis to what is meant by, for example, a turning point.Moving tangents have been employed to indicate stationary points and points ofinflexion. Mathematical 'games' have been constructed with student interaction animportant feature. This form of learning appears throughout the CAL literature andwas included in the Nelson Report as an example of how the computer may be usedin teaching. Students seem to learn when they are enjoying the material, so much ofwhat we do is intended to stimulate and motivate this learning process. Sprites havebeen constructed to aid this process.

5. The making and marking of the testThe test section software is written to allow the students the chance to test their

knowledge of a particular topic or set of topics on offer in each unit. The student cantake one or more tests in the privacy of the CALM laboratory. The tests are doneafter the student has attended lectures on the topics, worked through some examplesat home and possibly after a revision of the theory and worked examples available onthe computer. It is also designed to provide a record of the student's progressthroughout the course by filing the answers to the tests, marking the answers andgiving a mark for each test. The files containing the answers and marks allow both theteacher and student user to judge how the course is going and thereby provide acontinuous assessment of student progress.

At the moment two levels of test (there could be more) are constructed in eachweek's unit: they are referred to as the easy test and the hard test (think of twoexamination papers at different standards but covering the same material). Each test

Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014

Page 8: Software tools for computer‐aided learning in mathematics

566 The CALM Project Team

has been set up by constructing the questions (just as an examination paper is set up)and then by providing some instructions for the test (like the rubric on anexamination paper).

The test questions are selected from libraries of files held on disk and cataloguedusing the database Superfile. Each file contains one question consisting of threeparts: the statement of the question, some answer preambles to prompt a student'sresponse and the correct answers with associated information to allow the evaluationroutine to operate. The statement of the question may include special mathematicalsymbols, graphs and other illustrations if necessary. A question may have many partsand hence require a number of answers. The answer preambles are an importantfeature as they are used to guide the student to input the answer to a particular part.The answers may be constants, formulas (involving one or more variables) or words.The staging of questions is a vital ingredient of the learning process and is of primesignificance in the easy test. The information associated with the correct answersgives the variables involved, if any, and quantities required in the checking andmarking of the student's answers.

The set of test questions (which at present may contain up to 20 basic questions),chose from one or more topics, are assembled in a corresponding set of procedureswithin one file. Details of these questions are supplied to the test section program bydefining the set of hard questions, the set of easy questions and the sets of questionsfor each of the different topics. This flexible approach allows for future developmentand for a different order of topics if another teacher prefers to present the material ina different order.

In setting up the rules for the tests the total number of questions to be done isdefined along with the number of questions to be chosen from each of the topicsbeing tested. The number of 'attempts' allowed for any answer in a particular testquestion can also be defined. This is used in conjunction with the checking andmarking procedures. We have found that 3 'attempts' during an easy test and 1'attempt' for a hard test works quite well in practice. To simulate the process of astudent crossing out an attempted answer in a written examination paper andstarting the question again we allow the student to restart a question. The number ofrestarts permitted for any question in a particular test can also be defined. It wasfound that 2 restarts per question in the easy test were reasonable and that thenumber of restarts per question in the hard test should be greater than three.

The test program first picks up the student's name either from a previousprogram or by asking for it directly. The student is then invited to choose a test: hardor easy. Some instructions about the test are then displayed. These include thenumber of questions to be attempted, the number of 'attempts' allowed per answer,and the number of restarts permitted per question. A restart is activated by typing're' while the typing of 'qu' at any stage brings the test to an end. On pressing theescape key the student gains access to other facilities such as a calculator, a graphicsscreen, details of trigonometric identifies or information regarding performance inthe previous tests. Details of how to obtain these facilities is shown on part of thescreen. The list of facilities will be increased shortly. When the student had gainedthe information sought the escape key again guides the student back to the point inthe program at which the exit was originally made.

The questions are randomly selected, in agreement with the rubric, and given tothe student one at a time. This random selection process means that normally(depending on the number of questions in the test) each student gets a different set of

Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014

Page 9: Software tools for computer‐aided learning in mathematics

Computer-aided learning 567

questions to do and also any one student can be given many different tests in any testsection. Even with only twenty questions this can be achieved by building in arandom feature within some questions. For example, the derivative ofsin (nx) exp (ax) for different values of n and randomly generated a provides the sametype of question but is different in detail.

In each question the statement of the question is set out. The student is thenprompted using a preamble for the answer to each part of the question in turn. Afterinputting an answer it is checked and/or marked before passing on to the answer forthe next part.

After the students have input answers to all the parts of the question, the trueanswers are displayed so that they can be compared with their answers. Finally, theset of answers is marked against the correct ones and the score for the question isshown. At any stage before the final answer is put in the student may restart thequestion by typing 're' which results in a complete restart to the question. Also, atany stage before the input of the answer to the final part, the student may abandon thetest by typing 'qu'. These processes are repeated for each question.

Once the student has finished all the questions the percentage mark for the test isdisplayed. An 'averaged' mark is given and displayed for an abandonned test. Thestudents' answers and the correct answers to the questions are then filed away on thenetwork. The mark recorded in the test is also stored. These mark and answer filesare used by both student and lecturer to monitor the student's progress through thecourse.

In each question each student answer is checked and/or marked as soon as it isinput. The answer is first checked for various possible common faults. If thechecking procedure registers one of these errors the student is asked to amend his orher answer accordingly. An answer which is too long, an answer with unmatchedbrackets or an answer which does not make 'sense' will all lead to an error message.An appropriate message is displayed indicating the fault which is to be put right.

Once an answer passes the checking process it is marked and in the easy test eithera cross for wrong or a tick for right is displayed. The student's answer is comparedwith the correct answer using a function called 'compare' which in turn employs theevaluation procedure mentioned above. The value of the input string is calculated.The procedure of evaluation is used to find the value of any string of integers, realnumbers or mathematical functions. The information associated with the correctanswers when the question is set up allows details about the answers to be passed tothe evaluation and compare routines. This information tells whether the answers arescalars or functions of one or, perhaps, more variables and what these variables are,details of the points over which the correct answer should be compared with astudent answer, and other information regarding the expected format of thestudent's answer, notably its length. This is an important facet since we are trying toteach the students to simplify their answers as much as possible.

Clearly the marking and the filing away of the students' responses to the testquestions would be useless if we could not examine these marks at a later data. Tothis end we have developed a program, called MARK, which consists of a number ofprocedures to allow us to 'see' the students' attempts and marks and so determinehow well they are progressing.

On entering the MARK program one is prompted for the name of a 'class list' file,containing a list of the names of the students in the class. MARK then provides a listof the units on view and invites the user to choose one of these units. A menu giving

Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014

Page 10: Software tools for computer‐aided learning in mathematics

568 The CALM Project Team

the following options is then displayed:

(1) Low Marks(2) Absentees(3) Entire Class(4) Recent Tests(5) Printer(6) Change Unit(7) Individual Student(8) Quit.

Details of those students with a low average mark less than any given value are shownon the screen on request. The information from each unit consists of the number oftests done in that unit, the mean score, the standard deviation, the maximum and theminimum marks obtained in one test, and a 'grade'—a letter A, B, C, D, or E—givingus a guide to how 'good' the student performance has been. The grade is calculatedusing both the mean and the standard deviation. Each mark is stored on file using theformula

where u is the 'unit' number, m and dare the month and day the test was taken, e is 0 ifan easy test was chosen and 1 if a hard test, and v is the mark. The number v is offsetby 101 if the students quit the test. Being able to look at the low marks gives us aninstant check on the weaker students in the large classes we teach.

The second option provides the names of those students who have not done anytests for the unit—a consistent absentee alerts us to the names of the lazy students or,more seriously, those with an illness. Details, similar to those in Low Marks but forthe entire class, are shown under option 3.

Names of those students who have sat the test in the last few days is often quiteuseful. It allows us to see which students sit the tests at times other than the formaltutorial hour. Students have been encouraged to return to the open learningenvironment of the micro-laboratory at times in addition to the formal tutorial hourand many have taken this opportunity.

Option 5 simply turns the printer on/off to allow a hard copy of the information tobe readily available. This means we can give the students' parent departments moredetailed information on the progress of their students.

The change unit option obviously allows us to examine a different unit fromwithin the same program.

On taking the individual student option we are prompted for the name of astudent, and if this student is recognized by the 'class list' file then we are presentedwith another menu giving the following choices:

(1) Tests Already Done(2) Tests Not Yet Done(3) View One Test(4) Input A Mark(5) Printout File(6) Return to Main Menu.

Options 1 and 2 respectively provide for the individual student details on each unit,like those in the Low Marks and Absentee options of the main menu. More detailed

Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014

Page 11: Software tools for computer‐aided learning in mathematics

Computer-aided learning 569

information on the mark, the date of the test, whether an easy or hard test andwhether the student quit the test or not for any chosen unit are given by choice 3.Then a summary of this information appears, as in the Low Marks option.

The fourth choice allows us to allocate a mark for any given unit, e.g. we couldinclude the formal class examinations as a separate unit and store their marks forcomparison with the other tests. It has also been a far-sighted option during a periodof evolution during which inaccuracies have crept into the marking system. Such'teething' problems are disappearing as the project progresses.

The actual response to each question by the student, rather than the mark he/sheachieves, is stored in a separate file. Option 5 allows us to view these responses for anyunit and in this way common errors are quickly spotted. More options, particularlyon the graphical side, are in the planning stage—e.g. bar charts, etc.

6. Some lessons learntThe CALM software on differentiation has been tested thoroughly by a group of

60 first-year Mechanical Engineers in the Autumn term of the academic year 1986/7.The educational evaluation of the teaching packages has been presented elsewhereand the interested reader is directed to reference [4] for further details. However, anumber of important lessons have been learnt which will enable us to produce moreeffective software in the future. For example, we intend to experiment further withthe worked example section and provide 'hint' screens to nudge students in the rightdirection through a problem. In this way we hope to build up their confidence totackle more difficult questions. There is much more to do in the area of mathematical'games' in which students learn in a fun way some aspect of mathematical principle.The computer as a teaching aid is here to stay and there has been sufficient genuinestudent interest in our experiment to justify further investigation.

AcknowledgmentThe authors are grateful to the Computer Board of the U.K. for their financial

support of the CALM Project.

References[1] THE NELSON REPORT, 1983.[2] BAJPAI, A. C , FAIRLEY, J. D., HARRISON, M. C , MUSTOE, L. R., WALKER, D., and

WHITFIELD, A. H., 1985, Int. J. Math. Educ. Sci. Technol., 16, 407.[3] HARDING, R. D., 1986, Bull. Inst. Math. Applies, 22, 76.[4] BEEVERS, C. E., CHERRY, B., CLARK, D. E. R., FOSTER, M., MCGUIRE, G., and

RENSHAW, J., 1988, Computers in Education, 12, 43.Dow

nloa

ded

by [

Nor

thea

ster

n U

nive

rsity

] at

15:

46 2

7 O

ctob

er 2

014