33
Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA 2004, Lisbon, Portugal.

Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Embed Size (px)

Citation preview

Page 1: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Improving the Student Learning Experience for SQL using Automatic Marking

Dr. Gordon Russell, Andrew Cumming.

Napier University, Edinburgh, Scotland.

CELDA 2004, Lisbon, Portugal.

Page 2: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Introduction• SQL tutorials can be highly repetitive for the tutor.• Tutorials are 50 students to 1 lecturer (plus 2 helpers)• Tutorials run best when the tutor:

– Gets asked interesting questions– Gets around everyone in a single tutorial period– Has time to chat with the students about life

• Tutorials run best for the student when:– They can progress easily without it seeming difficult– Get to speak to the tutor whenever they want to– Have immediate feedback– Can work from home– Can avoid tutorials when they want to

Page 3: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Feedback• Part of the problem in improving tutorials is understanding “feedback”.• Feedback can mean anything to a student, however in this talk the

feedback of interest is the answers given to some key frequently asked questions from students where the answer must be individualised on a per student basis:– Is my SQL right?– Am I heading in the right direction?– When do I get the coursework assessment?– What mark did I get in the assessment and why?– Do I have to keep working on this stuff?– I have forgotten my username/password/notes/brain, so how are

you going to help me?

Page 4: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Assessment

• Originally the students submitted some SQL in answer to a number of assessment questions.

• These were submitted after 9 weeks of study.• At my peak I could mark 6 of these per hour, and each one had a

feedback sheet attached to them.• Back then I had 280 students… Including sanity breaks this took

about 2 weeks.• Maintaining consistent marking schemes for so many students

was difficult.• We also franchise this material, and perform moderation on other

people marking similar work, and this is another significant reason for the need for a new method.

Page 5: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Main Targets

• Perform automatic marking of student SQL assessments• Provide feedback on SQL written as part of tutorials.• Support incremental assessments scheduled under student

control.• Support some of the issues of distance learning of SQL.• Manage students online• Provide multi-campus support for franchised module.• Gather student statistics to understand student behaviour• Modify student behaviour to improve module performance.

Page 6: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Marking

• Marking the SQL from students can easily become subjective.

• As a starting point, the original marking scheme was edited to remove all things which could not be formally stated as a “marking rule”.

• There are many different ways of writing an SQL statement to answer a specific question, and many of those statements are equally valid.

• The new scheme was divided into two categories:– Accuracy of the result of executing the specified SQL– Simplistic quality measures of the SQL

Page 7: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Accuracy

• The accuracy measure used is, put simply, how similar is the output of executing the student’s SQL to the SQL I wrote as the sample solution.

• The basic algorithm takes the table produced by the student’s SQL statement, and then compares each cell of that table against the sample solution.– If the cell is in both tables then score+1– If the cell is not in the sample solution score-1– Divide the final score by the number of cells in the

biggest of the sample solution table and the student’s SQL table.

Page 8: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Example: Accuracy 4/9 or 45%

SELECT *

FROM people

ID Lastname

1 Russell

7 Kennedy

ID Firstname Lastname

1 Gordon Russell

4 Andrew Cumming

7 Jessie Kennedy

SELECT ID, Lastname

FROM people

WHERE ID IN (1,7)

Student’s Attempt Correct Answer

Page 9: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA
Page 10: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Tutorial Index

Page 11: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA
Page 12: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA
Page 13: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA
Page 14: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Algorithm Complexity

• It was decided to avoid penalising students if they choose a different column order from the sample solution.

• In general, the order of the rows does not effect the quality of the answer.

• Producing a comparison algorithm which is row and column order insensitive is an expensive operation if not done carefully.

• The algorithm used employs tree pruning algorithms and other optimisations to allow this to be done efficiently.

Page 15: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Faking the output• If the question asked “how many employees are 17 years old” a

student could count them (say 5) and writeSELECT 5 FROM DUAL;

• This would have the right answer, thus perfect accuracy, but is cheating!

• The system for this reason does a “hidden database check”.• If the measured accuracy is 100%, the query is executed again on a

different dataset which uses the same database schema. This dataset is specially constructed so that it produces different answers to all the queries.

• If the student uses a cheat like this one it will produce the same output on a different dataset, and fail the test.

• We take 30 points off the accuracy for this.

Page 16: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Quality measures

• In addition to accuracy, points are lost for poor quality.• The algorithm for measuring this is simplistic, as SQL quality

is hard to calculate.• Some things which are related to quality are easy to

measure:– User SQL is much longer than the sample solution– Having LIKE, but the string comparison has no wildcards.– Create a view without dropping the view.

Page 17: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Environment

• To use marking for assessments we need to identify who is responsible for what submissions.

• This requires the concept of users.• In general users can either be created by the

users themselves or by an administrator.• The administration by hand of 300 students per

semester is not a nice thought…• User registration by users can be problematic…

Page 18: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Safeguarding Registrations

• ActiveSQL uses an email registration confirmation system.• When someone registers they record an email address.• An email is sent to that address which contains a web link.• If they do not click on that link their account locks up within

14 days.• If they forget their system password they can get a link

emailed to that address which allows them to set a new password.

Page 19: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Some observations• With registration the student’s name appears on the screen.• Some students see the hidden database as a sneaky trick.

– In “real-life” the hidden database always bites back.• Hours worked, progress so far and success rate are all

visible to the student at all timer. Many students find this self-monitoring useful.

• Students failing to register is a continuing problem.• A balance must be struck between

– Managing and monitoring the students effectively.– Encouraging self-reliance.

Page 20: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Statistics

• One of the main objectives of this work is to provide support into investigating student behaviour.

• This allows us to make changes to the system and then measure the impact on student behaviour.

• We measure:– How long a student takes to get 100% on a question– What tutorials have been worked on and scores.– How long a student spends logged into the system.– What was their overall scores.– What was the student scores in the related exam.

Page 21: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Experiment 1:Time management• We found that a significant number of students waited until

the last possible minute before starting their coursework.• We instead wanted to encourage students to work at this

over a long period in an incremental way.• To attempt this we:

– Changed from 1 assessment to 4 smaller assessments.– Imposed a rule where you can only do an assessment if

you have completed the corresponding tutorial.• In this way completing tutorials had some value (more than

just learning).

Page 22: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Typical Students on Target

12

34

56

7

2002/3

2003/4

0

10

20

30

40

50

60

70

Percentage

Week Number

Cohort

Percentage of students on target

2002/3

2003/4

Page 23: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Typical students on Target

• A weekly “progress target” was defined.– This is the amount of material a typical student should

have completed at each week of the semester.• The chart shows that more of the 2003 students achieved

the target.• This improvement was immediate and lasted throughout the

semester.• In the final week more students were able to recover with a

final push before the deadline.

Page 24: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Good Students On Target

12

34

56

7

2002/3

2003/4

0

10

20

30

40

50

60

70

Percentage

Week Number

Cohort

Percentage of students ahead of target

2002/3

2003/4

Page 25: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Good Students on Target

• Good student behaviour has decreased slightly. The likely issues here are:– We have crushed their “push ahead” spirit…– The workload of having assessments early means less

time on assessments and more on workload.– They were better at controlling their time than I am.

• Luckily the effect is small on the good students, and the effect on the average student is significantly good to call this a positive change.

Page 26: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Experiment 2: Reward Effort

• Student feedback identified that students were unhappy completing a whole tutorial, then its assessment, only to discover they achieved 0% in that assessment.

• Left them with the “why did I bother” feeling…• To counter this, we changed the assessments such that:

– Each assessment had two questions.– The first question was from the just completed tutorial

difficulty level– The other question was from the tutorial level below this

one.

Page 27: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Time vs. Final Mark - BEFORETime spend (MIN) vs Overall mark (%)

0

10

20

30

40

50

60

70

80

90

100

0 500 1000 1500 2000 2500 3000 3500 4000

Time spent on Tutorials

Co

urs

ew

ork

fin

al

mark

Page 28: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Time vs. Final Mark - AFTERCO72010 Marks/Time

0

10

20

30

40

50

60

70

80

90

100

0 500 1000 1500 2000 2500 3000 3500 4000

Time in tutorials (mins)

Ma

rk A

ch

ieve

d (

%)

Page 29: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Result #2

• Some shape changes but not a significant change to the overall statistics.

• The change was implemented in a way that not all students would have benefited.

• Know more when experiment is repeated next year.

• Just shows you that not all changes result in a change!

Page 30: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Side Effects• Some aspects of the system were deliberate, but had surprising

side-effects…

• On each question the time taken to do that question was shown to the student.

• This resulted in many statements like “I have been working on this for 30 minutes” as a reason why we should tell them the answer…

• It is possible to write rubbish SQL and have a good accuracy (a coincidence), but this always results in a failed hidden db check. Students will say this is unfair, as if it has a high accuracy it must therefore be right, and thus my marking scheme is wrong.

Page 31: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

• In assessments not all the accuracy measures (and sometimes none of the measures) are shown to the student until they “close” an assessment.

• This is often cited as a problem… “How do you expect me to know if me SQL is right unless the system tells me”.

• Actually the hiding of the accuracy measures in assessments is something to be looked at for next year, as it is not clear that this is such a good thing wrt producing good module mark statistics.

• Are we working towards good statistics, or good teaching?

Page 32: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

More observations…

Exam vs CW mark

R2 = 0.3551

0

10

20

30

40

50

60

70

80

90

100

0 10 20 30 40 50 60 70 80 90 100

Exam Mark

Co

urs

ew

ork

Ma

rk

Page 33: Improving the Student Learning Experience for SQL using Automatic Marking Dr. Gordon Russell, Andrew Cumming. Napier University, Edinburgh, Scotland. CELDA

Future Work

• Investigate the following questions:– Exam vs. coursework mark… correlation?– Is rewarding effort important, or just achievement?– Is cram learning better than incremental?– Are statistics more important than actual learning (from

the perspective or teachers, students, and management)?

– Is ActiveSQL only 100 times worse than SQLZoo or is it higher?

– Will this talk ever end?