Upload
reynold-hutchinson
View
213
Download
1
Tags:
Embed Size (px)
Citation preview
Improving assessment Improving assessment quality using electronic quality using electronic testingtesting
Institute of InformaticsInstitute of Informatics - - FNSFNSUniversityUniversity “Ss“Ss. . Ciryl and Methodious”Ciryl and Methodious” – – Skopje, MacedoniaSkopje, Macedonia
Software Engineering Education and Reverse EngineeringSoftware Engineering Education and Reverse Engineering
Goce Armenski, M.ScGoce Armenski, M.Sc
1.1.
CONTENTCONTENT
ContentContent
- - CCONCLUSIONONCLUSION
- - RRESULTSESULTS
- - ЕЕТТЕСТЕСТ – – NNEW SYSTEM FOR ELECTRONIC TESTINGEW SYSTEM FOR ELECTRONIC TESTING
- - AAPLICATION OFPLICATION OF Е ЕТТESTEST
Goce ArmenskiGoce Armenski
- - IINTRODUCTIONNTRODUCTION
- - ММOTIVATION AND GOALSOTIVATION AND GOALS
- - BBASIC CONCEPTS OF THEASIC CONCEPTS OF THE SYSTEMS FOR SYSTEMS FOR ЕЕТТESTINGESTING
- - ААRCHITECTURERCHITECTURE, , CONCEPTS AND FUNCTIONALITY OF CONCEPTS AND FUNCTIONALITY OF ЕЕТТESTEST
22..
1. 1. INTRODUCTIONINTRODUCTION
IntroductionIntroduction
Goce ArmenskiGoce Armenski
Changes in the way people live influenced by:Changes in the way people live influenced by:- - GlobalizationGlobalization- - Increased meaning of knowledgeIncreased meaning of knowledge- - Information and communication revolutionInformation and communication revolution
““Industrial society” Industrial society” “Information society” “Information society”“Information society”“Information society” “Knowledge society”“Knowledge society”
50% of working skills are getting old in 3-5 years50% of working skills are getting old in 3-5 years
3.3. IntroductionIntroduction
Changes in educationChanges in educationTeacher Teacher Student centered education Student centered education
4.4. IntroductionIntroduction
Goce ArmenskiGoce Armenski
• What are the assessment functionsWhat are the assessment functions??– Check the successfulness in achieving learning goalsCheck the successfulness in achieving learning goals– Provide feedbackProvide feedback– Improve learning processImprove learning process
KeysKeys achievedachieved 14% improvement of student results when assessing 14% improvement of student results when assessing their knowledge once a week instead once a month. their knowledge once a week instead once a month.
Pikunas and Mazzota also mention improvements in results of 10%, Pikunas and Mazzota also mention improvements in results of 10%, when tests are delivered every week instead every 6 weekswhen tests are delivered every week instead every 6 weeks
Changes in the educationChanges in the education ( (lot more studentslot more students))- - Lecturing processLecturing process- - Delivery of materialsDelivery of materials- - Knowledge assessmentKnowledge assessment
5.5. IntroductionIntroduction
Goce ArmenskiGoce Armenski
Institute of InformaticsInstitute of Informatics ( (FNSFNS))- - every yearevery year 150+ 150+ studentsstudents- 4 - 4 questionsquestions ( (assignmentsassignments) ) x x 150 150 studentsstudents = 600 = 600 questionsquestions- 5 - 5 minmin for marking a questionfor marking a question- 3000 - 3000 minmin / 50 / 50 hourshours
How long the marking will takeHow long the marking will take??????
Ontime feedbackOntime feedback??????
Personalized feedbackPersonalized feedback??????
ObjectivityObjectivity??????
6.6. Motivation and goalMotivation and goal
Goce ArmenskiGoce Armenski
Development of a system for assessment of large number of Development of a system for assessment of large number of students (more that 150) every month, which can be used in students (more that 150) every month, which can be used in distance learning, as well as any other form of knowledge or skill distance learning, as well as any other form of knowledge or skill assessmentassessment..
- Motivation for reports creationMotivation for reports creation Statistical analyses of gathered information Statistical analyses of gathered information
- Motivation for test constructionMotivation for test construction Different tests for ecery student with same weight Different tests for ecery student with same weight Lowering the possibility for memorizing Lowering the possibility for memorizing
- Motivation for simple data entryMotivation for simple data entry Possibility for bulk entering of data Possibility for bulk entering of data
22. . MOTIVATION AND GOALMOTIVATION AND GOAL
7.7. Motivation and goalMotivation and goal
Goce ArmenskiGoce Armenski
Focus of the research:Focus of the research:To implement a system for computer based assessment To implement a system for computer based assessment and evaluate the results from its useand evaluate the results from its use
Global goal:Global goal:Identify the influence of this system to the teachers and Identify the influence of this system to the teachers and studentsstudents
Main goal:Main goal:Is computer based assessment more effective and Is computer based assessment more effective and objective than the traditional assessmentobjective than the traditional assessment
GoalGoal
8.8.
22. . еТеТESTEST – – CONCEPTS AND FUNCTIONALITYCONCEPTS AND FUNCTIONALITY
еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
- - Question bankQuestion bank, ,
- - Algoritms for test creationAlgoritms for test creation, ,
- - Systems for data presentationSystems for data presentation,,
- - Result reportsResult reports
QuestionMark, BlackBoardQuestionMark, BlackBoard,, WEB CT, Top Class WEB CT, Top Class, , EduSystem EduSystem
Basic concepts of systems for eTestingBasic concepts of systems for eTesting
9.9. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
Database of unique questions, with needed characteristics for Database of unique questions, with needed characteristics for simple selection during the test construction.simple selection during the test construction.
some of the parameters are created dynamicalysome of the parameters are created dynamicaly
Question bankQuestion bank
- Standards for question bank developmentStandards for question bank development..
- - Exchange of questions of university levelExchange of questions of university level
110.0. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
- Fixed answers questionsFixed answers questions ((objectiveobjective)) - - multichoicemultichoice - - short entryshort entry - - questions with graphical selectionquestions with graphical selection ( (hotspot)hotspot)- - Free text answersFree text answers - - programming code as answerprogramming code as answer - - essay answeressay answer
Fig. 2 Types of questions
Question for computer based assessment
Questions with fixed answers Questions with free answers
Questions with single answer
Short text of value entering questions
Questions with multiple answers
Questions with program code as answer
Questions with essay answer
111.1. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
Main difference between them in the adaptation to the characteristics of Main difference between them in the adaptation to the characteristics of the person which knowledge is assessedthe person which knowledge is assessed
Algoritms for test creationAlgoritms for test creation
Fig. 3 Algoritms for test creation
Linear Dynamiclinear
Testlets Masterymodels
Adaptive
Level of adaptation of the test
112.2. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
Content depending on the characteristics of the monitor and Content depending on the characteristics of the monitor and the networkthe network
In which way the system shows the resultsIn which way the system shows the results??- - shows true answersshows true answers??- - the result is in points or percentsthe result is in points or percents??- - negative markingnegative marking??????
Systems for data presentationSystems for data presentation
Marking and reportingMarking and reporting
113.3. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
еТеТest Technologyest Technology
- Web based applicationWeb based application- - Active Server Pages (ASP)Active Server Pages (ASP)- JavaScript- JavaScript- SQL Server 2000- SQL Server 2000- NT Server - NT Server andand Win 2000 Win 2000 compatiblecompatible- IIS 4.0 - IIS 4.0 or neweror newer
Web based sollutionsWeb based sollutions VS desktop based sollutionsVS desktop based sollutions
Web Browser(Netscape 4.x or Internet Explorer 4.x, and above)
Win 2000 Server
Email(SMTP)
IIS 4.0 + ASPSQL ServerAccess
JScr
ipt
114.4. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
еТеТest Architectureest Architecture
Fig. 4 Three tier architecture of the eTesting system
Web Server( )system modules
user interface
application logic
data layer
Database
SQL statements Data
115.5. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
- Types of usersTypes of users
- - Course organisationCourse organisation
- - Types of questionsTypes of questions
- - Test creation algoritmTest creation algoritm
- - System for data presentationSystem for data presentation
- - Marking and reportingMarking and reporting
еТест еТест ConceptsConcepts
Систем за електронскотестирањеАдминистратор на курс
Администратор на системот
Ученик - тестер
тестови
одговори
резултати
прашања, области
стратегии
резултати
корисници
предмети
насоки
16.16. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
Types of usersTypes of users
Fig. 5 Types of users
17.17. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
- learning objectslearning objects- - three structurethree structure
Course organisationCourse organisation
Fig. 6 Course organisation
LECTURE
PARTA PART PARTB C
SET SET SET SETA1 A2 A3 A4
18.18. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
• Multichoice questionsMultichoice questions ( (choose one of many, choose many of many, yes/no choose one of many, choose many of many, yes/no answersanswers););
• Short entry answerShort entry answer ( (text or numericaltext or numerical););
• Essay answerEssay answer..
Types of questionsTypes of questions
• Questions can have pictures or graphs in the text or offered answersQuestions can have pictures or graphs in the text or offered answers
19.19. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
Multichoice questionsMultichoice questions
Fig. 7 Choose one of many
Fig. 8 Choose many of many
220.0. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
Short entry answers Short entry answers
Сл. 9 Short entry
221.1. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
- these answers are not evaluated by the systemthese answers are not evaluated by the system- - lowering the objectivitylowering the objectivity
- Project Essay Grade (PEG)- Intelligent Essay Assessor (IEA),- Erater,- Bayesian Essay Test Scoring System (BETSY).
Questions with essay answerQuestions with essay answer
Fig. 10 Essay answer
222.2. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
- dynamic linear testsdynamic linear tests ( (fixed number of questionsfixed number of questions))
System for data presentationSystem for data presentation
Marking and reportingMarking and reporting
Test creation algoritmTest creation algoritm
- adjusted to Web standardsadjusted to Web standards (800 (800x600x600))- - possibility for picture and graph displaypossibility for picture and graph display
- results are shown at the end of the testresults are shown at the end of the test- - negative markingnegative marking
23.23. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
Statistical data analysesStatistical data analyses
• Identification of content which is not well presentedIdentification of content which is not well presented;;
• Personalized feedback to studentsPersonalized feedback to students;;
• Identification of week questions which need to be revised Identification of week questions which need to be revised before used againbefore used again;;
• Identifying the individual weeknesses of studentsIdentifying the individual weeknesses of students..
• difficultydifficulty – – how difficult a question ishow difficult a question is;;
• discriminationdiscrimination – – how well the question diferentiate good from bad how well the question diferentiate good from bad studentsstudents;;
• guessingguessing – – how students answer the corect answer without how students answer the corect answer without knowing itknowing it;;
• Different achievementsDifferent achievements – – what are the results of diferent user what are the results of diferent user groupsgroups. .
24.24. еТеТestest – – concepts and functionalityconcepts and functionality
Goce ArmenskiGoce Armenski
Inportant statistical data about questions:Inportant statistical data about questions:
25.25.
44. . APPLICATION OFAPPLICATION OF еТ еТESTEST
Application of eTestApplication of eTest
Goce ArmenskiGoce Armenski
- - Integration in the process of learningIntegration in the process of learning- - Controled learningControled learning
проучиА1
тестирајА1
проучиА2
тестирајА2
проучиА3
тестирајА3
проучиА4
тестирајА4
проучиА
( ,А А А А1 2 3 4, ,
)
тестирајА
( , )А А А А1 2, 3, 4
Fig. 12 Way of passing the learning objects
Succesfull strategySuccesfull strategy????- all questionsall questions- NN questions in a rowquestions in a row- NN wright questionswright questions- 3 3 wright questions in a rowwright questions in a row
Statistical analyses of the user activitiesStatistical analyses of the user activities
26.26.
55. . RESULTSRESULTS
ResultsResults
Goce ArmenskiGoce Armenski
The use of technology in education is very dependent of The use of technology in education is very dependent of the organizationthe organization - logistics - logistics - socal changes - socal changes
Syncronisation with other systemsSyncronisation with other systemsTechnical infrastructureTechnical infrastructure
Prctical ImplementationPrctical Implementation
• Institute of InformaticsInstitute of Informatics , , FNSFNS (2001) (2001)
• А.А.DD. . MobimakMobimak (2002) (2002)
• UNDPUNDP (2003-2004) (2003-2004)
27.27. ResultsResults
Goce ArmenskiGoce Armenski
Institute of InformaticsInstitute of Informatics, , FNSFNS ( (januaryjanuary 200 20011))
- 26 courses- 26 courses
- - 1239112391 questionsquestions
- 589 scheduled assessments- 589 scheduled assessments
- 9861 generated tests- 9861 generated tests
Data gatheringData gathering
Does eTesting provides more effective and more objective Does eTesting provides more effective and more objective assessment compared to the traditional forms, and does it help the assessment compared to the traditional forms, and does it help the learning processlearning process
28.28. ResultsResults
Goce ArmenskiGoce Armenski
Technical infrastructure Technical infrastructure Software development and maintenanceSoftware development and maintenanceTraining of administratorsTraining of administratorsTechnical personalTechnical personal
Cost savingsCost savings
Decreasing the use of paperDecreasing the use of paperDecreasing the use of print materialDecreasing the use of print materialDecreasing the time for conducting the assessmentDecreasing the time for conducting the assessment
Questionary with 10 questionsQuestionary with 10 questions
- Questions with Likert - Questions with Likert type of answers with 5 valuestype of answers with 5 values
- 236 students were enroled in the survey - 236 students were enroled in the survey
Student perspectiveStudent perspective
Teacher perspectiveTeacher perspectiveCreating of question bank is time consumingCreating of question bank is time consuming - more than - more than 1300 1300 per courseper course
Time saving after the creationTime saving after the creation
29.29. ResultsResults
Goce ArmenskiGoce Armenski
Question 1Question 1
30.30. ResultsResults
Goce ArmenskiGoce Armenski
The use of the system for electronic testing is:
0,4%3,8%
43,2%
28,4%23,7%
0,4%
Very difficult Difficult Moderate Easy Very easy no answer
31.31. ResultsResults
Goce ArmenskiGoce Armenski
QuestionQuestion 22
Using the system for electronic testing I can express:
11,9%
38,1%
49,6%
0,4%
shorter range ofskills
same range ofskills
wider range of skills no answer
32.32. ResultsResults
Goce ArmenskiGoce Armenski
QuestionQuestion 33
The electronic testing is __________ compared to the traditional one
0,4%3,4%
30,9%
51,7%
12,3%
1,3%
very difficult difficult the same easy very easy no answer
33.33. ResultsResults
Goce ArmenskiGoce Armenski
QuestionQuestion 44
I prefere assessment using the system for electronic testing, than the traditional one
2,1% 2,1%11,0%
26,3%
58,5%
0,0%
Stronglydisagree
Disagree Neutral Agree Strongly agree No answ er
34.34. ResultsResults
Goce ArmenskiGoce Armenski
QuestionQuestion 55
Marking on the system for electronic testing is objective (the same criteria for all)
0,8%4,7% 7,6%
33,5%
52,1%
1,3%
Stronglydisagree
Disagree Neutral Agree Strongly agree No answ er
35.35. ResultsResults
Goce ArmenskiGoce Armenski
QuestionQuestion 66
The use of the module for Online learning with often knowledge assessment is helping with the material
0,0% 2,1%9,3%
36,9%
51,7%
0,0%
Stronglydisagree
Disagree Neutral Agree Strongly agree No answ er
36.36. ResultsResults
Goce ArmenskiGoce Armenski
QuestionQuestion 77
Using the module for Online learning with often knowledge assessment helps me achieve more knowledge
0,8%3,8%
22,9%
37,7%34,4%
0,4%
Stronglydisagree
Disagree Neutral Agree Strongly agree No answ er
37.37. ResultsResults
Goce ArmenskiGoce Armenski
QuestionQuestion 88 and 9 and 9
96,6%92,8%
96,6%
80,5%
89,8%
0,4% 2,1% 0,0%
8,9%3,4%3,0% 5,1% 3,4%
10,6%6,8%
fast feedback about theresults
Detailed feedback on thequestions w here the
student made a mistake
The opportunity forselftesting
The exam is more relaxedand less stressed
I use more time forthinking than w riting
advantage disadvantage no answ er
38.38. ResultsResults
Goce ArmenskiGoce Armenski
QuestionQuestion 8 8 andand 9 9 contcont
The system gives opportunity for cheating
13,9%
63,6%
22,5%
yes no no answ er
The system gives opportunity for guessing
26,7%
54,2%
19,1%
yes no no answ er
39.39. ResultsResults
Goce ArmenskiGoce Armenski
QuestionQuestion 1010
I recoment use of the system for electronic testing for assessments in other courses
1,3% 1,7%8,9%
27,5%
59,3%
1,3%
Stronglydisagree
Disagree Neutral Agree Strongly agree No answ er
40.40. ResultsResults
Summary of the resultsSummary of the results
Прашање N Ср. Вр. Std
The use of the system for electronic testing is: 235 3,7149 0,8865
The electronic testing is __________ compared to the traditional one
233 3,7296 0,7368
I prefere assessment using the system for electronic testing, than the traditional one
236 4,3686 0,9154
Marking on the system for electronic testing is objective (same for all)
233 4,3305 0,8748
The use of the module for online learning with often knowledge assessment is helping with the material
236 4,3814 0,7428
Using the module for online learning with often knowledge assessment helps me achive more knowledge
235 4,0128 0,8986
I recommend use of the system for electronic testing for assessment in other courses
233 4,4378 0,8288
41.41.
66. . CONCLUSIONCONCLUSION
ConclusionConclusion
Goce ArmenskiGoce Armenski
System implementationSystem implementation• Dependent of the institution in which it is implementedDependent of the institution in which it is implemented
Improvements in the assessment processImprovements in the assessment process• Possibility to assess knowledge of more than 150 studentsPossibility to assess knowledge of more than 150 students• Immediate feedbackImmediate feedback• Lowering the subjectivityLowering the subjectivity• Self testing possibility before the official assessmentSelf testing possibility before the official assessment• Analyses of the gathered dataAnalyses of the gathered data• Possitive effects on the learning processPossitive effects on the learning process• Incresed securityIncresed security??????
Combining this method with other method of assessment Combining this method with other method of assessment can give great resultscan give great results