Upload
maxisurgeon
View
303
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Citation preview
Slide 1 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Evaluating educational softwareInstructor: Heiko Spallek, DMD, PhD
Based on papers/presentations by: T. Schleyer, DMD, PhD, University of PittsburghL. Johnson, PhD, University of IowaD. Rubright, MFA, MA, University of IowaH. Spallek, DMD, PhD, University of Pittsburgh
February 20th, 2004Dental Information Systems #2201
BIONF 2201
Evaluating educational software
February 20, 2004Slide 2 of 51
A framework for evaluating educational software
Asking the learners … Guidelines for the Design of Educational
Software ADEA Software Competition Heuristic Evaluation (your assignment)
Outline
BIONF 2201
Evaluating educational software
February 20, 2004Slide 3 of 51
What constitutes “good” information?
What makes a computer-based course “good”?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 4 of 51
Domains of quality criteria
Pedagogical Issues Subject Matter Language/Grammar Surface Features Menus
Questions Feedback Invisible Functions Off-line Materials Evaluation
BIONF 2201
Evaluating educational software
February 20, 2004Slide 5 of 51
Purpose of study
Designed to:– verify applicability of quality criteria– survey the state-of-the-art in online
continuing dental education Results applicable to a broad range of
computer-based materials, not just Web-based courses
BIONF 2201
Evaluating educational software
February 20, 2004Slide 6 of 51
Design
Complete survey of Web-based CDE using indices, search engines
manual review and coding for 34 criteria summarization of raw data for each
criterion
BIONF 2201
Evaluating educational software
February 20, 2004Slide 7 of 51
Criteria
Provider and course listing Course description Course format Course content and interaction
BIONF 2201
Evaluating educational software
February 20, 2004Slide 8 of 51
Results - Providers and courses
Total yield: 157 courses (25 hrs of searching!)
32 providers Universities currently provide highest
number of courses/institution
BIONF 2201
Evaluating educational software
February 20, 2004Slide 9 of 51
Course topics
Topic N % of total
Periodontology 31 19.7
Oral Diagnosis 18 11.5
Pathology 9 5.7
Prosthodontics 8 5.1
Implantology 7 4.5
Basic Science 6 3.8
Dental Materials 6 3.8
BIONF 2201
Evaluating educational software
February 20, 2004Slide 10 of 51
Course formats
Brochure or book format slide show case report newsletter or composite report
BIONF 2201
Evaluating educational software
February 20, 2004Slide 11 of 51
Credit hours
78 courses offered credit hours price per credit hour: $5 - $25
BIONF 2201
Evaluating educational software
February 20, 2004Slide 12 of 51
Credit hours vs length in screens
0
1
2
3
4
5
6
7
8
0 20 40 60 80
Screens
Cre
dit
ho
urs
BIONF 2201
Evaluating educational software
February 20, 2004Slide 13 of 51
Media used
Text: 100% Images: 84% Video: 2% PDF: 7%
BIONF 2201
Evaluating educational software
February 20, 2004Slide 14 of 51
Questions 77 (53%) of all courses included questions:
– multiple choice: 85%– true/False: 34%– open-ended: 8%
Questions mostly at end, in few cases throughout
28% of tests online scored
BIONF 2201
Evaluating educational software
February 20, 2004Slide 15 of 51
Dates
Date of creation: 11% Date of last update: 24%
BIONF 2201
Evaluating educational software
February 20, 2004Slide 16 of 51
Course length
BIONF 2201
Evaluating educational software
February 20, 2004Slide 17 of 51
Content
Author not indicated: 71% No goals and objectives: 23% No references: 85%
BIONF 2201
Evaluating educational software
February 20, 2004Slide 18 of 51
Navigation
Direct indication of progress: 23% Indirect indication of progress: 45% Progress actively obscured: 32% Navigation: approx. 60% easy or very
easy to navigate
BIONF 2201
Evaluating educational software
February 20, 2004Slide 19 of 51
Interaction
No e-mail contact possible: 47% Author’s e-mail listed: 24% Other e-mail: 29%
BIONF 2201
Evaluating educational software
February 20, 2004Slide 20 of 51
Discussion Limitations
– design guidelines are preliminary only– study used only a subset of design criteria– some criteria subjective (navigation, length)– not certain that all online courses in
dentistry were found– password-protected courses not reviewed
BIONF 2201
Evaluating educational software
February 20, 2004Slide 21 of 51
Discussion Variation of credit/hours vs course length Dearth of true multimedia courses Testing and feedback uses Internet
capabilities only marginally Low compliance with accepted
standards for educational materials
BIONF 2201
Evaluating educational software
February 20, 2004Slide 22 of 51
Discussion (cont.)
Poor use of navigational design Interaction obviously not desired in
most courses Advanced functions of educational
software not used (e.g. customization after pretest)
BIONF 2201
Evaluating educational software
February 20, 2004Slide 23 of 51
Recommendations
Disseminate Guidelines widely Online CDE should be peer-reviewed Develop valid instruments for assessing
courses Insert TITLE and KEYWORD tags into
HTML Establish central index of courses
Slide 24 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Evaluation of Web-based Dental CE-CoursesHeiko Spallek DMD PhD*, Elizabeth Pilcher DMD**, Ji-Young Lee***, Titus Schleyer DMD PhD*
*Center for Dental Informatics, University of Pittsburgh, School of Dental Medicine, Pittsburgh, PA **Medical University of South Carolina, College of Dental Medicine, Charleston, SC***Temple University School of Dentistry, Philadelphia, PA
Study goals
evaluate the outcomes of online CDE courses through analysis of 6 organizations focused on
how the participants of online CDE can be characterized whether the participants' expectations were met by the courses how the participants evaluated the content of the courses why they enrolled how they experienced the online environment
→ develop recommendations for the design of future courses
Slide 25 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Study design exploratory study
survey of 436 past course participants from 9 online CDE courses from 6 organizations (health care schools and commercial CE providers)
courses varied in content, length, type of provider, tuition
inclusion criteria were continuing education courses in dentistry that granted continuing education credits online for at least a year
Slide 26 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Course title Organization
Features Online since Number of former participants
CE credit hours / Accreditation
Nitrous Oxide Conscious Sedation, University of Texas Health Science Center, San Antonio, Dental School
Preview section, quizzes, can download in pdf
Oct. 1998
61 9
Top 40 Drugs, Medical University of South Carolina, College of Dental Medicine
Quiz, e-mail instructor, outside links Sept. 1998
161 2
Dentistry on the Internet, Temple University School of Dentistry
pre- and post-test, class listserve, quiz, communication with the course instructor
Oct. 1997 113 3
Submitting an Invention to a Dental Manufacturer, University of Michigan, School of Dentistry
Post test June 1997
21 1
Asthma Procter & Gamble
Course test Not Avail.
40 2
Introduction to Composite Dentistry DentalXchange.com
Course test Nov. 1999
10 2
Treating the Unscheduled Dental Emergency DentalXchange.com
Course test Nov. 1999
10 3
Tooth Bleaching DentalXchange.com
Course Test Mar. 1999
10 2
Tricks of the Trade in Endodontics DentalXchange.com
Course test Jan. 1999
10 2
Evaluated CDE courses
Slide 27 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Study design: survey"The Tailored Design Method" by Don Dillman
self-administered, dual-mode (e-mail and postal mail), partially branched survey (concise, limited the number of open-ended questions)
3 demographic questions 5 computer literacy questions 4 specific course material questions 6 online environment questions 3 course content questions 4 marketing questions
→ instrument available at http://di.dental.pitt.edu/cesurvey/
Slide 28 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Results/discussion: response rate
Response by Course Solicitations Respondents Rate
completed with CE certification
Asthma 40 20 50.00% 20 Dentistry on the Internet 113 42 37.17% 12 Nitrous Oxide Conscious Sedation 61 44 72.13% 5 Submitting an Invention to a Dental Manufacturer 21 10 47.62%
10
Top 40 Drugs 161 41 25.47% 6 Tricks of the Trade in Endodontics 10 5 50.00% 5 Treating the Unscheduled Dental Emergency 10 2 20.00%
1
Introduction to Composite Dentistry 10 4 40.00% 3 Tooth Bleaching 10 1 10.00% 1 Total 436 169 38.76% 63
Slide 29 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Results/discussion: demographics
Responses to the question “On the whole, how sophisticated a computer user do you consider yourself?”
Self Assessment Category Number of Reponses Percentage very unsophisticated 18 14% unsophisticated 14 11% neither sophisticated nor unsophisticated 48 37% sophisticated 32 25% very sophisticated 18 14% Total 130 100%
Slide 30 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Results/discussion: marketing
Responses to the question “How did you learn about this particular course?
19% Internet search engine 15% course provider's homepage 15% personal recommendation 9% professional journal 2% alumni journal 10% other sources
Slide 31 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Results/discussion: marketing
Participants perceived respectability of an online CDE course:
Agreement with the statement "I am more likely to tell others about my participation in an online course than my participation in a traditional classroom-based lecture."
19% strongly agreed with this statement 11% somewhat agreed 24% neither agreed nor disagreed 6% somewhat disagreed 1% strongly disagreed 8% no opinion
Slide 32 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Results/discussion: online environmentTime spent working online for the course?
"Dentistry on the Internet" => 14.5 hours"Submitting an Invention to a Dental Manufacturer" => 1 hour
When?27% accessed the course material during work hours79% after work hours 6% specified both
From where?31% home64% their office3% a library
Slide 33 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Results/discussion: online environment What was the single most important reason that attracted you to an online course
“convenience” 47%
What was the biggest disadvantage to the online format?“lack of human interaction” 13% “cannot ask questions” 12%
The lack of face-to-face contact with a teacher was a stumbling block for your learning
agreed 18% disagreed 65% neither agreed nor disagreed 17%
Slide 34 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Results/discussion: meeting initial expectations
courses ranked equally well in most categories– "exploit the convenience of online learning" – "fit the course into my schedule"
problem spots– cannot "communicate with peers online" – cannot "interact one-to-one with the instructor"
authors' experiences suggest– participants seldom raise content-related questions – since inception: the instructor of the course “Top 40 Drugs” received a total of four content-related questions
Slide 35 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Conclusions Evaluated online CDE courses do meet some of the needs and expectations of dental professionals.
generally limited number of participants → no ROI *
participants mainly originated in the United States
recommendations for online course development →→→→→→→→→
Carr, Sarah. Is anyone making money on distance education? The Chronicle of Higher Education 2-16-2001
Slide 36 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Recommendations
Online CDE courses need to:
be current cover the subject matter in-depth be guided by an experienced instructor define average time necessary to complete the entire course be marketed among dental professionals
BIONF 2201
Evaluating educational software
February 20, 2004Slide 37 of 51
Pedagogical Issues
Benefits of the computer Instructional techniques
match the audience Instructional techniques
match the content
Assessment strategy Customizable content Content is reinforced Interactions vary
How well does it teach? Are activities appropriate for the audience, objectives and content?
How well does it teach? Are activities appropriate for the audience, objectives and content?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 38 of 51
Pedagogical Issues
Benefits of the computer
www.lib.uiowa.edu/commons/skullvr/index.html
BIONF 2201
Evaluating educational software
February 20, 2004Slide 39 of 51
Pedagogical Issues
Instructional techniques match audience and content
D4: Dx & Tx PlanningD4: Dx & Tx Planning
DiagnosticBytesDiagnosticBytes
D1 & D2: AssessmentD1 & D2: Assessment
Assessment of Geriatric Patients
Assessment of Geriatric Patients
BIONF 2201
Evaluating educational software
February 20, 2004Slide 40 of 51
Pedagogical Issues
Assessment Strategy
Diagnosis of Head and Neck PainDiagnosis of Head and Neck Pain
BIONF 2201
Evaluating educational software
February 20, 2004Slide 41 of 51
Pedagogical Issues
Interactions vary
Diagnosis of Head and Neck PainDiagnosis of Head and Neck Pain
BIONF 2201
Evaluating educational software
February 20, 2004Slide 42 of 51
Information is complete, accurate, & logically organized
Subject MatterIs the content accurate and appropriate for the audience?
Is the content accurate and appropriate for the audience?
www.uiowa.edu/~oprm/AtlasHome.htmlwww.uiowa.edu/~oprm/AtlasHome.html
BIONF 2201
Evaluating educational software
February 20, 2004Slide 43 of 51
Language/Grammar
Glossary
Assessment of Geriatric PatientsAssessment of Geriatric Patients
Assessment of Geriatric Patients
Assessment of Geriatric Patients
Is language usage appropriate and understandable?
Is language usage appropriate and understandable?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 44 of 51
Surface Features Media is easily used Screen & application
enhance learning Color is coordinated Text is easy to read Bookmarks used Opportunity for
errors minimizedDiagnosis of Head and Neck PainDiagnosis of Head and Neck Pain
Does media play correctly, is text readable and is the overall look pleasing?
Does media play correctly, is text readable and is the overall look pleasing?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 45 of 51
Surface Features -- Menus
Clear user controls Menus have clear
labels Completed sections
indicated Esthetic screen
displays
research.dentistry.uiowa.edu/summaries/index.htmlresearch.dentistry.uiowa.edu/summaries/index.html
Will students get lost?Will students get lost?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 46 of 51
Questions Reflect objectives Interspersed Easy to answer Change answers Skip questions
Are there questions for the student to answer to gauge if they are learning or making progress in the program?
Are there questions for the student to answer to gauge if they are learning or making progress in the program?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 47 of 51
Feedback
Indicates correct/incorrect answers
Provides informative feedback
Uses media when appropriate
Is feedback given to guide the student and make learning more efficient and effective?
Is feedback given to guide the student and make learning more efficient and effective?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 48 of 51
Invisible Functions
Data continuously stored
Data collection turned on/off
Data secure Reports generated
Assessment of Geriatric Patients
Assessment of Geriatric Patients
What is going on behind the scene to check on student progress?
What is going on behind the scene to check on student progress?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 49 of 51
Off-line Materials Equipment requirements Troubleshooting
information Operating instructions Curriculum integration
suggestions Support materials provided
Assessment of Geriatric Patients
Assessment of Geriatric Patients
How is the program supported and connected to other resources?
How is the program supported and connected to other resources?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 50 of 51
Formative Evaluation 47 Students
– 15 schools– 3 countries
Computer Support– Installation problems– Operation problems
Observation Student
DiagnosticBytesDiagnosticBytes
Can I suggest changes to make the program better in the future?Can I suggest changes to make the program better in the future?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 51 of 51
Formative Evaluation
Strengths+ Easy manipulation of
images
+ Easy to enter a diagnosis
+ Highly interactive
+ Element of fun
+ Futuristic setting
+ Patient "spoke" to responses
+ Good graphics
+ Element of surprise
+ Great deal of decision-making
DiagnosticBytesDiagnosticBytes
BIONF 2201
Evaluating educational software
February 20, 2004Slide 52 of 51
Formative Evaluation
Weaknesses– Certain images too
small
– Instrument interface clumsy
– Mark, the assistant, became annoying
– Help needs expansion
– Consent is difficult to find
– Patient did not say why a proposed treatment was rejected
– Expand the evaluation
– Evaluation needs to display images during discussion
DiagnosticBytesDiagnosticBytes
BIONF 2201
Evaluating educational software
February 20, 2004Slide 53 of 51
Summative Evaluation
Design
AAS BS total
Control n = 16 n = 8 (Didactic)
24
Simulation n = 14 n = 20 34
total 30 28 58
AAS BS total
Control n = 16 n = 8 (Didactic)
24
Simulation n = 14 n = 20 34
total 30 28 58
Assessment of Geriatric Patients
Assessment of Geriatric Patients
Is this program effective? What measures are used? Is it fair? Is this program effective? What measures are used? Is it fair?
BIONF 2201
Evaluating educational software
February 20, 2004Slide 54 of 51
Summative Evaluation
Results
AAS BS
Control NSD p < .01
Simulation p < .05 p < .002.1
AAS BS
Control NSD p < .01
Simulation p < .05 p < .002.1
Assessment of Geriatric Patients
Assessment of Geriatric Patients
BIONF 2201
Evaluating educational software
February 20, 2004Slide 55 of 51
ContentContent
Content Content
A vision for the future
BIONF 2201
Evaluating educational software
February 20, 2004Slide 56 of 51
ContentContent Content Content
Decisionsupport
Education Analysis
User interface
Knowledge base
Slide 57 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Developing a Protocol for an Educational Software Competition
BIONF 2201
Evaluating educational software
February 20, 2004Slide 58 of 51
Background
– Pedagogical Issues– Subject Matter– Language and Grammar– Surface Features
– Questions, Answers and Feedback
– Invisible Functions– Off-line Materials– Evaluation
* http://www.temple.edu/dentistry/di/edswstd
Standards Committee for Dental Informatics developing Guidelines for the Design of Educational Software* (ANSI accredited)
133 criteria in 8 categories
BIONF 2201
Evaluating educational software
February 20, 2004Slide 59 of 51
BackgroundEducational Computing in Dentistry Competition No published/validated protocol for evaluation of
educational software has evolved into a rating instrument
Application of Guidelines assists further development and validation
Application of Guidelines assists adoption of standards
Promote educational software as a promotional activity
BIONF 2201
Evaluating educational software
February 20, 2004Slide 60 of 51
Methods
Use Guidelines to develop rating instrument– Pedagogy– Subject Matter– Technical Aspects
Include most relevant criteria Judge completes review in 2 hours
BIONF 2201
Evaluating educational software
February 20, 2004Slide 61 of 51
Methods – Rating Scale
Four-point scale 3 = agree 2 = somewhat agree 1 = somewhat disagree 0 = disagree
BIONF 2201
Evaluating educational software
February 20, 2004Slide 62 of 51
Methods
Pedagogy The computer is appropriate for the
instructional objective. The program offers a variety of
interactions.
BIONF 2201
Evaluating educational software
February 20, 2004Slide 63 of 51
Methods
Subject Matter The goals and objectives of the program
are clearly stated. The subject matter presented is
accurate. The subject matter presented matches
the knowledge level of the audience.
BIONF 2201
Evaluating educational software
February 20, 2004Slide 64 of 51
Methods
Technical Aspects The screen displays draw attention to
important information. Type styles are easy to read.
BIONF 2201
Evaluating educational software
February 20, 2004Slide 65 of 51
Methods -- Participants
Open to all ADEA members 3,000 individual members Dental institutions (n=55) Dental hygiene programs (n=67)
BIONF 2201
Evaluating educational software
February 20, 2004Slide 66 of 51
Methods -- Competition
Two Categories CD-ROM World Wide Web
Awards 1st -- $1,000 2nd -- $500 3rd -- $250 Honorable Mention
BIONF 2201
Evaluating educational software
February 20, 2004Slide 67 of 51
Methods – Entry Process
Submission via Web (n= 30) Required hardware and software Objectives Audience Description Formative Evaluation Summative Evaluation
BIONF 2201
Evaluating educational software
February 20, 2004Slide 68 of 51
Methods -- Review
Recruit qualified judges (n=13) 6 dental faculty active in dental
informatics 6 instructional designers 1 dental hygienist/instructional designer
BIONF 2201
Evaluating educational software
February 20, 2004Slide 69 of 51
Methods – Review
Calibrate judges 2 groups One-hour long conference call Review process & instrument Gather feedback & made changes to
instrument & process
BIONF 2201
Evaluating educational software
February 20, 2004Slide 70 of 51
Methods – Review
Enter/submit rating & open ended comments on spreadsheet
Random assignment of programs Assignments avoided conflict of interest 2 judges / program 5 - 7 programs / judge (2 weeks)
BIONF 2201
Evaluating educational software
February 20, 2004Slide 71 of 51
Analysis Calculated raw summary scores
– Pedagogy – 96 points maximum– Subject Matter – 33 points maximum– Technical Aspects – 75 points maximum– TOTAL – 204 points maximum
Adjust for N/A ratings Average scores Evaluation scores weighted by a factor of 5
BIONF 2201
Evaluating educational software
February 20, 2004Slide 72 of 51
Results – Judges’ Time
Mean = 50 minutes Range = 30 minutes to 4 hours
BIONF 2201
Evaluating educational software
February 20, 2004Slide 73 of 51
WWW Composite Scores
020406080
100120140160180
A B C D E F G H I J K L M N O P Q
Product ID
Po
ints
Pedagogy Subject Matter Technology
BIONF 2201
Evaluating educational software
February 20, 2004Slide 74 of 51
CD-ROM Composite Scores
0
20
40
60
80
100
120
140
160
180
A B C D E F G H I J K
Product ID
Po
ints
Pedagogy Subject Matter Technology
BIONF 2201
Evaluating educational software
February 20, 2004Slide 75 of 51
Formative Evaluation of Review Process
Two conference calls with judges Instrument Feedback
– Different program types reviewed with same instrument
– 4-point scale constraining– Program “presentation” may impact rating– Need to assign weights to criteria, especially
evaluation
BIONF 2201
Evaluating educational software
February 20, 2004Slide 76 of 51
Formative Evaluation of Review Process
– Time consuming– Required a team approach – technologist
and content expert Judge Calibration
– Requires additional training – Currently limited by costs
BIONF 2201
Evaluating educational software
February 20, 2004Slide 77 of 51
Discussion
Strengths of process & instrument– Dental and instructional design experts
involved– Instrument based on national guidelines– Calibration of judges (limited)– Formative evaluation will improve process
& instrument
Slide 78 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
There are a variety of expert review methods to chose from:
Heuristic evaluation Guidelines review Consistency inspection Cognitive walkthrough Formal usability inspection
Your assignment
Slide 79 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Expert Review
Method: Heuristics Evaluation = systematic inspection of a user interface design for usability
most popular usability inspection method goal: find usability problems ( fix them as part of an iterative design process) small set of evaluators examine the interface evaluator judge compliance with recognized usability principles (the "heuristics") general heuristics + category-specific heuristics
Slide 80 of 51Course BIONF 2201 February 20, 2004
Evaluating educational software
Expert Review
Ten Usability Heuristics by Jakob Nielsen
Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors (bad, good example) Help and documentation
Nielsen, J. (1994). Heuristic evaluation.
In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, John Wiley & Sons, New York, NY.