36
Choosing Critical Indicators in Online Learning Evaluation Mark Hawkes, Dakota State University Merrill Chandler, University of Illinois American Evaluation Association Annual Conference, November 8, 2001

Choosing Critical Indicators in Online Learning Evaluation Mark Hawkes, Dakota State University Merrill Chandler, University of Illinois American Evaluation

Embed Size (px)

Citation preview

Choosing Critical Indicators in Online Learning Evaluation

Mark Hawkes, Dakota State UniversityMerrill Chandler, University of IllinoisAmerican Evaluation AssociationAnnual Conference, November 8, 2001

Presentation Objective

Discuss online learning evaluation approaches in graduate programs at two universities Identify criteria/indicators suitable for the evaluation of online learning environments

Distance Learning Literature

Evaluation . . . . . . . No Yes(79%)

(21%)

Focus . . . . . . . . . Training Education

Impact on Learning . . . . . . . No Yes

(87%)(13%)

(58%) (42%)

Instructor Student Student Student Student Student

Internet

Login Interface

Content

Resources

AssessmentCommunication

Modes

Support Services

System Resources

Managementand Organizational

Information

OnlineLearning Architecture

Learner

Coordinator

Delivery Assessment

Learning Resources

LearnerRecords

Catalog info

Query Preferences

Performance

Interaction Example

(Metcalf, Snitzer, Austin, 2001)

Familiar Online Learning Evaluation Targets . . .

interface designinstructional designstudent satisfactiontechnology accessfaculty satisfactioneconomic viabilitydepartmental capacityinterdepartmental collaboration

DSU’s Educational Technology Program

Students:36 Credit hour MS program80% Education; 20% Business/industry90% Online; 10% On campusFemale 68%; Male 32%Project-based curriculum

DSU’s ET EnvironmentPervasive technological culture Consistency between program goals and the state/region-wide initiatives Campus-wide faculty support Institutional experience in Web-based instruction delivery Multi-delivery methodsClient: teachers, teacher developers, trainers, technology coordinators, etc.Predominantly web-based delivery

An Evaluation Model . . .

IlluminativeOperation of Components And Subcomponents

IntegrativeHolistic perspective onThe learning experience

Course &Program Design

Components

Infra-structure/System

Work Flow

Interaction Impact

Process Impact

Observing and Detecting Focused on Performance Functional Problems Outcomes

Infrastructure/System

Input/output devices Network speed and connectivity Network design/Topology Technical support systems and maintenance

Course and Program Design

Nature of the Design Situation Based Role of State and National Standards Sequencing/Instructional Strategies Assessment Motivation: Learning vs. Performance

Visualization Tools and MediaUser InterfaceCourse Management

Work Flow

Use of discussion toolsSoftware usageMessage redundancy (audio, video, web pages, emails).Progression Do learners progress through their

work tasks in a linear fashion? (novice-like)

Nonlinear opportunistic fashion (expert-like)

InteractionSocial and instructionalMust account for all of the following relationships:

Instructor Learner

Learners

Content

Learners

Technology

Content

Technology

Online Course InteractionAnnouncementsEmailDiscussion BoardSynchronous text chatDesktop VideoFile LoadingOnline assessmentAudio/video clipsRoom-based Video

ImpactCourse performanceCollaborative learningRetention/attrition (course and program)Professional relevance and utilityLearner productivity

Evaluation Attributes

Multi-sourced data (students, server log files, etc) Internal and external Performance based Comparison and criterion based

Student Course Ratings

3.16

3.26

3.27

3.29

3.32

3.32

3.42

3.45

3.45

1 2 3 4

1

1=Strongly Disagree; 4=Strongly Agree

The instructor wasaccessible

The instructor wasresponsive

Course was relevant todegree

Communicationtechnologies were effective

Course activities wereapplicable

Ample opportunity fordiscussion

Course format appropriatefor content

I enjoyed the course

Confident in conductingdesign

Helpfulness of Technology

3.13

3.19

3.52

3.58

3.61

3.74

1 2 3 4

1

1=Not at all helpful; 4=Very helpful

Personal communicationwith classmates orinstructor

Informational course emails

Electronic chat sessionswith instructor

Electronic chat sessionswith classmates

Course readings on website

Course texts

The breadth of this course was:

1 2 3 4 5 6 7 Not nearly enough The right amount Way too much

Compared to a traditional course

1 2 3 4 5 6 7A much narrower range About the same range A much wider range ofof material was covered of material was covered material was covered

Online: 4.61Compared to traditional 4.65 n=32

The depth of this course was:

1 2 3 4 5 6 7 Not nearly enough The right amount Way too much

Compared to a traditional course

1 2 3 4 5 6 7Material was covered in Material was covered in Material was covered in much less depth about the same depth much more depth

Online: 4.48Compared to traditional 4.42 n=32

The extent of critical thinking required:

1 2 3 4 5 6 7 Not nearly enough The right amount Way too much

Compared to a traditional course

1 2 3 4 5 6 7 Much less About the same Much more

Online: 4.61Compared to traditional 4.94 n=32

The amount of effort put into the course:

1 2 3 4 5 6 7 Much less About the same Much more

Compared to a traditional course

1 2 3 4 5 6 7 Much less About the same Much more

Online: 5.65Compared to traditional: 5.26 n=32

U of I’s Curriculum Technology and Education Reform (CTER)

Master of Education (Ed.M.) For practicing K-12 teachers and administrators A two-year program Eight online coursesProject based

CTER . . .

Program is in its fourth year CTER cohorts 1 & 2 have graduated CTER 3 cohort has 26 students CTER 4 has cohort 25 students

Female 73%; Male 27%Many students have technology responsibilities for their schools or districts

CTER’s synchronous and asynchronous technologies

WebBoard conferencing Streaming media using Real Player Audio narrated PowerPoint presentations Tapped In CTER Base iVisit RogerWilco Interactive Multimedia Paper

CTER evaluation

Mostly formativeMixed methods Course evaluation Program evaluation Mini-case studies

Course Evaluation

Instructor and Course Evaluation System (ICES) Piloting Evaluation Online (EON)CTER course survey using SurveyIt Instructor Technology use Support

Exemplary student projects

Program Evaluation

Program surveys Application skills Web browser skills Learner profile

Student interviewsCollection of student artifacts Mini-case studies

CTER studies identify five dimensions of effective learning:

Relevant and challenging assignments Providing adequate and timely feedback through teacher-student interactionFlexibility in teaching and learning Constructing coordinated learning environments Constructing rich environments for student to student interaction

Indicators of CTER effectiveness

Low dropout rate Student satisfaction Student learning transferred into practice

Typical Problems with Online Courses

Facilitating and encouraging collaboration Time managementStudent proficiency with course toolsAmbiguous directionsTimeliness of feedback

Factors Beyond ID Control

Student sophistication with technology tools System capacity Learner availability/accessibility Enthusiastic, responsive instructorGood learner supportMotivated learners

How to Design and Effective Online Course?

Follow basic ID principalsBuild a climate of disclosure and full participation Institute informal student evaluation and check-in mechanisms Active and intensive instructor participation Build in as much interactivity as possibleCreate visually interesting screens/pagesEnsure instructions are very clearMulti-mode interaction is critical

Slides at:

www.homepages.dsu.edu/hawkesm/