20
DESIGNING AND CONDUCTING FORMATIVE EVALUATIONS Tieka Wilkins

Designing and conducting formative evaluations chapter 10

Embed Size (px)

Citation preview

DESIGNING AND CONDUCTING FORMATIVE EVALUATIONS Tieka Wilkins

BACKGROUND

A formative evaluation, evidence of an instructional program’s worth is gathered for use in program while it is making decisions about how to revise being developed.

OBJECTIVES

Describe the purpose for various stages of formative evaluation of instructor-developed materials.

Describe the instruments used in formative evaluation

FORMATIVE EVALUATION

The collection of data and information during the development of instruction that can be used to improve the effectiveness of the instruction

To obtain data that can be used to revise the instruction to make it more efficient and effective.

LEARNER SPECIALIST

SME may be able to comment on the accuracy and currency of the instruction.

Learning specialist- may be able to critique your instructions related to what is known about enhancing that particular type of learning

Learner specialist-may be able to provide insights into the appropriateness of the material for the eventual performance context.

3 PHASES OF FORMATIVE EVALUATION

One-to-One Evaluation

Small –Group Evaluation

Field Trial

ONE TO ONE EVALUATION

One learner at a time reviews the instruction with the evaluator. The evaluator observes the learner using the instruction, notes the learners comments, and questions the learner during and after the instruction.

SMALL GROUP EVALUATION

To determine the effectiveness of changes made following the one-to-one evaluation.

FIELD TRIAL

To determine whether the changes/revision in the instruction made after the small group stage were effective.

DESIGN REVIEW

Design Review

Expert Review

One-to-One

Small Group

Field Trial

Ongoing Evaluation

DESIGN REVIEWS(ASK THESE QUESTIONS)

Does the instructional goal match the problem identified in the needs assessment?

Does the learner & environmental analysis match the audience?

Does the task analysis include all the prerequisite skills?

Are the test items reliable and valid, and do they match the objectives?

EXPERT REVIEW

Is the content accurate & up-to-date?

Does it present a consistent perspective?

Are examples, practice exercises, & feedback realistic & accurate?

Is the pedagogy consistent with current instructional theory?

Is the instruction appropriate to the audience?

ONE-ON-ONE REVIEW

Is the message clear?

What is the impact on: learner attitudes

achievement of objectives & goals

Feasibility of training

SMALL GROUP REVIEW

Look for the effects caused by the changes made in the one-to-one review

Identify any remaining learning problems

FIELD TRIALS (REVIEW)

Look for effects in changes made in small group

Can the instruction be used in the context in which it was intended

ONGOING EVALUATIONVIEWED THROUGH THE LENS OF:

Project Size

Life span of content

Audiences change

One-To-One

Small Group Tryouts

Field Trials

LEARNER EVALUATION“TESTING”

Do learners understand the instruction? Do they know what to do during the

practice & the tests? Can they interpret graphics in the text? Can they read all the material? How much time does it take?

This chapter allows me to understanding the difference in evaluation. When trying to get an understanding on the learners capability as an instructor, to identify the intellectual level. As a life learner I know how to change the scene to fit each individual or group setting.

[email protected]

Tieka Wilkins