37
Guide to Developing Effective Selected and Constructed Response Assessment Items & Rubrics Greg Sherman, Ph.D. Radford University Constructing Good Assessment Items Step One: Identify WHAT to Assess Teachers are generally interested in assessing the following educational variables of interest: Expected Terminal Outcome Performance Unexpected Terminal Outcome Performance Embedded Practice Performance Expected Attitudes Unexpected Attitudes Instructional Component Perception Social Interaction Concerns Instructor Concerns Learner Concerns The bold-faced items reflect variables that are the most common, and perhaps most important, variables to be measured within teacher-developed assessments. These reflect learning associated with the instructional objectives (including attitudes!). For example, as learners progress through an instructional experience, they should be provided with opportunities to practice the specific skills the instruction is designed to facilitate. Additionally, quizzes at the end of each distinct experience should measure the learning of the intended skills (including attitudes), as should the posttest at the conclusion of a larger instructional unit.

Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

  • Upload
    others

  • View
    19

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Guide to Developing Effective Selected and Constructed Response Assessment Items & Rubrics Greg Sherman, Ph.D. Radford University

Constructing Good Assessment Items Step One: Identify WHAT to Assess Teachers are generally interested in assessing the following educational variables of interest:

Expected Terminal Outcome Performance Unexpected Terminal Outcome Performance Embedded Practice Performance Expected Attitudes Unexpected Attitudes Instructional Component Perception Social Interaction Concerns Instructor Concerns Learner Concerns

The bold-faced items reflect variables that are the most common, and perhaps most important, variables to be measured within teacher-developed assessments. These reflect learning associated with the instructional objectives (including attitudes!). For example, as learners progress through an instructional experience, they should be provided with opportunities to practice the specific skills the instruction is designed to facilitate. Additionally, quizzes at the end of each distinct experience should measure the learning of the intended skills (including attitudes), as should the posttest at the conclusion of a larger instructional unit.

Page 2: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 2

Step 1a. Identify Performance Outcomes Selecting the best possible types of assessment items depends on WHAT you are assessing. The easiest way to determine the specific type of skill to be measured involves isolating the performances expected of the learner once they presumably learn what you hope they learn. Such performances are usually stated within the performance outcomes. But, identifying performance outcomes often involved pulling them from within stated instructional objectives. Performance outcomes help teachers determine what is worth learning and what type of skills are expected to be learned. Instructional objectives help teachers design effective instructional experiences.

Performance outcome = A description of a particular behavior or performance a learner is

expected to possess forever following an instructional experience. Instructional objectives = Expand upon performance outcomes by specifying the

[classroom] conditions under which performances or behaviors will occur during practice and assessment, as well as any criteria learners must meet which indicate that specific knowledge, skills, and attitudes have been learned.

Examples: Performance Outcome: Students will add double-digit numbers. Instructional Objective: Given two double-digit numbers written in equation

form, the students will add them together. Performance Outcome: Students will design controlled experiments. Instructional Objective: Given a problem and hypothesis, the students will

design a controlled experiment that includes a control group that is not subject to the independent variable, an appropriate dependent variable, and at least three extraneous variables held constant between all experimental groups.

Performance outcomes and instructional objectives can be identical (when no conditions or standards are needed).

Page 3: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 3

Professional educators are often faced with designing or adapting instructional experiences to facilitate or measure the learning of skills that are presented with statements written as “standards.” Often times, statements do not make it clear precisely what skills are to be learned, or more than one skill is included in the statement. In cases such as this, you must try to determine the intent of the skills, knowledge and/or attitudes reflected in such a statement and rewrite it to reflect individual instructional objectives so that performance outcomes can be identified. For example, the following is an English Language Arts Common Core State Standard (CCSS) for grades 6-8:

CCSS.ELA-Literacy.RH.6-8.7 Integrate visual information (e.g., in charts, graphs, photographs, videos, or maps) with other information in print and digital texts.

Based on the wording (or lack of wording), it is not completely clear what skill or skills need to be demonstrated by the students. For example, does this skill simply address the ability to use software to develop charts, graphs or other file types and embed them into electronic documents, or is it more likely that the statement refers to the students’ ability to should to include visual support (i.e. charts, graphs etc.) that enhances the overall meaning of a specific body of text? If the latter is the intent, then it should be stated as such. Another English Language Arts CCSS states:

CCSS.ELA-Literacy.RL.8.4 Determine the meaning of words and phrases as they are used in a text, including figurative and connotative meanings; analyze the impact of specific word choices on meaning and tone, including analogies or allusions to other texts.

Obviously, this statement includes AT LEAST two distinct skills. One skill is “Determine the meaning of words and phrases as they are used in a text, including figurative and connotative meanings.” The other skills is “Analyze the impact of specific word choices on meaning and tone, including analogies or allusions to other texts.” This statement seems to require more clarity in order for it to reflect a clearly-measurable performance. The phrase “analyze the impact” should be elaborated on to know more precisely what is expected of the learners (what they do, in fact, need to learn). For example, the outcome might be changed to the following:

“Accurately describe how specific word choices alter the meaning and tone of a sentence or passage.”

This might even be further refined with: “Given alternate forms of sentences or passages with specific words altered between the versions, explain how the different words alter the meaning or tone of the passage.”

After you have clarified your intended outcomes, you can categorize them using one or more of the classification schemes described next.

Page 4: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 4

Step 1. Classify Performance Outcomes

Although his work is a little dated, Benjamin Bloom’s (1964) proposed Taxonomy of Learning (categorized into cognitive, affective, and psychomotor domains) still represents a good, concise description of the diversity of learning. The cognitive domain categories were revised in 2001 by one of Bloom’s associates (Anderson & Krathwohl, 2001).

Cognitive Domain [Revised]

Remembering Retrieving, recognizing, and recalling relevant knowledge from long-term memory.

Understanding Constructing meaning from oral, written, and graphic messages through interpreting, exemplifying, classifying, summarizing, inferring, comparing, and explaining.

Applying

Carrying out or using a procedure through executing or implementing. Using a concept in a new situation or unprompted use of an abstraction. Applying what is learned in the classroom into novel situations in the workplace.

Analyzing

Breaking material into constituent parts, determining how the parts relate to one another and to an overall structure or purpose through differentiating, organizing, and attributing. Separating material or concepts into component parts so that its organizational structure may be understood. Distinguish between facts and inferences.

Evaluating Making judgments about the value of ideas or materials based on criteria and standards through checking and critiquing.

Creating Putting elements together to form a coherent or functional whole; reorganizing elements into a new pattern or structure through generating, planning, or producing.

Page 5: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 5

Affective Domain

Receiving phenomena Demonstrating willingness to attend to phenomena.

Responding to phenomena

Actively participating by attending and reacting to a particular phenomenon.

Valuing Attaching worth to a particular object, phenomenon, or behavior.

Organization Organizing values into priorities by contrasting different values, resolving conflicts between them, and creating a unique value system.

Internalizing values (characterization)

Has a value system that controls behavior. Behavior is pervasive, consistent, predictable, and helps define the “world view” (becomes a characteristic of the learner)

David Merrill’s (1994) Component Display Theory classifies learning along two dimensions:

Content [facts, concepts, procedures, and principles]

Performance [remembering, using, generalities]

Page 6: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 6

Robert Gagné (1985) suggested a taxonomy of learning that is very useful in helping to more clearly identify the type of skills represented in performance outcomes. His categories include elements of declarative versus procedural knowledge while providing enough detailed differences between some of the more subtle differences between different types of cognitive skills. It is this classification scheme that will be used to help inform the selection of valid and reliable assessment items for specific skill types.

Category Description of Performance Outcome Common

Performance Outcome Verbs

Motor Skills

Physical activities requiring movement and coordination of all or part of the body.

Execute, Perform, Swim, Walk, Run, Climb, Drill, Saw

Attitudes

Intrinsically motivated choices people make. If you think about it, some of the most important outcomes in life are really attitudes.

Choose, Decide, Participate

Verbal Information (Declarative Knowledge)

Information the learner must be able to state, including: facts, dates, people, names, principles, generalizations etc.

State, Recite, Tell, Declare, Name, List, Define

Intellectual Skills: (Procedural Knowledge) Discriminations

Distinguishing objects, features, or symbols (“back “versus “not back”, for example)

Distinguish, Differentiate

Intellectual Skills: Concepts

Concrete Concepts: Objects (parts of a bindery), classes of objects (books, human resource personnel, losers etc.), object features (5 inches, red, etc.), and object relations (above, near, etc.)

…that can be pointed out and identified. Defined Concepts: Objects, principles, classes, features, and relations that cannot be identified by pointing them out. They must be defined.

Examples include: “quality”, “energy”, “satisfaction”

Identify, Label Classify instances, Sort, Categorize

Page 7: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 7

Intellectual Skills: Rules

Rules make it possible to do something using symbols (most commonly, the symbols of language and math). Rules include the application of single principles to explain, describe, or predict phenomena or events. Rules make it possible for students to respond to a class of things with a class of performances.

Solve, Show, Demonstrate, Generate, Develop, Create, Determine, Calculate, Predict

Intellectual Skills: Higher-Order Rules (Problem Solving)

Higher order rules employ more than one rule or principle to solve problems, perform tasks, or explain, describe, and predict phenomena or events. Learners must decide which rules or principles must be utilized to perform tasks or explain, describe, or predict phenomena or events.

Solve, Show, Demonstrate, Generate, Develop, Create, Determine, Calculate, Predict, Defend, Support

Cognitive Strategies

Individual ways in which learners attend, learn, remember and think. Cognitive strategies govern the way learners deal with their environment.

Page 8: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 8

Using a Classification Key to Determine the Type of Performance Outcome Based on Gagné’s Model

1a. Is the primary focus of the performance outcome specific to the development of muscle

skill coordination? .............................................................................. MOTOR SKILL 1b. Is the primary focus of the performance outcome more than, or different from, the

specific development of muscle skill coordination?.......................................... GO TO 2 2a. Does the performance outcome represent a intrinsically motivated choices the learners

make based on their own values and preferences, presumably in the real world? ............................................................................................................ ATTITUDE

2b. Does the performance outcome represent mental abilities other than making choices

based on personal values and preferences, presumably in the real world? ......... GO TO 3 3a. Is the purpose of the performance outcome to state, repeat or recite specific pieces

of information such as facts, dates, people, names, principles, generalizations, etc.? .......................................................................................... VERBAL INFORMATION

3b. Does the performance outcome reflect a mental skill that requires the learner to go

beyond merely stating, repeating or reciting specific pieces of information? . GO TO 4 4a. Does the performance outcome only require learners to distinguish between objects,

object features, or symbols (i.e. “East” versus “Not East” or “Top” versus “Bottom”)? IS: DISCRIMINATIONS

4b. Does the performance outcome represent the application of mental abilities beyond

simply distinguishing between objects, features or symbols? ........................... GO TO 5

Page 9: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 9

5a. Does the performance outcome elicit the identification or labeling of objects (i.e. “book cover”), classes of objects (i.e. books), object features (5 inches, red, etc.), and/or object relations (above, near, etc.) …that can be pointed out and identified? .......... IS*: CONCRETE CONCEPTS

5b. Does the performance outcome require the learners to classify or label objects,

principles, classes, features, relations…that must be defined rather than pointing out? (i.e. “quality”, “energy”, “satisfaction”)? ................................ IS: DEFINED CONCEPTS

5c. Does the performance outcome require the application of cognitive processing skills

that go beyond identifying, labeling and classifying things? ............................ GO TO 6 6a. Does the performance outcome require the learners to apply single principles to

explain, describe, or predict phenomena or events? .................................... IS: RULES 6b. Does the performance outcome require the learners to employ more than one rule or

principle to solve problems, perform tasks, or explain, describe, and predict phenomena or events? ....................................................................... IS: HIGHER ORDER RULES

6c. Does the performance outcome represent the application of individual ways in which a

learner will attend, learn, remember, think, and decide which rules need to be applied to specific situation (beyond applying recommended rules)? ..... COGNITIVE STRATEGIES

*Intellectual Skills

Page 10: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 10

Constructing Good Assessment Items Step Two: Decide BEST WAY to Assess Most types of formal assessment items can be classified into one of two type: Selected Response and Constructed Response (including projects and performances). Selected response items represent those assessment items that focus on the learners choosing a possible answer from a set of provided, limited choices. Constructed response items represent those assessment items in which the learners must in which the learners must create (or construct) an answer. No choices are provided. The following chart presents categorizes the common assessment types:

Selected Response Constructed Response Multiple Choice (single or multiple answers) Matching Binary-Choice (i.e. True/False) Multiple Binary-Choice Ordering

Fill-in-the-Blank (including math problems) Labeling (unless a list of choices is provided) Short Answer (including math problems) Essay Project Performance

Generally, the most effective type of assessment item depends on the type of skill measured. To determine the type of skill measured, examine the instructional objective and identify the performance outcomes. Once you have identified the outcome, classify it according to the categories presented in the first column within the chart below. The second column presents a description of the category, while the third column presents common verbs used to describe the outcome type. After you have determined the category, use the information presented in the fourth column to decide the best type of item to develop. In addition, the fifth column on the right presents information about the most ideal types of feedback to provide the learners. More information about feedback follows in Step Four, including a diagram displaying basic information about each type of feedback.

Page 11: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 11

Outcome Category Common Behaviors

Suggested Type(s) of Assessment Items Constructed Response Selected Response

Motor Skills

Operate Perform Execute

Fill-in-the-Blank Labeling Short Answer Essay Project Performance

Multiple Choice Matching Binary-Choice Multiple Binary-Choice Ordering

Attitudes

Choose Exhibit Decide Participate

Fill-in-the-Blank Labeling Short Answer Essay Project Performance

Multiple Choice Matching Binary-Choice Multiple Binary-Choice Ordering

Dec

lara

tive

Kno

wle

dge

Verbal Information

State Name Define Recite List Tell

Fill-in-the-Blank Labeling Short Answer Essay Project Performance

Multiple Choice Matching Binary-Choice Multiple Binary-Choice Ordering

Proc

edur

al K

now

ledg

e

Intellectual Skills: Discriminations

Discriminate Distinguish Differentiate

Fill-in-the-Blank Labeling Short Answer Essay Project Performance

Multiple Choice Matching Binary-Choice Multiple Binary-Choice Ordering

Intellectual Skills: Concrete Concepts Intellectual Skills: Defined Concepts

Select Locate Identify Label Classify Sort Categorize

Fill-in-the-Blank Labeling Short Answer Essay Project Performance

Fill-in-the-Blank Labeling Short Answer Essay Project Performance

Multiple Choice Matching Binary-Choice Multiple Binary-Choice Ordering Labeling

Multiple Choice Matching Binary-Choice Multiple Binary-Choice Ordering

Intellectual Skills: Rules

Solve Show Determine Calculate Predict

Fill-in-the-Blank Labeling Short Answer Essay Project Performance

Multiple Choice Matching Binary-Choice Multiple Binary-Choice Ordering

Page 12: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Developing Effective Assessment Items Page 12

Intellectual Skills: Higher-Order Rules (Problem Solving) Cognitive Strategies

Solve Show Generate Develop Discuss Create Determine Calculate Predict Defend Support

Fill-in-the-Blank Labeling Short Answer Essay Project Performance

Multiple Choice Matching Binary-Choice Multiple Binary-Choice Ordering

Page 13: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 13

Constructing Good Assessment Items Step Three: Follow the Rules for Writing Well-Written Assessment Items An assessment item can contribute to the validity of an assessment-based inference when it accurately measures what it is supposed to measure. First and foremost, evidence for the validity of results obtained from individual assessment items is optimal when the skill elicited by the item is the exact same skill indicated within the instructional objective, with the exact same conditions provided. As discussed in the previous steps, the type of skill to be measured should determine the most appropriate type of assessment item to use. Ensuring that the best possible assessment item type is selected can ultimately support the item result’s evidence of validity. In addition, evidence of validity can be supported when the items are constructed in a fashion that minimizes the probability that the learners will respond correctly to an item if they haven’t, in fact, learned the skills being measured. The following Rules for Developing Well-Written Assessment Items provide details about how to develop items that maximize their measurement effectiveness.

All the rules in this section are based on the following premises: Evidence of Validity: Ensure items address targeted Skills, Knowledge,

and Attitudes/Dispositions (SKA) Maximize Grading efficiency (time and accuracy) Maximize probability that correct answers reflect SKA performance

(minimize correct answer guesses - Type I Error – False Positive) Maximize probability that incorrect answers reflect SKA performance

(Type II Error – False Negative) Encourage real thinking about the questions

Page 14: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 14

Rules for Developing Well-Written Selected Response & Constructed Response Assessment Items Rule 1. General Guidelines for All Assessment Types Rule 1.1: Write clear, concise, simple directions free of complex syntax or difficult vocabulary.

Example: Bad Directions Newton’s Three Laws of Motion defined an entire field of physical science for many years. Name them. Better Directions Name Newton’s Three Laws of Motion.

Rule 1.2: All assessment items must clearly elicit the performances articulated within the objectives under the same conditions stated.

Example:

Objective: Given the mass of an object (in kilograms) and the acceleration of the object (in m/s/s), the learners will calculate the force in Newtons. Bad Assessment Suppose a 1500 gram ball was thrown off the roof of a 2-story (approximately 7 meters high) building. What force will the ball apply to the ground when it hits (in Newtons)? Better Assessment Suppose a 1.5 Kg ball was thrown against a wall with an acceleration of 2 m/s/s. With what force will the ball hit the wall (in Newtons)?

Rule 1.3: Include reminders to perform certain steps if needed.

Example:

Draw a picture of a typical animal cell and label all the cell parts. Don’t forget to draw very clear lines from your labels to the cell parts (use arrows if needed).

Page 15: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 15

Rule 1.4: Included clear evaluation criteria if the assessment tasks are particularly complex.

Example:

Explain possible mechanisms involved in producing elevated global temperatures as a result of the greenhouse effect. Make certain that your answer references the chemical characteristics of hydrocarbons in the atmosphere, the flow of energy resulting in atmospheric temperature changes, and at least 3 specific natural and 3 specific artificial sources of hydrocarbon emissions contributing to changes in atmospheric concentrations of hydrocarbon gases.

Rule 1.5: Keep all assessment items free of prompts or cues that could be used to determine the correct answer.

Examples of this rule are included in some of the type-specific rules that follow. Rule 1.6: Keep the text of all assessment items free of bias.

Language should not offend personal characteristics, such as gender, ethnicity, SES, religion, race or other variables affecting communication (i.e. learning disabilities, language issues, developmental levels, general writing abilities etc.).

Content and/or mode of testing should not exhibit unfair penalization:

• Esoteric content (only understood by sub-group) • Vocabulary (using words like “esoteric”) • Writing skills (biased towards the better writers)

Examples: The following assessment items are biases against Democrats as well as people who may not write poetry with ease: [7th Grade Language Arts] Underline all the proper nouns in the following sentence:

The failed policies of President Jimmy Carter typify what is wrong with all Democrats, even today.

[High School Biology]

Write a poem that described the journey a drop of blood takes as it travels throughout the human body.

Page 16: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 16

Rule 1.7: Ensure there is only one correct answer, unless specifically designed otherwise. Rule 1.8: Ensure all information presented is clearly understood by the learners. This includes clear images if used, and understandable text.

Page 17: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 17

Selected Response

Multiple-Choice Vocabulary: This is the stem of the multiple choice item. Which of the following choices represents the correct answer?

a. This is a distractor b. This is a distractor c. This is the correct (or best) choice d. This is a distractor

2. Rules for Multiple Choice Assessment Items Rule 2.1: Do not use any choices (distractors) that are obviously incorrect.

Example:

How many centimeters are in a meter? a. .001 b. 1.00 c. 100 d. 3.12

Choice “d. 3.12” is obviously wrong and will be eliminated immediately as a possibility, raising the probability that the learners could answer correctly by chance. Notice also that the use of the word “are” might cue the learner to dismiss “b. 1.00) as a choice (see Rule 2.2).

Rule 2.2: Ensure that words such as a, an, he, she, or plural words are not used to cue learners toward the correct answer or away from distractors

Example:

Pierre de Fermat is best known as a a. Inventor b. Scientist c. Mathematician d. Teacher

Distractor “a. Inventor” could be eliminated because the reader would most like dismiss “a inventor” as a possibility.

Page 18: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 18

Rule 2.3: Don’t use “All of the Above.”

Once a learner determines that one of the answer options is wrong, then the “All of the Above” option is no longer a legitimate choice. However, None of the Above is fine because it is not necessarily tied to any of the possible answers. But avoid using None of the Above if you are eliciting the directions indicate selecting the “best possible answer.”

Rule 2.4: Don’t use words like “All” or “Never” in the answer options.

These terms almost always signify a distractor is incorrect.

Rule 2.5: Don’t repeat words from the question in the answer options.

Example:

Which of the following represents the best description of Natural Selection? a. The natural process responsible for the origin of new species and the

adaptation of organisms to their environments. b. The procedures involved in cultivating the best possible type of offspring

from a population of adult organisms. c. The adaptation of organisms their environment.

In this case, option “a.” includes the word “natural,” which is part of the question. It is the obvious answer.

Rule 2.6: Don’t use negative answer options following a negative question or stem.

Example: Which of the assumptions about Natural Selection is NOT true?

a. Organisms reproduce b. Traits vary among offspring c. Offspring do inherit traits from their parents d. The environment does not limit the size of populations.

In this case, option “d. The environment does not limit…” is the correct answer, but it is confusing because the learner must consider the double negative in order to recognize it as the best answer.

Page 19: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 19

Rule 2.7: Minimize the use of negative statements in general, but if you do use them, draw clear attention to the word NOT. Which of the following statements is not an example of homeostasis?

a. Sweating when the internal temperature rises, resulting in a cooling of the body b. A variety of mechanisms that contribute to regulating blood volume and

composition c. A variety of mechanisms that regulate ion levels in the blood and interstitial fluid d. The hormonal mechanisms contributing to male pattern baldness

Rule 2.8: Try to ensure that all distractors are generally the same length. Which of the following best represents an example of controlling extraneous variables in an experiment?

a. Changing the type and amount of treatment conditions b. Ensuring that only one type of data is recorded charts c. Consistently recording data in labeled d. Identifying all the variables in an experiment that could influence and affect the

dependent variable, and ensuring that these variable are held constant between all experimental and control groups.

Guess which is most likely the correct answer?

Rule 2.9: Try to ensure that correct choices are evenly distributed throughout an exam with respect to their position if possible.

25% of the correct choices should be the first choice, 25% of the correct choices should be the second selection, etc.

Rule 2.10: If the learners must read through the answer choices in order to understand what the stem really means, then more information needs to be included in the stem. Present a definite, explicit and singular question or problem in the stem. A well-worded stem will result in choices that are presented in minimal words.

Bad: Psychology... Better: The science of mind and behavior is called ...

Rule 2.11: When possible, state the stem as a direct question rather than as an incomplete statement. The stem should be meaningful by itself and should present a definite problem. A stem that presents a definite problem allows a focus on the learning outcome. A stem that does not present a clear problem, however, may test students’ ability to draw inferences from vague descriptions rather serving as a more direct test of students’ achievement of the learning outcome.

Page 20: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 20

Bad: Alloys are ordinarily produced by... Better: How are allows ordinarily produced?

Bad: Which of the following statements is true? Better: What characteristics is relatively constant in mitochondrial genomes across

species? Rule 2.12: Include in the stem any word(s) that might otherwise be repeated in each alternative.

Bad: In national elections in the United States the President is officially

a. chosen by the people. b. chosen by members of Congress. c. chosen by the House of Representatives. d. chosen by the Electoral College.

Better: In national elections in the United States the President is officially chosen by a. the people b. members of Congress c. the House of Representatives d. the Electoral College

Rule 2.13 Make the alternatives grammatically parallel with each other, and consistent with the stem.

Bad: What would do most to advance the application of atomic discoveries to medicine?

a. Standardized techniques for treatment of patients. b. Train the average doctor to apply radioactive treatments. c. Remove the restriction on the use of radioactive substances. d. Establishing hospitals staffed by highly trained radioactive therapy specialists.

Better: What would do most to advance the application of atomic discoveries to medicine?

a. Development of standardized techniques for treatment of patients. b. Training of the average doctor in application of radioactive treatments. c. Removal of restriction on the use of radioactive substances. d. Addition of trained radioactive therapy specialists to hospital staffs.

Page 21: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 21

Rule 2.14 Make the alternatives mutually exclusive.

Bad: The daily minimum required amount of milk that a 10 year old child should drink is

a. 1-2 glasses. b. 2-3 glasses. c. 3-4 glasses. d. at least 4 glasses.

Better: What is the daily minimum required amount of milk a 10 year old child should drink?

a. 1 glass. b. 2 glasses. c. 3 glasses. d. 4 glasses.

Rule 2.15: When possible, present alternatives in some logical order (e.g., chronological, most to least, alphabetical).

At 7 a.m. two trucks leave a diner and travel north. One truck averages 42 miles per hour and the other truck averages 38 miles per hour. At what time will they be 24 miles apart?

Bad Better

a. 6 p.m. b. 9 p.m. c. 1 a.m. d. 1 p.m. e. 6 a.m.

a. 1 a.m. b. 6 a.m. c. 9 a.m. d. 1 p.m. e. 6 p.m.

Page 22: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 22

Rule 2.16: Do not include irrelevant material in the stem.

Rule 2.17: Alternatives should be stated clearly and concisely. Items that are excessively wordy assess students’ reading ability rather than their attainment of the learning objective

Page 23: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 23

Rule 2.18: Alternatives should be homogenous in content. Alternatives that are heterogeneous in content can provide cues to student about the correct answer.

Rule 2.19: Using multiple choice to measure critical thinking skills (i.e. higher-order rules) often requires the use of multiple choice items in which learners must select more than one choice in order for the answer to be correct, or the learners must answer multiple related but separate multiple choice items in order for the answer to be considered correct.

Example:

1a. If an astronaut on the moon (no atmosphere, 1/6 the gravitational force of Earth) dropped a 2 kilogram hammer and a 3 gram feather at precisely the same time from the same height above the surface of the moon, which of the following would most likely occur?

a. The hammer would fall faster and hit the moon surface first. b. The feather would fall faster and hit the moon surface first. c. The hammer and the feather would fall at the same speed and hit the moon’s

surface at the same time. d. The hammer and the feather would not fall to the moon’s surface. 1b. Which of the following best explains why the answer(s) you selected in the item above (question 1a) is/are correct? Choose ALL that apply.

a. The moon’s force of gravity acts the same on the hammer and the feather. b. The mass of the hammer is greater than the mass of the feather. c. The moon’s force of gravity is greater on the hammer because it has more

mass. d. The moon’s force of gravity is stronger on the feather because it is less

massive. e. In the absence of an atmosphere, the lighter feather is easier to move than

the hammer, so it accelerates faster. f. The moon does not produce a force of gravity.

Page 24: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 24

3. Rules for Binary-Choice (i.e. True-False) Assessment Items Rule 3.1: Avoid the use of specific determiners that would permit a test-wise but unprepared examinee to respond correctly. Specific determiners refer to sweeping terms like "all," "always," "none," "never," "impossible," "inevitable," etc. Statements including such terms are likely to be false. On the other hand, statements using qualifying determiners such as "usually," "sometimes," "often," etc., are likely to be true. When statements do require the use of specific determiners, make sure they appear in both true and false items.

Bad Items The force of gravity can never change. [Most Likely False] There are no places on Earth without living things. [Most Likely False] Even the most complex living organisms may continue evolving. [Most Likely True]

Rule 3.2: Use negatives sparingly.

Bad: The Supreme Court is not composed of nine justices. Better: The Supreme is composed of nine justices.

Rule 3.3: Only address single concepts in true-false items.

Bad: Water will boil at a higher temperature if the atmospheric pressure on its surface is increased and more heat is applied to the container. Better: Water will boil at a higher temperature if the atmospheric pressure on its surface is increased.

and/or Water will boil at a higher temperature if more heat is applied to the container.

Rule 3.4: Develop an even mix of true and false correct answers (though false items tend to discriminate slightly better). Rule 3.5: Ensure binary-choice items are similar in length. For example, qualifying clauses in longer statements tend to ensure the statement is true. BUT include enough background information and qualifications so that the ability to respond correctly to the item does not depend on some special, uncommon knowledge.

Page 25: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 25

Bad: The second principle of education is that the individual gathers knowledge. Better: According to John Dewey, the second principle of education is that the individual gathers knowledge.

Rule 3.6: Base true-false items upon statements that are absolutely true or false, without qualifications or exceptions.

Bad: Nearsightedness is hereditary in origin. Better: Geneticists and eye specialists believe that the predisposition to nearsightedness is hereditary.

Rule 3.7: Express the item statement as simply and as clearly as possible.

Bad: When you see a highway with a marker that reads, "Interstate 80" you know that the construction and upkeep of that road is built and maintained by the state and federal government. Better: The construction and maintenance of interstate highways is provided by both state and federal governments.

Rule 3.8: Avoid lifting statements directly from the text, lecture or other materials so that recognition alone will not permit a correct answer.

Bad: For every action there is an opposite and equal reaction. Better: In accordance with Newton’s Third law of Motion, if you were to stand in a canoe and throw a life jacket forward to another canoe, chances are your canoe would jerk backward.

Rule 3.9: Avoid the use of unfamiliar vocabulary.

Bad: According to some politicians, the raison d'etre for capital punishment is retribution. Better: According to some politicians, justification for the existence of capital punishment is retribution.

Page 26: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 26

4. Rules for Multiple Binary-Choice Items In addition to all the rules described for the creation of effective binary-choice items, multiple binary-choice items should also adhere to the following two rules: Rule 4.1 Separate item clusters vividly from one another.

All the sub-items within a multiple binary-choice item should address a specific set of related content/skills. The assessment instrument should be formatted in a manner that clearly indicates a specific set of content and related skills are being assessed by multiple binary-choice items.

Rule 4.2 Ensure that each sub-item is, in fact, related to the content and/or skills set measured by the multiple binary-choice item. 5. Rules for Ordering Assessment Items Rule 5.1: Include the basis for ordering in the instructions.

Bad Directions List the planets of our solar system in order. Better Directions List the planets of our solar system in order from the closest to the sun to the furthest (average orbital distance).

6. Rules for Matching Assessment Items Rule 6.1: Do not use even numbers of items to be paired during matching assessments. Have more possible choices, or construct matches where a term from the match list could be used more than once.

Example of Good Item: Match each element with its chemical symbol. 1. Lead 2. Iron 3. Gold

a. Au b. Fe c. Hg

d. I e. K f. Pb

Page 27: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 27

Rule 6.2: Clearly describe the basis for matching and the number of times responses may be used if needed.

Bad Directions: Match the following. Better Directions: On the line to the left of each identifying location and characteristics in Column I, write the letter of the country in Column II that is best defined. Each country in Column II may be used more than once.

Rule 6.3: Generally, the items to be matched (the choices) should be brief in nature. Rule 6.4: Generally, list the choices in a logical order (i.e. alphabetical) and indicate if each choice can be used more than once.

Directions: On the line to the left of each definition in Column I, write the letter of the defense mechanism in Column II that is described. Use each defense mechanism only once.

Column I Column II

Bad Better ____1. Hunting for reasons to support

one's beliefs. a. Rationalization a. Denial of reality

____2. Accepting the values and norms of others as one's own even when they are contrary to previously held values.

b.

c.

Identification Projection

b.

c.

Identification Introjections

____3. Attributing to others one's own unacceptable impulses, thoughts and desires.

d. Denial of Reality d.

e.

Projection Rationalization

____4. Ignoring disagreeable situations, topics, sights.

f.

Transcendentalism

Rule 6.5: Employ homogeneous lists. The content of the matching items should address the same general concepts.

Bad:

____ Father of modern genetics ____ Part of the plant that carries out photosynthesis ____ Created botanical classification scheme ____ Sexual organ of the plant ____ Plant part that absorbs water and minerals

a. Mendel b. Roots c. Linnaeus d. Flower e. Leaf

Page 28: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 28

Better:

____ Part of the plant that carries out photosynthesis ____ Sexual organ of the plant ____ Plant part that absorbs water and minerals

a. Embryo b. Flower c. Fruit d. Leaf e. Root

Rule 6.6: Keep items and choices to be matched on the same page if possible.

Page 29: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 29

Constructed Response 7. Rules for Fill-in-the-Blank Assessment Items Rule 7.1: Ensure that only one answer could make the resulting statement true.

Example:

Bad: The appearance of Halley’s Comet .

Better: The appearance of Halley’s Comet occurs every years.

Rule 7.2: Use blanks that are the same size for multiple fill-in-the-blank items.

If the lengths of the lines are different, the learners might interpret the size of the line as a cue for the best possible answer.

Rule 7.3: Try not to place blanks near the beginning of sentences; rather, keep blanks toward the end of statements following the presentation of the more clearly-defined problem.

Bad: ________ is the molecular weight of KClO3. Better: The molecular weight of KClO3 is ________.

Rule 7.4: Do not omit so many words from the statement that the intended meaning is lost.

Bad: The ___________were to Egypt as the____________were to Persia and as __________were to the early tribes of Israel. Better: The Pharaohs were to Egypt as the__________were to Persia.

Rule 7.5: Avoid grammatical or other clues to the correct response.

Bad: Most of the United States' libraries are organized according to the __________ decimal system. Better: Which organizational system is used by most of the United States' libraries?

Page 30: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 30

8. Rules for Constructing Short Answer or Essay-Type Items Rule 8.1: Prepare essay items that elicit the type of behavior you want to measure.

Learning Objective: The student will be able to explain how the normal curve serves as a statistical model.

Not as Good: Describe a normal curve in terms of: symmetry, modality, kurtosis and skewness. Better: Briefly explain how the normal curve serves as a statistical model for estimation and hypothesis testing.

Rule 8.2: Construct items so that the tasks are explicitly described and the students have a clear idea about the extensiveness of the response desired (including information about the suggested amount of time required to answer the item as well as the overall point values if applicable).

Bad: Discuss the economic factors which led to the stock market crash of 1929. Better: Identify the three major economic conditions which led to the stock market crash of 1929. Discuss briefly each condition in correct chronological sequence and in one paragraph indicate how the three factors were inter-related. (10 points, 20 minutes)

Rule 8.3: Avoid giving the student a choice among optional items as this greatly reduces the reliability of the test. Rule 8.4: Before developing a scoring guide, answer the question yourself in writing.

Essay Scoring Suggestions:

Analytical Scoring:

Each answer is compared to an ideal answer and points are assigned for the inclusion of necessary elements. Grades are based on the number of accumulated points either absolutely (i.e., A=10 or more points, B=6-9 pts., etc.) or relatively (A=top 15% scores, B=next 30% of scores, etc.)

Holistic Scoring:

Each answer is read and assigned a score (e.g., grade, total points) based either on the total quality of the response or on the total quality of the response relative to other student answers.

Note: If a scoring rubric is to be developed, consider that rubric development rules presented in the next section.

Page 31: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 31

Rubrics A rubric, in general, is an instrument used to clearly communicate consistent assessment criteria for complex learning tasks. Rubrics are similar to comprehensive checklists, but the information is arranged in a grid-like fashion to help organize the scoring of individual student performances. Holistic

Holistic rubrics present and score the overall performance of learners with respect to all specified education variables of interest. Generally, these types of rubrics do not present specific criteria for different levels of performance related to each, individual variables of interest. Holistic rubrics are summative and evaluation-oriented in nature. These rubrics tend to focus on product variables.

Analytic

Analytic rubrics present specific criteria for different levels of performance related to each, individual variables of interest. Analytic rubrics are formative and assessment-oriented in nature. These rubrics tend to focus on both process and product variables.

Roles and Benefits of Rubrics Communicate goals and objectives of learning experience Help guide learners through the learning process Encourage self-reflection and self-evaluation Enable effective and consistent peer evaluation Minimize subjectivity in grading Makes it easier to conduct final evaluations of complex tasks

Rubric Limitations Labor-intensive to create Won’t get it right the first time Should go through tasks to develop best rubric possible

Page 32: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 32

Analytic Rubric Design and Structure Step One: Analyze instructional goal to determine ALL process variables of interest.

Skills & Knowledge Attitudes Collaboration Creativity Communication Use of Resources

Step Two: Describe the product to be evaluated. Step Three: Identify characteristics of best possible performance for each process variable of interest. Brainstorm these! Step Four: Identify characteristics of best possible product. Step Five: Create rubric rows and columns:

Rubric Areas [Rows]

3-7 areas that represent the total performance based on instructional goals (process and product variables)

Distinctly different variables are separated to enable individual assessment

Rubric Criteria [Analytic Columns] All areas contain objective descriptions Mix of numerically quantifiable and descriptive words leading to objective

assessment Each level has a distinctly different level of performance

Content and Presentation

Content is clear upon examination of the rubric All variables of interest are assessed Criteria challenge students to perform at a high level of academic

achievement

Page 33: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 33

Page 34: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 34

Chocolate Chip Cookie-Making Example

Instructional Goal: Working in groups of three, students will use any of the ingredients presented within a well-stocked kitchen to create chocolate-chip cookies without using a recipe. Step Two: Identify product variables of interest.

Number of chocolate chips Texture Color Taste Richness (flavor)

Step One: Identify process variables of interest.

Collaboration with other members of team Carrying out roles Following directions Cleaning up properly

Step Three: Identify characteristics of best possible performance for each process variable of interest.

Page 35: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 35

Step Four: Identify characteristics of best possible product.

Chocolate chip in every bite Chewy Golden brown Home-baked taste Rich, creamy, high-fat flavor

Step Five: Create rubric following all rules.

Variable of Interest

Poor [1]

Needs Improvement

[2]

Good [3]

Delicious [4] Score

Pro

du

ct

Number of Chips

Too few or too many chips

Chocolate in 50% of bites

Chips in about 75% of bites

Chocolate chip in every bite

Texture Texture resembles a dog biscuit

Texture either crispy/crunchy or 50% uncooked

Chewy in middle, crisp on edges

Even balance of chewy and

crispy

Color Burned

Either dark brown from overcooking

or light from undercooking

Either light from overcooking or light from being

25% raw

Golden brown

Taste

Store-bought flavor, preservative aftertaste – stale,

hard, chalky

Tasteless Quality store-bought taste

Home-baked taste

Richness Nonfat contents Low-fat contents Medium fat contents

Rich, creamy, high-fat flavor

Pro

cess

Cooperation

Doesn’t carry out assigned tasks, doesn’t ask and

answer questions, does not negotiate

Doesn’t always carry out assigned

tasks, doesn’t always ask and

answer questions, may not negotiate

always

Doesn’t always carry out

assigned tasks, OR doesn’t ask

and answer questions, OR

may not negotiate

Carries out assigned tasks,

asks and answers

questions, negotiates

Direction Never follows directions

Follows some directions

approximately 50% of the time.

Follows directions most of the time.

Follows all direction, all of

the time.

Clean-Up Nothing washed or

put away, no surfaces cleaned.

Not everything washed or put

away, OR not all surfaces cleaned.

Cleaned up with reminder, everything

washed and put away, all

surfaces cleaned.

Cleaned up without

reminder, everything

washed and put away, all surfaces cleaned.

Total:

Page 36: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 36

References

The specific assessment and feedback design strategies presented in this document are based on the learning and instructional design models articulated in the following texts: American Educational Research Association., American Psychological Association., National

Council on Measurement in Education., & Joint Committee on Standards for Educational and Psychological Testing (U.S.). (2014). Standards for educational and psychological testing.

Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and

assessing: A revision of Bloom's Taxonomy of educational objectives: Complete edition, New York : Longman.

Bashaw, W. (1991). Assessing learner performance. In Briggs, L., Gustafsen, K. & Tillman,

M. (Eds.). Instructional design: principles and applications. Englewood Cliffs, NJ: Educational Technology Publications.

Bloom, B; Mesia, B.; & Krathwohl, D. (1964). Taxonomy of Educational Objectives (two

vols: The Affective Domain & The Cognitive Domain). New York. David McKay. Brame, C. (2014). Writing Good Multiple Choice Test Questions. Retrieved from

http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/

Cunningham, D.J., Duffy, T.M. & Knuth, R.A. (1991). The Textbook of the Future. In

McKnight, C. (Ed.), Hypertext: A Psychological Perspective. London: Horwood Publishing.Dick, W., Carey, L. & Carey, J. (2005). The systematic design of instruction (6th edition). New York: Pearson

Dick, W., Carey, L. & Carey, J. (2015). The systematic design of instruction. Upper Saddle

River, NJ. Pearson. Duvall, Kathy. Improving your test questions. Retrieved 10-17-2008 from the University of

Illinois at Urbana-Champaign Center for teaching Excellent website: http://www.oir.uiuc.edu/dme/exams/ITQ.html

Frechtling, J., Sharp, L. & Westat, Inc. (Eds.) (1997). User-Friendly Handbook for Mixed

Method Evaluations. National Science Foundation. Gagne, R. (1985). The Conditions of Learning (4th ed.). New York: Holt, Rinehart &

Winston. Gagné, R. & Driscoll, M. (1988). Essentials of learning for instruction. Englewood Cliffs, N.J.: Prentice Hall

Herrington, J. & Oliver, R. (1997). Multimedia, magic and the way students respond to a

situated learning environment. Australian Journal of Educational Technology, 13(2), 127-143.

Page 37: Guide to Developing Effective Selected and Constructed ......Effective Selected and Constructed Response Assessment Items & Rubrics . Greg Sherman, Ph.D. ... and graphic messages through

Effective Assessment Development Guide Page 37

Merrill, M.D. (1994). Instructional Design Theory. Englewood Cliffs, NJ: Educational Technology Publications.

Jonassen, D., Peck, K. and Wilson, B. (1999). Learning With Technology: A Constructivist Perspective. Upper Saddle River, N.J.: Merrill Publishing Merrill, M. D. & Tennyson, R. (1994). Teaching concepts: an instructional design. Englewood Cliffs, NJ: Educational Technology

Popham, W. J. (2011). Classroom assessment: What teachers need to know. Boston:

Pearson. Sullivan, H. & Higgins, N. (1983). Teaching for competence. New York: Teachers College

Press