63
BURRHUS FREDERICK SKINNER

Burrhus Frederick Skinner

Embed Size (px)

Citation preview

Page 1: Burrhus Frederick Skinner

BURRHUS FREDERICK SKINNER

Page 2: Burrhus Frederick Skinner

MAJOR THEORETICAL CONCEPTS• Radical Behaviorism• Respondent and Operant Behavior• Type S and Type R Conditioning• Operant Conditioning Principles• The Skinner Box• The Cumulative Recording• Conditioning the Lever-Pressing

Response• Shaping• Extinction• Spontaneous Recovery• Superstitious Behavior• Discriminative Operant• Secondary Reinforcement

• Generalized Reinforcers• Chaining• Positive and Negative Reinforcers• Punishment• Alternatives to Punishment• Comparison of Skinner and

Thorndike• Schedules of Reinforcement• Verbal Behavior• Contingency Contracting• Skinner’s Attitude Toward Learning

Theory• The Need for a technology of

Behavior

Page 3: Burrhus Frederick Skinner

Radical Behaviorism• Rejects scientific language and interpretations

that refer to mentalistic events.• Provides basis for skepticism about mentalism

and about various influential approaches to the development of theories of learning and intelligent action.

Page 4: Burrhus Frederick Skinner

Respondent and Operant Behavior• Respondent behavior: elicited by a known

stimulus– Dependent on the stimulus that preceded it

Page 5: Burrhus Frederick Skinner

Respondent and Operant Behavior• Operant behavior: elicited by an unknown

stimulus– Seems to appear simultaneously– “Operant behavior does not occur independently

of stimulation; rather, the stimulus causing such behavior is unknown and that it is not important to know its cause.”

– Controlled by its consequences

Page 6: Burrhus Frederick Skinner

Type S and Type R Conditioning• Type S conditioning: aka respondent

conditioning; identical to classical conditioning– Called as such to emphasize the importance of the

stimulus in eliciting the desired response– The strength of the conditioning is determined by

the magnitude of the conditioned response

Page 7: Burrhus Frederick Skinner

Type S and Type R Conditioning• Type R conditioning: aka operant

conditioning; identical to instrumental conditioning– Emphasis on the response– The strength of the conditioning is shown by

response rate

Page 8: Burrhus Frederick Skinner

Operant Conditioning Principles• Two general principles:– Any response that is followed by a reinforcing

stimulus tends to be repeated.– A reinforcing stimulus is anything that increases

the rate with which an operant response occurs.• A reinforcer is anything that increases the

probability of a response recurring.

Page 9: Burrhus Frederick Skinner

Operant Conditioning Principles• Contingent reinforcement – getting the

reinforcer is dependent on the organism emmiting a certain response.

Page 10: Burrhus Frederick Skinner

Operant Conditioning Principles• Modifying behavior:– Find a reinforcer for the organism.– Wait until the desired behavior occurs.– Immediately reinforce the organism.

• When this is done, the rate with which the desired response occurs goes up.

Page 11: Burrhus Frederick Skinner

Operant Conditioning Principles• “We are what we have been reinforced for

being. What we call personality is nothing more than consistent behavior patterns that summarize our reinforcement history.”

Page 12: Burrhus Frederick Skinner

Operant Conditioning Principles• The casual processes producing the behavior

are instances of selection by consequences, a causal mode exhibited in the analogous processes of operant conditioning (contingencies of reinforcement), and natural selection (contingencies of survival).

Page 13: Burrhus Frederick Skinner

Operant Conditioning Principles• If one controls reinforcement, one also

controls behavior.– Behavior is constantly being reinforced, whether

or not we are aware of the fact.– It is not a question of whether or not a behavior is

going to be controlled, it is a question of who or what is going to control it.

Page 14: Burrhus Frederick Skinner

The Skinner Box• Direct descendant of Thondike’s puzzle box• Arranged so that when the animal depresses

the lever, the feeder mechanism is activated, and a small pellet of food is released into the food cup.

Page 15: Burrhus Frederick Skinner

The Cumulative Recording• Time is recorded in the x-axis– Rate by which line ascends indicates the rate of

responding• No response = line parallel to the x-axis• Response = pen goes up a notch• Multiple/rapid responses = line rises rapidly

Page 16: Burrhus Frederick Skinner

The Cumulative Recording• Total number of responses is recorded in the

y-axis– Measure the distance between the line of the

graph and the x-axis

Page 17: Burrhus Frederick Skinner

Conditioning the Lever-Pressing Response

• Deprivation: set of procedures that is related to how an organism performs on a certain task.

Page 18: Burrhus Frederick Skinner

Conditioning the Lever-Pressing Response

• Magazine training– The experimenter uses an external hand switch

and periodically triggers the feeder mechanism making sure the animal is not within the vicinity of the food cup.

– When the feeder mechanism is activated, it produces a clicking sound before delivering a pellet of food into the food cup.

– Thus, the animal associates the clicking sound with the presence of a food pellet.

Page 19: Burrhus Frederick Skinner

Conditioning the Lever-Pressing Response

• Lever Pressing– Everytime the animal presses the lever, it fires the

food magazine.– It produces a click that reinforces the bar press

and also signals the animal to go to the food cup where it is reinforced by food.

Page 20: Burrhus Frederick Skinner

Shaping • “You’re Hot, You’re Cold”• 2 components:– Differential reinforcement: some responses are

reinforced while others are not.– Successive reinforcement: only those responses

that become increasingly similar to the one the experimenter wants are reinforced.

Page 21: Burrhus Frederick Skinner

Extinction • Removal of the reinforcer from the operant

conditioning situation.• After extinction, the response rate goes back

to where it was before reinforcement was introduced, parallel to the x-axis. (operant level)

Page 22: Burrhus Frederick Skinner

Spontaneous Recovery• After extinction, if the animal is returned to

it’s home cage for a while and then returned to the experimental box, it will begin to press the lever again for a short period of time without any additional training.

Page 23: Burrhus Frederick Skinner

Superstitious Behavior• Ritualistic behavior– The animal thinks that what it is doing (the

reinforced behavior) is causing the pellet of food to appear.

Page 24: Burrhus Frederick Skinner

Superstitious Behavior• Noncontingent reinforcement: reinforcer is

independent of the animal’s behavior.– The feeder mechanism delivers a food pellet

randomly regardless of what the animal is doing.

Page 25: Burrhus Frederick Skinner

Discriminative Operant• Operant response given to only one set of

circumstances.• Animal receives food pellet when light is on

(discriminative stimulus; SD) and not if the light is off (SΔ).– the animal learns to press the lever only when the

light is on.• SD -> R -> SR, where R is the operant response

and SR is the reinforcing stimulus

Page 26: Burrhus Frederick Skinner

Secondary Reinforcement• Any neutral stimulus paired with a primary

reinforcer takes on reinforcing properties of its own – principle of secondary reinforcement

Page 27: Burrhus Frederick Skinner

Generalized Reinforcer• Secondary reinforcer that has been paired

with more than one primary reinforcer.• The main advantage is that it is not dependent

on deprivation to be effective.

Page 28: Burrhus Frederick Skinner

Chaining• Process by which one response can bring the

organism into contact with stimuli that act as an SD for another response, which in turn causes it to experience stimuli that cause a third response and so on.

• The development of a chained response always acts from the primary reinforcer backwards.

Page 29: Burrhus Frederick Skinner

Positive and Negative Reinforcers• Positive reinforcers: something naturally

reinforcing to the organism and is related to survival.– Its addition to the situation by a certain response,

increases the probability of that response’s recurrence.

Page 30: Burrhus Frederick Skinner

Positive and Negative Reinforcers• Negative reinforcers: something that is

naturally harmful to the organism.– Its removal from the situation by a certain

response, increases the probability of that response’s recurrence.

Page 31: Burrhus Frederick Skinner

Punishment• Occurs when a response removes something

positive from the situation or adds something negative to the situation.

• It does not decrease the probability of a response.– It may suppress a response as long as it is applied;

it does not weaken a habit.

Page 32: Burrhus Frederick Skinner

Punishment• Arguments against the use of punishments:– It is ineffective in the long run.– It causes unfortunate emotional by-products (e.g. fear).– It indicates what the organism should not do, not what

it should do.– It justifies inflicting pain on others.– Being in a situation where previously punished

behavior could be engaged in without being punished may excuse the child to do so.

– Punishment elicits aggression.– It often replaces one undesirable response with

another.

Page 33: Burrhus Frederick Skinner

Alternatives for Punishment• The circumstances causing the undesirable

behavior can be changed.• Letting the organism perform the undesired

response until it is sick of it - let time pass• Reinforce a behavior incompatible with the

undesirable behavior.• Ignore the undesirable behavior.

Page 34: Burrhus Frederick Skinner

Comparison of Skinner and Thorndike• Thorndike: dependent variable is the time to

solution; instrumental conditioning

Page 35: Burrhus Frederick Skinner

Comparison of Skinner and Thorndike• Skinner: dependent variable is the rate of

responding; operant conditioning

Page 36: Burrhus Frederick Skinner

Schedules of Reinforcement• PRE/Partial reinforcement effect: an organism

who receives reinforcement everytime it responds accurately and then placed on extinction will extinguish faster than an organism who had only a certain percentage of its correct responses reinforced during acquisition.

Page 37: Burrhus Frederick Skinner

Schedules of Reinforcement• Continuous Reinforcement Schedule (CRF):

every correct response during acquisition is reinforced.

Page 38: Burrhus Frederick Skinner

Schedules of Reinforcement• Fixed Interval Reinforcement Schedule (FI):

the animal is reinforced for a response made only after a set interval of time. – Fixed-interval scallop

Page 39: Burrhus Frederick Skinner

Schedules of Reinforcement• Fixed Ratio Reinforcement Schedule (FR):

every nth response that the animal makes is reinforced.– The animal must respond a certain number of

times before it is reinforced.– Post-reinforcement pause: depression in the rate

of response.

Page 40: Burrhus Frederick Skinner

Schedules of Reinforcement• Variable Interval Reinforcement Schedule (VI):

the animal is reinforced for responses made at the end of time intervals of variable durations.– Rather than having a fixed time interval, the

animal is reinforced on an average time.

Page 41: Burrhus Frederick Skinner

Schedules of Reinforcement• Variable Ratio Reinforcement Schedule (VR):

an animal is reinforced after making a specific average number of responses.– Same reinforcement schedule governing gamblers

in casinos; the faster one pulls the handle of a slot machine, the more frequently one is reinforced.

Page 42: Burrhus Frederick Skinner

Schedules of Reinforcement• Concurrent Schedule: used to investigate

simple-choice behavior; procedure wherein there are two available operant keys at the same time but that different reinforcements under different times.

• Matching Law: under concurrent schedules, the relative frequency of behavior matches the relative frequency of reinforcement.– B1/B1+B2 = R1/R1+R2

Page 43: Burrhus Frederick Skinner

Schedules of Reinforcement• Concurrent Chain Reinforcement Schedule:

used to investigate complex-choice behavior; an animal’s behavior during the initial phase of the experiment determines what schedule of reinforcement it experiences during the second, or terminal, phase.

Page 44: Burrhus Frederick Skinner

Schedules of Reinforcement• Progressive Ratio Schedules (PR): a lab animal

begins with low ratio schedule and the ratio of responses to reinforcements is systematically increased during subsequent training sessions. – Behavioral Economics: a discipline that uses

behavioral techniques to study demand for reinforcers and reinforcer efficacy.

Page 45: Burrhus Frederick Skinner

Verbal Behavior• Mand (demand): characterized by the unique

relationship between the form of the response and the reinforcement characteristically received in a given verbal community. – When the demand is met, the utterance (mand) is

reinforced , and next time the need arises, the person is likely to repeat the mand.

Page 46: Burrhus Frederick Skinner

Verbal Behavior• Tact: the verbal behavior of naming things. – Such behavior results in reinforcement when

objects or events are named correctly.

Page 47: Burrhus Frederick Skinner

Verbal Behavior• Echoic behavior: often a prerequisite to a

more complicated verbal behavior. – Verbal behavior that is reinforced when someone

else’s verbal response is repeated verbatim.

Page 48: Burrhus Frederick Skinner

Verbal Behavior• Autoclitic behavior: main function is to qualify

responses, express relations, and provide a grammatical framework for verbal behavior.– Autoclitic: intended to suggest behavior which is

based upon or depends upon other verbal behavior.

Page 49: Burrhus Frederick Skinner

Contingency Contracting• Involves making arrangements so that a

person gets something wanted when that person acts in a certain way.

Page 50: Burrhus Frederick Skinner

Skinner’s Attitude Toward Learning• “the empty organism approach”• Complex theories of learning are time-

consuming and wasteful• Our main concern should be to discover basic

relationships between classes of stimuli and classes of responses

• approach to research was to avoid theorizing and to deal only with the manipulation of observable stimuli and note how their manipulation affected behavior

Page 51: Burrhus Frederick Skinner

The Need for a Technology of Behavior• the many problems caused by cultural

practices could be solved by strengthening desirable behavior with the principles derived from the experimental analysis of behavior.

Page 52: Burrhus Frederick Skinner

RELATIVITY OF REINFORCEMENT• David Premack• William Timberlake

Page 53: Burrhus Frederick Skinner

David Premack• The way to find out what can be used as a

reinforcer is to observe the organism’s behavior while it has the opportunity to engage in any number of activities; the activities that it engages in most often can be used to reinforce the activities it engages in less often. (Premack Principle)

Page 54: Burrhus Frederick Skinner

William Timberlake• Disequilibrium hypothesis: any activity can be

a reinforcer if a contingency schedule constrains an animal’s access to that activity.

Page 55: Burrhus Frederick Skinner

THE MISBEHAVIOR OF ORGANISMS• Parody of the title of Skinner’s first major

work, The Behavior of Organisms; written by Marian and Keller Breland– Instinctual drift– Autoshaping

Page 56: Burrhus Frederick Skinner

SKINNER ON EDUCATION• learning proceeds most effectively if:– the information to be learned is presented in small

steps – the learners are given rapid feedback concerning

the accuracy of their learning– the learners are able to learn at their own pace

Page 57: Burrhus Frederick Skinner

SKINNER ON EDUCATION• Objectives should be defined behaviourally.• Motivation was important only in determining

a reinforcer for a given student.• Secondary reinforcers are important • Skinner stressed the use of extrinsic

reinforcers in education.• It is also important to move from 100 percent

reinforcement schedule to partial reinforcement schedule.

• Punishment is avoided as a reinforcer.

Page 58: Burrhus Frederick Skinner

SKINNER’S LEGACY• Programmed learning• PSI• CBI

Page 59: Burrhus Frederick Skinner

Programmed Learning• Incorporates all three principles of learning• Teaching machine: device invented to present

programmed material

Page 60: Burrhus Frederick Skinner

Programmed Learning• Features:– Small steps– Overt responding– Immediate feedback– Self-pacing

Page 61: Burrhus Frederick Skinner

PSI (Personalized Systems of Instructions) • individualized; involves quick, frequent

feedback on student performance

Page 62: Burrhus Frederick Skinner

PSI (Personalized Systems of Instructions) • Four steps:– Determine the material to be covered in the

course– Divide the material– Create methods to test mastery– Allow self-pacing

Page 63: Burrhus Frederick Skinner

CBI(Computer-Based Instruction)

• process by which a computer is used to present programmed or other kinds of instructional material.– Online education