410 likes | 627 Views
Principles of Behavior Sixth Edition. Richard W. Malott Western Michigan University. Power Point by Nikki Hoffmeister. Chapter 18. Interval Schedules. What is a Fixed-Interval Schedule?. Fixed-Interval (FI) Schedule of Reinforcement: A reinforcer is contingent on the first response,
E N D
Principles of BehaviorSixth Edition Richard W. Malott Western Michigan University Power Point by Nikki Hoffmeister
Chapter 18 Interval Schedules
What is a Fixed-Interval Schedule? Fixed-Interval (FI) Schedule of Reinforcement: • A reinforcer is contingent on • the first response, • after a fixed interval of time, • since the last opportunity for reinforcement.
What type of responding results from an FI schedule? Fixed-Interval Scallop: • A fixed-interval schedule often produces a scallop – • a gradual increase in the rate of responding, • with responding occurring at a high rate • just before reinforcement is available. • No responding occurs for some time after reinforcement.
Joe’s Term Paper • Sid assigned a term paper the first day of class • Joe has 15 weeks to complete the project • The following figure is a cumulative record of Joe’s work under this schedule • Weeks are plotted on the abscissa • Cumulative number of hours he worked are on the ordinate
Joe’s Term Paper • Joe spent no time preparing the paper in the first 7 weeks • Finally, in the 8th week, he spent 5 hours preparing • He did more the next week • And even more the next week • He spent the final week in a frenzy of long hours in the library • This is an FI scallop, right? Wrong.
Congress Example • What’s the cumulative record of passing laws by the US Congress? • A scallop; just like the pigeon pecking a key on an FI schedule. • Members of Congress return from their annual recess (reinforcer). • They pause for a few months, • and then they pass the first law, and gradually increase passing laws until right before the next time for recess.
Is law-passing on an FI schedule? • No • In this analysis, we ask the same questions we did in the term paper example
SD: Calendar and clock say 11:30 PM Saturday Before: You have no opportunity to see your favorite TV show SΔ: Calendar and clock say 9:30 AM Monday After: You have opportunity to see your favorite TV show Behavior: You turn TV to channel 8 After: You have no opportunity to see your favorite TV show Other Non-Examples • The TV Schedule
Analysis • Problem 1: • You have a calendar and clock; Rudolph doesn’t • If you didn’t, you might respond like Rudolph: responding more and more quickly as time passed • Problem 2: • You have a deadline; Rudolph doesn’t
SD: It has been 2 weeks since last paycheck Before: You have no paycheck SΔ: It has been 1 week since last paycheck After: You have a paycheck Behavior: You pick up your paycheck After: You have no paycheck Other Non-Examples • The Paycheck
A Good Example • You’re watching Seinfeld • Commercials come on • You switch to Jerry Springer • But you keep switching back and forth with increasing frequency as the commercial interval wears on • One of your flips is reinforced by Seinfeld
Superstition in the Pigeon • Skinner put a pigeon in a Skinner box. • He placed a feeder in the box every 15 seconds, regardless of what the bird was doing. • The first time, just prior to the feeder being presented, the pigeon had made an abrupt counter clockwise turn. • He did the same thing the next time, right before the feeder came up.
Results • The bird performed a stereotyped pattern of behavior: • rapid and persistent counter clockwise turns.
What is a Fixed-Time Schedule? Fixed-Time Schedule of Reinforcer Delivery: • A reinforcer is delivered • after the passage of a fixed period of time, • independently of the response.
What is Superstitious Behavior? Superstitious Behavior: • Behaving as if the response causes • some specific outcome, • when it really does not.
What is a Variable-Interval Schedule? Variable-Interval (VI) Schedule of Reinforcement: • A reinforcer is contingent on • the first response • after a variable interval of time • since the last opportunity for reinforcement.
VI Schedules • The opportunity for reinforcement comes as a direct function of the passage of time. • Thus, it is a time-dependent schedule. • The lengths of the intervals between opportunities are varied.
VI Schedules • Although the opportunity for reinforcement occurs as a function of time alone, • the subject must make the response • after the interval is over for reinforcement to occur. • Time alone will not bring about the reinforcer.
What type of responding does a VI schedule produce? Variable-Interval Responding: • Variable-interval schedules produce • a moderate rate of responding, • with almost no post-reinforcement pausing.
Responses can produce reinforcers in 2 ways: • Continuous Reinforcement: • Every response produces a reinforcer. • Intermittent Reinforcement: • Only some responses produce a reinforcer.
Intermittent Reinforcement & Extinction Resistance to Extinction and Intermittent Reinforcement: • Intermittent reinforcement • makes the response • more resistant to extinction • than does continuous reinforcement.
What is Resistance to Extinction? Resistance to Extinction: • The number of responses • or amount of time • before a response extinguishes. • The more an intermittent schedule differs from continuous reinforcement, • the more the behavior resists extinction.
Intermittent Reinforcement & Extinction Why does intermittent reinforcement increase resistance to extinction? • It’s easy for the rats to “tell the difference” between CRF and extinction. • During CRF, all responses produce reinforcers • During extinction, none of them do
Also… • It’s hard for the rats to “tell the difference” between intermittent reinforcement and extinction • During intermittent reinforcement, only an occasional response produces a reinforcer • During extinction, none of them do • The rats quickly discriminate between CRF and extinction • They greatly generalize between intermittent reinforcement and extinction
Intermediate Enrichment • Response Strength: • Response frequency • Resistance to extinction • Behavioral momentum • Skinner rejected “response strength,” claiming it is a reification • When you have 2 different ways of measuring the same thing, that thing is probably a reification • Those different measures may not agree
Example • A pigeon has 2 keys in the box • One is on a CRF schedule • One is a VI 1 • According to Resistance to Extinction-Response Strength Measure: • The response on a VI 1 schedule is stronger because it is more resistant to extinction • But…the pigeon will most likely continue pecking the CRF key more often than the VI 1 key
On DickMalott.com • Chapter 18 Advanced Enrichment Section • Why Do Limited Holds Work the Way They Do? • Why Does Intermittent Reinforcement Increase Resistance to Extinction?