190 likes | 410 Views
Operant conditioning. How the consequences of our behavior affects our future behaviorEdward Thorndike started this concept by studying cats in ?puzzle boxes"He graphed how their learning progressed over many trials to reveal the learning curve . reinforcement. An event which increases the future
E N D
1. Operant conditioning Learning from our behavior’s consequences
2. Operant conditioning How the consequences of our behavior affects our future behavior
Edward Thorndike started this concept by studying cats in “puzzle boxes”
He graphed how their learning progressed over many trials to reveal the learning curve
3. reinforcement An event which increases the future probability of the most recent response
First explored by Thorndike
Taken from another perspective, if the delivery of something makes the action or behavior which preceded it more likely to occur, it is a reinforcer
A reinforcer “stamps in” a response
4. Thorndike labeled this “The Law of Effect”
We are more likely to repeat responses that lead to (what we view as) favorable consequences or outcomes
5. In operant conditioning, we change behavior by following a desired action/response with reinforcement
And the sooner, the better
We label it “operant” because the subject operates on the environment to obtain reinforcement
6. Cc contrast Some call it instrumental conditioning because the subject’s behavior is instrumental in producing the outcome
Stark distinction from classical conditioning when the subject’s behavior was meaningless
Operant – subject acts with muscles
Classical – subject reacts through
internal organs
7. b. F. skinner The “Father” of Operant Conditioning
Zealous advocate
Laid out theoretical framework
Used only the simplest of assumptions
Pioneered precise operational definitions
Invented The Skinner Box
Rats pressed levers, pigeons pecked Os
8. Setting the table for desired behaviors
How do we get the rat to pull the lever in the first place?
Shaping – establishing a new behavior by reinforcing successive approximations to it
9. Shaping ii Works with rats, pigeons, children, everyone
Skinner’s lectures
Pigeon’s dropping bombs?
“Catch them while they’re being good”
Token economies
10. Complex behaviors How do we get an animal to perform a sequence of behaviors?
Chaining – reinforce each desired behavior by giving the animal the chance to engage in a previously reinforced behavior
11. Reinforcement & punishment These two events drive operant conditioning, and as a result, much of our behavior
Reinforcement increases the odds that the behavior which preceded it will re-occur
Punishment decreases the odds that the behavior which preceded it will reoccur
12. Reinforcement can result from removing pain or by helping us avoid it
Going to the dentist
Taking Tylenol/aspirin
Finding an excuse not to ask her out
Settling the big trial
13. SUPERSTITIONS We think that our good luck charm prevents us from screwing up
Same with “pre-game” rituals
14. More punishment Punishment can result from removing pleasure
“If you do that one more time. I’ll …”
Fouls in a basketball game
Grounded!
Time-outs
15. Does reinforcement simply equal pleasure or pain equal punishment?
No, reinforcement must increase the frequency of the behavior.
And, punishment must decrease it.
16. Punishment iii Very hard to administer successfully
Works best if it is:
1) consistent,
2) immediate,
3) moderate, and
4) alternatives are available.
Does torture work? Spanking?
17. Finding the right reinforcer How can we pick reinforcers that will prompt more desirable behaviors?
Harder than you might think, great variability between individuals.
Remember, Skinner virtually starved his pigeons and rats.
18. Two principles Permack – the chance to engage in frequent behaviors can be a reinforcer for less common behaviors
If that fails,
19. Disequilibrium – disrupting someone’s typical routine will make a return to the routine reinforcing