600 likes | 614 Views
Unit 3 Introduction. E3 a week from today, Thursday, 10/03 Multiple effects of stimuli Review schedules of reinforcement and contingent vs. adventitious reinforcement The power of reinforcement: Fordyce’s analysis and treatment of chronic pain Applications in Performance Management.
E N D
Unit 3 Introduction • E3 a week from today, Thursday, 10/03 • Multiple effects of stimuli • Review schedules of reinforcement and contingent vs. adventitious reinforcement • The power of reinforcement: Fordyce’s analysis and treatment of chronic pain • Applications in Performance Management
Multiple Effects of Stimuli: Purpose • To understand the difference between • SDs (and S∆s) and consequences • SDs and USs/CSs • To understand that one stimulus can have several different effects on the behavior of an individual
Stimuli to Date • Respondent Relations • USs and CSs • Operant Relations • SDs and S∆s • Scs (all consequences) • To date: each stimulus has only 1 effect • USs precede and elicit URs • CSs precede and elicit CRs • SDs precede and evoke operant Rs • S∆s precedeand suppress operant Rs • Scsfollow and influence future frequency
One Stimulus: Multiple Effects One stimulus can, for example: • Increase the future frequency of the behavior it follows (as an SR or Sr) • Immediately evoke a different operant behavior because the behavior has been reinforced in its presence but not in its absence (as an SD) • Immediately elicit a respondent behavior such as an emotional response (as an US or CS) In behavior analysis, we label the stimulus based on the effect it has on behavior.
An Everyday Analogy One woman can be a mother, a daughter, a sister, an aunt. We call that woman a different thing depending upon the relationship we want to emphasize. So too with stimuli. One stimulus can be an SR or Sr (or SP or Sp), an SD (or S∆), and a US or CS. We call that stimulus a different thing depending upon the relationship with behavior we want to emphasize and analyze.
A Simple Rat Example R (pull chain) Sr/SD (light on): R (lever press) SR (food) CS (light on) CR (salivation) • The light on is: • an Sr for the chain pull response • an SD for the lever press response • a CS for the salivation response It would be INCORRECT to say: • the light as an SD increases the future frequency of • pulling the chain • the light as an Sr evokes the lever press: by definition • reinforcement does NOT precede and evoke a response • the light as an SD or Sr elicits salivation: only a CS or US can • elicit salivation (3 incorrects, reinforcement, feel good example)
You invite a guest for dinner. The guest is sitting in the living room flipping through a magazine. You announce: “Dinner is served!” The “Dinner is served” has the following effects: Example 1: Dinner is Served • The dinner guest immediately walks to the dinner table • The dinner guest feels happy • The dinner guest’s mouth begins to “water” (salivate) • The dinner guest flips through magazines more frequently in the future in similar situations “Dinner is served” should be called: • An SD and only an SD for the first effect • A CS and only a CS for the second effect • A CS and only a CS for the third effect • An Sr and only an Sr for the fourth effect (helps to diagram, on board)
A person is driving down the road, looks into his rear view mirror and sees the flashing red/blue lights of a police car. The flashing lights have the following effects on behavior: Example 2: Multiple Effects • He immediately says “%#*!” (darn it!) • He immediately breaks out into a cold sweat • He does not look in his mirror as often in the future • He immediately pulls the car over Which of the behaviors would indicate that the flashing red/blue lights was: • a CS? Why? • an Sp (punisher)? Why? • an SD? Why? (VB, probable consequence for the last one;punishment? rogues)
A worker is graphing some of her performance data. Her supervisor says, “Hey, great work! Come into my office right now so we can discuss it.” The worker: Example 3: Multiple Effects • Immediately gets up and walks into the office • Immediately blushes • Immediately feels proud • Graphs her performance more often in the future Which of the behaviors would indicate that the supervisor’s statement was: • an Sr? Why? • an SD? Why? • a CS? Why? (#1 in the future; wrong! – definition of Sr vs. SD?)
Ex 4: MEs, Turn Example Around A track star bends down to tighten her shoe laces and hears the starting gun go off. What behavior on her part would indicate that the sound of the starting gun was: an SD? Why? 2. an Sp? Why? 3. a CS? Why? Difficulty of this type of question/example: No behavior in the question/example for the SD or CS. You have to come up with something reasonable given the situation.
Ex 5: MEs, Turn Example Around A student inserts money into a vending machine and pushes the button for pop. No pop is dispensed. The student hits the side of the machine and the pop is dispensed (drops into the container). What behavior on the part of the student would indicate that the sight of the pop in the dispenser/container: an SD? Why? 2. a CS? Why? 3. an Sr? Why? (last example, reach for the pop more often in the future??; but next slide - smiling, frowning, jumping up and down)
Operant vs. Respondent Behaviors • When asked to give a behavior that would indicate that a stimulus is a CS, many students make the mistake of saying “smiling,” “frowning,” “jumping up and down with excitement” • This is NOT correct • Smiling, frowning, jumping up and down, are all operant behaviors (as well as is most, but not all crying) • Only the actual emotional responses (the activation syndrome) are respondent behaviors • Thus, no credit if you indicate that smiling, frowning, jumping up and down would indicate that some stimulus is a CS (mistake is made because people often see these as “emotional behaviors”; but they are operant)
Multiple Effects: For additional examples and a self-instructional program: Go to my web site: alycedickinson.com Click on Teaching, then PSY 4600 Click on Multiple Effects of Stimuli (under Unit 3) Sample exam questions and answers: Pages 25 & 26, study objectives
SO 1: Six Basic Schedules of Reinforcement • Fixed Ratio (FR) • Reinforcement is provided after a specified numberof responses, i.e., FR3 • Variable Ratio (VR) • Reinforcement is provided after an average number of responses, i.e., VR3 (1,5,3: 9/3 =3) • Fixed Interval (FI) • Reinforcement is provided for the first response that occurs after a fixed period of time has passed, i.e., FI 10” • Variable Interval (VI) • Reinforcement is provided for the first response that occurs after an average period of time has passed, i.e., VI 10” (5”, 15”: 20/2 = 10) (Note carefully: Interval schedules have both a time and response requirement) (Meyerson & Michael, gambling,Gaetani articles, review) (Fixed time next)
SO 1: Time Schedules • Fixed Time (FT) • Reinforcement is provided after a specified time period has passed. There is NO response requirement. FT 10” • Variable Time (VT) • Reinforcement is provided after an average time period has passed. Again, there is NO response requirement. VT 10” (5”,15”: 20/2 = 10”)
SOs 2-6: Reinforcement Schedules and Gambling There are 26 casinos in Michigan and many opportunities for on-line gambling
NFE: Reinforcement Schedules & Gambling: • Slot machines, roulette wheels, and other forms of casino gambling are designed so that people lose most of the time • For every $1.00 gambled, a person wins, on average $.90 • Why would anyone agree to give someone $1 in exchange for $.90?
NFE: Reinforcement Schedules & Gambling • Gamblers often say they gamble for the fun of winning • This doesn’t make a lot of sense because in the long run, gamblers lose • Are compulsive gamblers morally weak, stupid, have a gambling gene, depressed, masochistic?? • Or are there other possibilities?
SOs 2- 6: Behavioral/Environmental Reasons • Text presents studies of • Pigeons who “chose” to gamble rather than work to the point that the researcher terminated the study because he was afraid the pigeons would starve themselves to death • College students who persisted in gambling even after gambling resulted in no wins and a zero return on the money they gambled • What are some of the reasons for this? • Study objectives 2-6 address these reasons/causes
SO3: Schedules of reinforcement • In most casino games, the reinforcement schedule for payoffs resemble a particular type of schedule. What is it? • What type of behavior pattern is typically produced by this type of schedule? (schedules of reinforcement have a lot to do with addiction; answer not on slide)
SO4: What might explain why some people become addicted and others do not? • Again, what type of schedule do most gambling payoffs resemble? Even if the schedule is exactly the same for two individuals, because of the variation in the schedule, all gamblers are not going to experience exactly the same sequence, number of payoffs, or amount of the payoff. (example on your own)
SO6: What factor not related to the schedule can account for excessive gambling? • Lack of reinforcers for other behaviors Example from text: Seniors sometimes become excessive gamblers after their spouse dies. They are now alone, their kids and grandkids may live miles away, their ability to enjoy physical activities may be diminished due to health – gambling provides very powerful reinforcers when there are not other reinforcers for other behaviors. (skipping 5 – do that on your own: be sure to state the answer to this study objective behaviorally, not “excitement”, which is a respondent behavior)
SO 7: Superstitious Behavior Superstitious behavior is behavior that occurs because it has been followed by a reinforcer even though the behavior does not produce the reinforcer; that is, even though the reinforcer is NOT contingent upon the behavior. The behavior occurs and just happens to be followed by a reinforcer. As a result, the behavior increases in frequency in the future. We call the resulting behavior “superstitious” behavior, and the reinforcement is called “adventitious*” reinforcement. *Also, more recently, called noncontingent reinforcement or “coincidental reinforcement.” (athletes, “lucky tie,” Skinner’s pigeons - head bobbing, flapping wing, turning right or left see text for some really neat examples Bobo the clown & kids, high school & college students)
SO 8 & 9: Contingent vs. Adventitious Reinforcement A reinforcer immediately follows the response, is contingent upon that response, and increases the future frequency of that response. (What does contingent mean?) • Contingent Reinforcement • Adventitious Reinforcement A reinforcer immediately follows the response, but is NOT contingent upon that response, and increases the future frequency of that response. Which of the following basic schedules of reinforcement represent adventitious reinforcement? Why? FR, VR, FI, VI, FT, VT (if, then relationship; contingent reinforcement is typically what we mean when we just say rienf; SO asks you to identify the contingent reinforcement schedules as well, and explain why)
SO 10: On your own, fair game for the exam* • State the technical name of the reinforcement schedule used by Skinner in his pigeon study in 254,5 • State the technical name of the reinforcement schedule used by Wagner and Morris in their “BoBo” the clown study in 255,3 *TAs can confirm, but not give out the answer, but we will not do this right before the exam
SO11: Introduction, Fordyce’s behavioral analysis of chronic pain • Fordyce developed a behavioral analysis of chronic pain (chronic back pain) and his analysis and treatment are still considered to be state of the art. • Intervention targeted operant pain behaviors that had been reinforced when the person was in pain, not the pain itself • When people complain, are inactive, don’t go to work, the problem may be that those behaviors have been reinforced frequently over a long period of time; it’s not the actual pain that is causing the problem (including this power of the basic principles, and provides an analysis that is surprising; chronic back pain, but relevant to any type of chronic pain, illness, medical condition; Targets of his intervention were the operant behaviors, he did not try to reduce the pain)
SO11: Cont., but not for the exam • Think about it - when someone is in pain or discomfort, how do you react? What behaviors do you reinforce? • Are you helping or hurting the individual to get better when you reinforce operant pain behaviors? • Clearly, you don’t want to ignore reports of pain in case some type of medical treatment is required and you certainly want to be sympathetic to show you “care.” On the other hand, are you helping “too much” and caring “too much.” • Skinner’s essay: “On the ethics of helping people.”
SO11: Cont., but not for the exam • Reinforcing dependent, complaining behavior is typically not a problem when individuals are ill for a short period of time (cold, flu) • Why? After the person recovers, you are not going to continue to reinforce whining and complaining! • The problem develops when individuals are ill or in pain for long periods of time, e.g., six months or longer (particularly when it is not clear at first how long the pain/illness will persist) • Please do not leave class and say “Dickinson is so cold and hard-hearted! She told us never to show how much we care when someone is sick!”
SO12: State two common reinforcers for pain behaviors and indicate whether each is positive or negative reinforcement R Common reinforcers Pain related behaviors 1. Sr+: attention and concern from others e.g., complaints, moans, 2. Sr-: escape/avoid unpleasant activities, asks for meds, stops doing jobs, and/or responsibilities something because of pain, holds or touches painful body part, grimaces
SO13: Analysis of care giver’s behavior: Learn the diagram. SD: R Sr Significant other saying “Thanks for caring! I don’t know what I would do without you!” Significant other engaging in pain-related behaviors “Oh, poor baby!” Not for the exam, but note: The response of care giver then serves as a reinforcer for the operant pain-related behavior of the significant other: R of ill person Sr (consequence for ill person) Complaining “Oh, poor baby!” (Why does the caretaker come to reinforce pain behaviors)
NFE: Fordyce’s intervention, the three essential features, all based on extinction • Extinction of all operant pain-related behaviors • Gradual increase in physical activity • Rest was contingent on completion of physical activity rather than complaint of pain • Gradual decrease and eventual elimination of all drugs (extinction of medication taking) • Masking drugs in a cocktail, reducing dosage • Switching drugs from pain-contingent to time-based (hospital-based program, with patients giving informed consent for procedures and being monitored by a physician, multiple meds, pain reliever, sleeping pills, antidepressants, medication tasking was reinforced by pain reduction and social attention)
SO14: Why time-based medication decreases medication taking With pain-contingent medication, the person is in pain when he/she takes the medication; hence taking medication is strongly reinforced by pain reduction and social attention. With time-based medication, if the dosage is correct, the person will not be in much pain when he/she takes the medication; hence taking medication is not as strongly reinforced and may even extinguish. (behaviorally, why would time-based medication reduce, extinguish medication-taking, in comparison to pain-contingent medication? Diagrams, next slide)
SO14: Pain-contingent and time-based medication, diagrams Pain-contingent MO (pain) : R (take medication) SR (pain reduction) Sr+ (social attention) Time-based MO (no/little pain): R (take medication) no/little pain reduction no/little social atten. Include “MO” even though we haven’t covered that yet; also Include SR for pain reduction and Sr+ for social attention
SO15: Not for exam, but some interesting facts • The 36 patients had pain from 4.5 - 30 years and none had worked in over three years • The inpatient program lasted an average of only 7 weeks! • Patients significantly increased their physical activity • Patients significantly increased their “uptime” (non-reclining time) • Patients were taking much less medication • They were taking fewer meds and cut dosage in half • Before, many were taking multiple meds (narcotic/analgesic, analgesic, sedative/hypnotic) • Patients reported having much less pain • Fascinating, since Fordyce’s program did not target the actual pain, but rather the operant pain-related behaviors • Because pain is self-reported, we really don’t know whether the pain decreased (sometimes physical activity will actually decrease pain)
SO16: Results of a follow-up study by ReinhardtStudy was conducted by Roberts at U. of Minn 14A. What percentage of patients who completed the program were living normal lives (including working)? 77% 14B. What percentage of patients who refused treatment were not working? 83%, which means only 17% were working 14C. How many prescription meds were the successful patients taking? None 14D. How many different prescription meds were the patients who refused treatment taking? An average of over three! (last slide on this – Gaetani next)
Performance Management (OBM) Introduction Photos: Dr. Doug Johnson
Performance Management Articles • Two articles • Gaetani et al. that investigated the effects of commission payment on the performance of two machinists • Sulzer-Azaroff et al. that investigated the effects of goal setting, feedback, “celebrations” on safety • Feedback and monetary incentives are very common interventions in business in industry • 60% of all OBM studies have used feedback alone or in combination with other variables • 35% of all US companies and 90% of Fortune 1000 companies have some type of individual incentive plan • Behavior-based safety is one of the largest areas of application of performance management (my focus, McGee, Johnson)
SO 17: Effectiveness of Performance Feedback From: Balcazar, Hopkins & Suarez (1985-1986), Journal of Organizational Behavior Management • Examined studies published up to that time that examined the effectiveness of feedback alone or in combination with other interventions in four major journals • Looked at the % of articles in which feedback had • Consistent effects: 100% of the employees improved their performance • Mixed effects: some but not all of the employees improved their performance • No effects • Results (SO 17) • Feedback alone: 28% of articles reported consistent effects • Feedback when combined with tangible rewards: 90% of articles reported consistent effects (alone, OK, better with tangibles, Gaetani demonstrates)
SO 18: Gaetani & Schedules of Reinforcement Gaetani et al. refer to a paycheck as an FI schedule. IT IS NOT!!!! Why Isn’t it? Assume we are talking about a weekly paycheck. • Too much delay between behaviors and the consequence (consequence must occur within 60” after the behavior occurs or we are dealing with “indirect acting contingencies” - “rule-governed behavior”) • No response is required for the receipt of the paycheck (time based only) • Not one response, but a collection of very different behaviors, is required
SO 19: Reinforcement procedure described asFR1 (CRF): NOT!! • Yukl, Latham, & Pursell (1976), tree planters, lumber company • 25 cents for each bag of trees planted (FR1 or CRF) • 50 cents for planting an average of 2 bags (VR2) • $1.00 for planting an average of 4 bags (VR4) What is the major problem with calling these schedules FR1, VR2 and VR4? (this small detail is not in the article) There were 1,000 tree seedlings in each bag!! (flipping a coin once or twice)
SO 19 cont:Reinforcement procedure described as FR3:NOT!!! • Published study designed to increase bus ridership on a college campus (Penn State). • Tokens that could be traded for merchandise from local stores served as the “reinforcers” (pop, reduced price on pizza, etc.) • Gave a token to every third person that got on the bus (FR3) What is the major problem with calling this schedule an FR3? In an FR3, reinforcement is provided after every third response of the same organism! Imagine trying to get a rat to press a lever by reinforcing the lever press of every third rat rather than every third lever press of the same rat! (don’t believe everything you read, even in peer-reviewed journals! Students try to name the schedule – there is no name for this schedule – reinforcement is provided to a specific person, not a “captive” population here)
SO 20: Why should you care? Why amI making such a big deal about this? • Basic schedules of reinforcement (FR, VR, FI, VI) generate consistent, proven patterns of responding • When schedules that are called the same thing (but are NOT), do not generate these same patterns of performance, individuals are “surprised” but worse yet • Often say that our basic principles of behavior are incorrect because the like-named schedules do not generate the same performance patterns • I don’t want you making that mistake So…..
SO 20: continued Why is it not surprising that in many applied settings, schedules of reinforcement have not resulted in the same performance patterns as the schedules that are called the same thing in the lab? While the schedules used in applied settings ARE schedules of reinforcement, they are usually NOT the same basic schedules that have been examined in the lab. Also, individuals with verbal behavior often describe the contingencies of the schedule to themselves, and then respond according to those descriptions. (this second one isn’t for the exam - just FYI) (Lowe,verbal individuals will show same performance patterns if contingencies are arranged to prevent or decrease likelihood of “descriptions.”)
Gaetani et al. article description • Participants: 2 machinists in a small machine shop • DV: Amount charged to customers per day on invoices • Design: ABAC reversal design • A = Hourly pay (E1: $5.00, E2: $7.00) • B = Hourly pay plus self-recorded feedback • C = Self-recorded feedback + commission pay system • 5% commission for above-standard performance • Penalty for below-standard performance • E1 standard (baseline average): ~$77.00 per day • E2 standard (baseline average): ~$98.00 per day • Length of phases: • A = 30 days, B = 13 days, A = 7 days, C = 40 days (reduction in wages, good thing? owner refused reversal to continue)
SO 22: Calculation of Wages • Above-standard performance Assume Employee 1 charged customers an average of $180 one day, and the standard was $80.00. His hourly pay is $5.00 per hour and he works 8 hours a day. How much would he earn that day? • $180 (amount charged) - $80 (st.) = $100 (over st.) • $100 X .05 (5% commission) = $5.00 in commission • $5 X 8 hours = $40 (hourly pay) • $40 (hourly pay) + $5 (commission) = $45.00
SO 22: Calculation of Wages • Below-standard performance Assume Employee 1 charged customers an average of $60 one day, and the standard was $80.00. His hourly pay is still $5.00 per hour and he works 8 hours a day. How much would he earn that day? • $60 (charged)/$80 (standard) = .75 • $5 X 8 hours = $40 (hourly pay) • .75 (of standard) X $40 = $30 Just a “friendly” warning: Students seem to have trouble with this one. (bring calculators, no cells, however)
Baseline: Hourly wages, No feedback Self-recorded feedback BSL Commission + self-recorded feedback
SO 23: Results of Gaetani et al.For exam: Specific % increases over baseline • Employee 1 • Baseline: Average = $77.00 per day • Feedback: Average = $152.00 per day (more than doubled) • Commission: Average = $238.00 per day • Average increase over baseline: 210%!!! • Employee 2 • Baseline: Average = $98.00 per day • Feedback: Average = $186.00 per day • Commission: Average = $269.00 per day • Average increase over baseline: 174%!!! • Never once did performance fall below standard during commission payment system - no penalty. And, at end, Employee 1 was performing as well as Employee 2. (annoyed specific %s)
Sulzer-Azaroff et al. article Introduction • Behavior-based safety is one of the largest areas of application in OBM. • Two of our Ph.D.’s have written books in the area (McSween and Agnew) • Sulzer-Azaroff pioneered this field and this article is one of the first published (Who Killed My Daddy?) • Important area! Not only for economic reasons but for humanitarian reasons. We can prevent injuries at work and save lives. • How Sulzer-Azaroff got started in behavior-based safety… Beth Sulzer-Azaroff with husband, Lee (not on slide)
SO 24: Purpose (but not for the exam) • Most companies measure (by law) number of accidents and number of lost time injuries (number of injuries that result in days that employees miss work) • The purpose of this study was to determine/assess whether an intervention that targeted behaviors and conditions that resulted from behaviors would/could decrease accidents and lost time injuries • The logic was that if you can increase safe behaviors and conditions that occur before the accident/injury happens, you should be able to decrease the accidents/injuries