60 likes | 174 Views
Schedules of Reinforcement. Schedules of Reinforcement. By Response: Fixed Ratio : Rewarded after a certain number of responses (same every time) Variable Ratio : Rewarded after a random number of responses (changes between rewards) By time:
E N D
Schedules of Reinforcement • By Response: • Fixed Ratio: Rewarded after a certain number of responses (same every time) • Variable Ratio: Rewarded after a random number of responses (changes between rewards) • By time: • Fixed Interval: Rewarded after a certain amount of time (same every time) • Variable Interval: Rewarded after a random amount of time (changes between rewards)
Fixed Ratio • Workers at McDonalds are paid $5 for every 10 McDoubles they make. • You are paid a Fixed (meaning pre-arranged) amount for a certain ration (or numba) of hamburgers you make.
Variable Ratio • Example, Playing the Slot Machines. • You know if you play the slots eventually you will win, the question becomes how long will it take (variable) for you to win. • It can change. Some people win after one or a few trys, some people win after a hundred tries. You could win again the next try or it may take another 100 tries for you to win.
Fixed Interval • You are paid $7.25 an hour for flipping burgers at Mac-Donalds. • You are paid a fixed amount for a certain amount of time. You know for every hour you work, you will get paid $7.25
Variable Interval • Ex: Fishing. • You can cast your line out and not have a fish bite for an hour and then finally get a bite. It may only take 2 minutes later to get the 2nd fish to bite, but then it may take 30 minutes for the third fish to bite.