440 likes | 609 Views
Biological Influences on Learning. Chapter 9. General Laws Of Learning. 2. Learning is the main determinant of action Laws of learning reveal themselves in any study of behavior, even if they are not behaviors exhibited in a natural setting.
E N D
Biological Influences on Learning Chapter 9
General Laws Of Learning 2 • Learning is the main determinant of action • Laws of learning reveal themselves in any study of behavior, even if they are not behaviors exhibited in a natural setting. • Past experience do not interfere with the learning involving arbitrary stimuli in the laboratory. • Conducted in humans and non human animals. • The specific CS, US (Pavlovian), reinforcer, and response (Instrumental) is arbitrary. • Extreme Enviromentalist and Nuturist!!!!!
Watson and General Laws 3 • “The behaviorist, ……. recognizes no dividing line between man and brute. The behavior of man, with all of its refinement and complexity, forms only a part of the behaviorist's total scheme of investigation” • "Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I'll guarantee to take any one at random and train him to become any type of specialist I might select—doctor, lawyer, artist, merchant-chief, and, yes, even beggarman and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors”
A Possible Rapprochement 4 Millions of Specific Laws One General Law Exaptations (serving one particular function, but later but modified to serve another)
Flying Dinosaurs and Exaptations 5 • Feathers for thermoregulation • Recycled for flight So, modes of learning should be conserved and modified for other purposes
Behavior Systems Approach 6 • Learning acts within innate behavior systems and functions to change the integration, tuning, instigation, or linkages. • According to Timberlake, an animal possesses instinctive behaviors such as feeding, mating, social bonding, care of young, and defense.
7 • Learning improves simple and complex motor tasks through repetition. • Activation of a mode can also be conditioned to cues that signal the receipt of reinforcers. • Conditioning of a specific mode produces a general motivational state that sensitizes all of the perceptual-motor modules in that mode. • Different stimuli can be conditioned to different modes. • Variations of learning can occur between species.
Constraints and Predispositions 8 • Constraints • When learning occurs less rapidly or less completely than expected. • Predispositions • Instances where learning occurs more rapidly or in a different form than expected.
Constraints: Animal Misbehavior 9 • Breland & Breland (1961) initiated the use of instrumental procedures to teach exotic behaviors to animals. • Animals learned the behaviors, but began showing instinctive behaviors. • Instinctive drift • When instrumental behavior deteriorates despite continued reinforcement due to the elicitation of instinctive behaviors.
IQ Zoo • Animal misbehavior may also be a result of Pavlovian conditioning. • For misbehavior to develop, stimuli resembling the natural cues must be consistently paired with the reinforcer and must reinforce naturally occurring species-typical behavior.
Constraints: Schedule-Induced Behavior 11 • Skinner (1948) found that reinforcing pigeons on an FI-15 sec. schedule resulted in stereotyped behavior. • Superstitious behavior • Skinner thought the “ritualistic” stereotyped pattern of behavior may have associated superstitious behavior with reinforcement. • Alternative view is the ritualistic behavior fill the “behavioral vacuum” between rewards
12 • Terminal behavior • The behavior that precedes reinforcement when an animal is reinforced on an interval schedule of reinforcement. • Interim behavior • The behavior following reinforcement when an animal is reinforced on an interval schedule of reinforcement. • Schedule-induced behavior • The high levels of instinctive behavior that occur following reinforcement on an interval schedule.
Schedule-Induced Polydipsia (SIP) 13 • The high levels of water consumption following food reinforcement on an interval schedule. • Some important aspect of food can produce excessive drinking. • Has been observed in rats, pigeons, and nonhuman primates. • Usually occurs in the immediate time following reinforcement and decreases as time for the next reinforcement nears.
SIP Group Control Group
The Nature of Schedule-Induced Behavior 15 • Riley and Wetherington (1989) proposed that schedule-induced instinctive behavior is a product of periodic reinforcement. • The relative insensitivity of water-induced polydipsia to taste aversion is compelling evidence of this.
Does Schedule-Induced Behavior Occur in Humans? 16 • Gilbert (1974) suggested that interval schedules could be responsible for some peoples’ excessive alcohol consumption. • Rats on interval schedules show excessive consumption of ethanol; cocaine solution.
Arm-Chair Quarterback Blitzed 17 Stay Budweiser Sipping Go Inter-Potato Chip Interval
18 • Schedule-induced polydipsia in animals is a good model for excessive alcohol consumption in humans. • Smoking and eating are other type of schedule-induced behaviors seen in humans. • Evidence is weaker in humans, and natural schedule induced behavior develops more slowly in humans than in animals.
Scheduled-Induced Anorexia 20 • Rats fed once per day at the same time • Periodic 24 hour schedule • Free-access to wheel (or not)
Predispositions and Taste Aversion 22 • Preparedness? • Avoidance of a flavor that precedes an illness experience (implies selectivity) • Long-delay learning? • The association of a flavor with an illness that occurred even several hours after the flavor was consumed. Is this a special type of learning
The Selectivity of Flavor Aversion Learning 23 • Garcia and Koelling (1966) demonstrated this phenomenon in a groundbreaking study with rats. • Seligman (1970): rats possess an evolutionary preparedness for flavor aversion learning. • Rats can also associate an environmental cue with illness. • Taste cues, however, are especially salient.
24 • Birds acquire visual aversions more readily than taste aversions. • Rely more heavily on visual system for food. • Search for food during the day. • Rats are nocturnal, and rely more heavily on gustatory information to find food.
Long-delay learning? 25 • Concurrent interference • The prevention of learning when a stimulus intervenes between the conditioned and unconditioned stimuli or when a behavior occurs between the operant response and reinforcement. • Long-delayed learning occurs as the absence of concurrent interference • Interference is observed if other tastes are introduced.
Flavor Aversion Learning in Humans 26 • Many children in the early stages of cancer develop flavor aversions before toxic chemotherapy. • Adult and child cancer patients receiving radiation therapy. • Causes weight loss in these individuals. • Mapletoff ice-cream as a surrogate
Imprinting 27 • Imprinting • The development of a social attachment to stimuli experienced during a sensitive period of development. • Lorenz (1952) found that infant birds form attachments to the first moving object they encounter. • Birds may imprint to inanimate objects as well as animals of another species. • Conditions influences the likelihood of imprinting.
Conditions Influence Imprinting 28 • Klopfer (1971) found that ducklings imprinted more readily to a moving object than a stationary object. • Harlow (1971) studied this phenomenon in nonhuman primate surrogate, cloth mothers. • Ainsworth (1982) has studied the effect of imprinting on human infants. • Imprinting can still occur after sensitive development periods when sufficient experience is given. • The sensitive period for attachment differs among species.
Sexual Preference 29 • The sexual preference of many birds is established during the sensitive period. • Does not have to be of the same species. • Sexual preference, thus, does not depend upon sexual reinforcement. • Is not modified even after sexual experience with another bird species.
Nature of Imprinting 30 • Moltz (1960, 1963) proposed that Pavlovian and operant conditioning are responsible for social imprinting. • Before the fear system develops, chicks orient to large, familiar and unfamiliar objects (e.g., the mother) with low levels of arousal. • Fear system develops, unfamiliar objects elicit high arousal. • Chicks cling to their mother associated with lower levels of arousal
31 • Fear reduction may be associated with “mama” in young human and nonhuman primates • Harlow (1971) used inanimate surrogate mothers of different forms (both wire and cloth) for nonhuman primate infants. • Primates clung to the cloth surrogate in the presence of a dangerous object, but fled when only the wire surrogate was present.
32 • The research of Ainsworth and colleagues: • Secure relationship • The establishment of a strong bond between a mother who is sensitive and responsive to her infant. • Anxious relationship • The relationship developed between a mother and her infant when the mother is indifferent to her infant.
33 • Instinctive view of Imprinting • The view that imprinting is a genetically programmed form of learning. • Kovach and Hess (1963) found that, despite its administration of electric shock, chicks still approached the imprinted object. • Punishment may not inhibit imprinting. • Primate infants clung to abusive “monster mothers” even as they were abused (Harlow, 1971).
Brain Mechanisms of Reward 34 • Electrical Brain Stimulation • Brain stimulation can be used as reinforcement. • Many species will act to earn brain stimulation. • Medial Forebrain Bundle or MFB (James Olds)
Characteristics 35 • Has 4 characteristics: • Highly reinforcing • Motivates ongoing behavior • Its functioning is stimulated by the presence of reinforcers • Its reinforcing effects are enhanced by deprivation
36 • Stimulus-bound behavior • Behavior tied to the prevailing environmental conditions and elicited by stimulation of the brain’s reinforcement center. • Activation of the MFB reinforces behavior. • Watching erotic films leads to stimulation of the MFB and sexual activity. • Also has been shown to eliminate pain in cancer patients. • Produces a strong euphoria that lasts several hours.
“Liking” versus “Wanting” 37 • Hypothesis #1: Hedonia, dopamine (DA) acts as a “pleasure neurotransmitter”. • drugs of abuse are involved dopamine activity, particularly in the "high" or euphoric state • However, not all rewards or pleasurable things involve activation of the reward system • Hypothesis #2: Incentive salience (wanting, “do it again” or “craving”) stands out as a possible role for dopamine as it regard.
Mesolimbic Reinforcement System 38 • The MFB is only part of the brain’s reinforcement center. • Mesolimbic reinforcement system • A central nervous system structure that mediates the influence of reinforcement on behavior. • Tegmentostriatal pathway and nigrostriatal pathway.
40 • Tegmentostriatal pathway • A neural pathway that begins in the lateral hypothalamus, goes through the MFB and ventral tegmental area, terminates in the nucleus accumbens, and governs the motivational properties of reinforcers. • Ventral tegmental area (VTA) • A structure in the tegmentostriatal reinforcement system that projects to the nucleus accumbens.
41 • Nucleus accumbens (NA) • A basal forebrain structure that plays a significant role in the influence of reinforcement on behavior. • Nigrostriatal pathway • A neural pathway that begins in the substantia nigra and projects to the neostriatum and that serves to facilitate reinforcement-induced enhancement of memory consolidation.
42 • One line of evidence of DA influence is the powerful reinforcing properties of amphetamine and cocaine, which increase the level of activity at dopaminergic receptors. • Animals quickly learn behaviors to self-administer amphetamine and cocaine. • Natural reinforcers (e.g., food and water) also trigger dopamine release, as does MFB stimulation.
Opiate Activation of the Tegmentostriatal Pathway 43 • Opiate drugs also stimulate the tegmentostriatal pathway. • Animals learn to self-administer opiate drugs such as heroin and morphine. • Opiate drugs do not activate DA receptors. • Both DA and opiate receptors produce activation in the NA.