1 / 52

Neobehaviorism

Positivism. Positivism: The belief that science should study only those objects or events that can be experienced directly. That is, all speculation about abstract entities should be avoided.Theorizing introduces error in science by creating expectations.Watson and the Russian Physiologists were positivists..

libitha
Download Presentation

Neobehaviorism

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Neobehaviorism Chapter 13

    2. Positivism Positivism: The belief that science should study only those objects or events that can be experienced directly. That is, all speculation about abstract entities should be avoided. Theorizing introduces error in science by creating expectations. Watson and the Russian Physiologists were positivists. -Auguste Comte had insisted that valid information about the world could only be attained using radical empiricism, metaphysical speculation was to be avoided at all costs. -Although the mind cannot be studied objectively, the products of the mind could be.-Auguste Comte had insisted that valid information about the world could only be attained using radical empiricism, metaphysical speculation was to be avoided at all costs. -Although the mind cannot be studied objectively, the products of the mind could be.

    3. Logical Positivism Logical Positivism: The philosophy of science according to which theoretical concepts are admissible if they are tied to the observable world through operational definitions. Science is divided into empirical and theoretical by combining rationalism and empiricism. Observational Terms: Terms that refer to empirical events. Theoretical Terms: Those terms that are employed to explain empirical observations. -Scientists realized that they could not study ONLY observable phenomenon. -Physicists had such theoretical concepts as gravity and magnetism, neither of which was observable. -Logical positivism was a way to study theoretical concepts by tying them to observable phenomenon. -Logical positivism wedded empiricism with rationalism. -Scientists realized that they could not study ONLY observable phenomenon. -Physicists had such theoretical concepts as gravity and magnetism, neither of which was observable. -Logical positivism was a way to study theoretical concepts by tying them to observable phenomenon. -Logical positivism wedded empiricism with rationalism.

    4. Operationism Percy Bridgeman, The Logic of Modern Physics (1927) Every abstract concept in physics must be defined in terms of the procedures used to measure the concept. Operational Definition: A definition that relates an abstract concept to the procedures used to measure it. Operationism: The belief that all abstract scientific concepts should be operationally defined. -Operational definitions tie theoretical terms to observable phenomena. -No ambiguity about the meaning of a theoretical term. -Operational definitions could be used to turn theoretical concepts in psychology such as drive, learning, anxiety, and intelligence into empirical events and remove any metaphysical speculations.-Operational definitions tie theoretical terms to observable phenomena. -No ambiguity about the meaning of a theoretical term. -Operational definitions could be used to turn theoretical concepts in psychology such as drive, learning, anxiety, and intelligence into empirical events and remove any metaphysical speculations.

    5. Physicalism Because all sciences follow the same principles, made the same assumptions, and attempted to explain empirical observations, why shouldn’t they use the same terminology? Physicalism: A belief growing out of logical positivism that all sciences should share common assumptions, principles, and methodologies and should model themselves after physics. Share vocabulary as well -One outcome of logical positivism was that all sciences were seen as essentially the same.-One outcome of logical positivism was that all sciences were seen as essentially the same.

    6. Neobehaviorism Neobehaviorism: Agreed with older forms of behaviorism that overt behavior should be psychology's subject matter but disagreed that theoretical speculation concerning abstract entities must be avoided. Such speculation was accepted, provided that the theoretical terms employed are operationally defined and lead to testable predictions about overt behavior. Behaviorism + logical positivism = neobehaviorism -All Neobehaviorists believed the following: -If theory is used, it must be used in ways demanded by logical positivism. -All theoretical terms must be operationally defined. -Nonhuman animals should be used as research subjects for two reasons: -Relevant variables are easier to control than they are for humans. -Perceptual and learning processes occurring in nonhuman animals differ only in degree from those processes in humans. -The learning process is of prime importance because it is the primary mechanism by which organisms adjust to changing environments.-All Neobehaviorists believed the following: -If theory is used, it must be used in ways demanded by logical positivism. -All theoretical terms must be operationally defined. -Nonhuman animals should be used as research subjects for two reasons: -Relevant variables are easier to control than they are for humans. -Perceptual and learning processes occurring in nonhuman animals differ only in degree from those processes in humans. -The learning process is of prime importance because it is the primary mechanism by which organisms adjust to changing environments.

    7. Edward Chase Tolman His father was a member of the first graduating class at MIT and a member of its board of trustees. Guess where Tolman went to school! Studied Psychology at Harvard with Robert Yerkes and Hugo Münsterberg Strongly concerned with the role of consciousness in psychology. Felt much better when Watson’s Behavior: An Introduction to Comparative Psychology (1914) was used as a text in Yerkes’ class. Studied with Gestalt Psychologist Kurt Koffka -Tolman, Edward Chase (1886-1959) Created a brand of behaviorism that used mental constructs and emphasized purposive behavior. Although Tolman employed many intervening variables, his most important was the cognitive map. -First faculty position was at Northwestern University where he was a terrible teacher. -In 1918 he was dismissed for a lack of teaching success. -He had written an essay expressing the extreme pacifism his Quaker upbringing had engendered in him. -He got a new position at the University of California Berkeley. -Drives Toward War (1942) was a psychoanalytic explanations of the human motives responsible for war. -As an American, a college professor and one brought up in the pacifist tradition, I am intensely biased against war. It is for me stupid, interrupting, unnecessary, and unimaginably horrible. I write this essay within that frame of reference. In short, I am driven to discuss the psychology of war and its possible abolition because I want intensely to get rid of it. -By the time the book came out Tolman’s pacifism was shaken by war atrocities. -He served two years in the Office of Strategic Services (1944-1945) -Tolman led a group of faculty members who refused to sign a loyalty oath during the McCarthy era. -Saw the oath as an infringement of civil liberties and academic freedom. -Tolman was suspended and worked at University of Chicago and Harvard for a while. -Eventually he was reinstated and the university admitted his position had been correct and gave him an honorary doctorate for it.-Tolman, Edward Chase (1886-1959) Created a brand of behaviorism that used mental constructs and emphasized purposive behavior. Although Tolman employed many intervening variables, his most important was the cognitive map. -First faculty position was at Northwestern University where he was a terrible teacher. -In 1918 he was dismissed for a lack of teaching success. -He had written an essay expressing the extreme pacifism his Quaker upbringing had engendered in him. -He got a new position at the University of California Berkeley. -Drives Toward War (1942) was a psychoanalytic explanations of the human motives responsible for war. -As an American, a college professor and one brought up in the pacifist tradition, I am intensely biased against war. It is for me stupid, interrupting, unnecessary, and unimaginably horrible. I write this essay within that frame of reference. In short, I am driven to discuss the psychology of war and its possible abolition because I want intensely to get rid of it. -By the time the book came out Tolman’s pacifism was shaken by war atrocities. -He served two years in the Office of Strategic Services (1944-1945) -Tolman led a group of faculty members who refused to sign a loyalty oath during the McCarthy era. -Saw the oath as an infringement of civil liberties and academic freedom. -Tolman was suspended and worked at University of Chicago and Harvard for a while. -Eventually he was reinstated and the university admitted his position had been correct and gave him an honorary doctorate for it.

    8. Edward Chase Tolman Purposive Behaviorism Molecular Behavior: A small segment of behavior such as a reflex or a habit that is isolated for study. Not worthy of study for Tolman Purposive Behavior: Behavior that is directed toward some goal and that terminates when the goal is attained (a.k.a. molar behavior). Worthy of study for Tolman Purpose is in the behavior itself Purposive Behaviorism: The type of behaviorism Tolman pursued, which emphasizes molar rather than molecular behavior. -Tolman needed to use a mentalistic term like “purposive” yet remain a behaviorist. -Done by seeing purpose in the behavior in itself and not by inferring purpose from the behavior. -Examples of purposive behavior -Rat running a maze -Man driving home -Woman doing her washing -Exact muscles, glands, motor nerves etc are meaningless. Purpose is what matters.-Tolman needed to use a mentalistic term like “purposive” yet remain a behaviorist. -Done by seeing purpose in the behavior in itself and not by inferring purpose from the behavior. -Examples of purposive behavior -Rat running a maze -Man driving home -Woman doing her washing -Exact muscles, glands, motor nerves etc are meaningless. Purpose is what matters.

    9. Edward Chase Tolman Tolman’s Use of Rats Using rats as experimental subjects guards against any possibility of indirect introspection that could occur with human participants. Dedicated his book Purposive Behavior in Animals and Men (1932) to the white rat. Everything about humans except society and language could be understood by studying rats. -Tolman never did any animal experiments at Harvard or Northwestern, but decided to teach comparative psychology at Berkeley. -Tolman never did any animal experiments at Harvard or Northwestern, but decided to teach comparative psychology at Berkeley.

    10. Edward Chase Tolman The Use of Intervening Variables Intervening Variables: Events believed to occur between environmental and behavioral events. Although intervening variables cannot be observed directly, they are thought to be causally related to behavior. Hull's habit strength and Tolman's cognitive map are examples of intervening variables. Environmental experience gives rise to internal (unobservable) events which in turn cause behavior. -Increasingly, Tolman came to believe the cognitive processes not only exist, but they are determinants of behavior. -He became increasingly mentalistic in his approach, but yet insisted on remaining a behaviorist. -”I, in my future work, intend to go ahead imagining how, if I were a rat, I would behave.” -He solved this using intervening variables. -Intervening variables intervene between the environment and behavior. -All intervening variables were tied to observable behavior (operationally defined). -Purely describing what organisms do is not enough, we should try to understand why they behave as they do.-Increasingly, Tolman came to believe the cognitive processes not only exist, but they are determinants of behavior. -He became increasingly mentalistic in his approach, but yet insisted on remaining a behaviorist. -”I, in my future work, intend to go ahead imagining how, if I were a rat, I would behave.” -He solved this using intervening variables. -Intervening variables intervene between the environment and behavior. -All intervening variables were tied to observable behavior (operationally defined). -Purely describing what organisms do is not enough, we should try to understand why they behave as they do.

    11. Edward Chase Tolman The Use of Intervening Variables Hypotheses and Expectancies Hypothesis: An expectancy that occurs during the early stages of learning. Vicarious Trial and Error: The apparent pondering of behavioral choices in a learning situation. Expectancy: A hypothesis that has been tentatively confirmed. -Tolman exhaustively developed many intervening variables, but his most famous has to do with cognitive maps. -How does a rat learn to solve a maze? -A rat is placed in a t-maze which is set-up such that turning left is reinforced with food whereas turning right is not. -At some point the animal formulates a weak hypothesis that turning one way leads to food while the other way does not. -Early in hypothesis formation, animals may pause at the choice point as though pondering the decision. -This pondering was called vicarious trial and error because the rat was “contemplating” outcomes. -If early hypotheses are confirmed, the animal will develop an expectancy.-Tolman exhaustively developed many intervening variables, but his most famous has to do with cognitive maps. -How does a rat learn to solve a maze? -A rat is placed in a t-maze which is set-up such that turning left is reinforced with food whereas turning right is not. -At some point the animal formulates a weak hypothesis that turning one way leads to food while the other way does not. -Early in hypothesis formation, animals may pause at the choice point as though pondering the decision. -This pondering was called vicarious trial and error because the rat was “contemplating” outcomes. -If early hypotheses are confirmed, the animal will develop an expectancy.

    12. Edward Chase Tolman The Use of Intervening Variables Hypotheses, Expectancies, Beliefs, and Cognitive Maps Belief: An expectation that experience has consistently confirmed. Cognitive Map: The mental representation of the environment. -If an expectancy is consistently confirmed, it will become a belief. -Through the process of hypotheses being confirmed and becoming expectancies which if confirmed become beliefs, the rat develops a cognitive map representing the entire environment and the consequences of various actions. -Very cognitive notion of learning. -Hypotheses, expectations, beliefs, and cognitive maps intervene between experience and behavior. -Rather than purely describing behavior, these intervening variables explained the behavior.-If an expectancy is consistently confirmed, it will become a belief. -Through the process of hypotheses being confirmed and becoming expectancies which if confirmed become beliefs, the rat develops a cognitive map representing the entire environment and the consequences of various actions. -Very cognitive notion of learning. -Hypotheses, expectations, beliefs, and cognitive maps intervene between experience and behavior. -Rather than purely describing behavior, these intervening variables explained the behavior.

    13. Edward Chase Tolman Position on Reinforcement Learning occurs constantly, independent of reinforcement and motivation. Confirmation: According to Tolman, the verification of a hypothesis, expectancy, or belief. S-S Psychology because rat learns relationships among stimuli (environmental features and reinforcement). -Animals learn what leads to what in the environment. -Tolman’s brand of learning has been called S-S rather than S-R because of what is being learned.-Animals learn what leads to what in the environment. -Tolman’s brand of learning has been called S-S rather than S-R because of what is being learned.

    14. Edward Chase Tolman Learning versus Performance Performance: The translation of learning into behavior. Latent learning: Learning that has occurred but is not translated into behavior. Latent extinction: The finding that animals who passively experience a goal box no longer containing reinforcement, extinguish a previously learned response to that goal box significantly faster than animals without such experience. -Learning occurs in the absence of reinforcement or motivation. -Performance is the combination of previous learning with motivation. -Latent learning shown very nicely on the next slide. -In extinction, the animal’s expectations change, they expect that following the CS, there will be no US. -In a demonstration of latent extinction: -One group of rats is placed into the empty goal box several times, then given extinction trials -One group given normal extinction trials. -The behavior of the first group extinguishes much more rapidly.-Learning occurs in the absence of reinforcement or motivation. -Performance is the combination of previous learning with motivation. -Latent learning shown very nicely on the next slide. -In extinction, the animal’s expectations change, they expect that following the CS, there will be no US. -In a demonstration of latent extinction: -One group of rats is placed into the empty goal box several times, then given extinction trials -One group given normal extinction trials. -The behavior of the first group extinguishes much more rapidly.

    15. Edward Chase Tolman Learning versus Performance Tolman and Honzik (1930) -Learning remains latent until the organism has a reason to use it.-Learning remains latent until the organism has a reason to use it.

    16. Clark Leonard Hull Trained to be in mining but polio made the career impossible. Considered becoming a Unitarian minister The idea of “attending an endless succession of ladies’ teas” dissuaded him. Went to the University of Michigan Received doctorate from University of Wisconsin Accepted professorship at Yale -Hull, Clark Leonard (1884-1952) Formulated a complex hypothetico-deductive theory in an attempt to explain all learning phenomena. -Hull, Clark Leonard (1884-1952) Formulated a complex hypothetico-deductive theory in an attempt to explain all learning phenomena.

    17. Clark Leonard Hull Hypothetico-Deductive Theory Hypothetico-Deductive Theory: A set of postulates from which empirical relationships are deduced (predicted). If the empirical relationships are as predicted, the theory gains strength; if not, the theory loses strength and must be revised or abandoned. Hull first reviewed work on learning and summarized it in general terms called postulates. From these postulates he inferred theorems that yielded testable propositions. Self-correcting -Hull borrowed the idea of intervening variables from Tolman. -Just like Tolman, Hull believed that in order to explain the associations between environmental stimuli and behavior, you must consider internal intervening conditions. -These intervening factors were thought to be primarily physiological by Hull.-Hull borrowed the idea of intervening variables from Tolman. -Just like Tolman, Hull believed that in order to explain the associations between environmental stimuli and behavior, you must consider internal intervening conditions. -These intervening factors were thought to be primarily physiological by Hull.

    18. Clark Leonard Hull Reinforcement Reinforcement: For Hull, drive reduction. Drive-reduction: Hull's proposed mechanism of reinforcement. Anything that reduces a drive is reinforcing. Habit strength: (SHR) The strength of an association between a stimulus and response. This strength depends on the number of reinforced pairings between the two. -Biological needs create drives and diminution of these drives is reinforcing. -Drives intervene between environment and behavior. -If a response leads to diminution of a drive in a certain situation, then habit strength is said to increase. -Habit strength, an intervening variable, is defined as the number of reinforced pairings between an environmental situation and a response.-Biological needs create drives and diminution of these drives is reinforcing. -Drives intervene between environment and behavior. -If a response leads to diminution of a drive in a certain situation, then habit strength is said to increase. -Habit strength, an intervening variable, is defined as the number of reinforced pairings between an environmental situation and a response.

    19. Clark Leonard Hull Reaction Potential (SER) Reaction Potential: (SER) For Hull, the probability of a learned response being elicited in a given situation. This probability is a function of the amount of drive and habit strength present. If either is 0, then no response -The probability of a learned response being exhibited is a function of habit strength and drive. -If either is 0, then there is no potential for a reaction to occur.-The probability of a learned response being exhibited is a function of habit strength and drive. -If either is 0, then there is no potential for a reaction to occur.

    20. Clark Leonard Hull Reaction Potential (SER) His actual formula is WAY more complicated than this. Take habit strength, multiply by some motivational principles, then add or subtract multiple intervening variables that make the potential for behavior increase or decrease. Finally, be sure to admit that we really don’t know everything and add an oscillation effect (SOR) -In order to properly calculate reaction potential we need to know many intervening variables, the most important of which is Habit strength, sHr -Habit strength is multiplied by Drive strength, D, which is measured by the hours of deprivation of a need. -Habit strength is also multiplied by K, the incentive value of a stimulus (size of the candy bar), and V is a measure of the connectiveness. -From this product the forces acting against reaction potential must be subtracted which include Inhibitory strength, sIr, the number of non reinforces and Reactive inhibition, Ir, the fatigue of the organism. -The last variable in his formula is sOr, which accounts for random error.-In order to properly calculate reaction potential we need to know many intervening variables, the most important of which is Habit strength, sHr -Habit strength is multiplied by Drive strength, D, which is measured by the hours of deprivation of a need. -Habit strength is also multiplied by K, the incentive value of a stimulus (size of the candy bar), and V is a measure of the connectiveness. -From this product the forces acting against reaction potential must be subtracted which include Inhibitory strength, sIr, the number of non reinforces and Reactive inhibition, Ir, the fatigue of the organism. -The last variable in his formula is sOr, which accounts for random error.

    21. Edwin Ray Guthrie BA and MA from University of Nebraska PhD from University of Pennsylvania Professor at University of Washington The Psychology of Learning (1935) was nontechnical, homespun, and full of anecdotes. Any scientific theory should be understandable by undergraduate students. -Guthrie Edwin, Ray (1886-1959) Accepted the law of contiguity but not the law of frequency. For him, learning occurs at full strength after just one association between a pattern of stimuli and a response. -Guthrie Edwin, Ray (1886-1959) Accepted the law of contiguity but not the law of frequency. For him, learning occurs at full strength after just one association between a pattern of stimuli and a response.

    22. Edwin Ray Guthrie The One Law of Learning Law of Contiguity: Guthrie's one law of learning, which states that when a pattern of stimuli is experienced along with a response, the two become associated. In 1959, Guthrie revised the law of contiguity to read, "What is being noticed becomes a signal for what is being done.“ At the time an association is formed between stimulus and response, MANY stimuli are present. An organism cannot possibly form associations with all stimuli. -According to Guthrie, what you do last in a situation is what you will tend to do again, Watson’s recency principle.-According to Guthrie, what you do last in a situation is what you will tend to do again, Watson’s recency principle.

    23. Edwin Ray Guthrie One-Trial Learning One-Trial Learning: Guthrie's contention that the association between a pattern of stimuli and a response develops at full strength after just one pairing of the two. “a stimulus pattern gains its full associative strength on the occasion of its first pairing with a response.” Prior to Guthrie, learning theorists had accepted Aristotle’s law of contiguity and his law of frequency. Learning theorists disagreed with why frequency increases association, but definitely agreed that it did. -We’ve seen one trial learning before (think 1% chocolate milk), but this was an exception, not the rule. -Aristotle, “It is a fact that there are some movements, by a single experience of which persons take the impress of custom more deeply than they do by experiencing others many times; hence upon seeing some things but once we remember them better than others which we may have seen frequently.”-We’ve seen one trial learning before (think 1% chocolate milk), but this was an exception, not the rule. -Aristotle, “It is a fact that there are some movements, by a single experience of which persons take the impress of custom more deeply than they do by experiencing others many times; hence upon seeing some things but once we remember them better than others which we may have seen frequently.”

    24. Edwin Ray Guthrie Why Practice Improves Performance Movements: Specific responses made to a specific configuration of stimuli. Movements gain their full associative strength after one exposure. Acts: A response made to varying stimulus configurations. Consist of many movements. Learning an act involves learning a specific response under various conditions. Skill: Consists of many acts. -The skill of learning golf requires many acts, such as putting, driving, and playing out of sand traps. -Each act requires many movements.-The skill of learning golf requires many acts, such as putting, driving, and playing out of sand traps. -Each act requires many movements.

    25. Edwin Ray Guthrie The Nature of Reinforcement Reinforcement does not provide a “satisfying state of affairs”, it preserves the association that preceded it. -Watson had hinted at this notion of reinforcement when he noted a trial in a learning experiment always ended with reinforcement, it was the last thing an organism experiences. -When cats learn to escape a puzzle box, each one learns its own unique way of doing so. -The stereotyped behavior would be repeated when the cats were again placed in the puzzle box. -Whatever the cat had done last on the previous trial was most likely to be repeated.-Watson had hinted at this notion of reinforcement when he noted a trial in a learning experiment always ended with reinforcement, it was the last thing an organism experiences. -When cats learn to escape a puzzle box, each one learns its own unique way of doing so. -The stereotyped behavior would be repeated when the cats were again placed in the puzzle box. -Whatever the cat had done last on the previous trial was most likely to be repeated.

    26. Edwin Ray Guthrie The Nature of Reinforcement Reinforcement preserves the association that preceded it. -This cat settled on a biting gesture to move the pole. -This cat settled on a biting gesture to move the pole.

    27. Edwin Ray Guthrie Forgetting Like learning, forgetting occurs in a single trial when a new S-R association replaces an old one. All forgetting involves new learning. If a child leaves school after the seventh grade, their memories of seventh grade will remain clear through life. How are your memories of seventh grade?

    28. Edwin Ray Guthrie Breaking Habits A habit is an act which has become associated with a large number of stimuli. The more stimuli that elicit the act, the stronger the habit. If you want to change a habit, then, observe the stimuli that elicit the undesirable act and perform another act in the presence of those stimuli. -By performing a new act in the presence of the stimuli, you will “forget” the old act. -The new act will be elicited in the presence of the stimuli.-By performing a new act in the presence of the stimuli, you will “forget” the old act. -The new act will be elicited in the presence of the stimuli.

    29. Edwin Ray Guthrie Punishment Punishment is effective not because of pain, but because of what it causes the organism to do in the presence of the stimuli. Hitting a dog on the nose with a newspaper may be an effective way to reduce car chasing behavior. Hitting a dog on the rear may be ineffective even if the amount of pain inflicted is equal.

    30. Edwin Ray Guthrie Drives and Intentions Maintaining Stimuli: According to Guthrie, the internal or external stimuli that keep an organism active until a goal is reached (a.k.a. drives or motives). When an organism performs an act which terminates the maintaining stimuli, the act becomes associated with the maintaining stimuli. This explains acts which appear to have intentions. -Here we have just another case of the recency principle at work. -This association makes behavior APPEAR intentional.-Here we have just another case of the recency principle at work. -This association makes behavior APPEAR intentional.

    31. B. F. Skinner The Troublemaking Youth Born in Susquehanna, PA Only punished once as a youth Attended Hamilton College in Clinton, NY Resented chapel attendance requirement Openly rebellious by senior year. Hated his English composition professor who was “a great name dropper” -Skinner, Burrhus Frederic (1904-1990) A behaviorist who believed that psychology should study the functional relationship between environmental events, such as reinforcement contingencies and behavior. Skinner's work exemplified positivism. -Skinner and his friends printed up posters that read, “Charles Chaplin, the famous cinema comedian, will deliver his lecture ‘Moving Pictures as a Career’ in the Hamilton College chapel on Friday, October 9” -The visit was under the auspices of the disliked professor. -They put up posters all over town and called the Utica newspaper. -Police roadblocks were needed to control the crowds. -The professor wrote an editorial the next day. -Skinner said it was the best thing the professor had ever written. -At graduation he covered the walls with caricatures of the faculty. -The president of the university had to warn him during intermission to settle down or lose his degree. -He graduated from Hamilton college without ever taking a psychology course.-Skinner, Burrhus Frederic (1904-1990) A behaviorist who believed that psychology should study the functional relationship between environmental events, such as reinforcement contingencies and behavior. Skinner's work exemplified positivism. -Skinner and his friends printed up posters that read, “Charles Chaplin, the famous cinema comedian, will deliver his lecture ‘Moving Pictures as a Career’ in the Hamilton College chapel on Friday, October 9” -The visit was under the auspices of the disliked professor. -They put up posters all over town and called the Utica newspaper. -Police roadblocks were needed to control the crowds. -The professor wrote an editorial the next day. -Skinner said it was the best thing the professor had ever written. -At graduation he covered the walls with caricatures of the faculty. -The president of the university had to warn him during intermission to settle down or lose his degree. -He graduated from Hamilton college without ever taking a psychology course.

    32. B. F. Skinner The Author He spent a year after college to try his hand at writing... He read the great literary works Built a writer’s study in his parents attic Began to smoke a pipe Lived in New York’s Greenwich Village CONCLUSION: “I had failed as a writer because I had had nothing important to say, but I could not accept that explanation. It was literature that must be at fault.” -His early writing was positively reviewed by Robert Frost. -“you are worth twice anyone else I have seen in prose this year!” -While working in his parents attic: “The results were disastrous. I frittered away my time. I read aimlessly…listened to the newly invented radio, contributed to the humorous column of a local paper but wrote almost nothing else, and thought about seeing a psychiatrist.”-His early writing was positively reviewed by Robert Frost. -“you are worth twice anyone else I have seen in prose this year!” -While working in his parents attic: “The results were disastrous. I frittered away my time. I read aimlessly…listened to the newly invented radio, contributed to the humorous column of a local paper but wrote almost nothing else, and thought about seeing a psychiatrist.”

    33. B. F. Skinner The Academic While in Greenwich Village, Skinner had read the works of Pavlov and Watson. Enrolled in the graduate program in psychology at Harvard and finished the PhD in three years. Remained as a post doc at Harvard for 5 years Began teaching at the University of Minnesota The Behavior of Organisms (1938) Chair of the Psychology Department at Indiana University Returned to Harvard where he remained. -”I would rise at six, study until breakfast, go to classes, laboratories, and libraries with no more than fifteen minutes unscheduled during the day, study until exactly nine o’clock at night and go to bed. I saw no movies or plays, seldom went to concerts, had scarcely any dates and read nothing but psychology and physiology.”-”I would rise at six, study until breakfast, go to classes, laboratories, and libraries with no more than fifteen minutes unscheduled during the day, study until exactly nine o’clock at night and go to bed. I saw no movies or plays, seldom went to concerts, had scarcely any dates and read nothing but psychology and physiology.”

    34. B. F. Skinner Skinner’s Positivism According to Bacon: 1. Scientists gather empirical facts. 2. They infer knowledge (theories) from those facts. Not the other way around! This began the positivist tradition later followed by Comte and Mach Believed science must rid itself of metaphysical (unobservable) concepts. Scientists determines relationships using functional analysis, if X then Y with no speculation as to why. -Bacon’s main point was that in the formulation of theories, scientist’s biases, misconceptions, traditions, and beliefs could manifest themselves and they would inhibit the search for objective truth. -Bacon’s main point was that in the formulation of theories, scientist’s biases, misconceptions, traditions, and beliefs could manifest themselves and they would inhibit the search for objective truth.

    35. B. F. Skinner Functional Analysis of Behavior Functional Analysis: Skinner's approach to research that involves studying the systematic relationship between behavioral and environmental events. Such study focuses on the relationship between reinforcement contingencies and response rate or response probability. -Everything we call mental events are nothing more than labels for bodily processes. -”My position can be stated as follows: What is felt or introspectively observed is not some nonphysical world of consciousness, mind or mental life but the observer’s own body.” -Even if mental events exist, nothing would be gained by studying them because it would be subjective study. -A functional analysis avoids many of the problems associated with studying mental events. -Mental events will someday be understood when we figure out which physiological processes are occurring when we think, choose, or will. -We use mental terms purely from ignorance of which internal physiological process we are responding. -Skinner was a radical behaviorist like Watson, he denied that any mental event could cause behavior.-Everything we call mental events are nothing more than labels for bodily processes. -”My position can be stated as follows: What is felt or introspectively observed is not some nonphysical world of consciousness, mind or mental life but the observer’s own body.” -Even if mental events exist, nothing would be gained by studying them because it would be subjective study. -A functional analysis avoids many of the problems associated with studying mental events. -Mental events will someday be understood when we figure out which physiological processes are occurring when we think, choose, or will. -We use mental terms purely from ignorance of which internal physiological process we are responding. -Skinner was a radical behaviorist like Watson, he denied that any mental event could cause behavior.

    36. B. F. Skinner Operant Behavior Respondent Behavior: Behavior that is elicited by a known stimulus (both learned and unlearned reflexes). S-R Psychology: The type of psychology insisting that environmental stimuli elicit most, if not all, behavior. The Russian physiologists and Watson were S-R psychologists. Instrumental Conditioning: The type of conditioning studied by Thorndike, wherein an organism learns to make a response that is instrumental in producing reinforcement. -Respondent behavior is elicited by known stimulation.-Respondent behavior is elicited by known stimulation.

    37. B. F. Skinner Operant Behavior Operant Behavior: Behavior that is controlled by its consequences and emitted by an organism rather than elicited by a known stimulus. -Skinner, like Thorndike, didn’t know or care why behavior was controlled by its consequences. -Thorndike called behavior instrumental, Skinner called it Operant -Operant behavior is emitted by the organism. -Operant behavior is controlled by its consequences. -Really don’t know why or care why. -Thorndike studied instrumental behavior by measuring time to escape following numerous reinforcements. -Skinner studied operant behavior (in a Skinner box) by allowing animals to freely respond, then measuring the effect of reinforcement on response rates. -Skinner, like Thorndike, didn’t know or care why behavior was controlled by its consequences. -Thorndike called behavior instrumental, Skinner called it Operant -Operant behavior is emitted by the organism. -Operant behavior is controlled by its consequences. -Really don’t know why or care why. -Thorndike studied instrumental behavior by measuring time to escape following numerous reinforcements. -Skinner studied operant behavior (in a Skinner box) by allowing animals to freely respond, then measuring the effect of reinforcement on response rates.

    38. B. F. Skinner The Nature of Reinforcement If an operant response leads to reinforcement, the rate of that response will increase. Those responses an organism makes that result in reinforcement are most likely to occur when the organism is next in that situation. Operant behavior is controlled by its consequences. -Reinforcement can only be identified by its effects on behavior. -What is reinforcing to one may not be reinforcing to another, or even to the same organism in a different situation. -A reinforcer is ANYTHING that, when made contingent on a response, makes the rate of that response increase.-Reinforcement can only be identified by its effects on behavior. -What is reinforcing to one may not be reinforcing to another, or even to the same organism in a different situation. -A reinforcer is ANYTHING that, when made contingent on a response, makes the rate of that response increase.

    39. B. F. Skinner The Nature of Reinforcement -Reinforcement increase the likelihood that the immediately preceding behavior will be repeated. -Positive-introduce something good -Negative-remove something bad-Reinforcement increase the likelihood that the immediately preceding behavior will be repeated. -Positive-introduce something good -Negative-remove something bad

    40. B. F. Skinner The Nature of Reinforcement -Punishment decrease the likelihood that the immediately preceding behavior will be repeated. -Positive-introduce something bad -Negative-remove something good-Punishment decrease the likelihood that the immediately preceding behavior will be repeated. -Positive-introduce something bad -Negative-remove something good

    41. B. F. Skinner The Nature of Reinforcement Primary Reinforcer An innate reinforcer. Satisfies a biological need. Secondary Reinforcer A conditioned reinforcer. An event that gains its reinforcing power through its association with a primary reinforcer. -Primary reinforcers include food, water, and sex. -Primary reinforcers are UCS -Secondary reinforcers include any kind of token such as money. -Secondary reinforcers are CS-Primary reinforcers include food, water, and sex. -Primary reinforcers are UCS -Secondary reinforcers include any kind of token such as money. -Secondary reinforcers are CS

    42. B. F. Skinner The Nature of Reinforcement Immediate Reinforcer Delayed Reinforcer Impulsive decision making tends to lead to immediate reinforcers, but at the expense of delayed reinforcers. Self-controlled decision making tends to…

    43. B. F. Skinner The Nature of Reinforcement Continuous Reinforcement Reinforcing the desired response each time it occurs. Learning occurs rapidly Extinction occurs rapidly Partial Reinforcement Reinforcing a response only part of the time Results in slower acquisition Greater resistance to extinction -In reality, a salesman does not sell a car every time, and a fisher does not catch a fish with every cast.-In reality, a salesman does not sell a car every time, and a fisher does not catch a fish with every cast.

    44. B. F. Skinner The Nature of Reinforcement Fixed Ratio (FR) Schedule that reinforces a response only after a specified number of responses. The faster you respond, the more rewards you get. Very high rate of responding Like piecework pay Unions have fought to remove this kind of payment because it exhausts workers. -After X responses, the next response will be reinforced.-After X responses, the next response will be reinforced.

    45. Variable Ratio (VR) Schedule that reinforces a response after an unpredictable number of responses. Like gambling, fishing Very hard to extinguish because of unpredictability. B. F. Skinner The Nature of Reinforcement -After some unknown number of responses, the next response is reinforced.-After some unknown number of responses, the next response is reinforced.

    46. Fixed Interval (FI) A schedule that reinforces a response only after a specified time has elapsed. Response occurs more frequently as the anticipated time for reward draws near. The mail comes once per day, but we know not to check the mail in the middle of the night. B. F. Skinner The Nature of Reinforcement -Sometimes when I’m waiting for important mail, I start to check the mailbox to see if the mail has come yet, I know not to check at 6:00 a.m., but as the “normal” time of mail delivery approaches, I become more and more likely to check the box.-Sometimes when I’m waiting for important mail, I start to check the mailbox to see if the mail has come yet, I know not to check at 6:00 a.m., but as the “normal” time of mail delivery approaches, I become more and more likely to check the box.

    47. Variable Interval (VI) Schedule that reinforces a response at unpredictable time intervals. Produces slow, steady responding Like the unpredictable pop quiz that reinforces studying B. F. Skinner The Nature of Reinforcement -The first response after some random amount of time is reinforced.-The first response after some random amount of time is reinforced.

    48. B. F. Skinner The Nature of Reinforcement -On this graph, the y axis represents the number of pecks made by a pigeon in a Skinner box (or bar presses by a rat). -Dashes represent reinforcement -Ratio schedules -Fixed ratios produce more responding than variable ratios, but both produce more than interval schedules. -Interval Schedules -FI produces the scallop shaped curves characterized by very little responding when the reinforcer is NOT coming, followed by an increase in responding as the time to reinforcement approaches. -VI schedules produce slow but steady responding.-On this graph, the y axis represents the number of pecks made by a pigeon in a Skinner box (or bar presses by a rat). -Dashes represent reinforcement -Ratio schedules -Fixed ratios produce more responding than variable ratios, but both produce more than interval schedules. -Interval Schedules -FI produces the scallop shaped curves characterized by very little responding when the reinforcer is NOT coming, followed by an increase in responding as the time to reinforcement approaches. -VI schedules produce slow but steady responding.

    49. B. F. Skinner The Importance of the Environment Watson and the Russian physiologists felt the environment was important because it elicited behavior. Skinner felt the environment was important because it selected behavior. The reinforcement contingencies the environment provides determine which behaviors are strengthened. -Skinner followed Darwinian tradition (functionalists) -In any given situation, an organism initially makes many responses. -Of those responses, only a few are functional, that is, lead to reinforcement. -Effective responses survive and become part of the organism’s response repertoire.-Skinner followed Darwinian tradition (functionalists) -In any given situation, an organism initially makes many responses. -Of those responses, only a few are functional, that is, lead to reinforcement. -Effective responses survive and become part of the organism’s response repertoire.

    50. B. F. Skinner The Positive Control of Behavior Reinforcement strengthens behavior, punishment does not weaken behavior. Punishment is used because reinforces the punisher. “Severe punishment unquestionably has an immediate effect in reducing a tendency to act in a given way.” Punishment creates fear, aggression, and other negative behaviors. Ignore undesirable behaviors and reinforce desirable ones! -”A child who has been severely punished for sex play is not necessarily less inclined to continue; and a man who has been imprisoned for violent assault is not necessarily less inclined toward violence. Punished behavior is likely to reappear after the punitive contingencies are withdrawn.” -”A child who has been severely punished for sex play is not necessarily less inclined to continue; and a man who has been imprisoned for violent assault is not necessarily less inclined toward violence. Punished behavior is likely to reappear after the punitive contingencies are withdrawn.”

    51. B. F. Skinner Application of Skinnerian Principles Behavior Therapy: The use of learning principles to treat emotional or behavioral disorders. Token Economies: An arrangement within institutions whereby desirable behavior is strengthened using valuable tokens as reinforcers. -Single rule of behavior therapy -Change reinforcement contingencies and you change behavior. -Behavior therapy used for smoking, alcoholism, drug addiction, mental retardation, juvenile delinquency, speech disorders, shyness, phobias, obesity, and sexual disorders. -Single rule of behavior therapy -Change reinforcement contingencies and you change behavior. -Behavior therapy used for smoking, alcoholism, drug addiction, mental retardation, juvenile delinquency, speech disorders, shyness, phobias, obesity, and sexual disorders.

    52. B. F. Skinner Application of Skinnerian Principles Operant conditioning procedure in which reinforcers guide behavior toward closer approximations of a desired goal a.k.a. method of successive approximation. -Skinner taught animals some very complex behaviors. In order to do this, you need to initially reward animals for doing behaviors that are sort of like the one you want. Then, as time goes on, reward behavior only as it gets closer and closer to the desired behavior. -In order to teach an elephant to stand on its back legs I might first reward the elephant whenever it shifts its weight toward the back, then I might reward it when it sits on its hind feet, then I might reward it for lifting a foot off of the ground, then two feet, then standing.-Skinner taught animals some very complex behaviors. In order to do this, you need to initially reward animals for doing behaviors that are sort of like the one you want. Then, as time goes on, reward behavior only as it gets closer and closer to the desired behavior. -In order to teach an elephant to stand on its back legs I might first reward the elephant whenever it shifts its weight toward the back, then I might reward it when it sits on its hind feet, then I might reward it for lifting a foot off of the ground, then two feet, then standing.

    53. B. F. Skinner Application of Skinnerian Principles Instinctive Drift Breland and Breland shaped animals for circuses, TV, and movies. Taught pigs to pick up “tokens” and deposit in a machine for food. Pigs began rooting the tokens even though it led to longer delay to reinforcement.

More Related