310 likes | 419 Views
Minds and Machines. Summer 2011 Wednesday, 07/20. Qualitative States. Conscious or experiential states. There’s something it’s like to be in such states.
E N D
Minds and Machines Summer 2011 Wednesday, 07/20
Qualitative States • Conscious or experiential states. • There’s something it’s like to be in such states. • For example, there is no special way it feels to be 6 feet tall. A statue can be 6 feet tall without feeling anything. But there is a special way it "feels" to be in (e.g. sharp or dull) pain or to visually experience a red tomato. • When mental states have a distinctive conscious character like this, we say that they are qualitative states, and we call their distinctive "feel" or conscious character their qualitative character. (Other lingo: "phenomenal states" and "phenomenal character.”)
Representational States • Many mental states are representational: they're aboutthings. • For example, my belief that NYU is located in New York City is about NYU and about New York City. • My desire to eat ice-cream is about eating and about ice-cream.
Intentionality • Aboutness: The power of minds to be about, to represent, or to stand for, things, properties and states of affairs. • Derives from the Latin word intentio, which in turn derives from the verb intendere, which means being directed towards some goal or thing. • Not to be confused with “intention” and “intension(ality)”.
Brentano’s Thesis • Contemporary discussions of the nature of intentionality were launched by Franz Brentano.
Brentano’s Thesis “Every mental phenomenon is characterized by what the Scholastics of the Middle Ages called the intentional (or mental) inexistence of an object, and what we might call, though not wholly unambiguously, reference to a content, direction toward an object (which is not to be understood here as meaning a thing), or immanent objectivity. Every mental phenomenon includes something as object within itself, although they do not do so in the same way. In presentation, something is presented, in judgment something is affirmed or denied, in love loved, in hate hated, in desire desired and so on.” “This intentional inexistence is characteristic exclusively of mental phenomena. No physical phenomenon exhibits anything like it. We can, therefore, define mental phenomena by saying that they are those phenomena which contain an object intentionally within themselves.”
Characteristics of Intentionality: Intentional Inexistence • The possible non-existence of the object of an intentional item. • For example, I cannot want without wanting something, but what I want need not exist for me to want it. • People have believed in Zeus, and children often believe in Santa Claus. Even though these things don’t exist, such beliefs still seem to have an object or to be about something.
Characteristics of Intentionality: Incomplete Representation • Representational states need not specify every detail of the objects they are about. • For example, my memory of my high school girlfriend may not represent her as having brown eyes (it may not represent her as not having brown eyes too).
Characteristics of Intentionality: Incomplete Representation • A representational state may be about an item, without being about any particular item. • For example, I may desire a coke or an ice-cream cone without desiring some particular coke or ice-cream cone.
Characteristics of Intentionality: Incomplete Representation • Representational states may represent that the F is a certain way without representing that the G is a certain way, even if the F is the G. • For example, Lois Lane believes that the super-hero is strong, but she does not believe that the reporter is strong. Yet the super-heroisthe reporter!
Characteristics of Intentionality: Incomplete Representation • This kind of incompleteness (along with intentional inexistence) seem to be distinctive of intentional relations like believing, desiring, and so on. • Contrast this with other, non-intentional relations like kissing or kicking. If Lois kisses or kicks the super-hero, she thereby kisses or kicks the reporter. Moreover, the super-hero/reporter must exist for her to do this! • But Lois can believe that the super-hero is strong without thereby believing that the reporter is strong. Moreover, the super-hero/reporter need not exist for Lewis to believe this.
Characteristics of Intentionality: Direction of Fit • Mind-to-World: The mental state functions to represent the world, e.g. beliefs, perceptions. • World-to-Mind: The mental state functions to represent a non-actual state of affairs, e.g. intentions, desires, plans.
Original vs. Derived Intentionality • Linguistic items such as words and sentences and maybe other sorts of representations like pictures, charts, films, are only derivatively about anything. • They depend for their intentionality on their being the creations and tools of creatures with minds. • What makes the particular sequence of ink marks on the next line Nietzsche is dead. about Nietzsche is its role as a sentence in a language used by people who have beliefs about Nietzsche , and wish to communicate about Nietzsche.
Original vs. Derived Intentionality • This suggests that the intrinsic or original intentionality of mental states is a very special feature, the source of all meaning in the world. • Sentences, pictures, diagrams, (and so on) are, in effect, extensions of the minds of those who use them, having no intrinsic meaning but only the meaning they derive from their use.
Mark of the Mental: Intentionality • Philosophers have tried to find a single feature that all mental states and processes have, and that all non-mental states and processes lack. If we found such a feature, it would be a mark of the mental. • One proposal is that (original) intentionality, or being representational,is a mark of the mental. But qualitative states are clearly mental states. And it’s controversial whether every qualitative state is also a representational state.
Mark of the Mental: Experience • A second proposal is that being conscious (or being a qualitative state) is a mark of the mental. • But there do seem to be examples of mental states that aren't conscious, e.g. unconscious motives and beliefs. So this proposal is also controversial.
Mark of the Mental: Self-Knowledge/Self-Awareness • A third proposal is that (the possibility of) having special kind of knowledge or awareness of a state is what marks it as mental. • For example, I can know or be aware of my pains in a way that’s very different from my way of knowing or becoming aware of my height. • Anything that we can know or be aware of in this special way would count as a part of the mind. • But it’s unclear what this special knowledge/awareness consists in and whether we have such special knowledge/awareness of all our mental states .
Searle’s Chinese Room • We sometimes ascribe intentional states to computers and other artifacts. • For example, you might say that your chess-playing computer wants to castle king-side or that your computer is thinking about the best move. • But this is an example of what we called derived intentionality. We’re just “reading” intentionality into the computer just like we may “read” emotion into a doll by describing it as sad. • According to the functionalist, if we have a computer running a sophisticated enough program, then the computer will have mental states with original intentionality. This is the view Searle wants to refute.
Searle’s Chinese Room: Refresher • Searle inhabits a room that contains a detailed rule book for how tomanipulate Chinese symbols. • He does not know what the symbols mean, but he can distinguish them by their shape. • If you pass Chinese symbols into the room, he will manipulate them according to the rules, and pass back a different set of Chinese symbols. • This results in what appears to be an intelligible conversation in Chinese.
Searle’s Chinese Room • Searle does not understand Chinese, even though he is manipulating symbols according to the rules in the book. • So manipulating symbols according to such rules is not enough, by itself, to enable one to understand Chinese. • Searle concludes that: “Such intentionality as computers appear to have is solely in the minds of those who program them and those who use them, those who send in the input and those who interpret the output. The aim of the Chinese room example was to try to show this by showing that as soon as we put something into the system that really does have intentionality (a man), and we program him with the formal program, you can see that the formal program carries no additional intentionality. It adds nothing, for example, to a man's ability to understand Chinese.”
The Systems Reply • Searle does not himself implement the Chinese room software. He is only part of the machinery. • The system as a whole—which includes Searle, the book of instructions, Searle’s scratch paper, and so on—is what implements the Chinese room software. • The functionalist is only committed to saying that this system as a whole understands Chinese. It is compatible with this that Searle does not understand Chinese.
The Systems Reply • Searle responds to the Systems Reply like this: “My response to the systems theory is quite simple: let the individual internalize all of these elements of the system. He memorizes the rules in the ledger and the data banks of Chinese symbols, and he does all the calculations in his head. The individual then incorporates the entire system. There isn't anything at all to the system that he does not encompass. We can even get rid of the room and suppose he works outdoors. All the same, he understands nothing of the Chinese, and a fortiori neither does the system, because there isn't anything in the system that isn't in him. If he doesn't understand, then there is no way that the system could understand because the system is just a part of him.”
The Systems Reply • But Searle’s crucial claim "he understands nothing of the Chinese, and a fortiori [for an even stronger reason!] neither does the system, because there isn't anything in the system that isn't in him" is a dubious form of inference. • Consider: He doesn't weigh five pounds, and a fortiori neither does his heart, because there isn't anything in his heart that isn't in him. Or: he wasn't designed by the Pentagon, and a fortiori neither was the Chinese room system, because there isn't anything in the system that isn't in him.
The Systems Reply • Searle focuses on spatial location, the fact that the system runs inside him. But the important relationship is that of implementation. • Compare: Running a computer game on Windows. The game can crash, without Windows’ crashing. Conversely, windows can run Word, without the game’s running word. • Similarly, the fact that Searle fully incorporates the Chinese room software does not imply that Searle shares all the states of the Chinese room software, nor that the software shares all of his states. • If the Chinese room software is in a state ofunderstanding what a certain Chinese symbol says,that does not imply that Searle is also in that state.
The Systems Reply • The fact that the Chinese room software is spatially inside Searle just means that the Chinese room software and the Searle software (i.e. the software that captures Searle’s mind) are run on the same hardware (Searle’s brain). • It does not mean that any states of the one are thereby states of the other. • According to the functionalist, Searle’s body houses two distinct intelligent systems: the Chinese room system and the Searle system. The former is implemented by the latter. But that does not imply that all states of the former are states of the latter (and vice versa).
Block’s Troubles with Functionalism “Imagine a body externally like a human body, say yours, but internally quite different. The neurons from sensory organs are connected to a bank of lights in a hollow cavity in the head. A set of buttons connects to the motor-output neurons. Inside the cavity resides a group of little men. Each has a very simple task: to implement a "square" of an adequate machine table that describes you. On one wall is a bulletin board on which is posted a state card, i.e., a card that bears a symbol designating one of the states specified in the machine table. Here is what the little men do: Suppose the posted card has a 'G' on it... Suppose the light representing input I17 goes on. One of the G-men has the following as his sole task: when the card reads 'G' and the I17 light goes on, he presses output button O191 and changes the state card to 'M'... In spite of the low level of intelligence required of each little man, the system as a whole manages to simulate you because the functional organization they have been trained to realize is yours…”
Block’s Troubles with Functionalism “Suppose we convert the government of China to functionalism, and we convince its officials to realize a human mind for an hour. We provide each of the billion people in China…with a specially designed two-way radio that connects them in the appropriate way to other persons and to the artificial body mentioned in the previous example. We replace each of the little men with a citizen of China plus his radio. Instead of a bulletin board we arrange to have letters displayed on a series of satellites placed so that they can be seen from anywhere in China. The system of a billion people communicating with one another plus satellites plays the role of an external "brain" connected to the artificial body by radio.”
Block’s Troubles with Functionalism • Block argues that in both cases, the functionalist is committed to saying that the system has mental states. • But, prima facie, these systems don't seem to have any mental states. • Even if we granted that the Homunculi-head or the China-brain hadintentional states, Block says that it’s especially doubtful that they have any qualitative states. (This focus on qualitative states is one main respect in which Block’s argument differs from Searle’s)
Putnam’s Reply • Putnam suggests that we emend functionalism, so that it says that creatures with the same functional organization as you will have the same mental states as you if and only if the internal mechanisms which realize their functional organization do not depend on the activities of things which themselves have minds. • But this emendation seems (i) ad hoc, and (ii) too strong (tiny-creatures example).
Problems with appeals to intuition • In many cases scientific results have convinced us to accept extremely unintuitive claims, e.g. that the earth is round, that two parallel lines can intersect, even that a cat can be both dead and alive at the same time! • The fact that attributing mentality to the previous imaginary systems seems unintuitive shows something about our concepts or ways of thinking about mental states. • But our ordinary concepts may be wrong! It may still be true that these systems have the property of having intentional/qualitative states.
Problems with appeals to intuition • In another article, Block claims that for Searle’s argument to work, Searle must show that there isn’t sufficient scientific evidence for regarding thinking to be formal symbol manipulation. This would show that we can rely on our intuitions, that our ordinary concepts are basically OK. • Since Block believes that there is such evidence, he rejects Searle’s conclusion. • But given that there isn’t any scientific evidence for thinking that experiencing is just manipulating symbols, Block still thinks that his own arguments work in the case of qualitative states.