130 likes | 372 Views
Personhood. What is a person? Why does it matter? “Human” rights: do you have to be human to deserve human rights? Restricted rights? Rights of protection, etc. granted to children, the severely mentally ill and others that are not granted full human rights
E N D
What is a person? • Why does it matter? • “Human” rights: do you have to be human to deserve human rights? • Restricted rights? Rights of protection, etc. granted to children, the severely mentally ill and others that are not granted full human rights • The right to be counted in utilitarian calculations, i.e. to have one’s pleasure and suffering matter morally
What does it take to be a person? 1) A member of the species Homo sapiens • People vs. “persons” • Brain-dead, talking pig, aliens, apes, computers, robots • Cyborgs (part human, part machine): how much of a human being can be replaced by machinery or artificial parts before personhood is lost?
2) A certain level of intelligence? How to define? Merely quantitative or qualitative, e.g. understanding concepts, having intentionality? Not necessary? Baby, mentally disabled Not sufficient? Intelligent but not sentient computer, deep blue, “zombie”
3) Being consciousness and/or having feelings Lower animals, e.g. rabbits, chickens Does consciousness come in degrees? Is a certain degree of consciousness necessary? How to determine consciousness? The Problem of Other Minds
4) Moral agenthood Kant: “…rational beings are called persons inasmuch as their nature already marks them out as ends in themselves” (1785) • Kant’s requirements of a moral agent: rationality, autonomy, able to understand moral judgments and choose to act morally, free will • Problem cases: Babies, mentally deficient people, apes, dolphins, computers, robots • Morally responsible vs. morally considerable • Even if only “persons” are moral agents (hence, morally responsible) , “non-persons” may be morally considerable
5) Some combination? Having sufficient intelligence, being a moral agent, being conscious, having free will, (being homo sapiens)? What combination would you choose? Are the criteria too strict? Can fulfilling some criteria be sufficient, e.g. either being homo sapiens or being sufficiently intelligent and conscious?
Artifical Intelligence • Could AI fit all the criteria (except being homo sapiens)? • Could computers: • Be (truly) intelligent, e.g. understanding concepts, having intentionality • Be conscious • Have feelings • Be moral agents • Have free will (at least to the same degree as we do) • Is it important to be biological?
The Turing Test Proposed by Alan Turing in 1950
Is the Turing Test sufficient? The Chinese Room Thought experiment invented by John Searle in 1980
Is the Turing Test necessary? Aliens, animals, computers surely don’t have to be exactly like humans (mentally) to be “persons”, or to be moral agents, or (at least) to be morally considerable.
Will there come a time when we have to make moral judgments regarding how computers or robots are treated? • Are people just biological computers? If so, could non-biological computers be mentally similar? • If people are something more, e.g. a spirit or special type of substance (e.g. non-physical substance) that has become attached to a biological machine, could a similar spirit or special type of substance become attached to a non-biological machine?
Suggested readings Asimov, Issac (1976), Bicentennial Man (on reserve in the Philosophy Dept. Office) Searle, John. R. (1990), “Is the Brain's Mind a Computer Program?” in Scientific American, 262, pgs. 20-25(in main library) Churchland, Paul, and Patricia Smith Churchland (1990) “Could a machine think?” in Scientific American 262, pgs. 26-31 (in main library)