160 likes | 172 Views
Explore the potential impact of GNR technology and the emergence of augmented humans on society and personal identity. Delve into the worries and ethical questions surrounding this technological progress, including the loss of free will and personal identity, societal rifts, and the concept of the Singularity. Contemplate the three key questions - should, can, and will we stop the Age of Augments from coming into being?
E N D
Enough Arnold?Cognitive Technology and the Future of Humanity Minds and Machines
The Coming of the Augments • “GNR” technology (genetic engineering, nanotechnology, robotics technology) may soon reach the point where humans will have the ability to fundamentally change the nature of the human species. • TWO (or more) VARIETIES OF PERSONS MIGHT COME TO EXIST: • HUMANS -- AUGMENTS
Some Worries • Could lead to destructive techno-race • Rifts in society • Horizontal: ‘techno-poor’ vs ‘techno-rich’ • Vertical: generational gap • Today’s technology is outdated tomorrow • … and the pace of ‘progress’ will only accelerate • Our children become a commodity / product • Designer Babies • ‘Playing God’
Further Worries • Loss of Free Will • Parents ‘designed’ us • Loss of Personal Identity • ‘Specs’ are known: Loss of self-exploration, self-motivation, or self-fulfillment • Maybe free will and personal identity are an illusion • Right now I don’t worry about that • But new technology may throw this in our face
The Singularity • Some people believe that the pace of technological change will reach such a rate that we have to become cyborgs to even make sense of this new technology • If we are able to create a being that’s smarter than us, imagine what that being could create • Best-known proponent: Ray Kurzweil • In “The Singularity is Near”, he predicts this will happen somewhere mid 21st century
Three Questions: • 1.
Three Questions: • 2.
Three Questions: • 3.
Three Questions Bound Together • Will implies Can – If we will stop it, then that must mean we can • Can does-not-implyOught- If we can stop it, does that mean we should? And if we cannot, does that mean we should not? • Those are Ought-from-Is fallacies! • Maybe doom is inevitable • Subtle variant: if we cannot stop it, should we therefore not try and stop it?
Three Questions Bound Together • S: We should stop the “Age of Augments” from coming into being • C: We can stop the “Age of Augments” from coming into being • W: We will stop the “Age of Augments” from coming into being
Should we … Can we … Will we …Stop the “Age of Augments”? S C W T TTTT F T F T T F F F T T F T F F F T F FF Which combination is the “correct one?”
Related Questions: • The “Theoretical Enough” Questions: • Is there ever a point beyond which technological progress no longer implies: • Individual/Personal progress • Societal progress • Species progress • …
Related Questions: • The “Species” Questions: • Do we have the RIGHT to ensure that we remain the dominant species on Earth? • Do we have the OBLIGATION to ensure that we remain the dominant species? • If a new and superior species of Augments comes into being, do humans have the RIGHT to remain as a (non-dominant) species?
Related Questions: • The “No Child Left Behind” Question: • Assume a civilization of Augments becomes the dominant civilization on Earth • Should you as a parent be morally OBLIGATED to have your child undergo augmentation (become an augment) to ensure that she/he will be able to compete successfully with her/his peers?