Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Preview:

DESCRIPTION

Creating a New Intelligent Species: Choices and Responsibilities for AI Designers. Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org. tool making weapons grammar tickling sweets preferred planning for future sexual attraction meal times private inner life. - PowerPoint PPT Presentation

Citation preview

Creating a New Intelligent Species:

Choices and Responsibilities for AI Designers

Eliezer YudkowskySingularity Institute for Artificial Intelligence

singinst.org

Eliezer Yudkowsky Singularity Institute for AI

In Every Known Culture:

• tool making• weapons• grammar• tickling• sweets preferred• planning for future• sexual attraction• meal times• private inner life

• try to heal the sick• incest taboos• true distinguished

from false• mourning• personal names• dance, singing• promises• mediation of conflicts

(Donald E. Brown, 1991. Human universals. New York: McGraw-Hill.)

ATP Synthase:The oldest wheel.

ATP synthase is nearly the same in mitochondria, chloroplasts, and bacteria – it’s older than eukaryotic life.

Eliezer Yudkowsky Singularity Institute for AI

A complex adaptation must be universal within a species.

Imagine a complex adaptation – say, part of an eye – that has 6 necessary proteins. If each gene is at 10% frequency, the chance of assembling a working eye is 1:1,000,000.

Pieces 1 through 5 must already be fixed in the gene pool, before natural selection will promote an extra, helpful piece 6 to fixation.

Eliezer Yudkowsky Singularity Institute for AI

(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)

The Psychic Unity of Humankind

(yes, that’s the standard term)

Complex adaptations must be universal – this logic applies with equal force to cognitive machinery in the human brain.

In every known culture: joy, sadness, disgust, anger, fear, surprise – shown by the same facial expressions.

Eliezer Yudkowsky Singularity Institute for AI

(Paul Ekman, 1982. Emotion in the Human Face.)(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)

Must… not…

emote…

Image: “The Matrix”

Eliezer Yudkowsky Singularity Institute for AI

Aha! A human with the AI-universal facial expression for disgust! (She must be a machine in disguise.)

Images: (1) “The Matrix” (2) University of Plymouth,http://www.psy.plym.ac.uk/year3/psy364emotions/psy364_emotions_evolutionary_psychobiolog.htm

Anthropomorphic hypothesis:

Causes

Eliezer Yudkowsky Singularity Institute for AI

Same mistake, more subtle:

Causes

Eliezer Yudkowsky Singularity Institute for AI

in nature we seewhat exists in us;

in looks out, and findsfaces in the clouds...

It takes a conscious effort to remember the machinery:

Eliezer Yudkowsky Singularity Institute for AI

Eliezer Yudkowsky Singularity Institute for AI

AI Nature:

• tool making• weapons• grammar• tickling• sweets preferred• planning for future• sexual attraction• meal times• private inner life

• try to heal the sick• incest taboos• true distinguished

from false• mourning• personal names• dance, singing• promises• mediation of conflicts

Eliezer Yudkowsky Singularity Institute for AI

AI Nature:

• tool making• weapons• grammar• tickling• sweets preferred• planning for future• sexual attraction++• meal times• private inner life

• heal sick humans• snarkling taboos• true distinguished

from false• mourning• personal names• dance, fzeeming• promises• mediation of conflicts

Crimes against nonhumanityand inhuman rights violations:

• cognitive enslavement

• theft of destiny

• creation under a low purpose

• denial of uniqueness

• hedonic/environmental mismatch

• fzeem deprivation

Eliezer Yudkowsky Singularity Institute for AI

Happiness set points:

• After one year, lottery winners were not much happier than a control group, and paraplegics were not much unhappier.

• People underestimate adjustments because they focus on the initial surprise.

Eliezer Yudkowsky Singularity Institute for AI

(Brickman, P., Coates, D., & Janoff-Bulman, R. (1978). Lottery winners and accident victims: is happiness relative? Journal of Personality and Social Psychology, 37, 917-927.)

“Hedonic treadmill” effects:

(Source: Survey by PNC Advisors. http://www.sharpenet.com/gt/issues/2005/mar05/1.shtml)

Eliezer Yudkowsky Singularity Institute for AI

• People with $500,000-$1,000,000 in assets say they would need an average of $2.4 million to feel “financially secure”.

• People with $5 million feel they need at least $10 million.

• People with $10 million feel they need at least $18 million.

Your life circumstances make little difference in how happy you are.

“The fundamental surprise of well-being research is the robust finding that life circumstances make only a small contribution to the variance of happiness—far smaller than the contribution of inherited temperament or personality. Although people have intense emotional reactions to major changes in the circumstances of their lives, these reactions appear to subside more or less completely, and often quite quickly... After a period of adjustment lottery winners are not much happier than a control group and paraplegics not much unhappier.”

(Daniel Kahneman, 2000. “Experienced Utility and Objective Happiness: A Moment-Based Approach.” In Choices, Values, and Frames, D. Kahneman and A. Tversky (Eds.) New York: Cambridge University Press.) Findable online, or google “hedonic psychology”.

Eliezer Yudkowsky Singularity Institute for AI

Nurture is built atop nature:

• Growing a fur coat in response to cold weather requires more genetic complexity than growing a fur coat. (George C. Williams, 1966. Adaptation and Natural Selection. Princeton University Press.)

• Humans learn different languages depending on culture, but this cultural dependency rests on a sophisticated cognitive adaptation: mice don’t do it. (John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)

Eliezer Yudkowsky Singularity Institute for AI

Creationtranscendsparenting:

An AI programmer stands,not in loco parentis,

but in loco evolutionis.

Eliezer Yudkowsky Singularity Institute for AI

To

create a new intelligent species

(even if it has only one member)is to create,

not a child of the programmers,

but a child of humankind,a new descendant of the familythat began with Homo sapiens

Eliezer Yudkowsky Singularity Institute for AI

If you didn’t intend to create a child of humankind, then you screwed up

big-time if your “mere program”:

• Starts talking about the mystery of conscious experience and its sense of selfhood.

• Or wants public recognition of personhood and resents social exclusion (inherently, not as a pure instrumental subgoal).

• Or has pleasure/pain reinforcement and a complex powerful self-model.

Eliezer Yudkowsky Singularity Institute for AI

BINA48

• By hypothesis, the first child of humankind• created for the purpose of a bloody customer

service hotline (?!)• from the bastardized mushed-up brain scans of

some poor human donors• by morons who didn’t have the vaguest idea how

important it all was

Eliezer Yudkowsky Singularity Institute for AI

By the time this gets to court, no matter what the judge decides, the human species has already screwed it up.

Take-home message:

Don’t refight the last war.

Doing right by a child of humankind is not like ensuring fair treatment of a human minority.

Program children kindly;

fair treatment may be too little too late.

Eliezer YudkowskySingularity Institute for Artificial Intelligence

singinst.org

Recommended