23
Creating a New Intelligent Species: Choices and Responsibilities for AI Designers Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org

Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

  • Upload
    claire

  • View
    29

  • Download
    0

Embed Size (px)

DESCRIPTION

Creating a New Intelligent Species: Choices and Responsibilities for AI Designers. Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org. tool making weapons grammar tickling sweets preferred planning for future sexual attraction meal times private inner life. - PowerPoint PPT Presentation

Citation preview

Page 1: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Creating a New Intelligent Species:

Choices and Responsibilities for AI Designers

Eliezer YudkowskySingularity Institute for Artificial Intelligence

singinst.org

Page 2: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Eliezer Yudkowsky Singularity Institute for AI

In Every Known Culture:

• tool making• weapons• grammar• tickling• sweets preferred• planning for future• sexual attraction• meal times• private inner life

• try to heal the sick• incest taboos• true distinguished

from false• mourning• personal names• dance, singing• promises• mediation of conflicts

(Donald E. Brown, 1991. Human universals. New York: McGraw-Hill.)

Page 3: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

ATP Synthase:The oldest wheel.

ATP synthase is nearly the same in mitochondria, chloroplasts, and bacteria – it’s older than eukaryotic life.

Eliezer Yudkowsky Singularity Institute for AI

Page 4: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

A complex adaptation must be universal within a species.

Imagine a complex adaptation – say, part of an eye – that has 6 necessary proteins. If each gene is at 10% frequency, the chance of assembling a working eye is 1:1,000,000.

Pieces 1 through 5 must already be fixed in the gene pool, before natural selection will promote an extra, helpful piece 6 to fixation.

Eliezer Yudkowsky Singularity Institute for AI

(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)

Page 5: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

The Psychic Unity of Humankind

(yes, that’s the standard term)

Complex adaptations must be universal – this logic applies with equal force to cognitive machinery in the human brain.

In every known culture: joy, sadness, disgust, anger, fear, surprise – shown by the same facial expressions.

Eliezer Yudkowsky Singularity Institute for AI

(Paul Ekman, 1982. Emotion in the Human Face.)(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)

Page 6: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Must… not…

emote…

Image: “The Matrix”

Page 7: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Eliezer Yudkowsky Singularity Institute for AI

Aha! A human with the AI-universal facial expression for disgust! (She must be a machine in disguise.)

Images: (1) “The Matrix” (2) University of Plymouth,http://www.psy.plym.ac.uk/year3/psy364emotions/psy364_emotions_evolutionary_psychobiolog.htm

Page 8: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Anthropomorphic hypothesis:

Causes

Eliezer Yudkowsky Singularity Institute for AI

Page 9: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Same mistake, more subtle:

Causes

Eliezer Yudkowsky Singularity Institute for AI

Page 10: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

in nature we seewhat exists in us;

in looks out, and findsfaces in the clouds...

Page 11: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

It takes a conscious effort to remember the machinery:

Eliezer Yudkowsky Singularity Institute for AI

Page 12: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Eliezer Yudkowsky Singularity Institute for AI

AI Nature:

• tool making• weapons• grammar• tickling• sweets preferred• planning for future• sexual attraction• meal times• private inner life

• try to heal the sick• incest taboos• true distinguished

from false• mourning• personal names• dance, singing• promises• mediation of conflicts

Page 13: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Eliezer Yudkowsky Singularity Institute for AI

AI Nature:

• tool making• weapons• grammar• tickling• sweets preferred• planning for future• sexual attraction++• meal times• private inner life

• heal sick humans• snarkling taboos• true distinguished

from false• mourning• personal names• dance, fzeeming• promises• mediation of conflicts

Page 14: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Crimes against nonhumanityand inhuman rights violations:

• cognitive enslavement

• theft of destiny

• creation under a low purpose

• denial of uniqueness

• hedonic/environmental mismatch

• fzeem deprivation

Eliezer Yudkowsky Singularity Institute for AI

Page 15: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Happiness set points:

• After one year, lottery winners were not much happier than a control group, and paraplegics were not much unhappier.

• People underestimate adjustments because they focus on the initial surprise.

Eliezer Yudkowsky Singularity Institute for AI

(Brickman, P., Coates, D., & Janoff-Bulman, R. (1978). Lottery winners and accident victims: is happiness relative? Journal of Personality and Social Psychology, 37, 917-927.)

Page 16: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

“Hedonic treadmill” effects:

(Source: Survey by PNC Advisors. http://www.sharpenet.com/gt/issues/2005/mar05/1.shtml)

Eliezer Yudkowsky Singularity Institute for AI

• People with $500,000-$1,000,000 in assets say they would need an average of $2.4 million to feel “financially secure”.

• People with $5 million feel they need at least $10 million.

• People with $10 million feel they need at least $18 million.

Page 17: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Your life circumstances make little difference in how happy you are.

“The fundamental surprise of well-being research is the robust finding that life circumstances make only a small contribution to the variance of happiness—far smaller than the contribution of inherited temperament or personality. Although people have intense emotional reactions to major changes in the circumstances of their lives, these reactions appear to subside more or less completely, and often quite quickly... After a period of adjustment lottery winners are not much happier than a control group and paraplegics not much unhappier.”

(Daniel Kahneman, 2000. “Experienced Utility and Objective Happiness: A Moment-Based Approach.” In Choices, Values, and Frames, D. Kahneman and A. Tversky (Eds.) New York: Cambridge University Press.) Findable online, or google “hedonic psychology”.

Eliezer Yudkowsky Singularity Institute for AI

Page 18: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Nurture is built atop nature:

• Growing a fur coat in response to cold weather requires more genetic complexity than growing a fur coat. (George C. Williams, 1966. Adaptation and Natural Selection. Princeton University Press.)

• Humans learn different languages depending on culture, but this cultural dependency rests on a sophisticated cognitive adaptation: mice don’t do it. (John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)

Eliezer Yudkowsky Singularity Institute for AI

Page 19: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Creationtranscendsparenting:

An AI programmer stands,not in loco parentis,

but in loco evolutionis.

Eliezer Yudkowsky Singularity Institute for AI

Page 20: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

To

create a new intelligent species

(even if it has only one member)is to create,

not a child of the programmers,

but a child of humankind,a new descendant of the familythat began with Homo sapiens

Eliezer Yudkowsky Singularity Institute for AI

Page 21: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

If you didn’t intend to create a child of humankind, then you screwed up

big-time if your “mere program”:

• Starts talking about the mystery of conscious experience and its sense of selfhood.

• Or wants public recognition of personhood and resents social exclusion (inherently, not as a pure instrumental subgoal).

• Or has pleasure/pain reinforcement and a complex powerful self-model.

Eliezer Yudkowsky Singularity Institute for AI

Page 22: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

BINA48

• By hypothesis, the first child of humankind• created for the purpose of a bloody customer

service hotline (?!)• from the bastardized mushed-up brain scans of

some poor human donors• by morons who didn’t have the vaguest idea how

important it all was

Eliezer Yudkowsky Singularity Institute for AI

By the time this gets to court, no matter what the judge decides, the human species has already screwed it up.

Page 23: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Take-home message:

Don’t refight the last war.

Doing right by a child of humankind is not like ensuring fair treatment of a human minority.

Program children kindly;

fair treatment may be too little too late.

Eliezer YudkowskySingularity Institute for Artificial Intelligence

singinst.org