Transcript
Page 1: Nick Bostrom, Oxford’s Future of Humanity Institute

Professor Nick BostromFaculty of Philosophy

Director, Future of Humanity InstituteJames Martin 21st Century School

Oxford University

The Big Picture

Page 2: Nick Bostrom, Oxford’s Future of Humanity Institute
Page 3: Nick Bostrom, Oxford’s Future of Humanity Institute

• “At the very apex of the first stratified societies, dynastic dreams were dreamt and visions of triumph or ruin entertained; but there is no mention in the papyri and cuniform tablets on which these hopes and fears were recorded that they envisaged, in the slightest degree, changes in the material conditions of the great masses, or for that matter, of the ruling class itself.”

• Robert Heilbroner

Page 4: Nick Bostrom, Oxford’s Future of Humanity Institute

The future of humanity?

Time

2007

Technological development

pre-human condition

human condition

posthuman condition

?

Page 5: Nick Bostrom, Oxford’s Future of Humanity Institute

Extinction

Time

2007

Technological development

pre-human condition

human condition

posthuman condition

Page 6: Nick Bostrom, Oxford’s Future of Humanity Institute

Past catastrophes killing >10 millionCatastrophe Century Deaths (millions)

Smallpox 20C 400

Plague of Justinian 6C 100

Black death (1347-1350) 14C 75

Second World War 20C 55

Great Leap Forward in China (famine) 20C 40

Spanish flu pandemic (1918-1919) 20C 40

Mongol conquests 13C 40

An Shi Rebellion (756–763) 8C 36

HIV/AIDS (20C-21C) 27

Fall of the Ming Dynasty 17C 25

Chinese Famine of 1907 20C 24

Taiping Rebellion (1851–1864) 19C 20

Decimation of the American Indians 15C-19C 20

Stalin (famines and purges) 20C 20

Mideast Slave Trade 7C-19C 19

Atlantic Slave Trade 15C-19C 18

Timur Lenk 14C-15C 17

British India (mainly famine) 19C 17

First World War 20C 15

Page 7: Nick Bostrom, Oxford’s Future of Humanity Institute

Scope

Intensity

Loss of one hair

Congestion fromone extra vehicle

0.01 degreeglobal warming

Loss of onespecies of beetle

Car is stolen

Recession ina country

Destruction ofthe ozone layer

Drastic loss ofbiodiversity

Fatal car crash

Genocide

Ageing?

Human extinction

Personal

Global

Trans-generational

Local

Imperceptible Endurable Terminal (Hellish?)

(Cosmic?)

GlobalCatastrophicRisks

Existential Risks

Page 8: Nick Bostrom, Oxford’s Future of Humanity Institute

Existential risk• One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.

Page 9: Nick Bostrom, Oxford’s Future of Humanity Institute

No existential catastrophe• > 99.9% of all species are extinct

• Toba eruption 75,000 years ago ~500 reproducing females

• Neanderthals extinct 33,000 – 24,000 years ago

• Homo florensis extinct 12,000 years ago?

• Opinions…

• Existential risks on the horizon…

Page 10: Nick Bostrom, Oxford’s Future of Humanity Institute

Some opinions on net existential risk

Source Claim Probability

GCR conference poll Median answer to “Overall risk of human extinction prior to 2100”

19%

Stern Report Extinction risk per year 0.1%

John Leslie (1996) Human extinction by 2496 (based partly on the doomsday argument and Leslie’s view about how quantum indeterminacy affects this argument)

30%

Martin Rees End of civilization by 2100 (note: this need not entail extinction or existential catastrophe)

50%

Early Bostrom (2002) Cumulative existential risk (no time limit)

≥ 25%

Richard Gott Probability of humanity extinct <5,100 yrs

0.25%

Richard Posner Human extinction this century “significant”

Page 11: Nick Bostrom, Oxford’s Future of Humanity Institute

•Natural

Anthropogenic

Page 12: Nick Bostrom, Oxford’s Future of Humanity Institute

HUMAN EXTINCTION RISKS?

Nanotech weapons systems

Nuclear holocaust

Simulation shutdown

Genetically engineering / synthetic biology

Non-weapons nanotech accident

Natural pandemic

Emissions-caused global warming

Nonspecific conflict

Self-destroying superintelligent AI

Supervolcanic eruption

Physics experiment

Extraterrestrial intelligence

Non-anthropogenic vaccuum decay

Kinetic impact

Back-contamination

Other climate change

Space radiation

Human infertility

Other

Page 13: Nick Bostrom, Oxford’s Future of Humanity Institute

Stagnation or Plateau

Time

2007

Technological development

pre-human condition

human condition

posthuman condition

Page 14: Nick Bostrom, Oxford’s Future of Humanity Institute
Page 15: Nick Bostrom, Oxford’s Future of Humanity Institute
Page 16: Nick Bostrom, Oxford’s Future of Humanity Institute
Page 17: Nick Bostrom, Oxford’s Future of Humanity Institute
Page 18: Nick Bostrom, Oxford’s Future of Humanity Institute

Revolutionary technologies?

•Artificial intelligence and superintelligence•Machine-phase nanotechnology•Uploading•Information technology (virtual reality; wearable computers, etc.)•Anti-aging medicine•Nootropics•Mood enhancers•Genetic engineering•Collaborative information filtering•Institutions: information markets…

Page 19: Nick Bostrom, Oxford’s Future of Humanity Institute

Recurrent collapse

Time

2007

Technological development

pre-human condition

human condition

posthuman condition

Page 20: Nick Bostrom, Oxford’s Future of Humanity Institute

The longer term

Time

2007

Technological development

posthuman condition

pre-human condition

Page 21: Nick Bostrom, Oxford’s Future of Humanity Institute

Posthumanity

Time

2007

Technological development

pre-human condition

human condition

posthuman condition

“Posthuman condition”:• Population greater than 1 trillion persons• Life expectancy greater than 500 years• Large fraction of the population has cognitive capacities more than two standard

deviations above the current human maximum• Near-complete control over the sensory input, for the majority of people for most of the

time• Human psychological suffering becoming rare occurrence• Any change of magnitude or profundity comparable to that of one of the above

Page 22: Nick Bostrom, Oxford’s Future of Humanity Institute

If Earth had formed one year ago…

Page 23: Nick Bostrom, Oxford’s Future of Humanity Institute

If Earth had formed one year ago…

•Homo sapiens evolved less than 12 minutes ago

Page 24: Nick Bostrom, Oxford’s Future of Humanity Institute

If Earth had formed one year ago…

•Homo sapiens evolved less than 12 minutes ago

•Agriculture began a little over 1 minute ago

Page 25: Nick Bostrom, Oxford’s Future of Humanity Institute

If Earth had formed one year ago…

•Homo sapiens evolved less than 12 minutes ago

•Agriculture began a little over 1 minute ago

•Industrial revolution took place less than 2 seconds ago

Page 26: Nick Bostrom, Oxford’s Future of Humanity Institute

If Earth had formed one year ago…

•Homo sapiens evolved less than 12 minutes ago

•Agriculture began a little over 1 minute ago

•Industrial revolution took place less than 2 seconds ago

•Electronic computer was invented 0.4 seconds ago

Page 27: Nick Bostrom, Oxford’s Future of Humanity Institute

If Earth had formed one year ago…

•Homo sapiens evolved less than 12 minutes ago•Agriculture began a little over 1 minute ago•Industrial revolution took place less than 2 seconds ago•Electronic computer was invented 0.4 seconds ago•Internet less than 0.1 seconds ago

Page 28: Nick Bostrom, Oxford’s Future of Humanity Institute

Kanzi

Great Ape Trust, Des Moines

Witten

Institute for Advanced Study, Princeton

Page 29: Nick Bostrom, Oxford’s Future of Humanity Institute

The singularity hypothesis

•Ulam talking with John von Neumann in 1958:• “One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”

Page 30: Nick Bostrom, Oxford’s Future of Humanity Institute

The singularity hypothesis•I. J. Good in 1965:

• “Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make…”

Page 31: Nick Bostrom, Oxford’s Future of Humanity Institute

Computing power

Page 32: Nick Bostrom, Oxford’s Future of Humanity Institute

Classical AI

Uploading

Neuromorphic engineering

Genetic algorithms, neural networks, etc.

Page 33: Nick Bostrom, Oxford’s Future of Humanity Institute

Space of possible modes of being

Page 34: Nick Bostrom, Oxford’s Future of Humanity Institute

Space of possible modes of being

Page 35: Nick Bostrom, Oxford’s Future of Humanity Institute

Space of possible modes of being

Page 36: Nick Bostrom, Oxford’s Future of Humanity Institute

Space of possible modes of being

??

??

????

????

????

Page 37: Nick Bostrom, Oxford’s Future of Humanity Institute

Space of possible modes of being

??

??

????

????

??

??

??

??

??

??

??

??

??

??

??

????

????

????

????

????

?? ??

????

??

??

??

??

??

??

??

??

??

??

??

??

??

??

??

??

????

??

??

??

??

??

????

??

??

??

??

??

??

??

??

??

?? ??

??

??

??

??

??

??

??

??

??

Page 38: Nick Bostrom, Oxford’s Future of Humanity Institute

Guardian

Activate

Summit

Page 39: Nick Bostrom, Oxford’s Future of Humanity Institute

Guardian

Activate

Summit


Recommended