Upload
activate-summit
View
1.153
Download
6
Embed Size (px)
DESCRIPTION
Nick Bostrom, director, Oxford's Future of Humanity Institute and founder of the world transhumanist foundation offers a mind-bending existential take on artificial intelligence, singularity and life in a post-human world.
Citation preview
Professor Nick BostromFaculty of Philosophy
Director, Future of Humanity InstituteJames Martin 21st Century School
Oxford University
The Big Picture
• “At the very apex of the first stratified societies, dynastic dreams were dreamt and visions of triumph or ruin entertained; but there is no mention in the papyri and cuniform tablets on which these hopes and fears were recorded that they envisaged, in the slightest degree, changes in the material conditions of the great masses, or for that matter, of the ruling class itself.”
• Robert Heilbroner
The future of humanity?
Time
2007
Technological development
pre-human condition
human condition
posthuman condition
?
Extinction
Time
2007
Technological development
pre-human condition
human condition
posthuman condition
Past catastrophes killing >10 millionCatastrophe Century Deaths (millions)
Smallpox 20C 400
Plague of Justinian 6C 100
Black death (1347-1350) 14C 75
Second World War 20C 55
Great Leap Forward in China (famine) 20C 40
Spanish flu pandemic (1918-1919) 20C 40
Mongol conquests 13C 40
An Shi Rebellion (756–763) 8C 36
HIV/AIDS (20C-21C) 27
Fall of the Ming Dynasty 17C 25
Chinese Famine of 1907 20C 24
Taiping Rebellion (1851–1864) 19C 20
Decimation of the American Indians 15C-19C 20
Stalin (famines and purges) 20C 20
Mideast Slave Trade 7C-19C 19
Atlantic Slave Trade 15C-19C 18
Timur Lenk 14C-15C 17
British India (mainly famine) 19C 17
First World War 20C 15
Scope
Intensity
Loss of one hair
Congestion fromone extra vehicle
0.01 degreeglobal warming
Loss of onespecies of beetle
Car is stolen
Recession ina country
Destruction ofthe ozone layer
Drastic loss ofbiodiversity
Fatal car crash
Genocide
Ageing?
Human extinction
Personal
Global
Trans-generational
Local
Imperceptible Endurable Terminal (Hellish?)
(Cosmic?)
GlobalCatastrophicRisks
Existential Risks
Existential risk• One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.
No existential catastrophe• > 99.9% of all species are extinct
• Toba eruption 75,000 years ago ~500 reproducing females
• Neanderthals extinct 33,000 – 24,000 years ago
• Homo florensis extinct 12,000 years ago?
• Opinions…
• Existential risks on the horizon…
Some opinions on net existential risk
Source Claim Probability
GCR conference poll Median answer to “Overall risk of human extinction prior to 2100”
19%
Stern Report Extinction risk per year 0.1%
John Leslie (1996) Human extinction by 2496 (based partly on the doomsday argument and Leslie’s view about how quantum indeterminacy affects this argument)
30%
Martin Rees End of civilization by 2100 (note: this need not entail extinction or existential catastrophe)
50%
Early Bostrom (2002) Cumulative existential risk (no time limit)
≥ 25%
Richard Gott Probability of humanity extinct <5,100 yrs
0.25%
Richard Posner Human extinction this century “significant”
•Natural
Anthropogenic
HUMAN EXTINCTION RISKS?
Nanotech weapons systems
Nuclear holocaust
Simulation shutdown
Genetically engineering / synthetic biology
Non-weapons nanotech accident
Natural pandemic
Emissions-caused global warming
Nonspecific conflict
Self-destroying superintelligent AI
Supervolcanic eruption
Physics experiment
Extraterrestrial intelligence
Non-anthropogenic vaccuum decay
Kinetic impact
Back-contamination
Other climate change
Space radiation
Human infertility
Other
Stagnation or Plateau
Time
2007
Technological development
pre-human condition
human condition
posthuman condition
Revolutionary technologies?
•Artificial intelligence and superintelligence•Machine-phase nanotechnology•Uploading•Information technology (virtual reality; wearable computers, etc.)•Anti-aging medicine•Nootropics•Mood enhancers•Genetic engineering•Collaborative information filtering•Institutions: information markets…
Recurrent collapse
Time
2007
Technological development
pre-human condition
human condition
posthuman condition
The longer term
Time
2007
Technological development
posthuman condition
pre-human condition
Posthumanity
•
Time
2007
Technological development
pre-human condition
human condition
posthuman condition
“Posthuman condition”:• Population greater than 1 trillion persons• Life expectancy greater than 500 years• Large fraction of the population has cognitive capacities more than two standard
deviations above the current human maximum• Near-complete control over the sensory input, for the majority of people for most of the
time• Human psychological suffering becoming rare occurrence• Any change of magnitude or profundity comparable to that of one of the above
If Earth had formed one year ago…
If Earth had formed one year ago…
•Homo sapiens evolved less than 12 minutes ago
If Earth had formed one year ago…
•Homo sapiens evolved less than 12 minutes ago
•Agriculture began a little over 1 minute ago
If Earth had formed one year ago…
•Homo sapiens evolved less than 12 minutes ago
•Agriculture began a little over 1 minute ago
•Industrial revolution took place less than 2 seconds ago
If Earth had formed one year ago…
•Homo sapiens evolved less than 12 minutes ago
•Agriculture began a little over 1 minute ago
•Industrial revolution took place less than 2 seconds ago
•Electronic computer was invented 0.4 seconds ago
If Earth had formed one year ago…
•Homo sapiens evolved less than 12 minutes ago•Agriculture began a little over 1 minute ago•Industrial revolution took place less than 2 seconds ago•Electronic computer was invented 0.4 seconds ago•Internet less than 0.1 seconds ago
Kanzi
Great Ape Trust, Des Moines
Witten
Institute for Advanced Study, Princeton
The singularity hypothesis
•Ulam talking with John von Neumann in 1958:• “One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”
The singularity hypothesis•I. J. Good in 1965:
• “Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make…”
Computing power
Classical AI
Uploading
Neuromorphic engineering
Genetic algorithms, neural networks, etc.
Space of possible modes of being
Space of possible modes of being
Space of possible modes of being
Space of possible modes of being
??
??
????
????
????
Space of possible modes of being
??
??
????
????
??
??
??
??
??
??
??
??
??
??
??
????
????
????
????
????
?? ??
????
??
??
??
??
??
??
??
??
??
??
??
??
??
??
??
??
????
??
??
??
??
??
????
??
??
??
??
??
??
??
??
??
?? ??
??
??
??
??
??
??
??
??
??
Guardian
Activate
Summit
Guardian
Activate
Summit