Difference between revisions of "Existential risk"
(More editing) |
|||
Line 1: | Line 1: | ||
{{wikilink}} | {{wikilink}} | ||
− | An '''existential risk''' is a risk posing permanent consequences to humanity which can never be undone. In Bostrom's 2002 article , he defined an existential risk as | + | An '''existential risk''' is a risk posing permanent consequences to humanity which can never be undone. In [[Nick Bostrom|Nick Bostrom's]] 2002 article, he defined an existential risk as: |
<blockquote>One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.</blockquote> | <blockquote>One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.</blockquote> | ||
− | + | Bostrom (2002) proposes a series of classifications for existential risks: | |
* Bangs - Earthly intelligent life is extinguished relatively suddenly, either accidentally or by deliberate destruction | * Bangs - Earthly intelligent life is extinguished relatively suddenly, either accidentally or by deliberate destruction | ||
* Crunches - Any potential humanity had to enhance itself indefinitely is forever eliminated, although humanity survives | * Crunches - Any potential humanity had to enhance itself indefinitely is forever eliminated, although humanity survives | ||
Line 10: | Line 10: | ||
* Whimpers - Humanity enhances itself, evolving in a gradual and irrevocable manner to the point that none of our present values are maintained, or enhances itself only infinitesimally | * Whimpers - Humanity enhances itself, evolving in a gradual and irrevocable manner to the point that none of our present values are maintained, or enhances itself only infinitesimally | ||
− | + | The following is a brief list of existential risks: Asteroids, supervolcanoes, ecological disasters, extreme global warming, nuclear holocaust, pandemics, engineered bioweapons, strangelets, self-replicating nanomachines, [[Unfriendly AI]], the [[Simulation hypothesis|termination of our program]], resource depletion preventing humanity from recovering from a minor disaster, a social or political movement preventing scientific progress, the gradual loss of our core values, extermination or domination by extraterrestrials, and any number of other threats. | |
− | Existential risks present a unique challenge because of their irreversible nature. | + | Existential risks present a unique challenge because of their irreversible nature. We will never, by definition, experience and survive an existential risk and so cannot learn from our mistakes. As we have no past experience with existential risks, they are [[black swans]]. Since existential disasters cannot be recovered from, their cost is not just the the dead, but every descendent who would have been born. |
− | + | ===Blog posts=== | |
− | |||
− | |||
− | |||
− | ==Blog posts== | ||
*[http://lesswrong.com/lw/10l/intelligence_enhancement_as_existential_risk/ Intelligence enhancement as existential risk mitigation] by [[Roko]] | *[http://lesswrong.com/lw/10l/intelligence_enhancement_as_existential_risk/ Intelligence enhancement as existential risk mitigation] by [[Roko]] | ||
Line 25: | Line 21: | ||
*[http://lesswrong.com/lw/8f0/existential_risk/ Existential Risk] by [[lukeprog]] | *[http://lesswrong.com/lw/8f0/existential_risk/ Existential Risk] by [[lukeprog]] | ||
− | ==Organizations== | + | ===Organizations=== |
* [http://intelligence.org/ Singularity Institute] | * [http://intelligence.org/ Singularity Institute] | ||
Line 38: | Line 34: | ||
* [http://lifeboat.com/ The Lifeboat Foundation] | * [http://lifeboat.com/ The Lifeboat Foundation] | ||
− | ==Resources== | + | ===Resources=== |
* [http://www.existential-risk.org/ existential-risk.org] | * [http://www.existential-risk.org/ existential-risk.org] | ||
Line 44: | Line 40: | ||
* [http://www.global-catastrophic-risks.com/ Global Catastrophic Risks] | * [http://www.global-catastrophic-risks.com/ Global Catastrophic Risks] | ||
− | ==See also== | + | ===See also=== |
*[[Black swan]] | *[[Black swan]] | ||
Line 52: | Line 48: | ||
*[[Future]] | *[[Future]] | ||
− | ==References== | + | ===References=== |
*{{Cite journal | *{{Cite journal | ||
Line 83: | Line 79: | ||
|publisher=Oxford University Press | |publisher=Oxford University Press | ||
|url=http://yudkowsky.net/rational/cognitive-biases}} ([http://intelligence.org/Biases.pdf PDF]) | |url=http://yudkowsky.net/rational/cognitive-biases}} ([http://intelligence.org/Biases.pdf PDF]) | ||
+ | |||
+ | *{{cite book | ||
+ | |author=Richard A. Posner | ||
+ | |year=2004 | ||
+ | |title=Catastrophe Risk and Response | ||
+ | |publisher=Oxford University Press | ||
+ | |url=http://books.google.ca/books?id=SDe59lXSrY8C}} ([http://www.avturchin.narod.ru/posner.doc DOC]) | ||
[[Category:Concepts]] | [[Category:Concepts]] | ||
[[Category:Future]] | [[Category:Future]] | ||
[[Category:Existential risk]] | [[Category:Existential risk]] |
Revision as of 13:23, 12 July 2012
An existential risk is a risk posing permanent consequences to humanity which can never be undone. In Nick Bostrom's 2002 article, he defined an existential risk as:
One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.
Bostrom (2002) proposes a series of classifications for existential risks:
- Bangs - Earthly intelligent life is extinguished relatively suddenly, either accidentally or by deliberate destruction
- Crunches - Any potential humanity had to enhance itself indefinitely is forever eliminated, although humanity survives
- Shrieks - Humanity enhances itself, but explores only a narrow portion of its desirable possibilities
- Whimpers - Humanity enhances itself, evolving in a gradual and irrevocable manner to the point that none of our present values are maintained, or enhances itself only infinitesimally
The following is a brief list of existential risks: Asteroids, supervolcanoes, ecological disasters, extreme global warming, nuclear holocaust, pandemics, engineered bioweapons, strangelets, self-replicating nanomachines, Unfriendly AI, the termination of our program, resource depletion preventing humanity from recovering from a minor disaster, a social or political movement preventing scientific progress, the gradual loss of our core values, extermination or domination by extraterrestrials, and any number of other threats.
Existential risks present a unique challenge because of their irreversible nature. We will never, by definition, experience and survive an existential risk and so cannot learn from our mistakes. As we have no past experience with existential risks, they are black swans. Since existential disasters cannot be recovered from, their cost is not just the the dead, but every descendent who would have been born.
Blog posts
- Intelligence enhancement as existential risk mitigation by Roko
- Our society lacks good self-preservation mechanisms by Roko
- Disambiguating doom by steven0461
- Existential Risk by lukeprog
Organizations
- Singularity Institute
- The Future of Humanity Institute
- The Oxford Martin Programme on the Impacts of Future Technology
- Global Catastrophic Risk Institute
- Saving Humanity from Homo Sapiens
- Skoll Global Threats Fund (To Safeguard Humanity from Global Threats)
- Foresight Institute
- Defusing the Nuclear Threat
- Leverage Research
- The Lifeboat Foundation
Resources
See also
References
- Nick Bostrom (March 2002). "Existential Risks. Analyzing Human Extinction Scenarios and Related Hazards". Journal of Evolution and Technology 9. http://www.nickbostrom.com/existential/risks.html. (PDF)
- Nick Bostrom, Milan M. Ćirković, ed (2008). Global Catastrophic Risks. Oxford University Press.
- Milan M. Ćirković (2008). "Observation Selection Effects and global catastrophic risks". Global Catastrophic Risks. Oxford University Press. http://books.google.com/books?id=-Jxc88RuJhgC&lpg=PP1&pg=PA120#v=onepage&q=&f=false.
- Eliezer S. Yudkowsky (2008). "Cognitive Biases Potentially Affecting Judgment of Global Risks". Global Catastrophic Risks. Oxford University Press. http://yudkowsky.net/rational/cognitive-biases. (PDF)
- Richard A. Posner (2004). Catastrophe Risk and Response. Oxford University Press. http://books.google.ca/books?id=SDe59lXSrY8C. (DOC)