An existential risk is a killing or crippling disaster risk that affects all of humanity, as distinguished from risks of lesser scope or intensity. Examples of potential existential risks include molecular nanotechnology weapons, a sufficiently large astroid impact, or an unFriendly AI.
Existential risks present a unique challenge because of their irreversible nature. Unlike with lesser risks, we don't have the option of learning from experience of past disasters. Even using past experience to predict the probability of future existential risks raises difficult problems in anthropic reasoning: as Milan M. Ćirković put it, "[W]e cannot [...] expect to find traces of a large catastrophe that occured yesterday, since it would have preempted our existence today." [...] Very destructive events destroy predictability!"
Since existential disasters cannot be recovered from, under many moral systems, they thereby matter for the rest of time: their cost is not just the people who died in the disaster, but all of their possible future descendants.
- Intelligence enhancement as existential risk mitigation by Roko
- Our society lacks good self-preservation mechanisms by Roko
- Existential Risk by lukeprog
- Nick Bostrom (March 2002). "Existential Risks. Analyzing Human Extinction Scenarios and Related Hazards". Journal of Evolution and Technology 9. http://www.nickbostrom.com/existential/risks.html. (PDF)
- Nick Bostrom, Milan M. Ćirković, ed (2008). Global Catastrophic Risks. Oxford University Press.
- Milan M. Ćirković (2008). "Observation Selection Effects and global catastrophic risks". Global Catastrophic Risks. Oxford University Press. http://books.google.com/books?id=-Jxc88RuJhgC&lpg=PP1&pg=PA120#v=onepage&q=&f=false.
- Eliezer S. Yudkowsky (2008). "Cognitive Biases Potentially Affecting Judgment of Global Risks". Global Catastrophic Risks. Oxford University Press. http://yudkowsky.net/rational/cognitive-biases. (PDF)