Existential risk

From Lesswrongwiki
Revision as of 17:22, 23 November 2011 by Lukeprog (talk | contribs) (Blog posts)
Jump to: navigation, search
Smallwikipedialogo.png
Wikipedia has an article about

An existential risk is a killing or crippling disaster risk that affects all of humanity, as distinguished from risks of lesser scope or intensity. Examples of potential existential risks include molecular nanotechnology weapons, a sufficiently large astroid impact, or an unFriendly AI.

Existential risks present a unique challenge because of their irreversible nature. Unlike with lesser risks, we don't have the option of learning from experience of past disasters. Even using past experience to predict the probability of future existential risks raises difficult problems in anthropic reasoning: as Milan M. Ćirković put it, "[W]e cannot [...] expect to find traces of a large catastrophe that occured yesterday, since it would have preempted our existence today." [...] Very destructive events destroy predictability!"

Since existential disasters cannot be recovered from, under many moral systems, they thereby matter for the rest of time: their cost is not just the people who died in the disaster, but all of their possible future descendants.

Blog posts

See also

References