Existential risk

From Lesswrongwiki
Jump to: navigation, search
Smallwikipedialogo.png
Wikipedia has an article about

An existential risk is a risk posing permanent consequences to humanity which can never be undone. In Bostrom's 2002 article , he defined an existential risk as

One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.

Nick Bostrom (2003) proposes a series of classifications for existential risks:

  • Bangs - Earthly intelligent life is extinguished relatively suddenly, either accidentally or by deliberate destruction
  • Crunches - Any potential humanity had to enhance itself indefinitely is forever eliminated, although humanity survives
  • Shrieks - Humanity enhances itself, but explores only a narrow portion of its desirable possibilities
  • Whimpers - Humanity enhances itself, evolving in a gradual and irrevocable manner to the point that none of our present values are maintained, or enhances itself only infinitesimally

Examples of potential existential risks to all of humanity include molecular nanotechnology weapons, climate chaos that causes social chaos that leads to general nuclear or biological warfare, a perfectly engineered plague, a sufficiently large asteroid impact, an Unfriendly AI or a Friendly AI that makes a significant error of logic or priority (from humanity's point of view, only) in how best to preserve life. As with lesser-scoped existential risks, most such scenarios would contemplate the survival of tiny numbers of humans that might escape into space or underground, but the threat to end civilization as we know it satisfies most existential definitions, since humans have not lived under these circumstances before, and might have to become something quite different emotionally and physically to survive at all.

Existential risks present a unique challenge because of their irreversible nature. Unlike with lesser risks, we don't have the option of learning from experience of past disasters - there has been no such disaster or else there could not be a "we" at present to analyze them.

Even using past experience to predict the probability of future existential risks raises difficult problems in anthropic reasoning: as Milan M. Ćirković put it, "[W]e cannot [...] expect to find traces of a large catastrophe that occurred yesterday, since it would have preempted our existence today." [...] Very destructive events destroy predictability!"

Since existential disasters cannot be recovered from, under many moral systems, they thereby matter for the rest of time: their cost is not just the people who died in the disaster, but all of their possible future descendants.

Blog posts

Organizations

Resources

See also

References