Existential risk

From Lesswrongwiki
Jump to: navigation, search
Smallwikipedialogo.png
Wikipedia has an article about

An existential risk is a risk posing permanent consequences to humanity which can never be undone. In Nick Bostrom's 2002 article, he defined an existential risk as:

One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.

Bostrom (2002) proposes a series of classifications for existential risks:

  • Bangs - Earthly intelligent life is extinguished relatively suddenly, either accidentally or by deliberate destruction
  • Crunches - Any potential humanity had to enhance itself indefinitely is forever eliminated, although humanity survives
  • Shrieks - Humanity enhances itself, but explores only a narrow portion of its desirable possibilities
  • Whimpers - Humanity enhances itself, evolving in a gradual and irrevocable manner to the point that none of our present values are maintained, or enhances itself only infinitesimally

The following is a brief list of existential risks: Asteroids, supervolcanoes, ecological disasters, extreme global warming, nuclear holocaust, pandemics, engineered bioweapons, strangelets, self-replicating nanomachines, Unfriendly AI, the termination of our program, resource depletion preventing humanity from recovering from a minor disaster, a social or political movement preventing scientific progress, the gradual loss of our core values, extermination or domination by extraterrestrials, and any number of other threats.

Existential risks present a unique challenge because of their irreversible nature. We will never, by definition, experience and survive an existential risk and so cannot learn from our mistakes. As we have no past experience with existential risks, they are black swans. Since existential disasters cannot be recovered from, their cost is not just the the dead, but every descendent who would have been born.

Blog posts

Organizations

Resources

See also

References