Suffering risk

From Lesswrongwiki
Revision as of 15:12, 23 November 2016 by Ignoranceprior (talk | contribs) (See also)
Jump to: navigation, search

Suffering risks (also known as s-risks) are risks of the creation of suffering in the far future on an astronomical scale. In this sense they can be considered a form of existential risk according to Bostrom's original definition, but it may be useful to distinguish between risks that threaten to prevent future populations from coming into existence (standard x-risks) and those which would instantiate a large amount of suffering (s-risks).

Although the Machine Intelligence Research Institute and Future of Humanity Institute have investigated strategies to prevent s-risks, the only EA organization with s-risk prevention as its primary focus is the Foundational Research Institute.

See also

External links