Ontological crisis

From Lesswrongwiki
Revision as of 14:44, 12 July 2012 by TerminalAwareness (talk | contribs) (Creation)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

An Ontological crisis is the term for the crisis experienced by any agent when its understanding of reality changes. This change can make some of the agent's preferences and goals into nonsense. In the context of an AGI, an ontological crisis could in the worst case pose an existential risk when old preferences and goals continue to be used. Another possibility is that the AGI loses all ability to comprehend the world, and would pose no threat. If an AGI reevaluates its preferences after its ontological crisis, very unfriendly behaviours could arise. Depending on the extent of the reevaluations, the AGI's changes may be detected and safely fixed, or go undetected until they manifest horribly.

Eliezer Yudkowsky has pointed out that we ourselves would experience an ontological crisis if we were to experience the far future.

Blog posts

External Links