Ontological crisis

From Lesswrongwiki
Revision as of 09:51, 21 July 2012 by TerminalAwareness (talk | contribs) (Added examples; hope they're OK)
Jump to: navigation, search

An Ontological crisis is the term for the crisis experienced by any agent when its understanding of reality changes. This change can make some of the agent's preferences and goals into nonsense. In the context of an AGI, an ontological crisis could in the worst case pose an existential risk when old preferences and goals continue to be used. Another possibility is that the AGI loses all ability to comprehend the world, and would pose no threat. If an AGI reevaluates its preferences after its ontological crisis, very unfriendly behaviours could arise. Depending on the extent of the reevaluations, the AGI's changes may be detected and safely fixed, or go undetected until they manifest horribly.

Eliezer Yudkowsky has pointed out that we ourselves could experience an ontological crisis if we were to know reality as it was known in the far future. For example, should it be learned that all humans will be copied at the instant of their death and recreated healthy during the 29th century to live as immortals, you might value you life a lot less than your fame. True and universal immortality would cause severe ontological crises as the urgency posed by a definite life span disappeared - and far stranger things. However, the development of indefinite medicinal life extension may make people value their lives far more, and any lifestyle that will let them obtain it. An example of an ontological crisis often experiences today is that following a person's loss of faith in God. Their preferences, often based in their theology and desire of acceptance to heaven, are subject to change as they begin to consider their own values.

Blog posts

External Links