Fragility of value

From Lesswrongwiki
Revision as of 06:34, 14 July 2012 by TerminalAwareness (talk | contribs) (First draft)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Human values are fickle things. They are the product of our evolutionary history, and more recently predominantly our social heritage. We have not yet compared values with any other intelligent life, and most of us find those values in our history repulsive. With no inherent morality (see the metaethics sequence, in particular these three posts), a person's values define them. Whatever else an AGI does, it must not change our values - the utility function is not up for grabs.

A poorly constrained artificial general intelligence seeking to maximize the utility experienced by humanity might turn us into blobs of perpetual orgasm. A world with immortality designed without fun theory might find its citizens modifying themselves as to find utterly engrossing a pursuit such as making table legs.

See Also

Blog Posts