Fragility of value

From Lesswrongwiki
Jump to: navigation, search

Human values are fickle things. They are the product of our lengthy evolutionary history, and a brief but intense social evolution. It shouldn't be surprising then that our values are an inconsistent mess of contradictions that still haven't been sorted out, by philosophy or psychology.

We have not yet compared values with any other intelligent life, and most of us find those values in our history repulsive. With no inherent morality (see the metaethics sequence, in particular these three posts), a person's values define them. Whatever else an AGI does, it must not change our values - the utility function is not up for grabs.

A poorly constrained artificial general intelligence seeking to maximize the utility experienced by humanity might turn us into blobs of perpetual orgasm. A world with immortality designed without fun theory might find its citizens modifying themselves as to find utterly engrossing a pursuit such as making table legs.

See Also

Blog Posts