Coherent Aggregated Volition

From Lesswrongwiki
Revision as of 01:19, 3 October 2012 by Pedrochaves (talk | contribs)
Jump to: navigation, search

Coherent Aggregated Volition is Ben Goertzel's response to Eliezer Yudkowsky's Coherent Extrapolated Volition. CAV would be a combination of the goals and beliefs of humanity at the present time. Without the "extrapolation" aspect of CEV, Coherent Aggregated Volition is simpler, and intended to be easier to formalize and prototype in the foreseeable future. CAV is not, however, intended to answer the question of Friendly AI, although Goertzel claims CEV is possibly not the answer as well.

The concept

CEV and CAV

See also

References