Difference between revisions of "Coherent Aggregated Volition"

From Lesswrongwiki
Jump to: navigation, search
(Created page with "'''Coherent Aggregated Volition''' is Ben Goertzel's response to Eliezer Yudkowsky's Coherent Extrapolated Volition. CAV would be a combination of the goals and beliefs of hu...")
 
Line 1: Line 1:
'''Coherent Aggregated Volition''' is Ben Goertzel's response to Eliezer Yudkowsky's [[Coherent Extrapolated Volition]]. CAV would be a combination of the goals and beliefs of humanity at the present time. Without the "extrapolation" aspect of CEV, Coherent Aggregated Volition is simpler, and intended to be easier to formalize and prototype in the foreseeable future. CAV is not intended to answer the question of Friendly AI.
+
'''Coherent Aggregated Volition''' is [[Ben Goertzel]]'s response to [[Eliezer Yudkowsky]]'s [[Coherent Extrapolated Volition]]. CAV would be a combination of the goals and beliefs of humanity at the present time. Without the "extrapolation" aspect of CEV, Coherent Aggregated Volition is simpler, and intended to be easier to formalize and prototype in the foreseeable future. CAV is not, however, intended to answer the question of Friendly AI, although Goertzel claims CEV is possibly not the answer as well.
 +
 
 +
==The concept==
 +
 
 +
 
 +
==CEV and CAV==
  
  

Revision as of 01:19, 3 October 2012

Coherent Aggregated Volition is Ben Goertzel's response to Eliezer Yudkowsky's Coherent Extrapolated Volition. CAV would be a combination of the goals and beliefs of humanity at the present time. Without the "extrapolation" aspect of CEV, Coherent Aggregated Volition is simpler, and intended to be easier to formalize and prototype in the foreseeable future. CAV is not, however, intended to answer the question of Friendly AI, although Goertzel claims CEV is possibly not the answer as well.

The concept

CEV and CAV

See also

References