Terminal value

From Lesswrongwiki
Revision as of 01:08, 25 August 2012 by JoshuaFox (talk | contribs)
Jump to: navigation, search

A terminal value is an ultimate goal, and end-in-itself. In an AI with a utility or reward function, the terminal value is the maximization of that function. (In some of Eliezer Yudkowsky's earlier writings, such as Creating Friendly AI, this is called a [[Subgoal stomp|supergoal].)

  • Terminal values vs instrumental
  • The terminal values of humanity?
    • A partial list
    • The complexity of human values
  • The paperclip maximizer and AIXI an examples
  • Benevolence as an instrumental and as a terminal value
  • Shifts in human terminal values
    • Kantian
    • Other
  • Subgoal stomp ("goal displacement")
    • A person wants to get rich to better enjoy life, and because of a total focus on money, becomes a workaholic focused on money for its own sake.
    • Humans as adaptation executors

Links

Terminal Values and Instrumental Values