Terminal value

From Lesswrongwiki
Revision as of 02:14, 25 August 2012 by JoshuaFox (talk | contribs)
Jump to: navigation, search

A terminal value is an ultimate goal, and end-in-itself. In an AI with a utility or reward function, the terminal value is the maximization of that function. (In some of Eliezer Yudkowsky's earlier writings, this is called "supergoal".)

Terminal values, stand in contrast to instrumental values, which are means-to-an-end, mere tools in achieving terminal values. For example, if a given university student who is studying merely to get a job may have the terminal value of getting a job; getting good grades is an instrument to that end.

Some values may be called "terminal" merely in relation to an instrumental goal, yet themselves serve instrumentally towards a higher goal. In the previous example, the student job may want the job to gain social status and money, and so the job is instrumental to these latter values. However, in considering future AI, the "terminal value" is generally used for the ultimate goals of a system, which do not serve any higher value.

  • Terminal values vs instrumental
  • The terminal values of humanity?
    • A partial list
    • The complexity of human values
  • The paperclip maximizer and AIXI an examples
  • Benevolence as an instrumental and as a terminal value
  • Shifts in human terminal values
    • Kantian
    • Other
  • Subgoal stomp ("goal displacement")
    • A person wants to get rich to better enjoy life, and because of a total focus on money, becomes a workaholic focused on money for its own sake.
    • Humans as adaptation executors

Links

Terminal Values and Instrumental Values