Terminal value

From Lesswrongwiki
Revision as of 01:21, 25 August 2012 by JoshuaFox (talk | contribs)
Jump to: navigation, search

A terminal value is an ultimate goal, an end-in-itself. In an AI with a utility or reward function, the terminal value is the maximization of that function. (In some of Eliezer Yudkowsky's earlier writings, the non-standard term "supergoal" is used.)

Terminal values vs. instrumental

Terminal values stand in contrast to instrumental values, which are means-to-an-end, mere tools in achieving terminal values. For example, if a given university student does not enjoy is studying but is doing so merely as a professional qualification, we can describe his terminal value as getting a job; getting good grades is an instrument to that end.

Some values may be called "terminal" merely in relation to an instrumental goal, yet themselves serve instrumentally towards a higher goal. In the previous example, the student may want the job to gain social status and money; if he could get prestige and money without working he would; and in this case the job is instrumental to these other values. However, in considering future AI, the "terminal value" is generally used only for the top of the goal hierarchy: the true ultimate goals of a system, which do not serve any higher value.


  • The terminal values of humanity
    • A partial list of these values
    • The complexity of human values
  • The paperclip maximizer and AIXI an examples
  • Benevolence as an instrumental and as a terminal value
  • Shifts in human terminal values
    • Kantian
    • Other
  • Subgoal stomp ("goal displacement") in both senses
    • A person wants to get rich to better enjoy life, and because of a total focus on money, becomes a workaholic focused on money for its own sake.
    • Humans as adaptation executors

Links

Eliezer Yudkowsky, Terminal Values and Instrumental Values