Difference between revisions of "Fun theory"

From Lesswrongwiki
Jump to: navigation, search
(Fun Theory and complex values)
Line 1: Line 1:
'''Fun theory''' addresses the following problem: If we can arrange to live for a very long time with every increasing intelligence, how can we continue to have any fun.
+
'''Fun theory''' addresses the following problem: If we can arrange to live for a very long time with greatly increased  intelligence and physical abilities, how can we continue to have any fun?
  
When we have intellects that can deeply comprehend every movie, novel, and concert ever created in an instant, that intimately know every twist of every forest path, that can bring joy or tranquillity to themselves at will; intellects which, with enhanced bodies, can swim under the Atlantic in a day and navigate a snowboard down Mount Everest as easily as you or I could steer a bicycle down the street, then what will remain?
+
When we have intellects that can deeply comprehend every movie, novel, and concert ever created in an instant; that intimately know every twist of every forest path; that can bring joy or tranquillity to themselves at will; when we have enhanced ourselves so that we can swim under the Atlantic in a day, and navigate a snowboard down Mount Everest as easily as you or I could steer a bicycle down the street, when free-climbing a cliff-face poses no risk to life or limb--then what fun will remain?
  
 
==The risk of boredom==
 
==The risk of boredom==
Unless we can answer this question, we might be faced by endless boredom. This could be seen as an argument against key hopes of [[transhumanism]], such as  lifespan extension, human intelligence enhancement, and physical enhancement.  
+
Unless we can answer this question, we might be faced by endless boredom. This could be seen as an argument against key hopes of [[transhumanism]] for the improvement of the human condition, including lifespan extension, human intelligence enhancement, and physical enhancement.  
  
Transhumanists work towards a much better human future--a Utopia--but, as George Orwell  [http://www.orwell.ru/library/articles/socialists/english/e_fun aptly described it]  Utopians of all stripts, including Socialists, Enlightenment thinkers, and Christians, have generally been unable to imagine futures where anyone would actually *want* to live.  
+
Transhumanists work towards a much better human future--a Utopia--but, as George Orwell  [http://www.orwell.ru/library/articles/socialists/english/e_fun aptly described it]  Utopians of all stripes, Socialist, Enlightenment, or Christian, have generally been unable to imagine futures where anyone would actually *want* to live.  
  
 
<blockquote>
 
<blockquote>
Line 13: Line 13:
  
 
==Fun Theory and complex values==
 
==Fun Theory and complex values==
A key insight of Fun Theory, in its current embryonic form, is that eudaimonia is [[Complexity of value|complicated]]--there are many properties which contribute to a life worth living. To experience fulfilled life, we humans require many things: safety, aesthetic stimulation, love, social interaction, learning, challenge, and much much more. If any these is absent, with learning and challenge particularly relevant in the context of Fun Theory, the human future will likely be very unpleasant.
+
A key insight of Fun Theory, in its current embryonic form, is that eudaimonia is [[Complexity of value|complicated]]--there are many properties which contribute to a life worth living. To experience a fulfilled life, we humans require many things: Aesthetic stimulation, pleasure, love, social interaction, learning, challenge, and much more.
  
Humans today have this value system, but to the extent that we do not allow ourselves to change our own goal system, it will also in general be true of enhanced humans. (It might not be true to the extent that we alter our own value system. Our value system today does, perhaps, prefer a situation in which some human values,like bloodlust, are removed. But there are many values which we, on reflection want to keep, like those mentioned above. We do not want to eliminate curioisity!)
+
It is a common mistake to extract one element of the human preferences and seek to maximize it alone. If we simply optimize for pleasure or happiness, we will "wirehead"--stimulate the relevant parts of our brain and experience bliss for eternity, but pursue no other experiences. If almost *any* element of our value system  is absent the human future will likely be very unpleasant.
 +
 
 +
The value system of humans today will also in general be true of enhanced humans, to the extent that we do change it as we self-enhance. We may want to alter our own value system, by eliminating values, like bloodlust, which on reflection we wish were absent. But there are many values which we, on reflection want to keep, and since we humans have no basis for a value system other than our current value system, Fun Theory must seek to maximize the value system that we have, rather than inventing new values.
 +
 
 +
==Boredom==
 +
Among the improvements  to the human condition that we want to work for, intelligence enhancement is among the most important. can work towards. Because of this, among other values that must be satisfied, Fun Theory puts a particular emphasis on the values opposed to boredom: curiosity, learning, and intellectual exploration. The open question is whether the universe will offer, or we ourselves can create, ever more complex and sophisticated opportunities to delight and challenge ever more powerful minds.
  
 
==Relation to Friendly AI==
 
==Relation to Friendly AI==
 
An artificial general intelligence which is created to help humanity ([[Friendly AI]]), and which grows to be much more powerful than us, must have as its goal the promotion of the human value system.
 
An artificial general intelligence which is created to help humanity ([[Friendly AI]]), and which grows to be much more powerful than us, must have as its goal the promotion of the human value system.
  
==Blog posts==
 
 
*[http://lesswrong.com/lw/xy/the_fun_theory_sequence/ The Fun Theory Sequence] by [[Eliezer Yudkowsky]] describes some of the many complex considerations that determine ''what sort of happiness'' we most prefer to have - given that many of us would decline to just have an electrode planted in our pleasure centers.
 
  
 
==External links==
 
==External links==
 
+
* George Orwell, [http://www.orwell.ru/library/articles/socialists/english/e_fun Why Socialists Don't Believe in Fun]
*George Orwell, [http://www.orwell.ru/library/articles/socialists/english/e_fun Why Socialists Don't Believe in Fun]
+
* David Pearce, [http://paradise-engineering.com/ Paradise Engineering] and [http://www.hedweb.com/hedab.htm The Hedonistic Imperative] provides a more nuanced alternative to wireheading.
*[http://paradise-engineering.com/ Paradise Engineering]
 
*[http://www.hedweb.com/hedab.htm The Hedonistic Imperative]
 
  
 
==See also==
 
==See also==
 
+
*[[The Fun Theory Sequence]]
 
*[[Complexity of value]]
 
*[[Complexity of value]]
*[[The Fun Theory Sequence]]
 
 
*[[Metaethics sequence]]
 
*[[Metaethics sequence]]
  
 
[[Category:Theses]]
 
[[Category:Theses]]

Revision as of 19:04, 29 August 2012

Fun theory addresses the following problem: If we can arrange to live for a very long time with greatly increased intelligence and physical abilities, how can we continue to have any fun?

When we have intellects that can deeply comprehend every movie, novel, and concert ever created in an instant; that intimately know every twist of every forest path; that can bring joy or tranquillity to themselves at will; when we have enhanced ourselves so that we can swim under the Atlantic in a day, and navigate a snowboard down Mount Everest as easily as you or I could steer a bicycle down the street, when free-climbing a cliff-face poses no risk to life or limb--then what fun will remain?

The risk of boredom

Unless we can answer this question, we might be faced by endless boredom. This could be seen as an argument against key hopes of transhumanism for the improvement of the human condition, including lifespan extension, human intelligence enhancement, and physical enhancement.

Transhumanists work towards a much better human future--a Utopia--but, as George Orwell aptly described it Utopians of all stripes, Socialist, Enlightenment, or Christian, have generally been unable to imagine futures where anyone would actually *want* to live.

It is a commonplace that the Christian Heaven, as usually portrayed, would attract nobody. Almost all Christian writers dealing with Heaven either say frankly that it is indescribable or conjure up a vague picture of gold, precious stones, and the endless singing of hymns... [W]hat it could not do was to describe a condition in which the ordinary human being actively wanted to be.

Fun Theory and complex values

A key insight of Fun Theory, in its current embryonic form, is that eudaimonia is complicated--there are many properties which contribute to a life worth living. To experience a fulfilled life, we humans require many things: Aesthetic stimulation, pleasure, love, social interaction, learning, challenge, and much more.

It is a common mistake to extract one element of the human preferences and seek to maximize it alone. If we simply optimize for pleasure or happiness, we will "wirehead"--stimulate the relevant parts of our brain and experience bliss for eternity, but pursue no other experiences. If almost *any* element of our value system is absent the human future will likely be very unpleasant.

The value system of humans today will also in general be true of enhanced humans, to the extent that we do change it as we self-enhance. We may want to alter our own value system, by eliminating values, like bloodlust, which on reflection we wish were absent. But there are many values which we, on reflection want to keep, and since we humans have no basis for a value system other than our current value system, Fun Theory must seek to maximize the value system that we have, rather than inventing new values.

Boredom

Among the improvements to the human condition that we want to work for, intelligence enhancement is among the most important. can work towards. Because of this, among other values that must be satisfied, Fun Theory puts a particular emphasis on the values opposed to boredom: curiosity, learning, and intellectual exploration. The open question is whether the universe will offer, or we ourselves can create, ever more complex and sophisticated opportunities to delight and challenge ever more powerful minds.

Relation to Friendly AI

An artificial general intelligence which is created to help humanity (Friendly AI), and which grows to be much more powerful than us, must have as its goal the promotion of the human value system.


External links

See also