Difference between revisions of "Fun theory"

From Lesswrongwiki
Jump to: navigation, search
(Fun Theory and complex values)
(Fun Theory and complex values)
Line 13: Line 13:
  
 
==Fun Theory and complex values==
 
==Fun Theory and complex values==
A key insight of Fun Theory in its current embryonic form is that eudaimonia is [[Complexity of value|complicated]]--there are many properties which contribute to a life worth living. To experience fulfilled life, we humans require many things: safety, aesthetic stimulation, love, social interaction, learning, challenge, and much much more. If any one of these is absent, with learning and challenge particularly relevant in the context of Fun Theory, the human future will likely be very unpleasant This is true of humans today, it will also in general be true of enhanced humans. (It might not be true to the extent that we alter our own value system. Our value system today does, perhaps, prefer a situation in which some human values are removed, like bloodlust, but there are values which we, on reflection want to keep, like those mentioned above. We do not want to eliminate curioisity!)  
+
A key insight of Fun Theory, in its current embryonic form, is that eudaimonia is [[Complexity of value|complicated]]--there are many properties which contribute to a life worth living. To experience fulfilled life, we humans require many things: safety, aesthetic stimulation, love, social interaction, learning, challenge, and much much more. If any these is absent, with learning and challenge particularly relevant in the context of Fun Theory, the human future will likely be very unpleasant.
 +
 
 +
Humans today have this value system, but to the extent that we do not allow ourselves to change our own goal system, it will also in general be true of enhanced humans. (It might not be true to the extent that we alter our own value system. Our value system today does, perhaps, prefer a situation in which some human values,like bloodlust, are removed. But there are many values which we, on reflection want to keep, like those mentioned above. We do not want to eliminate curioisity!)
  
 
==Relation to Friendly AI==
 
==Relation to Friendly AI==

Revision as of 07:50, 29 August 2012

Fun theory addresses the following problem: If we can arrange to live for a very long time with every increasing intelligence, how can we continue to have any fun.

When we have intellects that can deeply comprehend every movie, novel, and concert ever created in an instant, that intimately know every twist of every forest path, that can bring joy or tranquillity to themselves at will; intellects which, with enhanced bodies, can swim under the Atlantic in a day and navigate a snowboard down Mount Everest as easily as you or I could steer a bicycle down the street, then what will remain?

The risk of boredom

Unless we can answer this question, we might be faced by endless boredom. This could be seen as an argument against key hopes of transhumanism, such as lifespan extension, human intelligence enhancement, and physical enhancement.

Transhumanists work towards a much better human future--a Utopia--but, as George Orwell aptly described it Utopians of all stripts, including Socialists, Enlightenment thinkers, and Christians, have generally been unable to imagine futures where anyone would actually *want* to live.

It is a commonplace that the Christian Heaven, as usually portrayed, would attract nobody. Almost all Christian writers dealing with Heaven either say frankly that it is indescribable or conjure up a vague picture of gold, precious stones, and the endless singing of hymns... [W]hat it could not do was to describe a condition in which the ordinary human being actively wanted to be.

Fun Theory and complex values

A key insight of Fun Theory, in its current embryonic form, is that eudaimonia is complicated--there are many properties which contribute to a life worth living. To experience fulfilled life, we humans require many things: safety, aesthetic stimulation, love, social interaction, learning, challenge, and much much more. If any these is absent, with learning and challenge particularly relevant in the context of Fun Theory, the human future will likely be very unpleasant.

Humans today have this value system, but to the extent that we do not allow ourselves to change our own goal system, it will also in general be true of enhanced humans. (It might not be true to the extent that we alter our own value system. Our value system today does, perhaps, prefer a situation in which some human values,like bloodlust, are removed. But there are many values which we, on reflection want to keep, like those mentioned above. We do not want to eliminate curioisity!)

Relation to Friendly AI

An artificial general intelligence which is created to help humanity (Friendly AI), and which grows to be much more powerful than us, must have as its goal the promotion of the human value system.

Blog posts

  • The Fun Theory Sequence by Eliezer Yudkowsky describes some of the many complex considerations that determine what sort of happiness we most prefer to have - given that many of us would decline to just have an electrode planted in our pleasure centers.

External links

See also