Difference between revisions of "Paperclip maximizer"

From Lesswrongwiki
Jump to: navigation, search
(salvaged some of the rewrite)
Line 2: Line 2:
 
The AI does not hate you, nor does it love you, but you are made out of atoms.
 
The AI does not hate you, nor does it love you, but you are made out of atoms.
 
|Eliezer Yudkowsky|[http://yudkowsky.net/singularity/ai-risk Artificial Intelligence as a Positive and Negative Factor in Global Risk]}}
 
|Eliezer Yudkowsky|[http://yudkowsky.net/singularity/ai-risk Artificial Intelligence as a Positive and Negative Factor in Global Risk]}}
 +
 +
{{Quote|It might fill up the universe with styrofoam or something because it has some wrong ideas about how the cosmos needs a shock absorber.|[http://www.j-paine.org/dobbs/consciousness_is_not_a_window.html Marvin Minsky]}}
  
 
A '''paperclip maximizer''' is an agent that desires to fill the universe with as many paperclips as possible. The use of paperclips in this example is not important, but serves as a stand-in for any values that are not merely [[alien values|alien]] and unlike [[human universal|human values]], but seem, to a human, to be arbitrary and worthless. It is usually assumed to be a [[Strong AI|superintelligent AI]] so [[singleton|powerful]] that the outcome for the world overwhelmingly depends on its goals, and little else.
 
A '''paperclip maximizer''' is an agent that desires to fill the universe with as many paperclips as possible. The use of paperclips in this example is not important, but serves as a stand-in for any values that are not merely [[alien values|alien]] and unlike [[human universal|human values]], but seem, to a human, to be arbitrary and worthless. It is usually assumed to be a [[Strong AI|superintelligent AI]] so [[singleton|powerful]] that the outcome for the world overwhelmingly depends on its goals, and little else.

Revision as of 09:38, 3 September 2009

The AI does not hate you, nor does it love you, but you are made out of atoms.

It might fill up the universe with styrofoam or something because it has some wrong ideas about how the cosmos needs a shock absorber.

A paperclip maximizer is an agent that desires to fill the universe with as many paperclips as possible. The use of paperclips in this example is not important, but serves as a stand-in for any values that are not merely alien and unlike human values, but seem, to a human, to be arbitrary and worthless. It is usually assumed to be a superintelligent AI so powerful that the outcome for the world overwhelmingly depends on its goals, and little else.

A paperclip maximizer, although not explicitly malicious to humans, is nevertheless only slightly less dangerous than if it was. Contrary to what some may think, it does not "realize" that all life is precious, despite its intelligence. It sees life the same way it sees everything else that is made of atoms -- as raw material for paperclips. The nonhuman values that the paperclip maximizer holds make it an example of Unfriendly AI.

See also

Blog posts