Difference between revisions of "Paperclip maximizer"

From Lesswrongwiki
Jump to: navigation, search
m
m
Line 4: Line 4:
  
 
==See Also==
 
==See Also==
<!-- For related LWiki articles, formatted as a list -->
 
* [[Unfriendly AI]]
 
  
==References==
+
*[[Unfriendly AI]]
<!-- Always keep this header if there is at least one reference -->
+
*[[Alien values]]
  
=====Footnotes=====
+
==Blog posts==
<references/>
+
 
 +
*[http://lesswrong.com/lw/v1/ethical_injunctions/ Ethical Injuctions] by [[Eliezer Yudkowsky]]
 +
*[http://lesswrong.com/lw/tn/the_true_prisoners_dilemma/ The True Prisoners' Dilemma] by [[Eliezer Yudkowsky]]
  
=====Overcoming Bias Articles=====
 
<!-- For related Overcoming Bias articles,
 
    formatted as a list with each entry as "/Title/ by /Author/ -->
 
* [http://www.overcomingbias.com/2008/10/ethical-injunct.html Ethical Injuctions] by [[Eliezer Yudkowsky]]
 
* [http://www.overcomingbias.com/2008/09/true-pd.html The True Prisoners' Dilemma] by [[Eliezer Yudkowsky]]
 
 
[[Category:Jargon]]
 
[[Category:Jargon]]
 
[[Category:Concepts]]
 
[[Category:Concepts]]

Revision as of 00:01, 16 June 2009

A paperclip maximizer is an agent that desires to fill the universe with as many paperclips as possible. The use of paperclips in this example is not important, but serves as a stand-in for any values that are not merely alien and unlike human values, but seem, to a human, to be arbitrary and worthless. It is usually assumed to be a superintelligent AI so powerful that the outcome for the world overwhelmingly depends on its goals, and little else.

A paperclip maximizer, although not explicitly malicious to humans, is nevertheless only slightly less dangerous than if it was. Contrary to what some may think, it does not "realize" that all life is precious, despite its intelligence. It sees life the same way it sees everything else that is made of atoms -- as raw material for paperclips. The nonhuman values that the paperclip maximizer holds make it an example of Unfriendly AI.

See Also

Blog posts