Difference between revisions of "Intelligence explosion"

From Lesswrongwiki
Jump to: navigation, search
m (permitted possibilities)
Line 3: Line 3:
 
|I.J. Good|[http://www.aeiveos.com/~bradbury/Authors/Computing/Good-IJ/SCtFUM.html "Speculations Concerning the First Ultraintelligent Machine"]}}
 
|I.J. Good|[http://www.aeiveos.com/~bradbury/Authors/Computing/Good-IJ/SCtFUM.html "Speculations Concerning the First Ultraintelligent Machine"]}}
  
One of what [[Eliezer Yudkowsky]] calls the [http://yudkowsky.net/singularity/schools three major singularity schools] centers on the idea that minds creating smarter minds could enter a feedback loop strong enough to go "foom" in very little time.  
+
An '''intelligence explosion''' is a hypothetical strong positive feedback loop in which a recursively self-improving intelligence makes itself smarter at making itself smarter, resulting in a very fast, very dramatic leap in capability.
 +
[[Eliezer Yudkowsky]] calls the intelligence explosion one of [http://yudkowsky.net/singularity/schools three major Singularity schools].  
  
 
==Blog posts==
 
==Blog posts==

Revision as of 15:04, 1 November 2009

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind.

An intelligence explosion is a hypothetical strong positive feedback loop in which a recursively self-improving intelligence makes itself smarter at making itself smarter, resulting in a very fast, very dramatic leap in capability. Eliezer Yudkowsky calls the intelligence explosion one of three major Singularity schools.

Blog posts

Posts by Eliezer Yudkowsky:

See also