Difference between revisions of "Intelligence explosion"

From Lesswrongwiki
Jump to: navigation, search
m (permitted possibilities)
Line 3: Line 3:
 
|I.J. Good|[http://www.aeiveos.com/~bradbury/Authors/Computing/Good-IJ/SCtFUM.html "Speculations Concerning the First Ultraintelligent Machine"]}}
 
|I.J. Good|[http://www.aeiveos.com/~bradbury/Authors/Computing/Good-IJ/SCtFUM.html "Speculations Concerning the First Ultraintelligent Machine"]}}
  
One of what [[Eliezer Yudkowsky]] calls the [http://yudkowsky.net/singularity/schools three major singularity schools] centers on the idea that minds creating smarter minds could enter a feedback loop strong enough to go "foom" in very little time.
+
One of what [[Eliezer Yudkowsky]] calls the [http://yudkowsky.net/singularity/schools three major singularity schools] centers on the idea that minds creating smarter minds could enter a feedback loop strong enough to go "foom" in very little time.  
  
 
==Blog posts==
 
==Blog posts==
Line 10: Line 10:
  
 
*[http://lesswrong.com/lw/w5/cascades_cycles_insight/ Cascades, Cycles, Insight...], [http://lesswrong.com/lw/w6/recursion_magic/ ...Recursion, Magic]
 
*[http://lesswrong.com/lw/w5/cascades_cycles_insight/ Cascades, Cycles, Insight...], [http://lesswrong.com/lw/w6/recursion_magic/ ...Recursion, Magic]
*[http://lesswrong.com/lw/we/recursive_selfimprovement/ Recursive Self-Improvement], [http://lesswrong.com/lw/wf/hard_takeoff/ Hard Takeoff]
+
*[http://lesswrong.com/lw/we/recursive_selfimprovement/ Recursive Self-Improvement], [http://lesswrong.com/lw/wf/hard_takeoff/ Hard Takeoff], [http://lesswrong.com/lw/wg/permitted_possibilities_locality/ Permitted Possibilities, & Locality]
  
 
==See also==
 
==See also==

Revision as of 10:08, 1 November 2009

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind.

One of what Eliezer Yudkowsky calls the three major singularity schools centers on the idea that minds creating smarter minds could enter a feedback loop strong enough to go "foom" in very little time.

Blog posts

Posts by Eliezer Yudkowsky:

See also