Difference between revisions of "Intelligence explosion"

From Lesswrongwiki
Jump to: navigation, search
m (category: future)
Line 4: Line 4:
  
 
One of what [[Eliezer Yudkowsky]] calls the [http://yudkowsky.net/singularity/schools three major singularity schools] centers on the idea that minds creating smarter minds could enter a feedback loop strong enough to go "foom" in very little time.   
 
One of what [[Eliezer Yudkowsky]] calls the [http://yudkowsky.net/singularity/schools three major singularity schools] centers on the idea that minds creating smarter minds could enter a feedback loop strong enough to go "foom" in very little time.   
 
==See also==
 
 
*[[Technological singularity]], [[Hard takeoff]]
 
*[[Artificial general intelligence]]
 
*[[Lawful intelligence]]
 
  
 
==Blog posts==
 
==Blog posts==
Line 17: Line 11:
 
*[http://lesswrong.com/lw/w5/cascades_cycles_insight/ Cascades, Cycles, Insight...], [http://lesswrong.com/lw/w6/recursion_magic/ ...Recursion, Magic]
 
*[http://lesswrong.com/lw/w5/cascades_cycles_insight/ Cascades, Cycles, Insight...], [http://lesswrong.com/lw/w6/recursion_magic/ ...Recursion, Magic]
 
*[http://lesswrong.com/lw/we/recursive_selfimprovement/ Recursive Self-Improvement], [http://lesswrong.com/lw/wf/hard_takeoff/ Hard Takeoff]
 
*[http://lesswrong.com/lw/we/recursive_selfimprovement/ Recursive Self-Improvement], [http://lesswrong.com/lw/wf/hard_takeoff/ Hard Takeoff]
 +
 +
==See also==
 +
 +
*[[Technological singularity]], [[Hard takeoff]]
 +
*[[Artificial general intelligence]]
 +
*[[Lawful intelligence]]
  
 
{{stub}}
 
{{stub}}
 
[[Category:Concepts]]
 
[[Category:Concepts]]
 
[[Category:Future]]
 
[[Category:Future]]

Revision as of 04:22, 29 September 2009

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind.

One of what Eliezer Yudkowsky calls the three major singularity schools centers on the idea that minds creating smarter minds could enter a feedback loop strong enough to go "foom" in very little time.

Blog posts

Posts by Eliezer Yudkowsky:

See also