Intelligence explosion

From Lesswrongwiki
Revision as of 15:04, 1 November 2009 by Zack M. Davis (talk | contribs)
Jump to: navigation, search

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind.

An intelligence explosion is a hypothetical strong positive feedback loop in which a recursively self-improving intelligence makes itself smarter at making itself smarter, resulting in a very fast, very dramatic leap in capability. Eliezer Yudkowsky calls the intelligence explosion one of three major Singularity schools.

Blog posts

Posts by Eliezer Yudkowsky:

See also