Intelligence explosion

From Lesswrongwiki
Revision as of 08:21, 30 August 2009 by Vladimir Nesov (talk | contribs) (See also)
Jump to: navigation, search

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind.

One of what Eliezer Yudkowsky calls the three major singularity schools centers on the idea that minds creating smarter minds could enter a feedback loop strong enough to go "foom" in very little time.

See also

Blog posts

Posts by Eliezer Yudkowsky: