Difference between revisions of "Intelligence explosion"

From Lesswrongwiki
Jump to: navigation, search
m (See also)
m (category: future)
Line 20: Line 20:

Revision as of 19:07, 30 August 2009

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind.

One of what Eliezer Yudkowsky calls the three major singularity schools centers on the idea that minds creating smarter minds could enter a feedback loop strong enough to go "foom" in very little time.

See also

Blog posts

Posts by Eliezer Yudkowsky: