Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind.
An intelligence explosion is a hypothetical strong positive feedback loop in which a recursively self-improving intelligence makes itself smarter at making itself smarter, resulting in a very fast, very dramatic leap in capability. Eliezer Yudkowsky calls the intelligence explosion one of three major Singularity schools.
Posts by Eliezer Yudkowsky:
- Cascades, Cycles, Insight..., ...Recursion, Magic
- Recursive Self-Improvement, Hard Takeoff, Permitted Possibilities, & Locality