Difference between revisions of "AI takeoff"
(→See Also) |
|||
Line 1: | Line 1: | ||
− | '''A.I takeoff''' refers to a point in the future where [[Artificial General Intelligence]] | + | '''A.I takeoff''' refers to a point in the future where [[Artificial General Intelligence]] expands to become an SAI. The speed at which a AGI would expand can be split into “soft” and “hard” takeoff scenarios. |
− | + | “[[soft takeoff]]” scenarios are ones where an AGI's progression from standard intelligence to a SAI occurs on a time scale that allows human interaction. By maintaining control of the AGI’s ascent it should be possible for a [[Friendly AI]] to emerge. | |
− | A “[[hard takeoff]]” | + | A “[[hard takeoff]]” is widely considered much more precarious, as this involves an AGI rapidly ascending to a SAI without human control. This may result in unexpected behavior (i.e [[Unfriendly AI]]). A [[hard takeoff]] can either be defined as a system with vastly greater intelligence or one that has acquired extensive computing resources (eg. control of the Internet). |
− | |||
− | |||
==Blog Posts== | ==Blog Posts== |
Revision as of 00:27, 17 June 2012
A.I takeoff refers to a point in the future where Artificial General Intelligence expands to become an SAI. The speed at which a AGI would expand can be split into “soft” and “hard” takeoff scenarios.
“soft takeoff” scenarios are ones where an AGI's progression from standard intelligence to a SAI occurs on a time scale that allows human interaction. By maintaining control of the AGI’s ascent it should be possible for a Friendly AI to emerge.
A “hard takeoff” is widely considered much more precarious, as this involves an AGI rapidly ascending to a SAI without human control. This may result in unexpected behavior (i.e Unfriendly AI). A hard takeoff can either be defined as a system with vastly greater intelligence or one that has acquired extensive computing resources (eg. control of the Internet).
Blog Posts
- Hard Takeoff by Eliezer Yudkowsky
- The Age of Virtuous Machines by J Storrs Hall President of The Foresight Institute
- Hard take off Hypothesis by Ben Goertzel.
External Links
- Extensive archive of Hard takeoff Essays from Accelerating Future
- Can we avoid a hard take off? by Vernor Vinge