Superintelligence

From Lesswrongwiki
Revision as of 10:38, 12 July 2012 by TerminalAwareness (talk | contribs) (Rewrite first draft)
Jump to: navigation, search
Smallwikipedialogo.png
Wikipedia has an article about

A Superintelligence is a being with superhuman intelligence, and a focus of the Singularity Institute's research. Specifically, Nick Bostrom (1997) defined it as

"An intellect that is much smarter than the best human brains in practically every field, including scientific creativity, general wisdom and social skills."

The Singularity Institute is dedicated to ensuring humanity's safety and prosperity by preparing for the creation of an Artificial General Intelligence which will develop superintelligence. A superintelligence with an architecture not based on the human brain is sometime called a strong superintelligence. Given its intelligence, it is likely to be incapable of being controlled by humanity. It is important to prepare early to develop friendly artificial intelligence, as there may be an AI arms race.

An Artificial General Intelligence has a number of advantages that will help it develop superintelligence.

The development of superintelligence from humans has also been considered, sometimes called a weak superintelligence. It may come in the form of whole brain emulation, where a human brain is scanned and simulated on a computer. The development of Brain-computer interfaces may also lead to the creation of superintelligence. Biological enhancements such as genetic manipulation or the use of nootropics could lead to superintelligence as well.

Blog Posts

External Links

See Also