The Singularity or Technological Singularity is a term with a number of different meanings, ranging from a period of rapid change to the creation of greater-than-human intelligence.
Three Singularity schools
Eliezer Yudkowsky has observed that the varying perspectives on the Singularity can be broadly split into three "major schools" - Accelerating Change (Ray Kurzweil), the Event Horizon (Vernor Vinge), and the Intelligence Explosion (I.J. Good).
The Accelerating Change School observes that, contrary to our intuitive linear expectations about the future, the power of information technology increases exponentially. This includes, but is not restricted to Moore’s law - other examples include Internet speed, gene sequencing and the spatial resolution of brain scanning. By projecting these technology growths into the future it becomes possible to imagine what will be possible to engineer in the future. Kurzweil specifically dates the Singularity happening in 2045.
The Event Horizon School asserts that for the entirety of Earth’s history all technological and social progress has been the product of the human mind. However, Vernor Vinge asserts that technology will soon improve on human intelligence either via brain-computer interfaces or Artificial Intelligence or both. At which point technological progress will be beyond the comprehension of anything a mere human can imagine now.
The Intelligence Explosion School asserts that a positive feedback loop could be created in which an intelligence is making itself smarter, thus getting better at making itself even smarter. A strong version of this idea suggests that once the positive feedback starts to play a role, it will lead to a dramatic leap in capability very quickly. This scenario does not necessarily rely upon an entirely computing substrate for the explosion to occur, humans with computer augmented brains or genetically altered may also be methods to engineer an Intelligence Explosion. It is this interpretation of the Singularity that Less Wrong broadly focuses on.
Philosopher David Chalmers published a significant analysis of the Singularity, focusing on intelligence explosions, in Journal of Consciousness Studies. He performed a very careful analysis of the main premises and arguments for the existence of the singularity. According to him, the main argument is:
- 1. There will be AI (before long, absent defeaters).
- 2. If there is AI, there will be AI+ (soon after, absent defeaters).
- 3. If there is AI+, there will be AI++ (soon after, absent defeaters).
- 4. There will be AI++ (before too long, absent defeaters).
He then proceeds to search for arguments for these 3 premises. Premise 1 seems to be grounded in either Evolutionary argument for human-level AI or Emulation argument for human-level AI. Premise 2 is grounded in the existence and feasibility of an extensibility method for greater-than-human intelligence. Premise 3 in a more general version of premise 2. His analysis of how the singularity could occur defends the likelihood of an intelligence explosion. He also discusses the nature of general intelligence, and possible obstacles to a singularity. A good deal of discussion is given to the dangers of an intelligence explosion, and Chalmers concludes that we must negotiate it very carefully by building the correct values into the initial AIs.
The consequences of a Singularity range from Kurzweil’s largely positive predictions to Bill Joy’s existential pessimism outlined in his essay "Why the future doesn’t need us."
- Speculations Concerning the First Ultraintelligent Machine by I.J. Good
- Why the future doesn’t need us Bill Joy’s artcle for Wired magazine.
- The Coming Technological Singularity Essay by Vernor Vinge
- An overview of models of technological singularity by Anders Sandberg
- Singularity TED Talk by Ray Kurzweil (YouTube)
- The Singularity Three Major Schools of Thought Singularity Summit Talk by Eliezer Yudkowsky