Difference between revisions of "Singularity"

From Lesswrongwiki
Jump to: navigation, search
Line 1: Line 1:
The '''Singularity''' or '''Technological Singularity''' refers to a hypothetical future event where [[Artificial Intelligence]] vastly outperforms the abilities of the human mind. It is difficult for us to imagine how such beings would behave or how the future may unfold beyond that point. Various commentators have provided varying dates for when the Singularity will occur and the implications it would have for humanity.
+
The '''Singularity''' or Technological Singularity refers to a hypothetical future event where Artificial Intelligence vastly outperforms the abilities of the human mind. It is difficult for us to imagine how such beings would behave or how the future may unfold beyond that point. Various commentators have provided varying dates for when the Singularity will occur and the implications it would have for humanity.
  
These predictions are largely based on the mathematical projections of [[Moore's law]] which has been accurately predicting the exponential growth of computers for over 50 years. These projections allow computer scientists to estimate the dates when certain computing projects  (such as Brain Emulation) will be feasible, even if they are beyond the capabilities of today’s computers.  
+
Eliezer Yudkowsky has observed that the varying perspectives on the Singularity can be broadly split into three "major schools" - Accelerating Change (Ray Kurzweil), the Event Horizon (Vernor Vinge), and the Intelligence Explosion (I.J. Good).
  
Eliezer Yudkowsky has observed that the varying perspectives on the Singularity can be broadly split into three "major schools"  -  Accelerating Change (Ray Kurzweil), the Event Horizon (Vernor Vinge), and the [[Intelligence Explosion]] (I.J. Good).
+
The Accelerating Change School observes that, contrary to our intuitive linear expectations about the future, the power of information technology increases exponentially. This includes, but is not restricted to [[Moore’s law]] - other examples include Internet speed, gene sequencing and the spatial resolution of brain scanning. By projecting these technology growths into the future it becomes possible to imagine what will be possible to engineer in the future. Kurzweil specifically dates the Singularity happening in 2045.
  
The consequences of such an event range from Kurzweil’s largely positive predictions to Bill Joy’s existential pessimism outlined in his essay "Why the future doesn’t need us."
+
The Event Horizon School asserts that for the entirety of Earth’s history all technological and social progress has been the product of the human mind. However, Vernor Vinge asserts that technology will soon improve on human intelligence either via brain-computer interfaces or Artificial Intelligence or both. At which point technological progress will be beyond the comprehension of anything a mere human can imagine now.
 +
 
 +
The Intelligence Explosion School asserts that a positive feedback loop could be created in which an intelligence is making itself smarter, thus getting better at making itself even smarter. A strong version of this idea suggests that once the positive feedback starts to play a role, it will lead to a dramatic leap in capability very quickly.  This scenario does not necessarily rely upon an entirely computing substrate for the explosion to occur, humans with computer augmented brains or genetically altered may also be methods to engineer an Intelligence Explosion. It is this interpretation of the Singularity that Less Wrong broadly focuses on.
 +
 
 +
The consequences of a Singularity range from Kurzweil’s largely positive predictions to Bill Joy’s existential pessimism outlined in his essay "Why the future doesn’t need us."
  
 
==Blog posts==
 
==Blog posts==

Revision as of 02:46, 17 June 2012

The Singularity or Technological Singularity refers to a hypothetical future event where Artificial Intelligence vastly outperforms the abilities of the human mind. It is difficult for us to imagine how such beings would behave or how the future may unfold beyond that point. Various commentators have provided varying dates for when the Singularity will occur and the implications it would have for humanity.

Eliezer Yudkowsky has observed that the varying perspectives on the Singularity can be broadly split into three "major schools" - Accelerating Change (Ray Kurzweil), the Event Horizon (Vernor Vinge), and the Intelligence Explosion (I.J. Good).

The Accelerating Change School observes that, contrary to our intuitive linear expectations about the future, the power of information technology increases exponentially. This includes, but is not restricted to Moore’s law - other examples include Internet speed, gene sequencing and the spatial resolution of brain scanning. By projecting these technology growths into the future it becomes possible to imagine what will be possible to engineer in the future. Kurzweil specifically dates the Singularity happening in 2045.

The Event Horizon School asserts that for the entirety of Earth’s history all technological and social progress has been the product of the human mind. However, Vernor Vinge asserts that technology will soon improve on human intelligence either via brain-computer interfaces or Artificial Intelligence or both. At which point technological progress will be beyond the comprehension of anything a mere human can imagine now.

The Intelligence Explosion School asserts that a positive feedback loop could be created in which an intelligence is making itself smarter, thus getting better at making itself even smarter. A strong version of this idea suggests that once the positive feedback starts to play a role, it will lead to a dramatic leap in capability very quickly. This scenario does not necessarily rely upon an entirely computing substrate for the explosion to occur, humans with computer augmented brains or genetically altered may also be methods to engineer an Intelligence Explosion. It is this interpretation of the Singularity that Less Wrong broadly focuses on.

The consequences of a Singularity range from Kurzweil’s largely positive predictions to Bill Joy’s existential pessimism outlined in his essay "Why the future doesn’t need us."

Blog posts

References

External links

See also