Difference between revisions of "Singularity"

From Lesswrongwiki
Jump to: navigation, search
m
m (Removed false statement, fixed formatting)
Line 1: Line 1:
The '''Singularity''' or '''Technological Singularity''' refers to a hypothetical future event where [[Artificial Intelligence]] vastly outperforms the abilities of the human mind. Due to the fact these Super-Intelligences are, by definition, beyond human comprehension it becomes difficult for us to imagine how such beings would behave or how the future may unfold beyond that point. Various commentators have provided varying dates for when the Singularity will occur and the implications it would have for humanity.
+
The '''Singularity''' or '''Technological Singularity''' refers to a hypothetical future event where [[Artificial Intelligence]] vastly outperforms the abilities of the human mind. It is difficult for us to imagine how such beings would behave or how the future may unfold beyond that point. Various commentators have provided varying dates for when the Singularity will occur and the implications it would have for humanity.
  
These predictions are largely based on the mathematical projections of [[Moore’s law]] which has been accurately predicting the exponential growth of computers for over 50 years. These projections allow computer scientists to estimate the dates when certain computing projects  (such as Brain Emulation) will be feasible, even if they are beyond the capabilities of today’s computers.   
+
These predictions are largely based on the mathematical projections of [[Moore's law]] which has been accurately predicting the exponential growth of computers for over 50 years. These projections allow computer scientists to estimate the dates when certain computing projects  (such as Brain Emulation) will be feasible, even if they are beyond the capabilities of today’s computers.   
  
Eliezer Yudkowsky has observed that the varying perspectives on the Singularity can be broadly split into three “major schools” -  Accelerating Change (Ray Kurzweil), the Event Horizon (Vernor Vinge), and the [[Intelligence Explosion]] (I.J. Good).
+
Eliezer Yudkowsky has observed that the varying perspectives on the Singularity can be broadly split into three "major schools" -  Accelerating Change (Ray Kurzweil), the Event Horizon (Vernor Vinge), and the [[Intelligence Explosion]] (I.J. Good).
  
The consequences of such an event range from Kurzweil’s largely positive predictions to Bill Joy’s existential pessimism outlined in his essay “Why the future doesn’t need us.”.
+
The consequences of such an event range from Kurzweil’s largely positive predictions to Bill Joy’s existential pessimism outlined in his essay "Why the future doesn’t need us."
  
==Blog Posts==
+
==Blog posts==
  
*[http://yudkowsky.net/singularity/schools 3 Schools Blog] by Eliezer Yudkowsky on the Singularity and the Three Major Schools.
+
*[http://yudkowsky.net/singularity/schools Three Major Singularity Schools] by Eliezer Yudkowsky
*[http://lesswrong.com/lw/wf/hard_takeoff/ Hard Take-Off Blog] by Eliezer Yudkowsky
+
*[http://lesswrong.com/lw/wf/hard_takeoff/ Hard Take-Off Blog]
  
 
==References==
 
==References==
Line 18: Line 18:
 
*[http://www-rohan.sdsu.edu/faculty/vinge/misc/singularity.html The Coming Technological Singularity] Essay by Vernor Vinge
 
*[http://www-rohan.sdsu.edu/faculty/vinge/misc/singularity.html The Coming Technological Singularity] Essay by Vernor Vinge
  
==External Links==
+
==External links==
  
*[http://www.youtube.com/watch?v=IfbOyw3CT6A Singularity TED Talk] by Ray Kurzweil YouTube
+
*[http://www.youtube.com/watch?v=IfbOyw3CT6A Singularity TED Talk] by Ray Kurzweil (YouTube)
 
*[http://www.youtube.com/watch?v=mDhdt58ySJA The Singularity Three Major Schools of Thought] Singularity Summit Talk by Eliezer Yudkowsky
 
*[http://www.youtube.com/watch?v=mDhdt58ySJA The Singularity Three Major Schools of Thought] Singularity Summit Talk by Eliezer Yudkowsky
  
==See Also==
+
==See also==
  
*[[Intelligence Explosion]]
+
*[[Intelligence explosion]]
*[[Hard takeoff]]
+
*[[Hard takeoff]], [[Soft takeoff]]
*[[Soft takeoff]]
 

Revision as of 11:06, 8 June 2012

The Singularity or Technological Singularity refers to a hypothetical future event where Artificial Intelligence vastly outperforms the abilities of the human mind. It is difficult for us to imagine how such beings would behave or how the future may unfold beyond that point. Various commentators have provided varying dates for when the Singularity will occur and the implications it would have for humanity.

These predictions are largely based on the mathematical projections of Moore's law which has been accurately predicting the exponential growth of computers for over 50 years. These projections allow computer scientists to estimate the dates when certain computing projects (such as Brain Emulation) will be feasible, even if they are beyond the capabilities of today’s computers.

Eliezer Yudkowsky has observed that the varying perspectives on the Singularity can be broadly split into three "major schools" - Accelerating Change (Ray Kurzweil), the Event Horizon (Vernor Vinge), and the Intelligence Explosion (I.J. Good).

The consequences of such an event range from Kurzweil’s largely positive predictions to Bill Joy’s existential pessimism outlined in his essay "Why the future doesn’t need us."

Blog posts

References

External links

See also