From Lesswrongwiki
Jump to: navigation, search

The Singularity or Technological Singularity refers to a hypothetical future event where Artificial Intelligence vastly outperforms the abilities of the human mind. Due to the fact these Intelligences are, by definition, beyond Human comprehension it becomes difficult for us to imagine how such beings would behave. Various commentators have provided varying dates for when this will occur and the implications it would have for humanity.

These predictions are based on the mathematical projections of Moore’s Law which has been accurately predicting the exponential growth of computers for over 50 years. These projections allow computer scientists to estimate the dates when certain computing projects (such as Brain Emulation) will be feasible, even if they are beyond the capabilities of today’s computers.

A number of prominent computer scientists have also speculated on the Singularity happening in the near term future. These include Hans Moravec, Eliezer Yudkowsky, Bill Joy and most notoriously Ray Kurzweil in his book “The Singularity is Near”.

The consequences of such an event range from Kurzweil’s largely positive predictions to Bill Joy’s existential pessimism outlined in his essay “Why the future doesn’t need us.”

Blog Post

External Links