From Lesswrongwiki
Revision as of 02:18, 6 June 2012 by Daniel Trenor (talk | contribs) (Created page with "The ‘’’Singularity’’’ or ‘’’Technological Singularity’’’ refers to a hypothetical future event where Artificial Intelligence vastly outperforms the abilit...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

The ‘’’Singularity’’’ or ‘’’Technological Singularity’’’ refers to a hypothetical future event where Artificial Intelligence vastly outperforms the abilities of the human mind. Due to the fact these Intelligences are, by definition, beyond Human comprehension it becomes difficult for us to imagine how such beings would behave. Various commentators have provided varying dates for when this will occur and the implications it would have for humanity.

These predictions are based on the mathematical projections of Moore’s Law which has been accurately predicting the exponential growth of computers for over 50 years. These projections allow computer scientists to estimate the dates when certain computing projects (such as Brain Emulation) will be feasible, even if they are beyond the capabilities of today’s computers.

Although Alan Turing had envisioned the possibility of such Intelligences as early as 1951, it was another mathematician, John von Neumann, who is the first recorded to have used the term “Singularity”. However, he did not make any predictions about a when the Singularity would arrive or formally wrote any papers on the subject.

The concept passed from academia and into popular culture via science fiction authors such as Vernor Vinge (also a professor of mathematics) who also distributed an essay over the Internet in 1993 that included his prediction that "Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.....I'll be surprised if this event occurs before 2005 or after 2030."

A number of prominent computer scientists have also speculated on the Singularity happening in the near term future. These include Hans Moravec, Eliezer Yudkowsky, Bill Joy and most notoriously Ray Kurzweil in his book “The Singularity is Near”.

The consequences of such an event range from Kurzweil’s largely positive predictions to Bill Joy’s existential pessimism outlined in his essay “Why the future doesn’t need us.”

Blog Post

External Links