|
|
Line 1: |
Line 1: |
− | {{wikilink}}
| + | #REDIRECT [[Singularity]] |
− | You may think the [[future]] will be just like the present. But what if the future gets interrupted by this big and excitingly different thing involving some combination of shiny technology, impressive robot brains, and blasphemous blendings of man and machine?
| |
− | | |
− | As you may have noticed, people have let the meaning of the word "singularity" slip away into [http://www.acceleratingfuture.com/michael/blog/2007/07/the-word-singularity-has-lost-all-meaning/ ever greater ambiguity]. But some have tried to create order in this chaos. [[Eliezer Yudkowsky]] offers [http://yudkowsky.net/singularity/schools three major schools]: accelerating change, the event horizon, and the [[intelligence explosion]].
| |
− | | |
− | ==Blog posts==
| |
− | | |
− | *[http://intelligence.org/blog/2007/09/30/three-major-singularity-schools/ Three Major Singularity Schools] by [[Eliezer Yudkowsky]]
| |
− | *[http://www.acceleratingfuture.com/michael/blog/2007/07/the-word-singularity-has-lost-all-meaning/ The Word “Singularity” Has Lost All Meaning] by [[Michael Anissimov]]
| |
− | | |
− | ==External links==
| |
− | | |
− | *[http://facingthesingularity.com/ Facing the Singularity] by [http://lukeprog.com/ lukeprog]
| |
− | | |
− | ==See also==
| |
− | | |
− | *[[Absurdity heuristic]]
| |
− | *[[Hard takeoff]]
| |
− | *[[Intelligence explosion]]
| |
− | *[[Existential risk]]
| |
− | *[[Future]]
| |
− | | |
− | [[Category:Concepts]]
| |
− | [[Category:Future]]
| |