Difference between revisions of "Criticisms of the rationalist movement"

From Lesswrongwiki
Jump to: navigation, search
(See also)
(Transhumanism: AI is not disputed at RW, though some AI ideas like fast takeoff might be)
Line 15: Line 15:
 
Less Wrong's community was partially founded by soliciting users from the transhumanist [https://hpluspedia.org/wiki/SL4#cite_note-1 SL4] mailing list and [[Eliezer Yudkowsky]] is himself a prominent transhumanist.
 
Less Wrong's community was partially founded by soliciting users from the transhumanist [https://hpluspedia.org/wiki/SL4#cite_note-1 SL4] mailing list and [[Eliezer Yudkowsky]] is himself a prominent transhumanist.
  
As such, the fringe nature of transhumanist ideas such as [[cryonics]], [[AI]] and such has met with continued scorn from the skeptics based at [[RationalWiki]].<ref>http://rationalwiki.org/wiki/Transhumanism</ref>
+
As such, the fringe nature of transhumanist ideas such as [[cryonics]] has met with continued scorn from the skeptics based at [[RationalWiki]].<ref>http://rationalwiki.org/wiki/Transhumanism</ref>
  
 
== See also ==
 
== See also ==

Revision as of 10:35, 24 April 2017

Criticisms of the rationalist movement and Less Wrong have existed for most of its duration on various grounds.

Cult of Rationality

Less Wrong has been referred to as a cult phyg on numerous occasions,[1][2][3] with Eliezer Yudkowsky as its leader. Eliezer's confidence in his AI safety work outside of mainstream acedemia and self-professed intelligence render him highly unpopular with his critics.[4]

Neoreaction

The Neoreaction movement,[5] is a notoriously adjacent idea to the community. Whist it has being explicitly refuted by figures such as Eliezer[6][7] and Scott,[8] is often actively-associated by critics.[9][10]

Rationalism

The movement has been criticized as overemphasizing inductive reasoning over empiricism,[11] a criticism that has been refuted by Scott Alexander.[12]

Roko's basilisk

The Roko's basilisk thought experiment was notorious in that it required specific preconditions available nearly exclusively within the Less Wrong community that rendered the the reader vulnerable to this 'memetic hazard'. As such it has drawn derision from critics who feel perception risk from unfriendly AI is overstated within the community.[13][14]

Transhumanism

Less Wrong's community was partially founded by soliciting users from the transhumanist SL4 mailing list and Eliezer Yudkowsky is himself a prominent transhumanist.

As such, the fringe nature of transhumanist ideas such as cryonics has met with continued scorn from the skeptics based at RationalWiki.[15]

See also

References