RAZ Errata

From Lesswrongwiki
Revision as of 05:14, 26 June 2015 by RobbBB (talk | contribs)
Jump to: navigation, search

This is a list of errata for the first edition of Rationality: From AI to Zombies, by Eliezer Yudkowsky. You can add any errors you notice here, or email them to errata@intelligence.org.

Thanks for pointing out these mistakes go to: Viliam Búr, Wayne McDougall, Gram Stone.

 

Important errors

1. What Do I Mean By "Rationality"?

The link to Jim Holt's introduction to Newcomb's Problem is broken; the correct link is http://www.slate.com/articles/arts/egghead/2002/02/thinkinginside_the_boxes.html.
 

74. Avoiding Your Belief's Real Weak Points

"My point is that, when it comes to spontaneous self-questioning, one is much more likely to spontaneously self-attack strong points with comforting replies to rehearse, then to spontaneously self-attack the weakest, most vulnerable points." should say "than" in place of "then." This is an important meaning difference, albeit one that's easy to glean from context.
 

Minor errors

Biases: An Introduction

"We’re especially loathe to think" should be "We’re especially loath to think"
 

76. Fake Justification

"Renown is one reasonable criteria" should be "Renown is one reasonable criterion"
 

87. Anchoring and Adjustment

"then they adjust upward or downward from their starting estimate until they reached an answer that “sounded plausible”; and then they stopped adjusting." should be "then they adjust upward or downward from their starting estimate until they reach an answer that “sounds plausible”; and then they stop adjusting." to maintain tense consistency.
 

91. The "Outside the Box" Box

The link to Scott Aaronson's blog is directed at the wrong comment; the correct link is http://www.scottaaronson.com/blog/?p=87#comment-2092.
 

150. The Hidden Complexity of Wishes

"programming an Arithmetic Expert Systems" should be "programming Arithmetic Expert Systems"
 

157. Similarity Clusters

"these things may serve out to single out only humans" should be "these things may serve to single out only humans"
 

Interlude: An Intuitive Explanation of Bayes's Theorem

"a women" should be "a woman"
 

184. Beautiful Probability

"taken as a the limit of a finite process" should be "taken as the limit of a finite process"
 

187. Perpetual Motion Beliefs

"why couldn’t be the case" should be "why that couldn’t be the case"
"sticking in your finger into boiling water" should be "sticking your finger into boiling water"
 

204. Bind Yourself to Reality

"Think like Reality" should be capitalized "Think Like Reality".
 

214. Hands v. Fingers

"That that’s why we have" should be "That’s why we have".
 

247. Science Isn't Strict Enough

"you could know things without looking them" should be "you could know things without looking at them"