A sequence is a series of multiple posts on Less Wrong on the same topic, to coherently and fully explore a particular thesis.
Reading the sequences is the most systematic way to approach the Less Wrong archives.
If you prefer books over blog posts, Thinking and Deciding by Jonathan Baron and Good and Real by Gary Drescher have been mentioned as books that overlap significantly with the sequences. (Read more about how the sequences fit in with work done by others.)
- 1 Alternative formats
- 2 Translations
- 3 Core Sequences
- 4 Major Sequences
- 4.1 Mysterious Answers to Mysterious Questions
- 4.2 A Human's Guide to Words
- 4.3 How To Actually Change Your Mind
- 4.4 Reductionism
- 4.5 The Quantum Physics Sequence
- 4.6 The Metaethics Sequence
- 4.7 The Fun Theory Sequence
- 4.8 The Craft and the Community
- 4.9 Highly Advanced Epistemology 101 for Beginners
- 5 Minor Sequences
- 6 The_Hanson-Yudkowsky_AI-Foom_Debate
- 7 Sequences by Others
- 7.1 Positivism, Self Deception, and Neuroscience by Yvain
- 7.2 Priming and Implicit Association by Yvain
- 7.3 Introduction to Game Theory by Yvain
- 7.4 Decision Theory of Newcomblike Problems by AnnaSalamon
- 7.5 Living Luminously by Alicorn
- 7.6 The Science of Winning at Life by lukeprog
- 7.7 Rationality and Philosophy by lukeprog
- 7.8 No-Nonsense Metaethics by lukeprog
- 7.9 What Intelligence Tests Miss by Kaj_Sotala
- 7.10 Why Everyone (Else) Is a Hypocrite by Kaj_Sotala
The Sequences have been converted to eReader compatible formats by several projects.
- Print ready versions by jb55 (GitHub). Has versions in Markdown, PDF, and ePub. ePubs can be converted to nearly any other format with calibre.
- lw2ebook by OneWhoFrogs (GitHub). Includes all sequences in ePub and mobi formats.
- Sequences in EPub and MOBI formats by ciphergoth
- PDF version of most sequences in a single file. Has cross-reference support for internal links (PDF links or footnotes), and page size is appropriate for tablets.
- Print ready versions by Jordan. Contains all posts of each sequence in one HTML file. [Dead link.]
Castify makes certain content of Less Wrong available as a podcast for a small fee (it's recorded by a professional voice actor). Currently they offer
- The Simple Truth Eliezer's introductory essay (about 40 minutes, also included in Map and Territory)
- Map and Territory (about 1h 30 min)
- Mysterious Answers to Mysterious Questions (about 2h 30m)
- Ethical Injunctions (about 1h 30m)
- A Human's Guide to Words (about 2h 30m)
-  (about 5h 20m)
- Main article: Translations into other languages
Map and Territory contains some of the most important introductory posts and essays.
The most important technique that Less Wrong can offer you is How To Actually Change Your Mind.
Long sequences that have been completed and organized into a guide.
How to see through the many disguises of answers or beliefs or statements that remove curiosity without alleviating confusion.
A series on the use and abuse of words; why you can't define a word any way you like; how human brains seem to process definitions. First introduces the Mind projection fallacy and the concept of how an algorithm feels from inside, which makes it a basic intro to key elements of the LW zeitgeist.
A mega-sequence scattered over almost all of Less Wrong on the ultra-high-level penultimate technique of rationality: actually updating on evidence.
Organized into eight subsequences.
How to take reality apart into pieces... and live in that universe, where we have always lived, without feeling disappointed about the fact that complicated things are made of simpler things.
A non-mysterious introduction to quantum mechanics, intended to be accessible to anyone who can grok algebra and complex numbers. Cleaning up the old confusion about QM is used to introduce basic issues in rationality (such as the technical version of Occam's Razor), epistemology, reductionism, naturalism, and philosophy of science. Not dispensable reading, even though the exact reasons for the digression are hard to explain in advance of reading.
What words like "right" and "should" mean; how to integrate moral concepts into a naturalistic universe.
The dependencies on this sequence may not be fully organized, and the post list does not have summaries. Yudkowsky considers this one of his less successful attempts at explanation.
A concrete theory of transhuman values. How much fun is there in the universe; will we ever run out of fun; are we having fun yet; could we be having more fun. Part of the complexity of value thesis. Also forms part of the fully general answer to religious theodicy.
The final sequence of Eliezer Yudkowsky's two-year-long string of daily posts to Less Wrong, on improving the art of rationality and building communities of rationalists.
A bottom-up guide to epistemology, beginning Eliezer's first sequence of posts after a three year gap. The sequence includes practical applications and puzzling meditations. The whole series of posts is not online yet, although it has finished being written.
Smaller collections of posts. Usually parts of major sequences which depend on some-but-not-all of the points introduced.
A collection of introductory posts dealing with the fundamentals of rationality: the difference between the map and the territory, Bayes's Theorem and the nature of evidence, why anyone should care about truth, minds as reflective cognitive engines...
Some notes on the incredibly difficult feat of actually getting your brain to think about something (a key step in actually changing your mind). Whenever someone exhorts you to "think outside the box", they usually, for your convenience, point out exactly where "outside the box" is located. Isn't it funny how nonconformists all dress the same...
Subsequence of How to Actually Change Your Mind.
Some of the various ways that politics damages our sanity - including, of course, making it harder to change our minds on political issues.
Subsequence of How to Actually Change Your Mind.
Affective death spirals are positive feedback loops caused by the halo effect: Positive characteristics perceptually correlate, so the more nice things we say about X, the more additional nice things we're likely to believe about X.
Cultishness is an empirical attractor in human groups: roughly an affective death spiral; plus peer pressure and outcasting behavior; plus (often) defensiveness around something believed to be un-improvable.
Yet another subsequence of How to Actually Change Your Mind.
Ethical injunctions are rules not to do something even when you believe it's the right thing to do. This is to protect you from your own cleverness (especially taking bad black swan bets), and the Corrupted hardware you're running on.
Related to the Metaethics sequence.
If dragons were common, and you could look at one in the zoo - but zebras were a rare legendary creature that had finally been decided to be mythical - then there's a certain sort of person who would ignore dragons, who would never bother to look at dragons, and chase after rumors of zebras. The grass is always greener on the other side of reality.
Which is rather setting ourselves up for eternal disappointment, eh? If we cannot take joy in the merely real, our lives shall be empty indeed.
Subsequence of Reductionism.
On the putative "possibility" of beings who are just like us in every sense, but not conscious - that is, lacking inner subjective experience.
Subsequence of Reductionism.
Learning the very basic math of evolutionary biology costs relatively little if you understand algebra, but gives you a surprisingly different perspective from what you'll find in strictly nonmathematical texts.
How to do things that are difficult or "impossible".
How Yudkowsky made epic errors of reasoning as a teenage "rationalist" and recovered from them starting at around age 23, the period that he refers to as his Bayesian Enlightenment.
Sequences by Others
Later sequences that were written people other than Eliezer Yudkowsky.
Priming may be described as the capability of any random stimulus to commandeer your thinking and judgement for the next several minutes. Scared? Don't be. There exist ways to defend yourself against these kinds of intrusions, and there are even methods to harness them into useful testing mechanisms.
Decisions need to be modeled with some structure in order to be scrutinized and systematically improved; simply "intuiting" the answers to decision problems by ad-hoc methods is not conducive to thorough analysis. For this, we formulate decision theories. This sequence, themed with an analysis of Newcomb's problem, is a consolidated summary and context for the many decision theory discussions found on LessWrong at the time of writing.
Luminosity, as used here, is self-awareness. A luminous mental state is one that you have and know that you have. It could be an emotion, a belief or alief, a disposition, a quale, a memory - anything that might happen or be stored in your brain. What's going on in your head?
This sequence summarizes scientifically-backed advice for "winning" at everyday life: in one's productivity, in one's relationships, in one's emotions, etc. Each post concludes with footnotes and a long list of references from the academic literature.
This sequence explains how intuitions are used in mainstream philosophy and what the science of intuitions suggests about how intuitions should be used in philosophy.
This sequence explains and defends a naturalistic approach to metaethics.
A sequence summarizing the content of Keith Stanovich's book What Intelligence Tests Miss.
A sequence summarizing the content of Robert Kurzban's book Why Everyone (Else) is a Hypocrite: Evolution and the Modular Mind (this sequence hasn't been finished).