Difference between revisions of "Sequences"

From Lesswrongwiki
Jump to: navigation, search
(Other resources)
(Need a snapshot of this version. Will revert in a few seconds.)
Line 1: Line 1:
 
A '''sequence''' is a series of multiple posts on Less Wrong on the same topic, to coherently and fully explore a particular thesis.
 
A '''sequence''' is a series of multiple posts on Less Wrong on the same topic, to coherently and fully explore a particular thesis.
  
The original sequences were written by [[Eliezer Yudkowsky]] with the goal of creating a book on rationality. [[MIRI]] has since collated and edited the sequences into [[Rationality: From AI to Zombies]]. If you are new to Less Wrong, this book is the best place to start.
+
Reading through sequences is the most systematic way to approach the Less Wrong archives.
  
 
+
=Core Sequences=
__TOC__
 
 
 
  
==Rationality: From AI to Zombies==
+
[[Map and Territory (sequence) | Map and Territory]] contains some of the most important introductory posts and essays.
[[File:Rationality-Angled-Cover-Web.jpg|frame|right|''[[Rationality: From AI to Zombies]]'' cover image.]]
 
''[[Rationality: From AI to Zombies]]'' is an ebook collecting six books worth of essays on the science and philosophy of human rationality. It's one of the best places to start for people who want to better understand topics that crop up on ''Less Wrong'', such as cognitive bias, the map-territory distinction, meta-ethics, and existential risk. The six books are:
 
  
The ebook can be downloaded on a "pay-what-you-want" basis from [https://intelligence.org/rationality-ai-zombies intelligence.org]. Its six books in turn break down into twenty-six sections:
+
If you don't read the sequences on [[Mysterious Answers to Mysterious Questions]] and [[Reductionism (sequence) | Reductionism]], nothing else on Less Wrong will make much sense.
  
__________________________________________________________________
+
The most important method that Less Wrong can offer you is [[How To Actually Change Your Mind]].
*Book I: '''''[[Map and Territory]]'''''. An introduction to the Bayesian concept of rational belief.
 
**A. Predictably Wrong
 
**B. Fake Beliefs
 
**C. Noticing Confusion
 
**D. Mysterious Answers
 
__________________________________________________________________
 
*Book II: '''''[[How to Actually Change Your Mind]]'''''. A guide to noticing motivated reasoning and overcoming confirmation bias.
 
**E. Overly Convenient Excuses
 
**F. Politics and Rationality
 
**G. Against Rationalization
 
**H. Against Doublethink
 
**I. Seeing with Fresh Eyes
 
**J. Death Spirals
 
**K. Letting Go
 
__________________________________________________________________
 
*Book III: '''''[[The Machine in the Ghost]]'''''. Essays on the general topic of minds, goals, and concepts.
 
**L. The Simple Math of Evolution
 
**M. Fragile Purposes
 
**N. A Human's Guide to Words
 
__________________________________________________________________
 
*Book IV: '''''[[Mere Reality]]'''''. Essays on science and the physical world.
 
**O. Lawful Truth
 
**P. Reductionism 101
 
**Q. Joy in the Merely Real
 
**R. Physicalism 201
 
**S. Quantum Physics and Many Worlds
 
**T. Science and Rationality
 
__________________________________________________________________
 
*Book V: '''''[[Mere Goodness]]'''''. A discussion of ethics, and of things people value in general.
 
**U. Fake Preferences
 
**V. Value Theory
 
**W. Quantified Humanism
 
__________________________________________________________________
 
*Book VI: '''''[[Becoming Stronger]]'''''. Essays on self-improvement, group rationality, and rationality groups.
 
**X. Yudkowsky's Coming of Age
 
**Y. Challenging the Difficult
 
**Z. The Craft and the Community
 
__________________________________________________________________
 
  
 
+
=Major Sequences=
  
==Other sequences by Eliezer Yudkowsky==
+
Long sequences that have been completed and organized into a guide.
The following collections of essays come from the '''[[original sequences]]''', an earlier version of much of the material from ''Rationality: From AI to Zombies'':
 
  
* [[Ethical Injunctions]]: A discussion of prohibitions you may want to follow even when you've thought of a clever reason to think they don't apply.
+
==[[Mysterious Answers to Mysterious Questions]]==
* [[Metaethics sequence|The Metaethics Sequence]]: A longer version of "Value Theory", discussing the apparent "arbitrariness" of human morality.
 
* [[The Fun Theory Sequence]]: A discussion of the complexity of human value, and what the universe might look like if everything were much, much better. Fun theory is the optimistic, far-future-oriented part of value theory, asking: How much fun is there in the universe; will we ever run out of fun; are we having fun yet; could we be having more fun?
 
* [[The Quantum Physics Sequence]]: A much longer version of the "Quantum Physics and Many Worlds", delving more into the implications of physics for our concepts of personal identity and time.
 
  
Other collections from the same time period (2006-2009) include:
+
How to see through the many disguises of answers or beliefs or statements, that don't answer or say or mean anything.  The first (and probably most important) core sequence on Less Wrong.
  
* '''[[The Hanson-Yudkowsky AI-Foom Debate]]''': A blog conversation between Eliezer Yudkowsky and Robin Hanson on the topic of [[intelligence explosion]] and how concerned we should be about superintelligent AI.
+
==[http://lesswrong.com/lw/od/37_ways_that_words_can_be_wrong/ A Human's Guide to Words]==
* '''[[Free will (solution)|Free Will]]''': Yudkowsky's answer to a challenge he raises in ''Rationality: From AI to Zombies'' to come up with an explanation for the human ''feeling'' that we have free will.
 
  
Yudkowsky has also written a more recent sequence:
+
A series on the use and abuse of words; why you often ''can't'' define a word any way you like; how human brains seem to process definitions.  First introduces the Mind Projection Fallacy and the concept of how an algorithm feels from inside, which makes it a basic intro to key elements of the LW zeitgeist.
  
* '''[[Highly Advanced Epistemology 101 for Beginners]]'''. These essays include a discussion of truth, formal logic, causality, and metaethics, and are a good way for more ambitious readers to quickly get up to speed.
+
==[[How To Actually Change Your Mind]]==
  
 
+
A mega-sequence scattered over almost all of Less Wrong on the ultra-high-level penultimate technique of rationality: actually updating on the evidence.
  
==Sequences by others==
+
Organized into eight subsequences.
Sequences of essays by '''[[Yvain|Scott Alexander]]''' include:
 
  
* [[Positivism, Self Deception, and Neuroscience (sequence)|Positivism, Self Deception, and Neuroscience]]
+
==[[Reductionism (sequence) | Reductionism]]==
* [[Priming and Implicit Association (sequence)|Priming and Implicit Association]]. Priming may be described as the capability of any random stimulus to commandeer your thinking and judgement for the next several minutes. Scared? Don't be. There exist ways to defend yourself against these kinds of intrusions, and there are even methods to harness them into useful testing mechanisms.
 
* [[The Blue-Minimizing Robot (sequence)|The Blue-Minimizing Robot]]
 
* [http://lesswrong.com/lw/dbe/introduction_to_game_theory_sequence_guide Introduction to Game Theory]
 
: 
 
  
Sequences by '''[[Lukeprog|Luke Muehlhauser]]''':
+
The second core sequence of Less Wrong.  How to take reality apart into pieces... and live in that universe, where we have always lived, without feeling disappointed about the fact that complicated things are made of simpler things.
  
* [[The Science of Winning at Life]]. This sequence summarizes scientifically-backed advice for "winning" at everyday life: in one's productivity, in one's relationships, in one's emotions, etc. Each post concludes with footnotes and a long list of references from the academic literature.
+
==[http://lesswrong.com/lw/r5/the_quantum_physics_sequence/ The Quantum Physics Sequence]==
* [[Rationality and Philosophy]]. This sequence explains how intuitions are used in mainstream philosophy and what the science of intuitions suggests about how intuitions ''should'' be used in philosophy.
 
* [[No-Nonsense Metaethics]]. This sequence explains and defends a naturalistic approach to metaethics.
 
: 
 
  
By '''[[AnnaSalamon|Anna Salamon]]''':
+
A ''non-mysterious'' introduction to quantum mechanics, intended to be accessible to anyone who can grok algebra and complex numbers.  Cleaning up the old confusion about QM is used to introduce basic issues in rationality (such as the technical version of [[Occam's Razor]]), epistemology, reductionism, naturalism, and philosophy of science.  ''Not'' dispensable reading, even though the exact reasons for the digression are hard to explain in advance of reading.
  
* [[Decision Theory of Newcomblike Problems (sequence) | Decision Theory of Newcomblike Problems]]. Decisions need to be modeled with some structure in order to be scrutinized and systematically improved; simply "intuiting" the answers to decision problems by ad-hoc methods is not conducive to thorough analysis.  For this, we formulate decision theories.  This sequence, themed with an analysis of Newcomb's problem, is a consolidated summary and context for the many decision theory discussions found on LessWrong at the time of writing.
+
==[[Metaethics_sequence | The Metaethics Sequence]]==
: 
 
  
By '''[[Alicorn]]''':
+
What words like "right" and "should" mean; how to integrate moral concepts into a naturalistic universe.
  
* [http://lesswrong.com/lw/1xh/living_luminously/ Living Luminously]. [[Luminosity]], as used here, is self-awareness.  A luminous mental state is one that you have and know that you have.  It could be an [[emotion]], a [[belief]] or [[alief]], a disposition, a [[qualia|quale]], a memory - anything that might happen or be stored in your brain. What's going on in your head?
+
The dependencies on this sequence may not be fully organized, and the post list does not have summariesYudkowsky considers this one of his less successful attempts at explanation.
: 
 
  
And by '''[[Kaj Sotala]]''':
+
==[http://lesswrong.com/lw/xy/the_fun_theory_sequence/ The Fun Theory Sequence]==
  
* [http://lesswrong.com/tag/whatintelligencetestsmiss What Intelligence Tests Miss]. A sequence summarizing the content of Keith Stanovich's book ''What Intelligence Tests Miss''.
+
A concrete theory of transhuman values. How much fun is there in the universe; will we ever run out of fun; are we having fun yet; could we be having more fun. Part of the [[complexity of value]] thesis. Also forms part of the fully general answer to religious theodicy.
* [http://lesswrong.com/tag/whyeveryonehypocrite Why Everyone (Else) Is a Hypocrite] by [[Kaj_Sotala]]. An unfinished sequence summarizing the content of Robert Kurzban's book ''Why Everyone (Else) is a Hypocrite: Evolution and the Modular Mind''.
 
  
 
+
==[http://lesswrong.com/lw/cz/the_craft_and_the_community/ The Craft and the Community]==
  
==Other resources==
+
The final sequence of Eliezer Yudkowsky's two-year-long string of daily posts to Less Wrong, on improving the art of rationality and building communities of rationalists.
[http://lesswrong.com/user/Benito/ Benito's Guide] aims to systematically fill the reader in on the most important ideas discussed on LessWrong (not just in the sequences). It also begins with a series of videos, which are a friendly introduction, and useful if you enjoy talks and interviews.
 
  
''Thinking and Deciding'' by Jonathan Baron and ''Good and Real'' by Gary Drescher have been mentioned as books that overlap significantly with the sequences.  [http://lesswrong.com/r/all/lw/eik/eliezers_sequences_and_mainstream_academia/ More about how the sequences fit in with work done by others].
+
=Minor Sequences=
  
 
+
Smaller collections of posts.  Usually parts of major sequences which depend on some-but-not-all of the points introduced.
  
=== Audio ===
+
==[[Map and Territory (sequence) | Map and Territory]]==
[http://castify.co/ Castify] makes certain content of Less Wrong [http://castify.co/channels available as a podcast] for a small fee (they're recorded by a professional voice actor). Currently they offer:
 
  
Promoted Posts:
+
A collection of introductory posts dealing with the fundamentals of rationality: the difference between the map and the territory, Bayes's Theorem and the nature of evidence, why anyone should care about truth, minds as reflective cognitive engines...
* [http://castify.co/channels/51-less-wrong A podcast of the promoted posts from Less Wrong]
 
  
Major Sequences:
+
==[[Seeing with Fresh Eyes]]==
* [http://castify.co/channels/1-less-wrong-mysterious-answers-to-mysterious-questions Mysterious Answers to Mysterious Questions] (about 2h 30m)
 
* [http://castify.co/channels/16-less-wrong-a-human-s-guide-to-words A Human's Guide to Words] (3h 40m)
 
* [http://castify.co/channels/46-how-to-actually-change-your-mind How to Actually Change Your Mind] (8h 20m)
 
* [http://castify.co/channels/43-reductionism Reductionism] (5h 20m)
 
* [http://castify.co/channels/50-metaethics Metaethics] (7h 30m)
 
  
Minor Sequences:
+
Some notes on the incredibly difficult feat of actually getting your brain to think about something (a key step in [[mindchanging | actually changing your mind]]). Whenever someone exhorts you to "think outside the box", they usually, for your convenience, point out exactly where "outside the box" is located.  Isn't it funny how nonconformists all dress the same...
* [http://castify.co/channels/4-less-wrong-map-and-territory Map and Territory] (1h 30 min)
 
* [http://castify.co/channels/2-less-wrong-ethical-injunctions Ethical Injunctions] (1h 30m)
 
  
Essay:
+
Subsequence of [[How to Actually Change Your Mind]].
* [http://castify.co/channels/3-less-wrong-the-simple-truth The Simple Truth] Eliezer's introductory essay (40 minutes, also included in [http://castify.co/channels/4-less-wrong-map-and-territory Map and Territory])
 
  
 
+
==[[Politics is the Mind-Killer]]==
  
===Translations===
+
Some of the various ways that politics damages our sanity - including, of course, making it harder to [[mindchanging | change our minds]] on political issues.
{{main|Translations into other languages}}
+
 
* [http://rationalite.wordpress.com French]
+
Subsequence of [[How to Actually Change Your Mind]].
* [http://xrazionalita.wordpress.com Italian]
+
 
* [http://xracionalidad.wordpress.com Spanish]
+
==[[Death Spirals and the Cult Attractor]]==
* [http://lesswrong.ru Russian]
+
 
* [https://sites.google.com/site/makananuntukpikiran/sequences Bahasa Indonesia]
+
Affective death spirals are positive feedback loop caused by the halo effect: Positive characteristics perceptually correlate, so the more nice things we say about X, the more additional nice things we're likely to believe about X.
* [http://bur.sk/sk/lesswrong Slovak]
+
 
 +
Cultishness is an empirical attractor in human groups: roughly an affective death spiral; plus peer pressure and outcasting behavior; plus (often) defensiveness around something believed to be un-improvable.
 +
 
 +
Yet another subsequence of [[How to Actually Change Your Mind]].
 +
 
 +
==[[Joy in the Merely Real]]==
 +
 
 +
If dragons were common, and you could look at one in the zoo - but zebras were a rare legendary creature that had finally been decided to be mythical - then there's a certain sort of person who would ignore dragons, who would never bother to look at dragons, and chase after rumors of zebras. The grass is always greener on the other side of reality.
 +
 
 +
Which is rather setting ourselves up for eternal disappointment, eh? If we cannot take joy in the merely real, our lives shall be empty indeed.
 +
 
 +
Subsequence of [[Reductionism (sequence) | Reductionism]].
 +
 
 +
==[[Zombies (sequence) | Zombies Sequence]]==
 +
 
 +
On the putative "possibility" of beings who are just like us in every sense, but not conscious - that is, lacking inner subjective experience.
 +
 
 +
Subsequence of [[Reductionism (sequence) | Reductionism]].
 +
 
 +
==[[Evolution | The Simple Math of Evolution]]==
 +
 
 +
Learning the very basic math of evolutionary biology costs relatively little if you understand algebra, but gives you a surprisingly different perspective from what you'll find in strictly nonmathematical texts.
 +
 
 +
==[[Challenging the Difficult]]==
 +
 
 +
How to do things that are difficult or "impossible".
 +
 
 +
==[[Yudkowsky's coming of age | Coming of Age]]==
 +
 
 +
How Yudkowsky made epic errors of reasoning as a teenage "rationalist" and recovered from them starting at around age 23, the period that he refers to as his Bayesian Enlightenment.
  
 
[[Category:Meta]]
 
[[Category:Meta]]
[[Category:Sequences]]
 
[[Category:Introductory articles]]
 

Revision as of 08:26, 9 September 2015

A sequence is a series of multiple posts on Less Wrong on the same topic, to coherently and fully explore a particular thesis.

Reading through sequences is the most systematic way to approach the Less Wrong archives.

Core Sequences

Map and Territory contains some of the most important introductory posts and essays.

If you don't read the sequences on Mysterious Answers to Mysterious Questions and Reductionism, nothing else on Less Wrong will make much sense.

The most important method that Less Wrong can offer you is How To Actually Change Your Mind.

Major Sequences

Long sequences that have been completed and organized into a guide.

Mysterious Answers to Mysterious Questions

How to see through the many disguises of answers or beliefs or statements, that don't answer or say or mean anything. The first (and probably most important) core sequence on Less Wrong.

A Human's Guide to Words

A series on the use and abuse of words; why you often can't define a word any way you like; how human brains seem to process definitions. First introduces the Mind Projection Fallacy and the concept of how an algorithm feels from inside, which makes it a basic intro to key elements of the LW zeitgeist.

How To Actually Change Your Mind

A mega-sequence scattered over almost all of Less Wrong on the ultra-high-level penultimate technique of rationality: actually updating on the evidence.

Organized into eight subsequences.

Reductionism

The second core sequence of Less Wrong. How to take reality apart into pieces... and live in that universe, where we have always lived, without feeling disappointed about the fact that complicated things are made of simpler things.

The Quantum Physics Sequence

A non-mysterious introduction to quantum mechanics, intended to be accessible to anyone who can grok algebra and complex numbers. Cleaning up the old confusion about QM is used to introduce basic issues in rationality (such as the technical version of Occam's Razor), epistemology, reductionism, naturalism, and philosophy of science. Not dispensable reading, even though the exact reasons for the digression are hard to explain in advance of reading.

The Metaethics Sequence

What words like "right" and "should" mean; how to integrate moral concepts into a naturalistic universe.

The dependencies on this sequence may not be fully organized, and the post list does not have summaries. Yudkowsky considers this one of his less successful attempts at explanation.

The Fun Theory Sequence

A concrete theory of transhuman values. How much fun is there in the universe; will we ever run out of fun; are we having fun yet; could we be having more fun. Part of the complexity of value thesis. Also forms part of the fully general answer to religious theodicy.

The Craft and the Community

The final sequence of Eliezer Yudkowsky's two-year-long string of daily posts to Less Wrong, on improving the art of rationality and building communities of rationalists.

Minor Sequences

Smaller collections of posts. Usually parts of major sequences which depend on some-but-not-all of the points introduced.

Map and Territory

A collection of introductory posts dealing with the fundamentals of rationality: the difference between the map and the territory, Bayes's Theorem and the nature of evidence, why anyone should care about truth, minds as reflective cognitive engines...

Seeing with Fresh Eyes

Some notes on the incredibly difficult feat of actually getting your brain to think about something (a key step in actually changing your mind). Whenever someone exhorts you to "think outside the box", they usually, for your convenience, point out exactly where "outside the box" is located. Isn't it funny how nonconformists all dress the same...

Subsequence of How to Actually Change Your Mind.

Politics is the Mind-Killer

Some of the various ways that politics damages our sanity - including, of course, making it harder to change our minds on political issues.

Subsequence of How to Actually Change Your Mind.

Death Spirals and the Cult Attractor

Affective death spirals are positive feedback loop caused by the halo effect: Positive characteristics perceptually correlate, so the more nice things we say about X, the more additional nice things we're likely to believe about X.

Cultishness is an empirical attractor in human groups: roughly an affective death spiral; plus peer pressure and outcasting behavior; plus (often) defensiveness around something believed to be un-improvable.

Yet another subsequence of How to Actually Change Your Mind.

Joy in the Merely Real

If dragons were common, and you could look at one in the zoo - but zebras were a rare legendary creature that had finally been decided to be mythical - then there's a certain sort of person who would ignore dragons, who would never bother to look at dragons, and chase after rumors of zebras. The grass is always greener on the other side of reality.

Which is rather setting ourselves up for eternal disappointment, eh? If we cannot take joy in the merely real, our lives shall be empty indeed.

Subsequence of Reductionism.

Zombies Sequence

On the putative "possibility" of beings who are just like us in every sense, but not conscious - that is, lacking inner subjective experience.

Subsequence of Reductionism.

The Simple Math of Evolution

Learning the very basic math of evolutionary biology costs relatively little if you understand algebra, but gives you a surprisingly different perspective from what you'll find in strictly nonmathematical texts.

Challenging the Difficult

How to do things that are difficult or "impossible".

Coming of Age

How Yudkowsky made epic errors of reasoning as a teenage "rationalist" and recovered from them starting at around age 23, the period that he refers to as his Bayesian Enlightenment.