Cryonics is the practice of preserving people in liquid nitrogen after they stop breathing, since the brain doesn't vanish in a puff of smoke when you die, in the hope that more advanced future technology will be able to repair the brain or extract the person. (The "future technology" in question is generally specified as molecular nanotechnology, since cooling to liquid nitrogen temperatures, even in the presence of cryoprotectant, may denature some proteins.)
Cryonics-associated issues commonly raised on Less Wrong
- Advanced reductionism/physicalism (because of the issues associated with identifying a person with continuity of brain information).
- Whether an extended healthy lifespan is worthwhile (relates to Fun_Theory, religious rationalizations for 70-year lifespans, "sour grapes" rationalizations for why death is actually a good thing).
- The "shut up and multiply" aspect of spending $300/year (as Eliezer Yudkowsky quotes his costs for Cryonics Institute membership ($125/year) plus term life insurance ($180/year)) for a probability (how large being widely disputed) of obtaining many more years of lifespan. For this reason, cryonics advocates regard it as an *extreme case* of failure at rationality - a low-hanging fruit by which millions of deaths per year could be prevented at low cost.
- Cognitive biases contributing to emotional prejudice in favor of cryonics (optimistic bias, motivated cognition).
- The multiply chained nature of the probabilities involved in cryonics, and whether the final expected utility is worth the cost.
- [We Agree: Get Froze] by Robin Hanson. "My co-blogger Eliezer and I may disagree on AI fooms, but we agree on something quite contrarian and, we think, huge: More likely than not, most folks who die today didn't have to die! ... It seems far more people read this blog daily than have ever signed up for cryonics. While it is hard to justify most medical procedures using standard health economics calculations, such calculations say that at today's prices cryonics seems a good deal even if you think there's only a 5% chance it'll work."
- [You Only Live Twice] by Eliezer Yudkowsky. "My co-blogger Robin and I may disagree on how fast an AI can improve itself, but we agree on an issue that seems much simpler to us than that: At the point where the current legal and medical system gives up on a patient, they aren't really dead."
- [Quantum Mechanics and Personal Identity] by Eliezer Yudkowsky. A shortened index into the [Quantum Physics Sequence] describing only the prerequisite knowledge to understand the statement that "science can rule out a notion of personal identity that depends on your being composed of the same atoms - because modern physics has taken the concept of 'same atom' and thrown it out the window. There *are* no little billiard balls with individual identities. It's experimentally ruled out." The key post in this sequence is [[http://www.overcomingbias.com/2008/06/timeless-identi.html Timeless Identity", in which "Having used physics to completely trash all naive theories of identity, we reassemble a conception of persons and experiences from what is left" but this finale might make little sense without the prior discussion.
- [Break Cryonics Down] by Robin Hanson - tries to identify some of the chained probabilities involved in quantum mechanics.