# Difference between revisions of "Do the math, then burn the math and go with your gut"

(new page) |
|||

Line 22: | Line 22: | ||

* [https://blog.givewell.org/2011/08/18/why-we-cant-take-expected-value-estimates-literally-even-when-theyre-unbiased/ "Why we can’t take expected value estimates literally (even when they’re unbiased)"] (August 2011) by [[GiveWell]] co-founder [[Holden Karnofsky]] makes a similar point: "It’s my view that my brain instinctively processes huge amounts of information, coming from many different reference classes, and arrives at a prior; if I attempt to formalize my prior, counting only what I can name and justify, I can worsen the accuracy a lot relative to going with my gut. Of course there is a problem here: going with one’s gut can be an excuse for going with what one wants to believe, and a lot of what enters into my gut belief could be irrelevant to proper Bayesian analysis. There is an appeal to formulas, which is that they seem to be susceptible to outsiders’ checking them for fairness and consistency." | * [https://blog.givewell.org/2011/08/18/why-we-cant-take-expected-value-estimates-literally-even-when-theyre-unbiased/ "Why we can’t take expected value estimates literally (even when they’re unbiased)"] (August 2011) by [[GiveWell]] co-founder [[Holden Karnofsky]] makes a similar point: "It’s my view that my brain instinctively processes huge amounts of information, coming from many different reference classes, and arrives at a prior; if I attempt to formalize my prior, counting only what I can name and justify, I can worsen the accuracy a lot relative to going with my gut. Of course there is a problem here: going with one’s gut can be an excuse for going with what one wants to believe, and a lot of what enters into my gut belief could be irrelevant to proper Bayesian analysis. There is an appeal to formulas, which is that they seem to be susceptible to outsiders’ checking them for fairness and consistency." | ||

* [https://confusopoly.com/2019/04/03/the-optimizers-curse-wrong-way-reductions/ "The Optimizer’s Curse & Wrong-Way Reductions"] by Christian Smith discusses similar issues | * [https://confusopoly.com/2019/04/03/the-optimizers-curse-wrong-way-reductions/ "The Optimizer’s Curse & Wrong-Way Reductions"] by Christian Smith discusses similar issues | ||

+ | * [https://en.wikipedia.org/wiki/Verbal_overshadowing Verbal overshadowing] page on Wikipedia |

## Latest revision as of 22:42, 24 May 2019

"**Do the math, then burn the math and go with your gut**"^{[1]} is a procedure for decision-making that has been described by Eliezer Yudkowsky. The basic procedure is to go through the process of assigning numbers and probabilities that are relevant to some decision ("do the math") and then to throw away this calculation and instead make the final decision with one's gut feelings ("burn the math and go with your gut"). The purpose of the first step is to force oneself to think through all the details of the decision and to spot inconsistencies.

## Contents

## History

In July 2008, Eliezer Yudkowsky wrote the blog post "When (Not) To Use Probabilities", which discusses the situations under which it is a bad idea to verbally assign probabilities. Specifically, the post claims that while theoretical arguments in favor of using probabilities (such as Dutch book and coherence arguments) always apply, humans have evolved algorithms for reasoning under uncertainty that don't involve verbally assigning probabilities (such as using "gut feelings"), which in practice often perform better than actually assigning probabilities. In other words, the post argues in favor of using humans' non-verbal/built-in forms of reasoning under uncertainty even if this makes humans incoherent/subject to Dutch books, because forcing humans to articulate probabilities would actually lead to worse outcomes. The post also contains the quote "there *are* benefits from trying to translate your gut feelings of uncertainty into verbal probabilities. It may help you spot problems like the conjunction fallacy. It may help you spot internal inconsistencies – though it may not show you any way to remedy them."^{[2]}

In October 2011, LessWrong user bentarm gave an outline of the procedure in a comment in the context of the Amanda Knox case. The steps were: "(1) write down a list of all of the relevant facts on either side of the argument. (2) assign numerical weights to each of the facts, according to how much they point you in one direction or another. (3) burn the piece of paper on which you wrote down the facts, and go with your gut." This description was endorsed by Yudkowsky in a follow-up comment. bentarm's comment claims that Yudkowsky described the procedure during summer of 2011.^{[3]}

In December 2016, Anna Salamon described the procedure parenthetically at the end of a blog post. Salamon described the procedure as follows: "Eliezer once described what I take to be the a similar ritual for avoiding bucket errors, as follows: When deciding which apartment to rent (he said), one should first do out the math, and estimate the number of dollars each would cost, the number of minutes of commute time times the rate at which one values one's time, and so on. But at the end of the day, if the math says the wrong thing, one should do the right thing anyway."^{[4]}

## See also

- CFAR Exercise Prize – Andrew Critch's Bayes game, described on this page, gives another technique for dealing with uncertainty in real-life situations

## References

- ↑ Qiaochu Yuan. "Qiaochu_Yuan comments on A Sketch of Good Communication". March 31, 2018.
*LessWrong*. - ↑ Eliezer Yudkowsky. "When (Not) To Use Probabilities". July 23, 2008.
*LessWrong*. - ↑ bentarm. "bentarm comments on Amanda Knox: post mortem". October 21, 2011.
*LessWrong*. - ↑ Anna Salamon. "'Flinching away from truth' is often about *protecting* the epistemology". December 20, 2016.
*LessWrong*.

## External links

- A Facebook post by Julia Galef from May 2018 inquiring about this procedure
- "Why we can’t take expected value estimates literally (even when they’re unbiased)" (August 2011) by GiveWell co-founder Holden Karnofsky makes a similar point: "It’s my view that my brain instinctively processes huge amounts of information, coming from many different reference classes, and arrives at a prior; if I attempt to formalize my prior, counting only what I can name and justify, I can worsen the accuracy a lot relative to going with my gut. Of course there is a problem here: going with one’s gut can be an excuse for going with what one wants to believe, and a lot of what enters into my gut belief could be irrelevant to proper Bayesian analysis. There is an appeal to formulas, which is that they seem to be susceptible to outsiders’ checking them for fairness and consistency."
- "The Optimizer’s Curse & Wrong-Way Reductions" by Christian Smith discusses similar issues
- Verbal overshadowing page on Wikipedia