# Difference between revisions of "Counterfactual mugging"

m |
|||

(5 intermediate revisions by 4 users not shown) | |||

Line 1: | Line 1: | ||

'''Counterfactual mugging''' is a thought experiment for testing and differentiating [[decision theory|decision theories]], stated as follows: | '''Counterfactual mugging''' is a thought experiment for testing and differentiating [[decision theory|decision theories]], stated as follows: | ||

{{Quote| | {{Quote| | ||

− | [[Omega]] appears and says that it has just tossed a fair coin, and given that the coin came up tails, it decided to ask you to give it $100. Whatever you do in this situation, nothing else will happen differently in reality as a result. Naturally you don't want to give up your $100. But Omega also tells you that if the coin came up heads instead of tails, it'd give you $10000, but only if you'd agree to give it $100 if the coin came up tails. | + | [[Omega]] appears and says that it has just tossed a fair coin, and given that the coin came up tails, it decided to ask you to give it $100. Whatever you do in this situation, nothing else will happen differently in reality as a result. Naturally you don't want to give up your $100. But Omega also tells you that if the coin came up heads instead of tails, it'd give you $10000, but only if you'd agree to give it $100 if the coin came up tails. Do you give Omega $100? |

}} | }} | ||

− | Depending on how the problem in | + | Depending on how the problem in phrased, intuition calls for different answers. For example, [[Eliezer Yudkowsky]] has argued that framing the problem in a way Omega is a regular aspect of the environment which regularly asks such types of questions makes most people answer 'Yes'. |

− | Formal decision theories also diverge. For [[Causal Decision Theory]], you can only affect those probabilities that you are causal linked to. Hence the answer should be 'No'. In [[Evidential Decision Theory]] any kind of connection is accounted, then the | + | Formal decision theories also diverge. For [[Causal Decision Theory]], you can only affect those probabilities that you are causal linked to. Hence, the answer should be 'No'. In [[Evidential Decision Theory]] any kind of connection is accounted, then the answer should be 'No'. [[Timeless Decision Theory]] answer seems undefined, however Yudkowsky has argued that if the problem is recurrently presented, one should answer 'Yes' on the basis of enhancing its probability of gaining $10000 in the next round. This seems to be Causal Decision Theory prescription as well. [[Updateless decision theory]][http://lesswrong.com/lw/15m/towards_a_new_decision_theory/] prescribes giving the $100, on the basis your decision can influence both the 'heads branch' and 'tails branch' of the universe. |

==Blog posts== | ==Blog posts== | ||

− | |||

*[http://lesswrong.com/lw/3l/counterfactual_mugging/ Counterfactual Mugging] by [[Vladimir Nesov]] | *[http://lesswrong.com/lw/3l/counterfactual_mugging/ Counterfactual Mugging] by [[Vladimir Nesov]] | ||

*[http://lesswrong.com/lw/135/timeless_decision_theory_problems_i_cant_solve/ Timeless Decision Theory: Problems I Can't Solve] by [[Eliezer Yudkowsky]] | *[http://lesswrong.com/lw/135/timeless_decision_theory_problems_i_cant_solve/ Timeless Decision Theory: Problems I Can't Solve] by [[Eliezer Yudkowsky]] | ||

*[http://lesswrong.com/lw/15m/towards_a_new_decision_theory/ Towards a New Decision Theory] by [http://weidai.com/ Wei Dai] | *[http://lesswrong.com/lw/15m/towards_a_new_decision_theory/ Towards a New Decision Theory] by [http://weidai.com/ Wei Dai] | ||

+ | *[http://lesswrong.com/lw/jrm/the_sin_of_updating_when_you_can_change_whether/ The sin of updating when you can change whether you exist] by Benya Fallenstein | ||

==External links== | ==External links== | ||

− | |||

*[http://ordinaryideas.wordpress.com/2011/12/31/counterfactual-blackmail-of-oneself/ Conterfactual Blackmail (of oneself)] by [http://lesswrong.com/user/paulfchristiano Paul F. Christiano] | *[http://ordinaryideas.wordpress.com/2011/12/31/counterfactual-blackmail-of-oneself/ Conterfactual Blackmail (of oneself)] by [http://lesswrong.com/user/paulfchristiano Paul F. Christiano] | ||

+ | *[https://casparoesterheld.com/2016/11/21/thoughts-on-updatelessnes/ Thoughts on Updatelessness] by Caspar Oesterheld | ||

==See also== | ==See also== | ||

+ | *[[Decision theory]] | ||

+ | *[[Acausal trade]] | ||

− | |||

*[[Newcomb's problem]] | *[[Newcomb's problem]] | ||

*[[Parfit's hitchhiker]] | *[[Parfit's hitchhiker]] | ||

Line 27: | Line 28: | ||

*[[Prisoner's dilemma]] | *[[Prisoner's dilemma]] | ||

*[[Pascal's mugging]] | *[[Pascal's mugging]] | ||

+ | *[[Updateless decision theory]] | ||

[[Category:Concepts]] | [[Category:Concepts]] | ||

[[Category:Decision theory]] | [[Category:Decision theory]] |

## Latest revision as of 19:18, 23 July 2018

**Counterfactual mugging** is a thought experiment for testing and differentiating decision theories, stated as follows:

Omega appears and says that it has just tossed a fair coin, and given that the coin came up tails, it decided to ask you to give it $100. Whatever you do in this situation, nothing else will happen differently in reality as a result. Naturally you don't want to give up your $100. But Omega also tells you that if the coin came up heads instead of tails, it'd give you $10000, but only if you'd agree to give it $100 if the coin came up tails. Do you give Omega $100?

Depending on how the problem in phrased, intuition calls for different answers. For example, Eliezer Yudkowsky has argued that framing the problem in a way Omega is a regular aspect of the environment which regularly asks such types of questions makes most people answer 'Yes'.

Formal decision theories also diverge. For Causal Decision Theory, you can only affect those probabilities that you are causal linked to. Hence, the answer should be 'No'. In Evidential Decision Theory any kind of connection is accounted, then the answer should be 'No'. Timeless Decision Theory answer seems undefined, however Yudkowsky has argued that if the problem is recurrently presented, one should answer 'Yes' on the basis of enhancing its probability of gaining $10000 in the next round. This seems to be Causal Decision Theory prescription as well. Updateless decision theory[1] prescribes giving the $100, on the basis your decision can influence both the 'heads branch' and 'tails branch' of the universe.

## Blog posts

- Counterfactual Mugging by Vladimir Nesov
- Timeless Decision Theory: Problems I Can't Solve by Eliezer Yudkowsky
- Towards a New Decision Theory by Wei Dai
- The sin of updating when you can change whether you exist by Benya Fallenstein

## External links

- Conterfactual Blackmail (of oneself) by Paul F. Christiano
- Thoughts on Updatelessness by Caspar Oesterheld