Difference between revisions of "Counterfactual mugging"

From Lesswrongwiki
Jump to: navigation, search
Line 3: Line 3:
 
[[Omega]] appears and says that it has just tossed a fair coin, and given that the coin came up tails, it decided to ask you to give it $100. Whatever you do in this situation, nothing else will happen differently in reality as a result. Naturally you don't want to give up your $100. But Omega also tells you that if the coin came up heads instead of tails, it'd give you $10000, but only if you'd agree to give it $100 if the coin came up tails.
 
[[Omega]] appears and says that it has just tossed a fair coin, and given that the coin came up tails, it decided to ask you to give it $100. Whatever you do in this situation, nothing else will happen differently in reality as a result. Naturally you don't want to give up your $100. But Omega also tells you that if the coin came up heads instead of tails, it'd give you $10000, but only if you'd agree to give it $100 if the coin came up tails.
 
}}
 
}}
Depending on how the problem in phared, intuition calls diferentt answers. For example, [[Eliezer Yudkowsky]] has argued that framing the problem in a way Omega is a regular aspect of the environment which regularly asks such types of questions makes most people answer 'Yes'.
+
Depending on how the problem in phrased, intuition calls for different answers. For example, [[Eliezer Yudkowsky]] has argued that framing the problem in a way Omega is a regular aspect of the environment which regularly asks such types of questions makes most people answer 'Yes'.
  
Formal decision theories also diverge. For [[Causal Decision Theory]], you can only affect those probabilities that you are causal linked to. Hence the answer should be 'No'. In [[Evidential Decision Theory]] any kind of connection is accounted, then the answere should be 'No'. [[Timeless Decision Theory]] answer seems undefined, however Yudkowsky has argued that if the problem is recurrent presented, one should answer 'Yes' on the basis of enhancing its probability of gaining $10000 in the next round. This seems to be Causal Decision Theory prescription as well. [[Updateless decision theory]][http://lesswrong.com/lw/15m/towards_a_new_decision_theory/] prescribes given the $100, on the basis your decision can influence both the 'heads branch' and 'tails branch' of the universe.
+
Formal decision theories also diverge. For [[Causal Decision Theory]], you can only affect those probabilities that you are causal linked to. Hence, the answer should be 'No'. In [[Evidential Decision Theory]] any kind of connection is accounted, then the answer should be 'No'. [[Timeless Decision Theory]] answer seems undefined, however Yudkowsky has argued that if the problem is recurrent presented, one should answer 'Yes' on the basis of enhancing its probability of gaining $10000 in the next round. This seems to be Causal Decision Theory prescription as well. [[Updateless decision theory]][http://lesswrong.com/lw/15m/towards_a_new_decision_theory/] prescribes giving the $100, on the basis your decision can influence both the 'heads branch' and 'tails branch' of the universe.
  
 
==Blog posts==
 
==Blog posts==
 
 
*[http://lesswrong.com/lw/3l/counterfactual_mugging/ Counterfactual Mugging] by [[Vladimir Nesov]]
 
*[http://lesswrong.com/lw/3l/counterfactual_mugging/ Counterfactual Mugging] by [[Vladimir Nesov]]
 
*[http://lesswrong.com/lw/135/timeless_decision_theory_problems_i_cant_solve/ Timeless Decision Theory: Problems I Can't Solve] by [[Eliezer Yudkowsky]]
 
*[http://lesswrong.com/lw/135/timeless_decision_theory_problems_i_cant_solve/ Timeless Decision Theory: Problems I Can't Solve] by [[Eliezer Yudkowsky]]
Line 14: Line 13:
  
 
==External links==
 
==External links==
 
 
*[http://ordinaryideas.wordpress.com/2011/12/31/counterfactual-blackmail-of-oneself/ Conterfactual Blackmail (of oneself)] by [http://lesswrong.com/user/paulfchristiano Paul F. Christiano]
 
*[http://ordinaryideas.wordpress.com/2011/12/31/counterfactual-blackmail-of-oneself/ Conterfactual Blackmail (of oneself)] by [http://lesswrong.com/user/paulfchristiano Paul F. Christiano]
  
 
==See also==
 
==See also==
 
 
*[[Decision theory]]
 
*[[Decision theory]]
 
*[[Newcomb's problem]]
 
*[[Newcomb's problem]]

Revision as of 05:31, 16 October 2012

Counterfactual mugging is a thought experiment for testing and differentiating decision theories, stated as follows:

Omega appears and says that it has just tossed a fair coin, and given that the coin came up tails, it decided to ask you to give it $100. Whatever you do in this situation, nothing else will happen differently in reality as a result. Naturally you don't want to give up your $100. But Omega also tells you that if the coin came up heads instead of tails, it'd give you $10000, but only if you'd agree to give it $100 if the coin came up tails.

Depending on how the problem in phrased, intuition calls for different answers. For example, Eliezer Yudkowsky has argued that framing the problem in a way Omega is a regular aspect of the environment which regularly asks such types of questions makes most people answer 'Yes'.

Formal decision theories also diverge. For Causal Decision Theory, you can only affect those probabilities that you are causal linked to. Hence, the answer should be 'No'. In Evidential Decision Theory any kind of connection is accounted, then the answer should be 'No'. Timeless Decision Theory answer seems undefined, however Yudkowsky has argued that if the problem is recurrent presented, one should answer 'Yes' on the basis of enhancing its probability of gaining $10000 in the next round. This seems to be Causal Decision Theory prescription as well. Updateless decision theory[1] prescribes giving the $100, on the basis your decision can influence both the 'heads branch' and 'tails branch' of the universe.

Blog posts

External links

See also