Difference between revisions of "Pascal's mugging"

From Lesswrongwiki
Jump to: navigation, search
m
(the problem isn't the unbounded utility function, it's how fast the utilities grow)
Line 1: Line 1:
'''Pascal's mugging''' refers to a [[thought experiment]] in decision theory, named in analogy to [[Wikipedia:Pascal's wager|Pascal's wager]]. An agent with an unbounded utility function can potentially be exploited by bets offering tiny probabilities of vast utilities: describable utilities can get large faster than the probabilities of corresponding situations given by the [[Solomonoff induction|Solomonoff]]-like [[priors]] get small. The situation is dramatized by a mugger:
+
'''Pascal's mugging''' refers to a [[thought experiment]] in decision theory, a finite analogue of [[Wikipedia:Pascal's wager|Pascal's wager]]. The situation is dramatized by a mugger:
  
 
{{Quote|
 
{{Quote|
Line 5: Line 5:
 
|[http://lesswrong.com/lw/kd/pascals_mugging_tiny_probabilities_of_vast/ Pascal's Mugging: Tiny Probabilities of Vast Utilities]}}
 
|[http://lesswrong.com/lw/kd/pascals_mugging_tiny_probabilities_of_vast/ Pascal's Mugging: Tiny Probabilities of Vast Utilities]}}
  
Intuitively, one is not inclined to acquiesce to the mugger's demands, and yet it's not clear how this intuition can be justified in [[decision theory]].
+
If an agent's utilities over outcomes can potentially grow much faster than the probability of those outcomes diminishes, then it will be dominated by tiny probabilities of hugely important outcomes.  As pointed out by Peter de Blanc, its expected utilities may not even converge.  The prior over computable universes in [[Solomonoff induction]] seems to have this problem in particular - more generally, if prior probability goes as simplicity of physical law, then small increases in complexity can correspond to enormous increases in the size of even a finite universe.
 +
 
 +
Intuitively, one is not inclined to acquiesce to the mugger's demands - or even pay all that much attention one way or another - but what kind of prior does this imply?
  
 
==Blog posts==
 
==Blog posts==
Line 18: Line 20:
  
 
==References==
 
==References==
 +
 +
*{{Cite journal
 +
|title=Convergence of Expected Utilities with Algorithmic Probability Distributions
 +
|author=Peter de Blanc
 +
|year=2007
 +
|url=http://arxiv.org/abs/0712.4318
 +
}}
  
 
*{{Cite journal
 
*{{Cite journal

Revision as of 02:56, 4 November 2009

Pascal's mugging refers to a thought experiment in decision theory, a finite analogue of Pascal's wager. The situation is dramatized by a mugger:

Now suppose someone comes to me and says, "Give me five dollars, or I'll use my magic powers from outside the Matrix to run a Turing machine that simulates and kills 3^^^^3 people."

If an agent's utilities over outcomes can potentially grow much faster than the probability of those outcomes diminishes, then it will be dominated by tiny probabilities of hugely important outcomes. As pointed out by Peter de Blanc, its expected utilities may not even converge. The prior over computable universes in Solomonoff induction seems to have this problem in particular - more generally, if prior probability goes as simplicity of physical law, then small increases in complexity can correspond to enormous increases in the size of even a finite universe.

Intuitively, one is not inclined to acquiesce to the mugger's demands - or even pay all that much attention one way or another - but what kind of prior does this imply?

Blog posts

See also

References

  • Nick Bostrom (2009). "Pascal's Mugging". Analysis 69 (3): 443-445.  (PDF)