From Lesswrongwiki
Revision as of 09:17, 29 September 2009 by Eliezer Yudkowsky (talk | contribs) (rewrote article to distinguish "prior probability" and introduce controversy surrounding priors)
Jump to: navigation, search
Wikipedia has an article about

A Bayesian uses Bayes's Theorem to update beliefs based on the evidence. This requires that, even in advance of seeing the evidence, you have beliefs about what the evidence means - how likely you are to see the evidence, if various hypotheses are true - and how likely those hypotheses were, in advance of seeing the evidence. To calculate a posterior probability using Bayes's Theorem, you need a prior probability and likelihood distribution.

Suppose you had a barrel containing some number of red and white balls. If you start with the belief that each ball was independently assigned red color (vs. white color) at some fixed probability between 0 and 1, and you start out ignorant of this fixed probability (the parameter could anywhere between 0 and 1), then each red ball you see makes it more likely that the next ball will be red. (By Laplace's Rule of Succession.)

On the other hand, if you start out with the prior belief that the barrel contains exactly 10 red balls and 10 white balls, then each red ball you see makes it less likely that the next ball will be red (because there are fewer red balls remaining).

Thus our prior can affect how we interpret the evidence. The first prior is an inductive prior; things that happened before are predicted to happen again with greater probability. The second prior is anti-inductive; the more red balls we see, the fewer we expect to see in the future.

In both cases, you started out believing something about the barrel - presumably because someone else told you, or because you saw it with your own eyes. But then their words, or even your own eyesight, was evidence, and you must have had prior beliefs about probabilities and likelihoods in order to interpret the evidence. So it seems that an ideal Bayesian would need some sort of inductive prior at the very moment they were born; and where an ideal Bayesian would get this prior, has occasionally been a matter of considerable controversy in the philosophy of probability.

The phrase "Prior probability" (as opposed to "priors") usually refers to a point estimate already based on considerable evidence - for example when we estimate the number of women who start out with breast cancer at age 40, in advance of performing any mammographies.

Blog posts

See also