# Difference between revisions of "Priors"

Kaj Sotala (talk | contribs) |
Pedrochaves (talk | contribs) |
||

Line 14: | Line 14: | ||

− | == | + | ==Updating prior probabilities== |

− | + | It's important to notice that piors represent a commitment to a certain belief. That is, as [http://lesswrong.com/lw/ear/whats_the_value_of_information/7anb Cyan puts it], you can't ''shift'' your prior. What happens is that, after being presented with the evidence, you update your prior probability, thus actually becoming a posterior probability. | |

− | |||

− | + | It should be noticed, however, that it can make sense to informally talk about updating priors when dealing with a sequence of inferences. In such cases, posterior probability actually becomes a prior for the next inference, so it can make it easier to refer to it in that way. | |

− | |||

− | |||

− | |||

==References== | ==References== |

## Revision as of 03:11, 21 October 2012

In the context of Bayes's Theorem, **Priors** refer generically to the beliefs an agent holds regarding a fact, hypothesis or consequence, before being presented with evidence. More technically, in order for this agent to calculate a posterior probability using Bayes's Theorem, this referred prior probability and likelihood distribution are needed.

## Examples

Suppose you had a barrel containing some number of red and white balls. You start with the belief that each ball was independently assigned red color (vs. white color) at some fixed probability. Furthermore, you start out ignorant of this fixed probability (the parameter could be anywhere between 0 and 1). Each red ball you see then makes it *more* likely that the next ball will be red (following a Laplacian Rule of Sucession).

On the other hand, if you start out with the prior belief that the barrel contains exactly 10 red balls and 10 white balls, then each red ball you see makes it *less* likely that the next ball will be red (because there are fewer red balls remaining).

Thus our prior can affect how we interpret the evidence. The first prior is an inductive prior - things that happened before are predicted to happen again with greater probability. The second prior is anti-inductive - the more red balls we see, the fewer we expect to see in the future.

In both cases, you started out believing something about the barrel - presumably because someone else told you, or because you saw it with your own eyes. But then their words, or even your own eyesight, was evidence, and you must have had prior beliefs about probabilities and likelihoods in order to interpret the evidence. So it seems that an ideal Bayesian would need some sort of inductive prior at the very moment they were born. Where an ideal Bayesian would get this prior, has occasionally been a matter of considerable controversy in the philosophy of probability.

As a real life example, consider two leaders from different political parties. Each one has his own beliefs about social organization and the roles of people and government in society. These differences can be attributed to a wide range of factors, from genetic variability to education influence in their personalities and condition the politics and laws they want to implement. However, neither can show that his beliefs are better than those of the other, unless he can show that his priors were generated by sources which track reality better^{[1]}.

## Updating prior probabilities

It's important to notice that piors represent a commitment to a certain belief. That is, as Cyan puts it, you can't *shift* your prior. What happens is that, after being presented with the evidence, you update your prior probability, thus actually becoming a posterior probability.

It should be noticed, however, that it can make sense to informally talk about updating priors when dealing with a sequence of inferences. In such cases, posterior probability actually becomes a prior for the next inference, so it can make it easier to refer to it in that way.

## References

- ↑ Robin Hanson (2006). "Uncommon Priors Require Origin Disputes". Theory and Decision 61 (4) 319–328. http://hanson.gmu.edu/prior.pdf

## Blog posts

- Priors as Mathematical Objects
- "Inductive Bias"
- Probability is Subjectively Objective
- Bead Jar Guesses by Alicorn - Applied scenario about forming priors.