# Difference between revisions of "Causal Decision Theory"

(10 intermediate revisions by 3 users not shown) | |||

Line 1: | Line 1: | ||

− | {{wikilink}} | + | {{arbitallink|https://arbital.com/p/causal_dt/|Causal decision theories}} |

− | '''Causal Decision Theory''' – CDT - is a branch of [[decision theory]] which advises an agent to take actions that maximizes the causal consequences of desired outcomes <ref> http://plato.stanford.edu/entries/decision-causal/ </ref>. As any branch of decision theory, it prescribes taking the action that maximizes [[utility]], that which utility equals or exceeds the utility of every other option. The utility of each action is measured by the [[expected utility]], the averaged by probabilities sum of the utility of each of its results. How the actions can influence the probabilities differ between the branches. Contrary to [[Evidential Decision Theory]] – EDT - CDT focuses on the causal relations between one’s actions and its outcomes, instead of focusing on which actions provide evidences for desired outcomes. According to CDT a rational agent should track the available causal relations linking his actions to the desired outcome and take the action which will better enhance the chances of the desired outcome. | + | {{wikilink|Causal decision theory}} |

+ | '''Causal Decision Theory''' – CDT - is a branch of [[decision theory]] which advises an agent to take actions that maximizes the causal consequences on the probability of desired outcomes <ref> http://plato.stanford.edu/entries/decision-causal/ </ref>. As any branch of decision theory, it prescribes taking the action that maximizes [[utility]], that which utility equals or exceeds the utility of every other option. The utility of each action is measured by the [[expected utility]], the averaged by probabilities sum of the utility of each of its possible results. How the actions can influence the probabilities differ between the branches. Contrary to [[Evidential Decision Theory]] – EDT - CDT focuses on the causal relations between one’s actions and its outcomes, instead of focusing on which actions provide evidences for desired outcomes. According to CDT a rational agent should track the available causal relations linking his actions to the desired outcome and take the action which will better enhance the chances of the desired outcome. | ||

− | The core aspect of CDT is mathematically represented by the fact it uses probabilities of conditionals in place of conditional probabilities <ref>Lewis, David. (1981) "Causal Decision Theory," Australasian Journal of Philosophy 59 (1981): 5- 30.</ref>. A conditional probability of B given A - P(A | + | One usual example where EDT and CDT commonly diverge is the [[Smoking lesion]]: |

+ | “Smoking is strongly correlated with lung cancer, but in the world of the Smoker's Lesion this correlation is understood to be the result of a common cause: a genetic lesion that tends to cause both smoking and cancer. Once we fix the presence or absence of the lesion, there is no additional correlation between smoking and cancer. | ||

+ | Suppose you prefer smoking without cancer to not smoking without cancer, and prefer smoking with cancer to not smoking with cancer. Should you smoke?” | ||

+ | CDT would recommend smoking since there is no causal connection between smoking and cancer. They are both caused by a gene, but have no causal direct connection with each other. EDT on the other hand would recommend against smoking, since smoking is an evidence for having the mentioned gene and thus should be avoided. | ||

+ | |||

+ | The core aspect of CDT is mathematically represented by the fact it uses probabilities of conditionals in place of conditional probabilities <ref>Lewis, David. (1981) "Causal Decision Theory," Australasian Journal of Philosophy 59 (1981): 5- 30.</ref>. The probability of a conditional is the probability of the whole conditional being true, where the conditional probability is the probability of the consequent given the antecedent. A conditional probability of B given A - P(B|A) -, simply implies the [[Bayesian probability]] of the event B happening given we known A happened, it’s used in EDT. The probability of conditionals – P(A > B) - refers to the probability that the conditional 'A implies B' is true, it is the probability of the contrafactual ‘If A, then B’ be the case. Since contrafactual analysis is the key tool used to speak about causality, probability of conditionals are said to mirror causal relations. In most cases these two probabilities track each other, and CDT and EDT give the same answers. However, some particular problems have arisen where their predictions for rational action diverge such as the [[Smoking lesion]] problem – where CDT seems to give a more reasonable prescription – and [[Newcomb's problem]] – where CDT seems unreasonable. David Lewis proved <ref>Lewis, D. (1976), "Probabilities of conditionals and conditional probabilities", The Philosophical Review (Duke University Press) 85 (3): 297–315</ref> it's impossible to probabilities of conditionals to always track conditional probabilities. Hence, evidential relations aren’t the same as causal relations and CDT and EDT will always diverge in some cases. | ||

==References== | ==References== |

## Latest revision as of 05:12, 28 August 2016

**Causal Decision Theory** – CDT - is a branch of decision theory which advises an agent to take actions that maximizes the causal consequences on the probability of desired outcomes ^{[1]}. As any branch of decision theory, it prescribes taking the action that maximizes utility, that which utility equals or exceeds the utility of every other option. The utility of each action is measured by the expected utility, the averaged by probabilities sum of the utility of each of its possible results. How the actions can influence the probabilities differ between the branches. Contrary to Evidential Decision Theory – EDT - CDT focuses on the causal relations between one’s actions and its outcomes, instead of focusing on which actions provide evidences for desired outcomes. According to CDT a rational agent should track the available causal relations linking his actions to the desired outcome and take the action which will better enhance the chances of the desired outcome.

One usual example where EDT and CDT commonly diverge is the Smoking lesion: “Smoking is strongly correlated with lung cancer, but in the world of the Smoker's Lesion this correlation is understood to be the result of a common cause: a genetic lesion that tends to cause both smoking and cancer. Once we fix the presence or absence of the lesion, there is no additional correlation between smoking and cancer. Suppose you prefer smoking without cancer to not smoking without cancer, and prefer smoking with cancer to not smoking with cancer. Should you smoke?” CDT would recommend smoking since there is no causal connection between smoking and cancer. They are both caused by a gene, but have no causal direct connection with each other. EDT on the other hand would recommend against smoking, since smoking is an evidence for having the mentioned gene and thus should be avoided.

The core aspect of CDT is mathematically represented by the fact it uses probabilities of conditionals in place of conditional probabilities ^{[2]}. The probability of a conditional is the probability of the whole conditional being true, where the conditional probability is the probability of the consequent given the antecedent. A conditional probability of B given A - P(B|A) -, simply implies the Bayesian probability of the event B happening given we known A happened, it’s used in EDT. The probability of conditionals – P(A > B) - refers to the probability that the conditional 'A implies B' is true, it is the probability of the contrafactual ‘If A, then B’ be the case. Since contrafactual analysis is the key tool used to speak about causality, probability of conditionals are said to mirror causal relations. In most cases these two probabilities track each other, and CDT and EDT give the same answers. However, some particular problems have arisen where their predictions for rational action diverge such as the Smoking lesion problem – where CDT seems to give a more reasonable prescription – and Newcomb's problem – where CDT seems unreasonable. David Lewis proved ^{[3]} it's impossible to probabilities of conditionals to always track conditional probabilities. Hence, evidential relations aren’t the same as causal relations and CDT and EDT will always diverge in some cases.

## References

- ↑ http://plato.stanford.edu/entries/decision-causal/
- ↑ Lewis, David. (1981) "Causal Decision Theory," Australasian Journal of Philosophy 59 (1981): 5- 30.
- ↑ Lewis, D. (1976), "Probabilities of conditionals and conditional probabilities", The Philosophical Review (Duke University Press) 85 (3): 297–315