Difference between revisions of "Moral uncertainty"

From Lesswrongwiki
Jump to: navigation, search
m (Blog posts)
m
Line 1: Line 1:
 
'''Moral uncertainty''' (or '''normative uncertainty''') is uncertainty about what you should do — not merely uncertainty about external facts, like what consequences will follow a given course of action, but uncertainty about the ''moral implications'' of those facts. You might know you're in a position to save three strangers at the cost of your own life, and still not know what to do, because you're not sure whether to apply utilitarian or egoist morality.
 
'''Moral uncertainty''' (or '''normative uncertainty''') is uncertainty about what you should do — not merely uncertainty about external facts, like what consequences will follow a given course of action, but uncertainty about the ''moral implications'' of those facts. You might know you're in a position to save three strangers at the cost of your own life, and still not know what to do, because you're not sure whether to apply utilitarian or egoist morality.
  
[[Expected utility]] is well-established as the right way of dealing with ordinary uncertainty. However, it is not straightforward to apply expected utility to moral uncertainty. Many moral theories don't assign utilities. A theory that does assign utilities can scale all the utilities it assigns by a constant without it affecting what outcomes or actions it prefers.
+
[[Expected utility]] is well-established as the right way of dealing with ordinary uncertainty. However, it is not straightforward to apply expected utility to moral uncertainty. Many moral theories don't assign utilities. Those theories that do assign utilities are faced with the problem that they can multiply all their utilities by a constant, and still prefer the same outcomes and actions. This would raise the question of what constant to pick in each case: in other words, how to calibrate the value of a [[Utils|util]] across theories.
  
 
Another approach is to follow only the most probable theory. This has its own problems. For example, what if the most probable theory points only weakly in one way, and other theories point strongly the other way?
 
Another approach is to follow only the most probable theory. This has its own problems. For example, what if the most probable theory points only weakly in one way, and other theories point strongly the other way?

Revision as of 05:28, 29 June 2012

Moral uncertainty (or normative uncertainty) is uncertainty about what you should do — not merely uncertainty about external facts, like what consequences will follow a given course of action, but uncertainty about the moral implications of those facts. You might know you're in a position to save three strangers at the cost of your own life, and still not know what to do, because you're not sure whether to apply utilitarian or egoist morality.

Expected utility is well-established as the right way of dealing with ordinary uncertainty. However, it is not straightforward to apply expected utility to moral uncertainty. Many moral theories don't assign utilities. Those theories that do assign utilities are faced with the problem that they can multiply all their utilities by a constant, and still prefer the same outcomes and actions. This would raise the question of what constant to pick in each case: in other words, how to calibrate the value of a util across theories.

Another approach is to follow only the most probable theory. This has its own problems. For example, what if the most probable theory points only weakly in one way, and other theories point strongly the other way?

Nick Bostrom and Toby Ord have proposed a parliamentary model. In this model, each theory sends a number of delegates to a parliament in proportion to its probability. The theories then bargain for support as if the probability of each action were proportional to its votes. However, the actual output is always the action with the most votes. Bostrom and Ord's proposal lets probable theories determine most actions, but still gives less probable theories influence on issues they consider unusually important.

Blog posts

External links

See also