Suppose that a disease, or a monster, or a war, or something, is killing people. And suppose you only have enough resources to implement one of the following two options:
- Save 400 lives, with certainty.
- Save 500 lives, with 90\% probability; save no lives, 10\% probability.
Most people choose option 1. Which, I think, is foolish; because if you multiply 500 lives by 90% probability, you get an expected value of 450 lives, which exceeds the 400-life value of option 1. (Lives saved don't diminish in marginal utility, so this is an appropriate calculation.)
``What!" you cry, incensed. ``How can you gamble with human lives? How can you think about numbers when so much is at stake? What if that 10% probability strikes, and everyone dies? So much for your damned logic! You're following your rationality off a cliff!"
Ah, but here's the interesting thing. If you present the options this way:
- 100 people die, with certainty.
- 90% chance no one dies; 10% chance 500 people die.
Then a majority choose option 2. Even though it's the same gamble. You see, just as a certainty of saving 400 lives seems to feel so much more comfortable than an unsure gain, so too, a certain loss feels worse than an uncertain one.
You can grandstand on the second description too: "How can you condemn 100 people to certain death when there's such a good chance you can save them? We'll all share the risk! Even if it was only a 75% chance of saving everyone, it would still be worth it---so long as there's a chance---everyone makes it, or no one does!"
You know what? This isn't about your feelings. A human life, with all its joys and all its pains, adding up over the course of decades, is worth far more than your brain's feelings of comfort or discomfort with a plan. Does computing the expected utility feel too cold-blooded for your taste? Well, that feeling isn't even a feather in the scales, when a life is at stake. Just shut up and multiply.
A googol is 10^100---a 1 followed by one hundred zeroes. A googolplex is an even more incomprehensibly large number---it's 10^googol, a 1 followed by a googol zeroes. Now pick some trivial inconvenience, like a hiccup, and some decidedly untrivial misfortune, like getting slowly torn limb from limb by sadistic mutant sharks. If we're forced into a choice between either preventing a googolplex people's hiccups, or preventing a single person's shark attack, which choice should we make? If you assign any negative value to hiccups, then, on pain of decision-theoretic incoherence, there must be some number of hiccups that would add up to rival the negative value of a shark attack. For any particular finite evil, there must be some number of hiccups that would be even worse.
Moral dilemmas like these aren't conceptual blood sports for keeping analytic philosophers entertained at dinner parties. They're distilled versions of the kinds of situations we actually find ourselves in every day. Should I spend $50 on a console game, or give it all to charity? Should I organize a $700,000 fundraiser to pay for a single bone marrow transplant, or should I use that same money on mosquito nets and prevent the malaria deaths of some 200 children?
Yet there are many who avert their gaze from the real world's abundance of unpleasant moral tradeoffs--many, too, who take pride in looking away. Research shows that people distinguish "sacred values," like human lives, from "unsacred values," like money. When you try to trade off a sacred value against an unsacred value, subjects express great indignation. (Sometimes they want to punish the person who made the suggestion.)
My favorite anecdote along these lines comes from a team of researchers who evaluated the effectiveness of a certain project, calculating the cost per life saved, and recommended to the government that the project be implemented because it was cost-effective. The governmental agency rejected the report because, they said, you couldn't put a dollar value on human life. After rejecting the report, the agency decided not to implement the measure.
Trading off a sacred value against an unsacred value feels really awful. To merely multiply utilities would be too cold-blooded--it would be following rationality off a cliff...
But altruism isn't the warm fuzzy feeling you get from being altruistic. If you're doing it for the spiritual benefit, that is nothing but selfishness. The primary thing is to help others, whatever the means. So shut up and multiply!
And if it seems to you that there is a fierceness to this maximization, like the bare sword of the law, or the burning of the Sun--if it seems to you that at the center of this rationality there is a small cold flame--
Well, the other way might feel better inside you. But it wouldn't work.
And I say also this to you: That if you set aside your regret for all the spiritual satisfaction you could be having--if you wholeheartedly pursue the Way, without thinking that you are being cheated--if you give yourself over to rationality without holding back, you will find that rationality gives to you in return.
But that part only works if you don't go around saying to yourself, "It would feel better inside me if only I could be less rational." Should you be sad that you have the opportunity to actually help people? You cannot attain your full potential if you regard your gift as a burden.