Wagering Calamity: Positive Utility
The objection can be introduced with the following thought experiment. Say you had to choose between:
Option 1: A 99% chance of causing the death of everyone and a 1% chance of bringing about nine trillion maximally happy people.
Option 2: A guaranteed outcome of creating nine billion maximally happy people.
According to the principles of classic utilitarianism, the expected utility from Option 1 (0.01 x 9 trillion = 90 billion) exceeds that of Option 2. Yet, it gives us a glaring 99% chance of a disastrous outcome.1
Negative Utility
The Wagering Calamity Objection becomes even more alarming when we frame it in terms of negative utility. Say you face the following decision:
Option 1: A 99% chance that everyone on earth gets tortured for all of time (-100 utils per person) and a 1% chance that a septillion happy people get created (+100 utils pp) for all of time
Option 2: A 100% chance that everyone on earth becomes maximally happy for all of time (+100 utils pp)
Let's assume the population in both these scenario's remain stable over time (or grow similarly), Expected Value Theory (and classic utilitarianism by extension) says we should choose option 1, even though this has a 99% chance of an s-risk, over a guaranteed everlasting utopia for everyone.
We can make it more pernicious by combining it with the repugnant conclusion to give option 1 a 1% chance of creating an enormous amount of people whose lives are barely worth living (+1 util pp), but are still in aggregate more utils.
Morality and the Role of Risk
Classic utilitarian calculations seem to disregard our innate moral preferences that lean towards risk aversion. Given the scenarios presented, I think most people would choose Option 2 or Option B, valuing the certainty of a positive outcome over an immensely rewarding but perilously risky alternative.
Even if we create scenarios with things like 50/50 odds or even favorable odds, I think most people would have a moral instinct to not choose the option with the possible calamity.
The Wagering Calamity Objection compels us to think beyond mere arithmetic. It asks us to consider the moral weight of risk and the ethical implications of near-certain negative outcomes.
Afterword
This objection seems to be related to Pascals-mugging (and infinite ethics), but it isn’t the same thing. I tried looking for it online but couldn’t find it. I asked the ‘askphilosophy’ subreddit about it and they couldn’t find it either. Please let me know if this objection already exists.
Let’s assume in this and the next scenario the population remains fixed for the sake of simplicity