Will EA renounce utilitarianism as a guiding principle by the end of 2030?
➕
Plus
32
Ṁ1997
2031
16%
chance

For this to resolve positivly, the general trend in EA has to shift away from utilitarianism and towards some other moral philosophy. (It can still be consequentialist.) Not every EA must renounce utilitarianism, it's ok if there's still a lot of discussion about it. But it can't be the main "thing" that EA is known for anymore.

Get
Ṁ1,000
and
S1.00
Sort by:

Heavily annoyed by the wording of this market. Utilitarianism is not a guiding principle of EA, see CEA's guiding principles and core EA principles, and MacAskill's pdf.

De re, something like 60-70% EA might define themselves as utilitarian, but de dicto that's not even a soft requirement.

Marketing EA as resting on utilitarianism rmakes people unacquainted with it reject it for bad reasons.

bought Ṁ10 NO

Impossible. The whole schtick is predicated on "measuring" well-being.

Does rule utilitarianism count?

predicts NO

@CodeSolder It's a type of utilitarianism, so yes. However if people start advocating for applying it so broadly/selectively that it just becomes deontology, I won't count that.

@IsaacKing one could argue all deontology is rule utilitarianism by natural selection

But we agree an outcome equivalent to "we decided the utilitarian thing is to use deontology because it in practice leads to the highest average utility" for a large part of the decision space does count?

@IsaacKing in other words do we count the prevailing belief being that utilitarianism is theoretically/morally correct/True but not actually feasible/worth implementation on the scale of individual decisions?

predicts NO

@CodeSolder I guess Kant's categorical imperative is sort of a form of rule utilitarianism (with the "rule" part taken to a such an extreme that it's no longer really utilitarianism) now that you mention it. But I don't think all forms of deontology would fit that. In principle, deontology could prescribe a rule that always has negative consequences of following it.

How are you defining utilitarianism? Does refer specifically to hedonic utilitarianism? Normally, I use the term synonymously with consequentialism, but obviously, that's not how you're using it given the description.

predicts NO

@JosephNoonan They're not at all the same thing. Consequentialism just means that the morality of an action is determined by its consequences. It doesn't dictate what consequences people care about. A completely self-centered person who assigns 0 value to anyone else's well-being could still be consequentialist.

Utilitarianism is the principle that we should maximize overall utility or average utility, or minimize disutility, across all beings.

@IsaacKing Ok, that distinction makes sense. So ethical egoism, for example, would count as consequentialism but not utilitarianism, but things like preference utilitarianism and other forms that count more things than just conscious experience as good are still forms of utilitarianism.

predicts NO

@JosephNoonan Yeah, the main debates within utilitarianism are what exactly counts as "utility", and how to aggregate individual utility into a single number to compare world states.

@IsaacKing
"Utilitarianism is the principle that we should maximize overall utility or average utility, or minimize disutility, across all beings."

But this definition doesn’t include, by example, rule utilitarianism.
(And in fact, I don’t think rule utilitarianism should really count as a form of utilitarianism, it isn’t even consequentialist)

Also, is it utilitarian if it includes only human as the beings in consideration ?

If yes, then it is hard to see why egoism wouldn’t also count, we just have to restrain the group still more, using the same logic. (I think it should be no, but some think it is still utilitarian if yes)

I don’t think there is any simple definition which would include everything with "utilitarianism" in its name and reject the rest (it would list-like : A or B or C).