this post was submitted on 09 Sep 2025
132 points (97.1% liked)

PhilosophyMemes

204 readers
158 users here now

Memes must be related to phil.

The Memiverse:
!90s_memes@quokk.au
!y2k_memes@quokk.au
!sigh_fi@quokk.au

founded 4 weeks ago
MODERATORS
 
top 16 comments
sorted by: hot top controversial new old
[–] SmoothOperator@lemmy.world 29 points 1 week ago

Bentham might say that utilitarianism isn't about comparing more or less arbitrary values of utility in different actions or outcomes, but forcing ourselves to ask if there is any utility to the outcome.

Is it better to donate money to cancer research, or give the money to a beggar in the street? Entirely unclear, it's essentially impossible to calculate the relative utility of these actions until you agree on some measure of utility. That's fine, that's not really what utilitarianism is for.

Is it moral for the state to execute people for their homosexuality, as the UK did in Bentham's time? Maybe according to religious morals, or traditional morals, or duty ethics. Not according to utilitarianism. Absolutely nobody benefits from this, and the suffering is immense.

Utilitarianism, when applied correctly, forces us to critically investigate every action that causes suffering and ask: can this actually be justified?

[–] showmeyourkizinti@startrek.website 17 points 1 week ago (2 children)

I hate to break it to you but all values are made up.

[–] Nemo@slrpnk.net 10 points 1 week ago (1 children)

look at this user, they don't even believe in universal moral truth

What is universal? What is moral, and what is truth? Oh man maybe I did smoke too much weed in the 60’s?

[–] lugal@lemmy.dbzer0.com 3 points 1 week ago (1 children)

The thing is that utilitarians have this pseudo arithmetic concept that looks objective while it's not. Other schools of thought are more openly subjective and therefore more honest

[–] SmoothOperator@lemmy.world 7 points 1 week ago* (last edited 1 week ago) (2 children)

Do you have an example of this pseudo-arithmetic? You mean like the trolley problem, that saving five people is better than ~~saving~~ not murdering one?

[–] Nemo@slrpnk.net 7 points 1 week ago (1 children)

ahem

"...that murdering one person is better than letting five die"

FTFY

[–] SmoothOperator@lemmy.world 4 points 1 week ago

True, fixed it

[–] lugal@lemmy.dbzer0.com 0 points 1 week ago (1 children)

Trolley is a good example. Or "You run into a burning house and can either save a dog or a human who is in coma". Like, don't even pretend you have a metric for situations that specific. There is also longtermism which I'm sure not all utilitarians subscribe to, basically saying there will be so many people in the future that it's more important to invest in the technology of my stakeholders than to help our contemporaries. And it doesn't matter that I'm rich because I will have more offspring than the poor so it's a net positive. As if you could foresee all the consequences. What you can in fact foresee is the consequence of treating people as your equals.

[–] SmoothOperator@lemmy.world 7 points 1 week ago* (last edited 1 week ago)

Three good examples - I'd say that

  • the trolley problem is a reasonable application of utilitarianism, not depending on any other metric than "it is good to stop a person from dying". The main problem with applying it in practice is not the arithmetic (which is sound), but that you are almost never guaranteed that killing the one person will actually save the others.
  • comatose man vs dog in burning building is a good example of a case where utilitarianism can't give you an answer, but can give you a way of investigating the problem by discussing what utility is in this situation
  • longtermism is like the reverse of utilitarianism to me. Utilitarianism asks you to ignore the abstract to justify ethics based on actual consequences. Longtermism asks you to ignore the consequences of the present in favor of some made up abstract future utility. It's the opposite of utilitarianism.

Ultimately, utilitarianism isn't about calculating which situation gives more utility, but about critically investigating whether your actions actually make the world a better place.

[–] Justas@sh.itjust.works 7 points 1 week ago (1 children)

We will assign this value based on supply and demand and call it "price".

[–] Deceptichum@quokk.au 8 points 1 week ago* (last edited 1 week ago) (1 children)

Ooh you know what would be really funny? If we assigned peoples ability to live or die based on this.

[–] Justas@sh.itjust.works 1 points 1 week ago (1 children)

I don't know, it kinda sounds like that other arbitrary system. What was it called, crappy-talism?

*Kappa Talisman.

Exactly! It’s always subjective. You might even say “It’s not turtles but instead, subjective all the way down”

[–] whimsy@lemmy.zip 4 points 1 week ago

Aah yes, effective altruism