this post was submitted on 03 Aug 2025
266 points (87.6% liked)
Fuck AI
3612 readers
707 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AI saved my pets life. You won't convince me it's 100% all bad and there's no "right" way to use it.
The way it is trained isnt intellectual theft imo.
It only becomes intellectual theft if it is used to generate something that then competes with and takes away profits from the original creators.
Thus the intellectual theft only kicks in at generation time, but the onus is still on the AI owners for not preventing it
However if I use AI to generate anything that doesn't "compete" with anyone, then "intellectual theft" doesn't matter.
For example, when I used it to assist with diagnosing a serious issue my pet was having 2 months ago that was stumping even our vet and it got the answer right, which surprised our vet when we asked them to check a very esoteric possibility (which they dubious checked and then they were shocked to find something there.
They asked us how on earth we managed to guess to check that place of all things, how could we have known. As a result we caught the issue very early when it was easy to treat and saved our pets life
It was a gallbladder infection, and her symptoms had like 20 other more likely causes individually.
But when I punched all her symptoms into GPT, everytime, it asserted if was likely the gallbladder. It had found some papers on other animals and mammals and how gallbladder infections cause that specific combo of symptoms rarely, and encouraged us to check it out.
If you think "intellectual theft" still applies here, despite it being used to save an animals life, then you are the asshole. No one "lost" profit or business to this, no one's intellectual property was infringed, and consuming the same amount of power it takes to cook 1 pizza in my oven to save my pets life is a pretty damn good trade, in my opinion.
So, yes. I think I used AI ethically there. Fight me.
Regular search could have also surfaced that information
Not at tremendously less of a power cost anyways. My laptop draws 35W
5 minutes of GPT is genuinely less power consumption than several hours of my laptop being actively used to do the searching manually. Laptops burn non trivial amounts of power when in use. Anyone who has held a laptop on their lap can attest to the fact they aren't exactly running cold.
Hell even a whole day of using your mobile phone is non trivial in power consumption, they also use 8~10W or so.
Using GPT for dumb shit is arguably unethical, but only in the sense that baking cookies in the oven is. You gonna go and start yelling at people for making cookies? Cooking up one batch of cookies burns WAAAY more energy than fucking around with GPT. And yet I don't see people going around bashing people for using their ovens to cook things as a hobby.
There's no good argument against what I did, by all metrics it genuinely was the ethical choice.
Querying the LLM is not where the dangerous energy costs have ever been. It's the cost of training the model in the first place.
The training costs effectively enter a "divide by infinity" argument given enough time.
While they continue to train models at this time, eventually you hit a point where a given model can be used in perpetuity.
Costs to train go down, whereas the usability of that model stretches on to effectively infinity.
So you hit a point where you have a one time energy cost to make the model, and an infinite timescale to use it on.
Costs to train are going up exponentially. In a few years corps are going to want a return on the investment and they're going to squeeze consumers.
I can't wait for this to happen. Any day now, surely.