this post was submitted on 03 Aug 2025
253 points (88.0% liked)

Fuck AI

3612 readers
867 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

Source (Bluesky)

you are viewing a single comment's thread
view the rest of the comments
[–] theunknownmuncher@lemmy.world 7 points 13 hours ago* (last edited 13 hours ago) (15 children)

the fact that it is theft

There are LLMs trained using fully open datasets that do not contain proprietary material... (CommonCorpus dataset, OLMo)

the fact that it is environmentally harmful

There are LLMs trained with minimal power (typically the same ones as above as these projects cannot afford as much resources), and local LLMs use signiciantly less power than a toaster or microwave...

the fact that it cuts back on critical, active thought

This is a usecase problem. LLMs aren't suitable for critical thinking or decision making tasks, so if it's cutting back on your "critical, active thought" you're just using it wrong anyway...

The OOP genuinely doesn't know what they're talking about and are just reacting to sensationalized rage bait on the internet lmao

[–] csh83669@programming.dev 15 points 12 hours ago (7 children)

Saying it uses less power that a toaster is not much. Yes, it uses less power than a thing that literally turns electricity into pure heat… but that’s sort of a requirement for toast. That’s still a LOT of electricity. And it’s not required. People don’t need to burn down a rainforest to summarize a meeting. Just use your earballs.

[–] theunknownmuncher@lemmy.world 0 points 12 hours ago* (last edited 5 hours ago) (5 children)

Saying it uses less power that a toaster is not much

Yeah but we're talking a fraction of 1%. A toaster uses 800-1500 watts for minutes, local LLM uses <300 watts for seconds. I toast something almost every day. I'd need to prompt a local LLM literally hundreds of times per day for AI to have a higher impact on the environment than my breakfast, only considering the toasting alone. I make probably around a dozen-ish prompts per week on average.

That’s still a LOT of electricity.

That's exactly my point, thanks. All kinds of appliances use loads more power than AI. We run them without thinking twice, and there's no anti-toaster movement on the internet claiming there is no ethical toast and you're an asshole for making toast without exception. If a toaster uses a ton of electricity and is acceptable, while a local LLM uses less than 1% of that, then there is no argument to be made against local LLMs on the basis of electricity use.

Your argument just doesn't hold up and could be applied to literally anything that isn't "required". Toast isn't required, you just want it. People could just stop playing video games to save more electricity, video games aren't required. People could stop using social media to save more electricity, TikTok and YouTube's servers aren't required.

People don’t need to burn down a rainforest to summarize a meeting.

Strawman

[–] wizardbeard@lemmy.dbzer0.com 2 points 3 hours ago (1 children)

I won't call your point a strawman, but you're ignoring the actual parts of LLMs that have high resource costs in order to push a narrative that doesn't reflect the full picture. These discussions need to include the initial costs to gather the dataset and most importantly for training the model.

Sure, post-training energy costs aren't worth worrying about, but I don't think people who are aware of how LLMs work were worried about that part.

It's also ignoring the absurd fucking AI datacenters that are being built with more methane turbines than they were approved for, and without any of the legally required pollution capture technology on the stacks. At least one of these datacenters is already measurably causing illness in the surrounding area.

These aren't abstract environmental damages by energy use that could potentially come from green power sources, these aren't "fraction of a toast" energy costs only caused by people running queries either.

[–] theunknownmuncher@lemmy.world 1 points 2 hours ago* (last edited 2 hours ago)

Nope, I'm not ignoring them, but the post is specifically about exceptions. The OOP claims there are no exceptions and there is no ethical generative AI, which is false. Your comment only applies to the majority of massive LLMs hosted by massive corporations.

The CommonCorpus dataset is less than 8TB, so fits on a single hard drive, not a data center, and contains 2 trillion tokens, which is a relatively similar amount of tokens that small local LLMs are typically trained with (OLMo 2 7B and 13B were trained on 5 trilion tokens).

These local LLMs don't have high electricity use or environmental impact to train, and don't require a massive data center for training. The training cost in energy is high, but nothing like GPT4, and is only a one time cost anyway.

So, the OOP is wrong, there is ethical generative AI, trained only on data available in the public domain, and without a high environmental impact.

load more comments (3 replies)
load more comments (4 replies)
load more comments (11 replies)