this post was submitted on 23 Apr 2025
617 points (92.9% liked)

196

5442 readers
2028 users here now

Be sure to follow the rule before you head out.


Rule: You must post before you leave.



Other rules

Behavior rules:

Posting rules:

NSFW: NSFW content is permitted but it must be tagged and have content warnings. Anything that doesn't adhere to this will be removed. Content warnings should be added like: [penis], [explicit description of sex]. Non-sexualized breasts of any gender are not considered inappropriate and therefore do not need to be blurred/tagged.

Also, when sharing art (comics etc.) please credit the creators.

If you have any questions, feel free to contact us on our matrix channel or email.

Other 196's:

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] JohnDClay@sh.itjust.works 50 points 3 months ago (4 children)

What country? Sir Lanka? This isn't a useful comparison as is, I'll see if I can dig up actual numbers.

[–] Darrell_Winfield@lemmy.world 20 points 3 months ago (1 children)

Following for the results of your work here so I can use it in the future.

[–] JohnDClay@sh.itjust.works 70 points 3 months ago (3 children)

From this 2023 paper, looks like if all Nvidia AI servers are running 24/7, you'd get an energy consumption of about 5.7–8.9 TWh per year. Nvidia servers make up 95% of the AI market (according to the paper) so that'd be pretty close to what AI servers would consume.

The paper also estimates about 20% of crypto mining GPUs no longer mining etherium converted to AI, which contributed another 16.1 TWh per year.

This doesn't include some AI, but it should be the majority.

Between those two sources, that gives 23.4 TWh per year. That gives 0.08 exta joules per year per this converter. That's 22% of Sri Lanka's energy consumption (which is the lowest country).

So AI in a year uses at much energy as Sri Lanka uses in 3 months. At least in 2023. I'll see if I can find a more recent study.

[–] Blaster_M@lemmy.world 23 points 3 months ago (1 children)

So that assumes AI requests use 100 percent of the hardware 100 percent of the time.

[–] JohnDClay@sh.itjust.works 14 points 3 months ago* (last edited 3 months ago)

Yes, but those servers are pretty ai specific, so that's a decent assumption. Looks like Nvidia is drastically ramping up production of these servers, so current electricity use might be about 10x, I'm working on it.

[–] taiyang@lemmy.world 15 points 3 months ago

This is the kind of comment I love on Lemmy.

[–] GiveOver@feddit.uk 1 points 3 months ago (1 children)

There's plenty of countries missing from that rankings list, and I bet those are the ones using less energy. Especially considering microstates like Vatican, the statement could be technically correct

[–] JohnDClay@sh.itjust.works 1 points 3 months ago

I can't find any info on Vatican City's energy use, but possibly. You could go even further and compare to not widly recognized countries like sealand, where you have the energy consumption of a residential house or two. But that would be wildly misleading.

load more comments (3 replies)
[–] Sanctus@lemmy.world 25 points 3 months ago (3 children)

Miyazaki's sadness was enough for me. He is right. This is humans losing faith in humans. Trust the machine, not yourself.

[–] taiyang@lemmy.world 18 points 3 months ago

Also AI is still worse than a human on things like essay writing. Why do I know? Cause I just finished grading midterms!

[–] Grimy@lemmy.world 8 points 3 months ago (1 children)

His popular AI quote is from 2016 and is missing a lot of context. What he was commenting on isn't anything like the current generative AI wave. That being said, he doesn't seem to have publicly rectified it so it might still represent his views.

[–] Stovetop@lemmy.world 8 points 3 months ago* (last edited 3 months ago) (1 children)

Agreed. Based on ongoing circumstances and the general response from other high-profile animators in the industry, I am inclined to think that Miyazaki and others at Ghibli are still against AI art. But I also do feel that the quote from 2016 is being reused without the essential context.

Miyazaki opened his response by talking about a friend of his who suffers from a physical disability, which is entirely irrelevant to the topic of generative AI. In context, it was directed at a reinforcement-learning AI model that some artists implemented to try to animate human-like models in unorthodox and unnatural ways, with the proposed utility of using it for zombies or similar. Their suggestion was that these unnatural learned movements are meant to be seen as disturbing and monstrous.

The "insult to life itself" remark was with regards to how they seemed to be making a mockery of disability and, with his friend in mind, was not something he could approve of.

[–] Sanctus@lemmy.world 2 points 3 months ago (1 children)

Don't really see how that doesn't relate. So its not a reinforcement learning model designed to make animations. Cool, the result is still the same. Humanity losing faith in itself quote really can't be applied in a different way to only refer to this one specific model that was made to make terrifying animations, it clearly applies to handing all this human made work over to machines that dont understand why we make what we make. The machine, and subsequently the people who created it, were accused by Miyazaki of not knowing suffering. Not having any idea about something they were trying to emulate. This is what struck his core. The lack of empathy or connection to the subject. The root of all of our connections and bonds come from shared experience and empathy. He was speaking on the abandonment of these principles and AI is the epitome of it all.

[–] polyploy@lemmy.dbzer0.com 2 points 3 months ago

Thank you, way too many people here who seem to completely misunderstand the nature of Miyazaki's resentment towards AI.

He was not simply put off by the appearance of the animations, but rather repulsed by the entire process and the idea that machines could ever replicate the creativity of humanity. This is a man that had one of his animators work more than a year on a 4 second shot, refusing to use CGI in any capacity to speed that process up. The notion that he would have anything but contempt for AI is laughable.

[–] infinitesunrise@slrpnk.net 4 points 3 months ago (1 children)

That stuff Miyazaki said was before generative AI existed. He was commenting on procedural animation being used poorly in a 3D simulation. It's fair to apply his sentiment to AI, but he himself was not talking about AI.

[–] Kolrami@lemmy.world 1 points 3 months ago (1 children)
[–] infinitesunrise@slrpnk.net 1 points 3 months ago

Oh yeah, the presentation he was commenting on did suck, and while what he said to those guys was harsh it was entirely justified.

[–] s_s@lemm.ee 19 points 3 months ago

Somebody said The Apple ads for AI look like they're describing the people who are the biggest pieces of shit you work with or know.

[–] chicken@lemmy.dbzer0.com 13 points 3 months ago* (last edited 3 months ago) (5 children)

I found a blogpost that cites a Business Insider article that implies this claim as formulated is way off:

Reported energy use implies that ChatGPT consumes about as much energy as 20,000 American homes. An average US coal plant generates enough energy for 80,000 American homes every day. This means that even if OpenAI decided to power every one of its billion ChatGPT queries per day entirely on coal, all those queries together would only need one quarter of a single coal plant. ChatGPT is not the reason new coal plants are being opened to power AI data centers.

It goes on to argue that while it's true that AI related electricity use is booming, it's not because of LLM chatbots:

AI energy use is going to be a massive problem over the next 5 years. Projections say that by 2030 US data centers could use 9% of the country’s energy (they currently use 4%, mostly due to the internet rather than AI). Globally, data centers might rise from using 1% of the global energy grid to 21% of the grid by 2030. ...

97% of the total energy used by AI as of late 2024 is not being used by ChatGPT or similar apps, it’s being used for other services. What are those services? The actual data on which services are using how much energy is fuzzy, but the activities using the most energy are roughly in this order:

* Recommender Systems - Content recommendation engines and personalization models used by streaming platforms, e-commerce sites, social media feeds, and online advertising networks.

* Enterprise Analytics & Predictive AI - AI used in business and enterprise settings for data analytics, forecasting, and decision support.

* Search & Ad Targeting - The machine learning algorithms behind web search engines and online advertising networks.

* Computer vision - AI tasks involving image and video analysis – often referred to as computer vision. It includes models for image classification, object detection, facial recognition, video content analysis, medical image diagnostics, and content moderation (automatically flagging inappropriate images/videos). Examples are the face recognition algorithms used in photo tagging and surveillance, the object detection in self-driving car systems (though inference for autonomous vehicles largely runs on-board, not in data centers, the training of those models is data-center based), and the vision models that power services like Google Lens or Amazon’s image-based product search.

* Voice and Audio AI - AI systems that process spoken language or audio signals. The most prominent examples are voice assistants and speech recognition systems – such as Amazon’s Alexa, Google Assistant, Apple’s Siri, and voice-to-text dictation services.
[–] uienia@lemmy.world 7 points 3 months ago (1 children)

You conveniently seem to have left this part from your first linked article out of your argument:

De Vries estimated in the paper that by 2027, the entire AI sector will consume between 85 to 134 terawatt-hours (a billion times a kilowatt-hour) annually.

"You're talking about AI electricity consumption potentially being half a percent of global electricity consumption by 2027," de Vries told The Verge. "I think that's a pretty significant number."

[–] chicken@lemmy.dbzer0.com 6 points 3 months ago* (last edited 3 months ago)

No, I think that gets conveyed in the second half, the argument isn't that AI as a whole isn't using a lot of electricity, it's that this electricity use is being misattributed to LLM chatbots which are only a very small part of it.

load more comments (4 replies)
[–] AlboTheGuy@feddit.nl 7 points 3 months ago

There's a misconception regarding the "consumption" of water, also a bit of a bias towards AI data centers whereas most used water is actually from energy production (via carbon, fuel or even hydroelectric) which is actually a factor to be considered when calculating the actual water use and consumption.

Regarding energy production and water "consumption" I read some papers and as far as I could understand numbers flactuate wildly. 5-40% of the water that runs through the system ends up being consumed via evaporation (so from potentially drinkable/usable for agriculture water to mostly water that ends up in the sea).

What I'm trying to say is that, yes, we should be very aware of the water that we consume in our big data centers but should also put a great focus on the water used by the energy that fuels the data center itself, much of the discourse ends up being "haha use water for email silly" when it should be a catalyst for a more informed approach to water consumption.

Basically I fear that the ai industry can make use of our ignorance and eappease with some "net zero" bs completely ignoring where most of the water is consumed and how.

And yes there are solutions to avoid using fresh water for energy production: solar/wind, using sea water, using polluted water, more sophisticated systems that actually "consume" as little water as possible. These methods have drawbacks that our governments and industry refuse to face and would rather consume and abuse our resources, I really want people to focus on that.

[–] Cypher@lemmy.world 5 points 3 months ago (2 children)

I would be interested in seeing the power consumption required to generate for an AI vs an artist, on an individual basis it might not stack up the way people want.

[–] kibiz0r@midwest.social 13 points 3 months ago (1 children)

Capitalist dystopia got us comparing ingested calories per unit of art

[–] infinitesunrise@slrpnk.net 2 points 3 months ago

Fuuuck, comment of the day right here IMO. This hit me.

[–] JohnDClay@sh.itjust.works 7 points 3 months ago* (last edited 3 months ago)

Maybe about 33% less electricity than human digital art? I don't feel like calculating this myself.

https://www.reddit.com/r/aiwars/comments/11v5ovu/comment/jcsj7uy/

[–] zexyqag@lemmy.world 5 points 3 months ago (1 children)

How does it actually consume the water?

[–] Nikelui@lemmy.world 9 points 3 months ago (10 children)
[–] zexyqag@lemmy.world 4 points 3 months ago (2 children)

How does it consume water? I thought it would be a closed loop?

load more comments (2 replies)
load more comments (9 replies)
[–] WorldsDumbestMan@lemmy.today 5 points 3 months ago

I have local AI for this reason. All it does is toast my balls a bit, and waste 10's of watts of electricity.

[–] Flocklesscrow@lemm.ee 4 points 3 months ago

I want cover letters to be shot in the street at noon

[–] pimento64@sopuli.xyz 1 points 3 months ago (1 children)

Also I demand that everyone who calls it AI instead of procedural generation gets tazed on the butthole

load more comments (1 replies)
[–] 474D@lemmy.world 1 points 3 months ago

I mean it's not about the convience of writing bullshit emails and generating fun pictures, that can be done locally easily, it's about these "AI" companies being shit.

[–] Ragdoll_X@sh.itjust.works 1 points 3 months ago* (last edited 3 months ago) (2 children)

You're not gonna save the world by not using ChatGPT, just like you won't save all those slaves in Zambia by not buying from Apple, and just like you didn't destroy Twitter by joining Bluesky.

To have real effect requires systemic change, so if you want to actually make a difference you can do things like canvassing, running for local office positions and school boards, educating friends and family about politics, or try killing a few politicians and tech CEOs. You know, basic stuff.

Also I asked Gemini's Deep Research to research this for me because why not UwU

Executive Summary

Estimates for the energy consumed by ChatGPT during its training and inference phases vary considerably across different studies, reflecting the complexity of the models and the proprietary nature of the data. Training a model like GPT-3 is estimated to require around 1.3 GWh of electricity^1^, while more advanced models such as GPT-4 may consume significantly more, with estimates ranging from 1.75 GWh to over 62 GWh.^2^ Models comparable to GPT-4o are estimated to consume between 43.2 GWh and 54 GWh during training.^3^ These figures represent substantial energy demands, with the training of GPT-4 potentially exceeding the annual electricity consumption of very small nations multiple times over. The energy used during ChatGPT inference, the process of generating responses to user queries, also presents a wide range of estimates, from 0.3 watt-hours to 2.9 watt-hours per query.^4^ This translates to an estimated annual energy consumption for inference ranging from approximately 0.23 TWh to 1.06 TWh. This level of energy demand can be comparable to the entire annual electricity consumption of smaller countries like Barbados. The lack of official data from OpenAI and the diverse methodologies employed by researchers contribute to the variability in these estimates, highlighting the challenges in precisely quantifying the energy footprint of these advanced AI systems.^4^

  1. https://balkangreenenergynews.com/chatgpt-consumes-enough-power-in-one-year-to-charge-over-three-million-electric-cars/

  2. https://www.baeldung.com/cs/chatgpt-large-language-models-power-consumption

  3. https://www.bestbrokers.com/forex-brokers/ais-power-demand-calculating-chatgpts-electricity-consumption-for-handling-over-78-billion-user-queries-every-year/

  4. https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use

load more comments (2 replies)
load more comments
view more: next ›