this post was submitted on 11 Oct 2025
544 points (99.5% liked)

Technology

75959 readers
2565 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] UnderpantsWeevil@lemmy.world 2 points 4 hours ago (1 children)

Every company uses storage, and every growing company needs more.

You're comparing mountains to molehills. That's before you consider improvements in storage and compression relative to demands for space, or the degree to which our storage capacity "needs" are predicated on the voracious appetite of AI models and their unwanted output. Or, for that matter, the inefficient distribution of data and proliferation of spam data that predates it.

very few companies are training generative AIs

Most US Growth Now Rides on AI—And Economists Suspect a Bubble

The expansion in demand is entirely being driven by the expansion in AI capacity.

[–] FlowVoid@lemmy.world 1 points 3 hours ago* (last edited 2 hours ago) (1 children)

That article doesn't say what you imply it does. Companies may be using ChatGPT to grow, but that doesn't mean they are training AIs.

And the distinction is critical to energy usage. Training a new AI uses a lot of energy. Querying an existing AI uses far less.

[–] UnderpantsWeevil@lemmy.world 1 points 1 hour ago* (last edited 1 hour ago) (1 children)

Companies may be using ChatGPT to grow, but that doesn’t mean they are training AIs.

It's the MAG7 that's driving growth. And they're all fixated on training AI in some capacity

Training a new AI uses a lot of energy. Querying an existing AI uses far less.

It costs $5 for each 10s video generation, based on Azure's published rates for the first Sora model.

That's presumably a lot of energy.

[–] FlowVoid@lemmy.world 1 points 49 minutes ago

The MAG7 operate large and growing cloud services, so their datacenter costs would grow even without any AI training.

And charging $5 for a video query does not mean the query uses $5 of energy. The query is priced to recoup training costs that were already incurred.