this post was submitted on 03 Aug 2025
429 points (93.2% liked)

Technology

73567 readers
3616 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] astanix@lemmy.world 6 points 21 hours ago (3 children)

Is it like crypto where cpus were good and then gpus and then FPGAs then ASICs? Or is this different?

[–] wewbull@feddit.uk 10 points 17 hours ago (1 children)

I think it's different. The fundamental operation of all these models is multiplying big matrices of numbers together. GPUs are already optimised for this. Crypto was trying to make the algorithm fit the GPU rather than it being a natural fit.

With FPGAs you take a 10x loss in clock speed but can have precisely the algorithm you want. ASICs then give you the clock speed back.

GPUs are already ASICS that implement the ideal operation for ML/AI, so FPGAs would be a backwards step.

[–] astanix@lemmy.world 3 points 12 hours ago

Thank you for the explanation!

[–] brucethemoose@lemmy.world 3 points 14 hours ago* (last edited 14 hours ago)

If bitnet or some other technical innovation pans out? Straight to ASICs, yeah.

Future smartphone will probably be pretty good at running them.

[–] cley_faye@lemmy.world 2 points 19 hours ago (1 children)

It's probably different. The crypto bubble couldn't actually do much in the field of useful things.

Now, I'm saying that with a HUGE grain of salt, but there are decent application with LLM (let's not call that AI). Unfortunately, these usages are not really in the sight of any business putting tons of money into their "AI" offers.

I kinda hope we'll get better LLM hardware to operate privately, using ethically sourced models, because some stuff is really neat. But that's not the push they're going for for now. Fortunately, we can already sort of do that, although the source of many publicly available models is currently… not that great.

[–] KumaSudosa@feddit.dk 6 points 16 hours ago

LLMs are absolutely amazing for a lot of things. I use it at work all the time to check code blocks or remembering syntax. It is NOT and should NOT be your main source of general information and we collectively have to realise how problematic and energy consuming they are.