this post was submitted on 15 Sep 2025
487 points (87.3% liked)
Technology
75191 readers
2665 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A phone can do a lot. Much much more than ENIAC era supercomputer (I think you'll have to get pretty close to the end of the previous century to find a supercomputer more powerful than a modern smartphone)
What a phone can't do is run an LLM. Even powerful gaming PCs are struggling with that - they can only run the less powerful models and queries that'd feel instant on service-based LLMs would take minutes - or at least tens of seconds - on a single consumer GPU. Phones certainly can't handle that, but that doesn't mean that "cant' do anything".
I've run small models (a few Gb in size) on my steam deck. It gives reasonably fast responses (faster than a person would type).
I know that they're far from state-of-the art, but they do work and I know that the Steam Deck is not going to be using much power.