this post was submitted on 06 Oct 2025
183 points (98.9% liked)
Programming
23053 readers
104 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yeah, LLMs kinda-sorta-almost work for nearly anything but their failures are have a uniform distribution in terms of seriousness - LLMs are equally likely to give an answer than will kill people if acted upon as they are to make a minor mistake in an answer.
Statistical text generators don't have logical consistency checks or contextual awareness, unlike people, and that makes LLM unsuitable for just about any application were there are error modes which could be costly or dangerous, even whilst barely trained people could work there because some things are obviously dangerous or wrong for even the dumbest of humans so they won't just do them, plus humans tend to put much more effort and attention into not doing the worst kinds of mistakes than they do the lighter kind.
Of course, one has to actually be capable of logically analyzing things to figure this core inherent weakness in how LLMs works when it comes to use them in most domains - as it's not directly visible and instead is about process - and that's not something talkie, talkie grifters are good at since they're used to dealing with people who can be pushed around and subtly manipulated, unlike Mathematics and Logic.
'LLMs specifically won't work.'
'No, see, LLMs won't work.'
Okay.
I'm not disagreeing, rather I'm expanding on your point.