this post was submitted on 26 Aug 2025
67 points (98.6% liked)
Tech
1879 readers
240 users here now
A community for high quality news and discussion around technological advancements and changes
Things that fit:
- New tech releases
- Major tech changes
- Major milestones for tech
- Major tech news such as data breaches, discontinuation
Things that don't fit
- Minor app updates
- Government legislation
- Company news
- Opinion pieces
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't get it, is ai bad in every possible way and it never works and always lies, singlehandedly destroying the planet while nobody uses it.... and jobs are being lost.
I find AI to be extremely knowledgeable about everything, except anything I am knowledgeable about. Then it's like 80% wrong. Maybe 50% wrong. But it's significant.
So, c-suite see it churning out some basic code - not realising that code is 80% wrong - and think they don't need as many junior devs. Hell, might as well get rid of some mid level devs as well, cause AI will make the other mid level devs more efficient.
And when there aren't as many jobs for junior devs, there aren't as many people eligible for mid devs or senior devs.
I know it seems like the whole "Immigrants are lazy and leech off benefits. Immigrants are taking all our jobs" kinda thing.
But actually it's that LLMs are very good at predicting what the next word might be, not should be.
So it seems correct to people that don't actually know. While people that do know can see its wrong (but maybe not in all the ways it's wrong), and have to spend as much time fixing it as they would have if they had just fucking written it themselves in the first place.
Besides which, by the time an AI prompt is suitably created to get the LLM to generate its approximation of the solution for a problem.... Most of the work is done, the programmer has constrained the problem. The coding part is trivial in comparison.
I think most people don't understand what programers do. They don't know why you need all these people to build an app. They think it's just coding.
This matches my experience exactly. The problem is that the C suite isn't generally an expert in anything, and don't even realize it. They're going to keep thinking AI is amazing forever and not understand that's where the crash came from.
Programming isn't about syntax or language.
LLMs can't do problem solving.
Once a problem has been solved, the syntax and language is easy.
But reasoning about the problem is the hard part.
Like the classic case of "how many 'r's in 'strawberry'", LLMs would state 2 occurrences.
Just check googles AI Mode.
The strawberry problem was found and reported on, and has been specifically solved.
Promoted
how many 'r's in the word 'strawberry'
:Prompted
how many 'c's in the word 'occurrence'
:So, the specific case has been solved. But not the problem.
In fact, I could slightly alter my prompt and get either 2 or 3 as the answer.
That's actually well said. I use it a lot and really love it, but I found this is a forbidden opinion on fediverse 😆 Usually I get at least insulted immediately if not banned for saying that. I was in a company that tried to develop some ai apps, but kind of failed, but I learned a lot about how to use ai, what can be done and what is not sensible to do with ai.
I was thinking a lot when this whole thing began to find a job away from tech, but slowly realized this ai is not replacing humans any time soon so I remained in tech, but for good or bad, not in AI .
That's basically the "AI is replacing jobs. AI can't replace jobs".
C-suite don't get it. It's a hugely accessible framework that anyone can use. But only trained people can use the results. But c-suite trust the results because software has been so predictable (so trustworthy) in the past.
C-suite replace employees with AI. AI can't actually do the job that it pretends it can do. Everyone suffers, and the people selling the shovels profit the most from the gold rush.
It lies on its resume and in it's interviews, but in ways that are hard to detect.
I bet there was a similar sentiment when automation replaced blue collar jobs.
And yet, all those automations still require tool and die manufacturing and maintenance. Buy a tool & die from wherever which is purpose built to your process, and a year down the line you require the supplier to maintain the actual die - the actuators and machine can be maintained by anyone, but the "business logic" is what produces a good high quality part. Process changes? Updated design? Changing supplier to a slightly different material? Back to the supplier to new die.
But so many jobs were made "redundant" by cheap tooling and automation, and now it's (nearly) impossible to actually manufacture something at scale in America.
Except LLMs action the next most likely step to the most likely dimensions based on the prompt and based on the popularity of similar/previous processes.
Fine for art and subjective medium, not for manufacturing and not for engineering.
I guess you could write automated tests which define the behaviour you want.
Probably better to write the behaviour you want and get AI to generate automated tests....