this post was submitted on 03 Aug 2025
26 points (100.0% liked)

TechTakes

2097 readers
179 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

There's a very long history of extremely effective labor saving tools in software.

Writing in C rather than Assembly, especially for more than 1 platform.

Standard libraries. Unix itself. More recently, developing games in Unity or Unreal instead of rolling your own engine.

And what happened when any of these tools come on the scene is that there is a mad gold rush to develop products that weren't feasible before. Not layoffs, not "we don't need to hire junior developers any more".

Rank and file vibe coders seem to perceive Claude Code (for some reason, mostly just Claude Code) as something akin to the advantage of using C rather than Assembly. They are legit excited to code new things they couldn't code before.

Boiling the rivers to give them an occasional morale boost with "You are absolutely right!" is completely fucked up and I dread the day I'll have to deal with AI-contaminated codebases, but apart from that, they have something positive going for them, at least in this brief moment. They seem to be sincerely enthusiastic. I almost don't want to shit on their parade.

The AI enthusiast bigwigs on the other hand, are firing people, closing projects, talking about not hiring juniors any more, and got the media to report on it as AI layoffs. They just gleefully go on about how being 30% more productive means they can fire a bunch of people.

The standard answer is that they hate having employees. But they always hated having employees. And there were always labor saving technologies.

So I have a thesis here, or a synthesis perhaps.

The bigwigs who tout AI (while acknowledging that it needs humans for now) don't see AI as ultimately useful, in the way in which C compiler was useful. Even if its useful in some context, they still don't. They don't believe it can be useful. They see it as more powerfully useless. Each new version is meant to be a bit more like AM or (clearly AM-inspired, but more familiar) GLaDOS, that will get rid of all the employees once and for all.

top 8 comments
sorted by: hot top controversial new old
[–] traecer@techhub.social 5 points 2 hours ago

@diz I love this comment in the thread: "AI is good at creating work that looks correct to someone that doesn’t understand the work."

[–] WoodScientist@lemmy.world 34 points 14 hours ago (1 children)

AI is good at creating work that looks correct to someone that doesn't understand the work.

[–] Tar_alcaran@sh.itjust.works 14 points 6 hours ago (1 children)

Which is exactly why nobody uses AI for their work, because what they do is complex and nuanced and they can see the AI is full of shit.

But your work is easy, and AI produces stuff just like because I'm not smart enough to tell the difference.

[–] derpgon@programming.dev 2 points 4 hours ago

As a senior PHP developer, I can say that it is absolutely useless for more than writing boilerplate unit tests.

Best case scenario you HAVE TO review the code. Remember, you are submitting the changes, not the AI.

[–] corbin@awful.systems 11 points 14 hours ago

Well, is A* useful? But that's not a fair example, and I can actually tell a story that is more specific to your setup. So, let's go back to the 60s and the birth of UNIX.

You're right that we don't want assembly. We want the one true high-level language to end all discussions and let us get back to work: Fortran (1956). It was arguably IBM's best offering at the time; who wants to write COBOL or order the special keyboard for APL? So the folks who would write UNIX plotted to implement Fortran. But no, that was just too hard, because the Fortran compiler needed to be written in assembly too. So instead they ported Tmg (WP, Esolangs) (1963), a compiler-compiler that could implement languages from an abstract specification. However, when they tried to write Fortran in Tmg for UNIX, they ran out of memory! They tried implementing another language, BCPL (1967), but it was also too big. So they simplified BCPL to B (1969) which evolved to C by 1973 or so. C is a hack because Fortran was too big and Tmg was too elegant.

I suppose that I have two points. First, there is precisely one tech leader who knows this story intimately, Eric Schmidt, because he was one of the original authors of lex in 1975, although he's quite the bastard and shouldn't be trusted or relied upon. Second, ChatGPT should be considered as a popular hack rather than a quality product, by analogy to C and Fortran.

[–] fodor@lemmy.zip 8 points 14 hours ago

"It is difficult to get a man to understand something, when his salary depends upon his not understanding it!" -Upton Sinclair

[–] 4am@lemmy.zip -1 points 14 hours ago* (last edited 14 hours ago) (1 children)

In the hands on an experienced coder, AI being used as an autocomplete, as a test suite for easy obvious bugs, etc is a time saver.

The big push for AI though is to sew doubt about physical evidence. “This document was faked by creating it with AI”, “This video is doctored by AI and is a deepfake”, etc.

Now they think have a machine that they can blame for evidence of their crimes. It was never about new tools for vibe coding.

[–] Seminar2250@awful.systems 6 points 6 hours ago* (last edited 6 hours ago)

you can save even more time by not doing the work at all

the output is more consistent than what an LLM shits out, too

Edit: serious note, even though you probably aren't worth anyone's time: you may be conflating the technology's actual use cases (as an accountability sink and to spread misinformation) with the intentions of its creators. and the real reason higher-ups are pushing this is because they're pliant dipshits that would eat dogfood if the bowl was labelled "FOMO". also they hate paying employees