this post was submitted on 08 Apr 2026
444 points (99.3% liked)

Programmer Humor

30852 readers
1657 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] spectrums_coherence@piefed.social 65 points 21 hours ago* (last edited 15 hours ago) (2 children)

LLM is very good at programming when there are huge number of guardrails against them. For example, exploit testing is a great usecase because getting a shell is getting a shell.

They kind of acts as a smarter version of infinite monkey that can try and iterate much more efficiently than human does.

On the other hand, in tasks that requires creativity, architecture, and projects without guard rail, they tend to do a terrible job, and often yielding solution that is more convoluted than it needs to be or just plain old incorrect.

I find it is yet another replacement for "pure labor", where the most unintelligent part of programming, i.e. writing the code, is automated away. While I will still write code from scratch when I am trying to learn, I likely will be able automate some code writing, if I know exactly how to implement it in my head, and I also have access to plenty of testing to gaurentee correctness.

[–] Serinus@lemmy.world 39 points 20 hours ago (2 children)

People have trouble with the middle ground. AI is useful in coding. It's not a full replacement. That should be fine, except you've got the ai techbros and CEOs on one end thinking it will replace all labor, and the you've got the backlash to that on the other end that want to constantly talk about how useless it is.

[–] brianpeiris@lemmy.ca 4 points 13 hours ago* (last edited 12 hours ago) (1 children)

I suspect the problem is that there are many developers nowadays who don't care about code quality, actual engineering, and maintenance. So the people who are complaining are right to be concerned that there is going to be a ton of slop code produced by AI-bro developers, and the developers who actually care will be left to deal with the aftermath. I'd be very happy if lead developers are prepared to try things with AI, and importantly to throw the output away if it doesn't meet coding standards. Instead I think even lead developers and CTOs are chasing "productivity" metrics, which just translates to a ton of sloppy code.

[–] Serinus@lemmy.world 1 points 12 hours ago

Yeah, I don't plan to leave in two years, so I'm motivated to not say "oh fuck" when I have to maintain the thing I built later.

Plus, you know, I don't want people to groan when they have to work on my code.

[–] sunbeam60@feddit.uk 9 points 18 hours ago (1 children)

I’d buy you a beer for that summary. That is exactly SPOT ON.

[–] HeyThisIsntTheYMCA@lemmy.world 5 points 17 hours ago* (last edited 17 hours ago)

the times i trust LLMs: when i am using it to look up stuff i have already learned, but i can't remember and just need to refresh my memory. there's no point memorizing shit i can look up and am not going to use regularly, and i'm the effective guardrail against the LLMs being wrong when I'm using them.

the times i don't trust the LLMs: all the other times. if i can't effectively verify the information myself, why am i going to an unreliable source?

having to explain that nuance over and over, it's just shorter and easier to say the llm is an unreliable source. which it is. when i'm not doing lazy output, it doesn't need testing (it still gets at least 2 reviews, but the last time those reviews caught anything was years ago). the llm's output always needs testing.

[–] RamenJunkie@midwest.social 6 points 15 hours ago* (last edited 15 hours ago)

They are also great for programming one off personal projects that frankly, don't have the use scale that needs rigerous security oversight. Especially since like, if you did it yourself, you probably were not sanitizing the inputs (etc) anyway. You were slapping down some Python code and moving on.

Like, I don't care if my script to convert Wordpress exports to Markdown files crashes if you feed it a JPEG. I am the only one using it, for this data manipulation task.