this post was submitted on 08 Apr 2026
567 points (96.6% liked)

Programmer Humor

30852 readers
1765 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Mudman@sh.itjust.works 1 points 2 minutes ago

Their trash will become completely irrelevant way before that.

[–] VampirePenguin@lemmy.world 1 points 3 hours ago

More proof that money is meaningless.

[–] BlackLaZoR@lemmy.world -2 points 4 hours ago

You can't expect it to measure time if it doesn't have an internal clock. I'm sure it could be done if you give LLM necessary tools and system prompt

[–] Donebrach@lemmy.world 6 points 13 hours ago* (last edited 13 hours ago) (1 children)

He looks like a guy who got intentionally hit with lacrosse sticks over and over and over in high school because he deserved to be hit with lacrosse sticks over and over and over and we should maybe make it an annual event to hit him in the face with lacrosse sticks because he looks like a guy who deserves to be hit in the face intentionally with lacrosse sticks over and over and over.

[–] Tollana1234567@lemmy.today 1 points 7 hours ago

he was plucked from obscurity by PETER thiel.

[–] axx@slrpnk.net 10 points 16 hours ago (1 children)

It's become more and more obvious that the reason he regularly looks like a rabbit caught in headlights is because he is, in fact, a fraud and not the tech genius he would like everyone to believe.

[–] pigup@lemmy.world 5 points 14 hours ago

I feel the same way about Elon Musk's stutter. I think a good fraction of the repeating sounds he does when he talks is actually him just buying time because he's actively bullshitting. I could be wrong, but I don't trust that MF'er.

[–] ductTapedWindow@lemmy.zip 27 points 22 hours ago (2 children)

I just used the voice feature in my truck to enter an address for Google maps like always, it came up as Gemini with a long speech. I repeated the address, it asked me if I wanted the location in my home city or one in a city over 400 miles away. Regression with exponential cost.

[–] KairuByte@lemmy.dbzer0.com 6 points 8 hours ago

God I hate that. “Alexa turn on sleep” was a reliable “turn on sleep scene” until “Alexa Plus” came around, and now it randomly assumes in trying to tell it goodnight and tells me to have a good night.

Same with “sixty minutes” being immediately parsed as “sixty minute timer” and now sometimes simply results in a “what about sixty minutes?”

They’ve lowered the success metrics and satisfaction a whole bunch, but don’t fret you can now hold a “conversation” with it! Complete with logical contradictions!

[–] skuzz@discuss.tchncs.de 14 points 20 hours ago

And every fake-friendly long-winded response consumes more electricity and water than it should, while also being useless.

[–] jobbies@lemmy.zip 37 points 23 hours ago (8 children)

Makes me so angry. All the problems that couldve been solved with that kinda money. Climate crisis. World hunger. Population migration. Housing affordability.

If Trump triggered WW3 and we all got nuked id be fine with it. We don't deserve to exist.

[–] skuzz@discuss.tchncs.de 10 points 20 hours ago

Instead, all that money is being used to accelerate our doom. AI datacenters unnecessarily consuming power and drinking water in small towns everywhere. Many just dumping humidity into the air and letting that water literally blow away via lazy evaporative cooling. Most "normal" water consuming processes consume, treat, and return water to the downstream-traveling aquifer.

Now, couple that with an overall warming climate. When air is warmer, the more moisture the air can hold. So we end up with more water vapor in the air than normal. With the weirding factor of climate change, this means more water energy for more powerful and destructive storms the likes of which humanity has never seen. Which feeds back into more ice melting, oceans rising, permafrost melting, cycle, accelerate, cycle, accelerate.

Also, real curious to see how millions of warehouses belching humidity and heat into the air across the surface of the globe can affect the general weather patterns, but that sadly won't be known until after the damage is done.

[–] numberskull@lemmy.zip 11 points 22 hours ago

There was an ad during the Super Bowl that succinctly sums up how I feel right now: “America deserves Pepsi”

load more comments (6 replies)
[–] pfried@reddthat.com 10 points 20 hours ago (1 children)

This will actually be solved in a week. All it takes is to add the current time to each input.

[–] KairuByte@lemmy.dbzer0.com 1 points 8 hours ago (1 children)

Knowing what time it is now, and telling the user the timer has completed, are two separate things.

How useful is a “oh by the way your timer ended three hours ago” tacked on to the end of the next interaction?

[–] Hisse@programming.dev 1 points 4 hours ago* (last edited 4 hours ago)

How about making it prompt itself when the timer ends? That solves "BTW it ended"

[–] lobut@lemmy.ca 20 points 1 day ago (1 children)

Why's this need to be on the LLM? They control the app, can't they just make a tool call out?

[–] NotMyOldRedditName@lemmy.world 11 points 23 hours ago (1 children)

Hey, set a timer for 60 seconds.

ChatGPT analyzes text

You want a timer for 600 seconds, got it!

Sets timer for 600 seconds with api.

[–] Hisse@programming.dev 1 points 4 hours ago

It'll actually misinterpret "seconds" as the number 2 instead and then start a timer lengthed 60*2 which is of course 150!

[–] yopp 27 points 1 day ago* (last edited 1 day ago) (4 children)

This is most unhinged take from both sides.

Time can’t exist in LLM by design: it’s just a thing that predicts next token based on previous tokens. There is no temporal relation between tokens. You can stop and resume generation at any point. How anyone expect it to “count time”? Based on what? The best you can do is add time mark to model input at some interval.

Simplifying, somewhat complex biological systems have some kind of clocks that actually chemically tick and induce some kind of signal that they can react on.

LLMs can’t do that like at all. They never will. Some other architecture that runs in cycles? Maybe. But transformer shit? Never ever.

[–] MysticKetchup@lemmy.world 23 points 1 day ago

The issue is that ChatGPT will tell you that it can do those things. Most of the hype for "AI" has been predicated on treating it like actual artificial intelligence and not the LLM parrot it truly is

load more comments (3 replies)
[–] minorkeys@lemmy.world 168 points 1 day ago* (last edited 1 day ago) (22 children)

The public fundamentally misunderstands this tech because salesman lied to them. An LLM is not AI. It just says the most likely thing based off what is most common in its training data for that scenario. It can't do math or problem solve. It can only tell you what the most likely answer would be. It can't do function things. It's like Family Feud where it says what the most people surveyed said.

[–] Clent@lemmy.dbzer0.com 88 points 1 day ago (3 children)

Some of them will "do math" but not with the LLM predictor, they have a math engine and the predictor decides when to use it. What's great is when it outputs results, it's not clear if it engaged the math engine or just guessed.

load more comments (3 replies)
load more comments (21 replies)
[–] core@leminal.space 19 points 1 day ago (2 children)

Its a Large Language Model, not a Large Number Model.

load more comments (2 replies)
[–] transporter_ii@programming.dev 11 points 1 day ago (13 children)

To be fair, timers are hard.

load more comments (13 replies)
[–] robocall@lemmy.world 39 points 1 day ago (7 children)

He's going to ask US Congress for a bailout with taxpayers money when this all fails and Congress is going to most likely give it to him because this one company is a huge part of the US economy

load more comments (7 replies)
[–] TheV2@programming.dev 19 points 1 day ago (2 children)

Shit like this is a reminder to me that a large portion behind some AI products' hype are people who have no clue what these products even do. I wonder how the world would change, if these jack of all trades who ~~invest~~ waste so much time into collecting ideas to fill up their pockets, instead spent more time on actually understanding the ideas they have chosen and build at least a fundamental knowledge.

load more comments (2 replies)
[–] MousePotatoDoesStuff@lemmy.world 42 points 1 day ago (11 children)

Even if it could, it would be an order of magnitude more inefficient in terms of convenience than the stopwatch we already have on our phones.

"Hey ChatGPT, do the thing I could have done in 3-4 clicks on my clock app."

Not to mention the sheer wastefulness in terms of energy. A MINECRAFT REDSTONE MACHINE TIMER WOULD BE MORE EFFICIENT. (Not to mention that, unlike SOTA LLMs, it can run offline on a phone)

load more comments (11 replies)
[–] paraphrand@lemmy.world 49 points 1 day ago (2 children)

Wow, the only thing Siri is generally competent at.

[–] Grandwolf319@sh.itjust.works 2 points 16 hours ago

I miss Siri, just asked it to open the pod bay doors, had a few laughs and moved on with my life.

load more comments (1 replies)
load more comments
view more: next ›