this post was submitted on 25 Jan 2026
453 points (97.7% liked)

Programmer Humor

28818 readers
924 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] ZILtoid1991@lemmy.world 8 points 1 day ago

Five Nights at Altman's

[–] kamen@lemmy.world 3 points 1 day ago

If software was your kid.

Credit: Scribbly G

[–] jwt@programming.dev 5 points 1 day ago

Reminds me of that "have you ever had a dream" kid.

[–] DylanMc6@lemmy.dbzer0.com 2 points 1 day ago

The AI touched that lava lamp

[–] Darohan@lemmy.zip 77 points 3 days ago
[–] ChaoticNeutralCzech@feddit.org 20 points 2 days ago* (last edited 2 days ago)

Nah, too cold. It stopped moving and the computer can't generate any more random numbers to pick from the LLM's weighted suggestions. Similarly, some LLMs have a setting called "heat": too cold and the output is repetitive, unimaginative and overly copying input (like sentences written by first autocomplete suggestions), too hot and it is chaos: 98% nonsense, 1% repeat of input, 1% something useful.

[–] ideonek@piefed.social 34 points 3 days ago (6 children)
[–] FishFace@piefed.social 103 points 3 days ago (3 children)

LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model's view of "context" doesn't change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don't work perfectly.

*Token

[–] bunchberry@lemmy.world 1 points 1 day ago (1 children)

This happened to me a lot when I tried to run big models with low context windows. It would effectively run out of memory so each new token wouldn't actually be added to the context so it would just get stuck in an infinite loop repeating the previous token. It is possible that there was a memory issue on Google's end.

[–] FishFace@piefed.social 1 points 1 day ago (1 children)

There is something wrong if it's not discarding old context to make room for new

[–] bunchberry@lemmy.world 1 points 15 hours ago (1 children)

At least llama.cpp doesn't seem to do that by default. If it overruns the context window it just blorps.

[–] FishFace@piefed.social 1 points 8 hours ago

I think there are parameters for that, from googling.

[–] ideonek@piefed.social 20 points 3 days ago (22 children)

That was the answer I was looking for. So it's simmolar to "seahorse" emoji case, but this time.at some point he just glitched that most likely next world for this sentence is "or" and after adding the "or" is also "or" and after adding the next one is also "or", and after a 11th one... you may just as we'll commit. Since thats the same context as with 10.

Thanks!

load more comments (22 replies)
[–] MonkderVierte@lemmy.zip 5 points 3 days ago

I've got it once in a "while it is not" "while it is" loop.

[–] ch00f@lemmy.world 55 points 3 days ago (1 children)

Gemini evolved into a seal.

[–] kamenlady@lemmy.world 11 points 3 days ago

or simply, or

[–] ech@lemmy.ca 25 points 3 days ago (1 children)

It's like the text predictor on your phone. If you just keep hitting the next suggested word, you'll usually end up in a loop at some point. Same thing here, though admittedly much more advanced.

[–] vaultdweller013@sh.itjust.works 3 points 2 days ago (1 children)

Example of my phone doing this.

I just want you are the only reason that you can't just forget that I don't have a way that I have a lot to the word you are not even going on the phone and you can call it the other way to the other one I know you are going out to talk about the time you are not even in a good place for the rest they'll have a little bit more mechanically and the rest is.

You can see it looping pretty damned quick with me just hitting the first suggestion after the initial I.

[–] MrScottyTay@sh.itjust.works 3 points 2 days ago

I think I will be in the office tomorrow so I can do it now and then I can do it now and then I can do it for you and your dad and dad and dad and dad and dad and dad and dad and dad and dad and dad

That was mine haha

[–] Arghblarg@lemmy.ca 29 points 3 days ago

LLM showed its true nature, probabilistic bullshit generator that got caught in a strange attractor of some sort within its own matrix of lies.

[–] palordrolap@fedia.io 18 points 3 days ago (1 children)

Unmentioned by other comments: The LLM is trying to follow the rule of three because sentences with an "A, B and/or C" structure tend to sound more punchy, knowledgeable and authoritative.

Yes, I did do that on purpose.

[–] Cevilia@lemmy.blahaj.zone 11 points 2 days ago (1 children)

Not only that, but also "not only, but also" constructions, which sound more emphatic, conclusive, and relatable.

[–] luciferofastora@feddit.org 2 points 2 days ago

I used to think learning stylistic devices like this was just an idle fancy, a tool simply designed to analyse poems, one of the many things you're most certain you'll never need but have to learn in school.

What a fool I've been.

[–] kogasa@programming.dev 16 points 3 days ago

Turned into a sea lion

[–] RVGamer06@sh.itjust.works 7 points 2 days ago

O cholera, czy to Freddy Fazbear?

[–] squirrel@piefed.kobel.fyi 9 points 3 days ago
[–] lividweasel@lemmy.world 7 points 2 days ago (1 children)
[–] rockerface@lemmy.cafe 5 points 2 days ago (1 children)

Platinum, even. Star Platinum.

[–] MotoAsh@piefed.social 2 points 2 days ago

I don't see no 'a's between those 'or's for the full "ora ora ora ora" effect.

load more comments
view more: next ›