this post was submitted on 25 Jan 2026
452 points (97.7% liked)

Programmer Humor

28803 readers
361 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 49 comments
sorted by: hot top controversial new old
[–] ZILtoid1991@lemmy.world 7 points 3 hours ago

Five Nights at Altman's

[–] DylanMc6@lemmy.dbzer0.com 2 points 3 hours ago

The AI touched that lava lamp

[–] jwt@programming.dev 5 points 5 hours ago

Reminds me of that "have you ever had a dream" kid.

[–] ChaoticNeutralCzech@feddit.org 19 points 1 day ago* (last edited 1 day ago)

Nah, too cold. It stopped moving and the computer can't generate any more random numbers to pick from the LLM's weighted suggestions. Similarly, some LLMs have a setting called "heat": too cold and the output is repetitive, unimaginative and overly copying input (like sentences written by first autocomplete suggestions), too hot and it is chaos: 98% nonsense, 1% repeat of input, 1% something useful.

[–] Darohan@lemmy.zip 77 points 1 day ago
[–] Kyrgizion@lemmy.world 49 points 2 days ago

Attack of the logic gates.

[–] RVGamer06@sh.itjust.works 7 points 1 day ago

O cholera, czy to Freddy Fazbear?

[–] ideonek@piefed.social 34 points 2 days ago (6 children)
[–] FishFace@piefed.social 101 points 2 days ago (3 children)

LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model's view of "context" doesn't change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don't work perfectly.

*Token

[–] bunchberry@lemmy.world 1 points 9 hours ago (1 children)

This happened to me a lot when I tried to run big models with low context windows. It would effectively run out of memory so each new token wouldn't actually be added to the context so it would just get stuck in an infinite loop repeating the previous token. It is possible that there was a memory issue on Google's end.

[–] FishFace@piefed.social 1 points 2 hours ago

There is something wrong if it's not discarding old context to make room for new

[–] ideonek@piefed.social 20 points 2 days ago (1 children)

That was the answer I was looking for. So it's simmolar to "seahorse" emoji case, but this time.at some point he just glitched that most likely next world for this sentence is "or" and after adding the "or" is also "or" and after adding the next one is also "or", and after a 11th one... you may just as we'll commit. Since thats the same context as with 10.

Thanks!

[–] atomicbocks@sh.itjust.works -3 points 1 day ago (1 children)

He?

This is not a person and does not have a gender.

[–] ideonek@piefed.social 37 points 1 day ago* (last edited 1 day ago) (1 children)

Chill dude. It's a grammatical/translation error, not an ideological declaration. Especially common mistake if of your native language have "grammatical gender". Everything have "gender" in mine. "Spoon" is a "she" for example, but im not proposing to any one soon. Not all hills are worth nitpicking on.

[–] MonkderVierte@lemmy.zip 5 points 1 day ago

I've got it once in a "while it is not" "while it is" loop.

[–] ch00f@lemmy.world 55 points 2 days ago (1 children)

Gemini evolved into a seal.

[–] kamenlady@lemmy.world 12 points 2 days ago

or simply, or

[–] ech@lemmy.ca 24 points 1 day ago (1 children)

It's like the text predictor on your phone. If you just keep hitting the next suggested word, you'll usually end up in a loop at some point. Same thing here, though admittedly much more advanced.

[–] vaultdweller013@sh.itjust.works 2 points 1 day ago (1 children)

Example of my phone doing this.

I just want you are the only reason that you can't just forget that I don't have a way that I have a lot to the word you are not even going on the phone and you can call it the other way to the other one I know you are going out to talk about the time you are not even in a good place for the rest they'll have a little bit more mechanically and the rest is.

You can see it looping pretty damned quick with me just hitting the first suggestion after the initial I.

[–] MrScottyTay@sh.itjust.works 2 points 1 day ago

I think I will be in the office tomorrow so I can do it now and then I can do it now and then I can do it for you and your dad and dad and dad and dad and dad and dad and dad and dad and dad and dad

That was mine haha

[–] Arghblarg@lemmy.ca 28 points 2 days ago

LLM showed its true nature, probabilistic bullshit generator that got caught in a strange attractor of some sort within its own matrix of lies.

[–] palordrolap@fedia.io 16 points 1 day ago (1 children)

Unmentioned by other comments: The LLM is trying to follow the rule of three because sentences with an "A, B and/or C" structure tend to sound more punchy, knowledgeable and authoritative.

Yes, I did do that on purpose.

[–] Cevilia@lemmy.blahaj.zone 10 points 1 day ago (1 children)

Not only that, but also "not only, but also" constructions, which sound more emphatic, conclusive, and relatable.

[–] luciferofastora@feddit.org 2 points 1 day ago

I used to think learning stylistic devices like this was just an idle fancy, a tool simply designed to analyse poems, one of the many things you're most certain you'll never need but have to learn in school.

What a fool I've been.

[–] kogasa@programming.dev 16 points 2 days ago

Turned into a sea lion

[–] squirrel@piefed.kobel.fyi 9 points 1 day ago
[–] lividweasel@lemmy.world 6 points 1 day ago (1 children)
[–] rockerface@lemmy.cafe 5 points 1 day ago (1 children)

Platinum, even. Star Platinum.

[–] MotoAsh@piefed.social 2 points 1 day ago

I don't see no 'a's between those 'or's for the full "ora ora ora ora" effect.