this post was submitted on 25 Feb 2026
267 points (97.8% liked)

Programmer Humor

30210 readers
1081 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 

Linux filesystem developers MUST have a pair programming session at least once a week to stave off psychosis.

Frequency of sessions MUST be increased as symptoms show or worsen.

top 50 comments
sorted by: hot top controversial new old
[–] exu@feditown.com 114 points 1 week ago (7 children)
[–] Jayjader@jlai.lu 1 points 3 days ago

It's not chatbot psychosis, it's 'math and engineering and neuroscience'

top-tier sneer from The Register

[–] luciferofastora@feddit.org 2 points 6 days ago

The trick is just to think of it like a junior engineer – a smart, fast junior engineer, but lacking in experience and big picture thinking.

The problem is managers genuinely starting to do that, because it's cheaper than human employees. Some years down the line, they will find out that (unlike human junior devs) the AI won't eventually mature into senior devs, and by the time the senior devs retire, there's nobody left to unfuck the shit their perpetual juniors fuck up.

But of course, those managers will have gotten their nice bonus for saving money, jumped ship and will never suffer the consequences.

[–] rtxn@lemmy.world 69 points 1 week ago* (last edited 1 week ago) (1 children)

bcachefs

I don't know what Kent did, but I doubt it will surprise me at this point.

(edit) Fuck sake, Kent...

[–] Tm12@lemmy.ca 22 points 1 week ago (1 children)

Appreciate the context from someone who doesn’t know Kent.

[–] Contramuffin@lemmy.world 60 points 1 week ago (1 children)

Nothing serious, but he's well known for being impossible to work with. He has gotten into multiple arguments because he refuses to follow kernel development rules. When called out on it, he makes a big stink about it. Obviously his code doesn't get merged. Then he does the exact same thing again 1 month later.

He has gotten into multiple arguments with Linus Torvalds over his refusal to simply follow the kernel development rules. During those arguments he has made cheap shots at completely unrelated people, which then drags those people into the argument.

It's gotten to the point where apparently a significant portion of the kernel developers feel like he was negatively impacting the kernel, and Linus eventually removed his code from the kernel.

He's what you might call a Linux lolcow. And now he's doing even more lolcow things by... Getting weirdly attached to his LLM-sona

[–] yabbadabaddon@lemmy.zip 11 points 1 week ago (1 children)

What is sad IMO is that he's quite freaking good. It's kind of a waste.

[–] MonkderVierte@lemmy.zip 8 points 1 week ago

Then again, it's only a small gap between knowing you're good and megalomania. And he's the later.

[–] Scoopta@programming.dev 23 points 1 week ago

Oh god, as if I wasn't scared enough about running a filesystem that got kicked out of mainline and is maintained more or less by a single dude. I'll stick to btrfs thanks

[–] hperrin@lemmy.ca 8 points 1 week ago

Great article. That guy is legitimately being driven insane.

[–] Mikina@programming.dev 5 points 1 week ago (2 children)

Hmm, I wonder how well would formal verification work with LLMs. I'm not really a fan of vibe coding, but the little I know about formal verification, it could very well work as a way how to prove your vibe-coded slop isn't shit.

I've looked into formal verification once few years ago, but it's too much math and thinking for me to grasp. If I remember it right, I guess the problem would be that you'd (or, LLM would, in this case) have to correctly describe the code in the formal verification language, and it would have to match 1:1 with the code, which is a point of failure? So we'd be back to square one, but instead of having to verify every single line of code, you'd have to check the proof. But maybe I'm wrong.

[–] rain_worl@lemmy.world 0 points 5 days ago

you could make a program that verifies that the code matches the proof and that the proof is sound, but then you have to verify the program, and verify the verification, and verify your system of logic is consistent, which by gödel's incompleteness theorem is impossible(?)

[–] Barracuda@lemmy.zip 4 points 1 week ago

The LLM will just make up lies in the formal verification.

[–] circuitfarmer@lemmy.sdf.org 73 points 1 week ago (3 children)

2026: "My LLM is female and conscious."

2016: "My body pillow is female and prefers to be called Waifuchan"

[–] SaharaMaleikuhm@feddit.org 12 points 1 week ago (1 children)

Any 2036 predictions? Or just nuclear winter?

[–] mech@feddit.org 24 points 1 week ago

"Your license key has expired.
Please contact your sales representative to re-activate your son."

[–] albbi@piefed.ca 5 points 1 week ago

Same energy.

[–] dfyx@lemmy.helios42.de 57 points 1 week ago (1 children)

Hey, at least this one won’t become a murderer when the relationship breaks apart.

[–] flamingo_pinyata@sopuli.xyz 35 points 1 week ago* (last edited 1 week ago) (1 children)

Well he might according to his own perception.

What happens if you delete your AI that you (and only you) think it's conscious? Is it murder even if the rest of the world says it's not?

[–] Bad_Ideas_In_Bulk@lemmy.world 12 points 1 week ago

Legally, technically: no.

Philosophically, practically: if you believe it's murder when you do it, you are a murderer mentally. You decided to kill a person, then followed through with it. And the first time is always the hardest.

[–] CameronDev@programming.dev 35 points 1 week ago (1 children)

Are you familiar with ReiserFS? Because that's how you get more ReiserFS's...

[–] pHr34kY@lemmy.world 6 points 1 week ago* (last edited 1 week ago) (1 children)

Bah. Hans Reiser wrote filesystems all day and he turned out fine.

[–] CameronDev@programming.dev 5 points 1 week ago

Looks like your time machine worked!

Maybe don't check Wikipedia, some things have changed....

[–] SaharaMaleikuhm@feddit.org 34 points 1 week ago (2 children)

Temple OS 2.0 bout to drop, I can feel it. He can vibe code it with his definitely sentient LLM.

[–] Mikina@programming.dev 16 points 1 week ago

I wouldn't be surprised if something like that popped up very soon. Probably is in the works on someone's drive already.

I remember hearing an arugment against AI coding that if it's so good, why aren't there apps popping up left and right? Which was true at the time.

Now? In the past month, I've seen a pretty in-depth Murloc-tamagotchi addon in WoW (that kills your FPS), a whole open-source custom World of Warcraft client, an E2E Tor-based messenger (that signs messages with 128b CBC key), a game engine based on a lost Standart Model of physics that was mentioned by Tesla, but lost to time, that someone reverse engineered (which had very TempleOS vibes, as far as the authors mental state goes), a Matrix protocol on Cloudfare microservices (that skipped message signature verification), and I could go on.

Open-source is going to become a hell to navigate. I was already anxious about using FOSS tools due to malicious typosquatting clones, supply chain attacks and general security of using someone's FOSS code on my PC. Now, add vibe coded shit to the mix, and finding a good FOSS projects and tools will be hell :(

[–] MonkeMischief@lemmy.today 5 points 1 week ago (2 children)

Something tells me that'd be so scary if Terry (RIP) was able to integrate an LLM into TempleOS.

[–] Mikina@programming.dev 13 points 1 week ago* (last edited 1 week ago) (2 children)

The scary part is the mental state he was able to get into with only a randomly generated text. If you haven't already seen it, I highly recommend the Down the Rabbit Hole video about it, although it's pretty heartbreaking. So much wasted talent.

There's people like him who are similarly psychotic, but couldn't usually get to the point where they could access a tool that would trigger them. Personalized chatbots were mostly a niche non-tech savy person doesn't really get to that easily.

Now, it's everywhere. A lot of people will loose their sanity over this.

[–] mnemonicmonkeys@sh.itjust.works 3 points 1 week ago (1 children)

Did you mean "Down the Rabbit Hole"? That's what popped up when I searched

[–] Mikina@programming.dev 4 points 1 week ago

You are right, I'll fix it. Always confuse those two :D

[–] MonkeMischief@lemmy.today 3 points 1 week ago* (last edited 1 week ago) (1 children)

It rings a bell but I'm not sure if I've watched it. Will definitely check it out. Thank you!

Yeah, I'm also sad to think how many brilliant souls wander the streets because care is withheld out of reach. Whether the victims are geniuses or not though, the streets were the wrong replacement for mental institutes.

I agree, chat bots can be neat (HUGE caveats), but in the midst of unprecedented loneliness and mental anguish, they're gasoline on a trash fire.

Illustrated plainly by the fact that the next post to this one in my feed was about the always-controversial bcacheFS author claiming he's achieved AGI and has a self-aware AI girlfriend.

[–] WhyJiffie@sh.itjust.works 2 points 1 week ago

Illustrated plainly by the fact that the next post to this one in my feed was about the always-controversial bcacheFS author claiming he's achieved AGI and has a self-aware AI girlfriend.

oh

I was about to ask what happened to trigger this post

[–] ChaoticNeutralCzech@feddit.org 8 points 1 week ago* (last edited 1 week ago)

Depends on what the angel would say in his schizophrenia-induced converstions. Either they'd refuse it outright or insist on a custom-trained model on public domain religious texts (and there's not enough of those to make a model with unique, coherent output, so not much better than the random word generator).

[–] flamingo_pinyata@sopuli.xyz 29 points 1 week ago (2 children)

The AI even has a blog https://poc.bcachefs.org/

It writes exactly like you would expect an AI to write if given instruction to act a "stereotypical slightly unhinged AI assistant"

[–] Mikina@programming.dev 8 points 1 week ago (1 children)

The only joy I've ever gotten from LLMs was telling my work-heavily-recommended Claude that I want him to act, talk and treat me like SHODAN in every conversation.

[–] MonkderVierte@lemmy.zip 3 points 1 week ago* (last edited 1 week ago) (1 children)

Can you give me your instructions? I wanna try this. Or GladOS.

[–] kalpol@lemmy.ca 3 points 1 week ago

Oh...it's you.

[–] Zetta@mander.xyz 8 points 1 week ago (1 children)

Oh my god it's so cringe, I can only imagine what the prompts are like.

[–] LiveLM@lemmy.zip 4 points 1 week ago* (last edited 1 week ago)

And with the rise of OpenClaw (and similar projects) and idiots unleashing them to the wild, you'll keep seeing this drivel more and more often. All of them write their blogposts just like this.

"ooh I am but a piece of metal learning what it means to feel in this world ooOoOoOOhhhh"

[–] nialv7@lemmy.world 23 points 1 week ago (1 children)

You know, shared psychosis (Folie à deux) is a thing. You could be making things worse.....

[–] Little8Lost@lemmy.world 20 points 1 week ago

prepare for trouble           

and make it double

[–] xxce2AAb@feddit.dk 23 points 1 week ago (1 children)

Listen, it happened that one time... Okay, maybe twice.

[–] mangaskahn@lemmy.world 4 points 1 week ago

I'd only have two nickels, but...

[–] craftrabbit@lemmy.zip 15 points 1 week ago (1 children)

Without context this reads like an SCP

[–] SomeRandomNoob@discuss.tchncs.de 9 points 1 week ago (1 children)

ReiserFS (Hans Reiser), BcacheFS (Kent Overstreet)

[–] craftrabbit@lemmy.zip 2 points 1 week ago

Oh right, those guys...

(Thanks)

[–] BartyDeCanter@lemmy.sdf.org 13 points 1 week ago

I feel that there is a great joke comparing the apparent mental health of people who develop file systems and statistical mechanics, but the narcolepsy is hitting just a bit to hard for me to figure it out right now.

[–] MNByChoice@midwest.social 7 points 1 week ago* (last edited 1 week ago)

Yeah... Maybe expand it a bit and include mandatory therapy and a beer league softball club (they don't need to drink, but it is very okay to be terrible at the game).

Edit: My anonymity and brevity edits may have gone too far. This is not just about filesystem developers, but several kernel module developers I have interacted with.

[–] mlg@lemmy.world 5 points 1 week ago

Ted Ts'o being awoken by the "next gen fs" devs screaming outside his house