this post was submitted on 13 Jan 2026
105 points (99.1% liked)

TechTakes

2362 readers
89 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
all 17 comments
sorted by: hot top controversial new old
[–] BigMuffN69@awful.systems 6 points 1 day ago (1 children)

In b4 METR drops the next shoddy study and the promptfondlers go wild

[–] dgerard@awful.systems 4 points 1 day ago

the most robust study in AI coding! cos they built it from toothpicks instead of toilet paper

[–] CinnasVerses@awful.systems 28 points 2 days ago (1 children)

There is an old principle in software development not to make the GUI too pretty until the back end works, because managers and customers will think its ready when they can click around buttons with nice shading and animations. I think slopware is like that. People see the demo that appears to work and don't see what maintaining it and integrating it with other systems is like.

Hell, that's the whole thing with these LLM-based business/product structures, isn't it? The models are very good at creating something that looks right, leading to people being absolutely blindsided when they fail to actually do the thing that boosters and salesmen pretended they were doing.

Given that these are statistical models that function probabilistically, it seem like the obvious attitude to take would be to assume it's a question of when they fail and do something wrong, rather than if. But accounting for that inevitability undermines most if not all of the actual economic value of these things because it turns out it takes just about as much time, effort, and skill to monitor and check these things as it would to just do the damn work yourself. But as soon as you start giving these things permissions to operate independently you are setting up a time bomb and putting duct tape over the timer. You will get fucked eventually.

[–] rafoix@lemmy.zip 18 points 2 days ago (1 children)

When the bosses say “good” they don’t mean that it is better in any way. What they mean is that it is “good” enough to replace a human worker.

[–] ozymandias@sh.itjust.works 25 points 2 days ago (1 children)

it turns out that people who can’t code make terrible judges of how good ai is at coding

[–] rafoix@lemmy.zip 23 points 2 days ago (2 children)

The entire AI boom is management cumming all over their own faces at the thought of firing every human worker.

[–] mi@fedia.io 11 points 2 days ago

Not any different than offshoring devs. Saves money short-term, cost savings look great, managers get promotions, and they move on from that garbage code base that was created

[–] ozymandias@sh.itjust.works 8 points 2 days ago

don’t need to kink shame but yeah, pretty much… that and rich idiots who listen to excited grifters talking about how it’s the future… so fomo.
now in the future “ai free” is going to be a huge feature

[–] resipsaloquitur@lemmy.world 17 points 2 days ago (1 children)
[–] Tar_alcaran@sh.itjust.works 3 points 1 day ago

And now that you trust me, bonus me!

[–] mi@fedia.io 12 points 2 days ago (2 children)

We cant estimate how much effort it will take to fix or maintain it. That's the problem

[–] HaraldvonBlauzahn@feddit.org 6 points 1 day ago* (last edited 1 day ago)

We cant estimate how much effort it will take to fix or maintain it. That's the problem

That's the expected result of selling stuff that doesn't exist or does not really work.

[–] zqwzzle@lemmy.ca 4 points 2 days ago (2 children)

What you’re saying is there will be lucrative billable hours to fix it when this all shakes out? I guess short term pain…

[–] dgerard@awful.systems 7 points 1 day ago

In a few years. From the companies that haven't died. During Great Depression 2.

Your not wrong, but I don't like that much uncertainty in my life.