this post was submitted on 05 Apr 2026
20 points (91.7% liked)

TechTakes

2535 readers
45 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments
[–] YourNetworkIsHaunted@awful.systems 8 points 1 day ago (3 children)

Found an interesting take on YouTube, of all places. Her argument can be summarized (with high compression losses) as "AI companies and technologies are bad for basically all the reasons that non-cultist critics say, but trying to shame and argue people out of using them entirely is less effective than treating them as a normal tool with limitations and teaching people how to limit the harm." She makes the analogy to drug policy.

I think she makes a very compelling argument, and I'm still digesting it a bit because I definitely had the knee-jerk rejection as an insider shill, but especially towards the end as she talks about how the AI industry targets low-literacy users as ideal customers (because the more you know about it the less you're likely to actually use them) I found myself agreeing more than not. I do wish she had addressed the dangers of cognitive offloading more, since being mindful of which tasks you're letting the computer do for you is pretty significant part of minimizing those harms, especially for students and some professionals who face a strong incentive to just coast by on slop if they can get away with it.

[–] Evinceo@awful.systems 7 points 17 hours ago (1 children)

I feel like there's a difference between alcohol and drugs, something people can make in their back yard and AI which requires a first world country's entire economy to be oriented towards it to function... a difference in what we should be required to accept.

I don't buy the general argument about shame either. We teach children to shit in toilets and not sidewalks. I see rampant AI use as just another form of disgusting public indecency and the faster we bring shame in to remedy it the better.

I don't disagree about the massive costs necessarily associated with thia industry. Even the smaller and lighter models she mentions only exist because of the massive fuckers. At the same time, I think those arguments are for the realm of public policy more than individual choice to use chatbots or not. We've talked at length here over the last year or so about how the economics of the bubble are driven largely by a broken B2B SaaS pipeline that separates purchasing decisions from actually having to use the products and by an investment capital sector desperately trying to recapture the glory days of the pre-2008 omnibubble and throwing obscene amounts of money at anything with the right narrative regardless of the numbers. I feel like that keeps happening regardless of how many individual users fall for the hype and make it part of their normal workflows.

I feel like the analogy to the drug trade is still pretty relevant given the violence and predation that the black market pretty much inevitably attracts and sustains. Like, maybe you know a guy who has his own grow op or whatever, but cocaine and heroin money is going through the cartels at some point in the chain and they're going to use some portion of it for bullets that end up in some journalist's kids or something. The downstream harms are massive even if the drug industry could theoretically avoid them in ways the AI industry can't, but any given individual user's contribution to them is incredibly minor and given the addictive and self-destructive nature of the product it's both more humane and more effective to treat them as a victim of a broken world that (falsely) offered this as a step up. While I don't think we should allow slop to invest every forum any more than addicts should be allowed to shoot up on every corner, I think that if shaming makes people less likely to acknowledge that they're going down a dead-end road and reach out to their communities and support networks for help addressing the root of what drove them to these maladaptive antisolutions in the first place then shaming is making things worse, not better.

Also as the father of a small child I can unfortunately say from recent personal experience that shaming, be it public or private, is far less effective as a means of motivating behavioral change than we want it to be, even for things as basic as not shitting on the goddamn lawn.

[–] V0ldek@awful.systems 15 points 22 hours ago

I think that's 100% correct and also it's year 3 of this nonsense and I cannot be fucked. My response to genAI in any context now is to scream and start doing jumping jacks.

Imagine the drug policy context but then also half of your colleagues are doing meth every day every time you see them, people say shit like "everyone does meth, those that say they don't are lying", and meth is a trillion-dollar industry that has been telling you "meth is the future" for years. You'd be much less inclined to argue calmly against meth and much more inclined to start screaming and jumping.

[–] jaschop@awful.systems 5 points 1 day ago* (last edited 1 day ago)

Sounds kind of like the Baldur Bjarnason strategy but for your coworkers instead of your boss.

I can see the value of someone with a critical understanding diving into the technology, so they can talk others down from the ledge.

But you also need the social pressure to maintain some slop-free spaces. Not everyone can be asked to accomodate recovering slopaholics.