this post was submitted on 12 Apr 2026
21 points (95.7% liked)

TechTakes

2539 readers
78 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments
[–] scruiser@awful.systems 10 points 1 day ago* (last edited 1 day ago) (12 children)

Eliezer joins the trend of condemning "political" violence with confidence on the far end of the dunning-kruger curve: https://www.lesswrong.com/posts/5CfBDiQNg9upfipWk/only-law-can-prevent-extinction

I've already mocked this attitude down thread and in the previous weekly thread, so I'll try to keep my mockery to a few highlights...

He's admitting nuke the data centers is in fact violence!

It would be beneath my dignity as a childhood reader of Heinlein and Orwell to pretend that this is not an invocation of force.

But then drawing a special case around it.

But it's the sort of force that's meant to be predictable, predicted, avoidable, and avoided. And that is a true large difference between lawful and unlawful force.

I don't think Eliezer has checked the news if he think the US government carries out violence in predictable or fair or avoidable ways! Venezuela! (It wasn't fair before Trump, or avoidable if you didn't want to bend over for the interest of US capital, but it is blatantly obvious under Trump) The entire lead up to Iran consisted of ripping up Obama's attempts at treaties and trying to obtain regime change through surprise assassination! Also, if the stop AI doomers used some clever cryptography scheme to make their policy of property destruction (and assassination) sufficiently predictable and avoidable would that count as "Lawful" in Eliezers book? ~~If he kept up with the DnD/Pathfinder source material, he would know Achaekek's assassins are actually Lawful Evil~~

The ASI problem is not like this. If you shut down 5% of AI research today, humanity does not experience 5% fewer casualties. We end up 100% dead after slightly more time.

His practical argument against non-state-sanctioned violence is that we need a total ban (and thus the authority of state driving it), because otherwise someone with 8 GPUs in a basement could invent strong AGI and doom us all. This is a dumb argument, because even most AI doomers acknowledge you need a lot of computational power to make the AGI God. And they think slowing down AGI (whether through violence or other means) might buy time for another sort of solution that is more permanent (like the idea of "solve alignment" Eliezer originally promised them). Lots of lesswrong posts regularly speculate on how to slow down the AI race and how to make use of the time they have, this isn't even outside the normal window of lesswrong discourse!

Statistics show that civil movements with nonviolent doctrines are more successful at attaining their stated goals

Sources cited: 0

One of the comments also pisses me off:

Which reminds me about another point: I suspect that "bomb data centers" meme causal story was not somebody lying, but somebody recalling by memory without a thought that such serious allegation maybe is worthy to actually look up it and not rely on unreliable memory.

"Drone strike the data centers even if starts nuclear war" is the exact argument Eliezer made and that we mocked. It is the rationalists that have tried to soften it by eliding over the exact details.

[–] blakestacey@awful.systems 6 points 15 hours ago (1 children)

It would be beneath my dignity as a childhood reader of Heinlein and Orwell

Life is too short to be that pompous

[–] Architeuthis@awful.systems 5 points 14 hours ago (1 children)

Reading Heinlein as a kid isn't even especially notable, but it's Yud so he definitely means the polyamory advocacy stuff specifically.

[–] blakestacey@awful.systems 3 points 13 hours ago

And it's not like Orwell wrote a book about talking animals that is required reading in schools across the land.

load more comments (10 replies)