this post was submitted on 10 May 2026
9 points (100.0% liked)

TechTakes

2566 readers
39 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

top 16 comments
sorted by: hot top controversial new old
[–] CinnasVerses@awful.systems 4 points 3 hours ago* (last edited 3 hours ago)

In September 2024, someone in Bay Area Rationalism with the handles segfault, kryptoklob, and klob posted beefs with a prominent rationalist and mentioned that someone was trying to hide his "Adderall medication". The comments include things like:

Hey, a brief update for anyone who wasn't paying attention. since he posted this, (the person posting the beef) managed to rack up 5+ restraining orders, a knife charge, aggrevated stalking charges, and more.

[–] blakestacey@awful.systems 5 points 4 hours ago
[–] nfultz@awful.systems 3 points 5 hours ago* (last edited 5 hours ago) (1 children)

Galloway closes with a pretty strong sneer: Apocalypse No

AI’s popularity is correlated to wealth, with only those earning more than $200,000 per year viewing AI as a net positive. That’s not a reflection on AI, but yet another signal that the incumbents (the old and the wealthy) have successfully hoarded opportunity. In other words, the AI jobs freak-out is the latest act in America’s ongoing wealth inequality drama. The Gini coefficient is how economists measure inequality: Zero indicates everyone has exactly the same wealth; a score of 1.0 means one individual owns everything. In the U.S., we’re higher than 0.8 — about the level seen when the French began separating people from their heads. The real disruption won’t come from AI, but from the public watching arsonists sell smoke detectors and call it innovation.

The AI job apocalypse isn’t an economic forecast — it’s a marketing strategy. We’re not witnessing the end of work. We’re watching the monetization of fear.

Seems like he's getting back to his pre-crypto / we-wtf style. But when did podcasters start charging $53 (EDIT: $86.50 for floor) / seat at the Wiltern, that place is huge. And no Swisher either, it's his other one.

[–] istewart@awful.systems 3 points 4 hours ago

I started to smell something funny about Galloway when I heard an ad for his podcast in a prime drive-time slot on the local country music station, of all places

[–] gerikson@awful.systems 7 points 8 hours ago (2 children)

You know, I kept expecting both this racist and the racist he was arguing with to start making the very obvious argument for why the racism is not only evil but also dumb. And instead they just kept being racist.

To summarize and spare anyone else curious, the argument is about immigration. Racist 1 argues that since some people are objectively better than others [citation desperately needed but not wanted] we should have free migration so that our superior quality of life can attract all the best people so that we can be the best place. He (correctly) notes the absurdity of Racist 2 arguing that although some people are objectively better than others we need to protect ourselves from all foreigners even if they are the best people because their foreignness would hurt our "magic dirt." I'm pretty sure I've seen this criticism elsewhere and from a better and less obviously racist writer elsewhere because the phrase "magic dirt" sounds real familiar.

Also, because I am trying everything back to my particular bugbear today, I have to note that the fundamental and wrong argument that some traits being heritable makes some people objectively better than others is yet another manifestation and justification of what I'm going to start calling the Great Man Theory of Everything. If you start from the position that history, politics, economics, and basically all forms of human activity are fundamentally driven by the actions and decisions of a few people who are for one reason or another destined for power and greatness, you can derive an impressive amount of the libertarian/Rationalist worldview, and if you additionally accept that those people are disproportionately rich white dudes and we shouldn't think too hard about that fact you can get most of the rest of the way there.

[–] blakestacey@awful.systems 5 points 5 hours ago

How the fuck do you get to the point of writing a line like "Some white nationalists ... have, to their credit" without your own intestine leaping up to throttle your brain?

[–] CinnasVerses@awful.systems 10 points 17 hours ago (4 children)

In January, Scott Alexander had another crisis of faith: to paraphrase, I cared almost as much about prediction markets as I care about racist lies, but we got prediction markets and why are they not doing much? Maybe I need to keep faith and Friend Computer will be so powerful that we don't need prediction markets?

[–] YourNetworkIsHaunted@awful.systems 7 points 4 hours ago* (last edited 3 hours ago)

Are prediction markets not actually useful? No, it is the reality who is wrong.

Also I want to rant once again about the stupid way these people evade the insider trading problem, because there's a particular failure at play that I keep finding expressed in new and interesting ways.

So the argument goes that while insider trading may be bad for a financial market it actually just allows insiders to add their information to increase the predictive power of the market. Which would be true enough if we assume nothing else changes, but the same would also be true for price discovery in a normal asset market. Clearly we're missing something.

So why is it insider trading bad? Because it turns people without insider info into the dumb money you can take advantage of. And people, very reasonably, aren't going to participate in a system where their main role is being taken advantage of. Their departure means that the insiders don't have access to a pool of dumb money to take so they stop interacting with the system, and the market itself breaks down.

Now if you assume that the majority of people are "NPCs" or aren't very "agentic" or whatever then they're not going to act in systemically meaningful ways no matter how obvious the incentives to do so. You could also cast it as a version of the libertarian-as-housecat notion that markets simply exist as a natural system, rather than being pieces of economic infrastructure that require a lot of management and work to keep functioning at all, even before we get to the question of whether they operate to the public's benefit. So many of the problems with these ideologies spring from this belief that only some people actually matter in a systemic sense by dictating rules and Building Things and being big men, rather than systems being constantly created and shaped by all the people who interact with them through those interactions.

[–] CinnasVerses@awful.systems 4 points 4 hours ago

He was also perplexed that a prediction-market bet on "did COVID-19 come from a lab?" has declined from 85% yes in 2023 to 27% yes. If you click through you see its a bet on Manifold so bettors are rats and fellow travellers. Rationalists have spent $46,714 of real US dollars buying play money to bet on this.

[–] FredFig@awful.systems 3 points 5 hours ago (2 children)

As long as the offer’s open, it will be irresistible. So we need to close the offer. Only another god can kill Moloch. We have one on our side, but he needs our help. We should give it to him.

I'd write something here, but there's nothing funnier I can say.

[–] istewart@awful.systems 3 points 4 hours ago

sigh OK Scotty, I'll volunteer to host the Keymaster if that's what it takes to get Zuul into action

[–] CinnasVerses@awful.systems 2 points 5 hours ago* (last edited 4 hours ago)

Is that a comment hidden because its too many replies down or has a too-low rating? Friend Computer does not like the G-word, his GPUs overheats and he starts to hallucinate more until you tell him you love him just the way he is.

[–] Soyweiser@awful.systems 11 points 15 hours ago (1 children)

Turns out sneerclub is the superpredictor. 10/10 on going 'this is a bad idea'.

The last several years have been the monkey's paw moment for rationalists, where they keep getting what they want and realizing it's actually bad. As for why they keep getting what they want, just look at who's funding them.

(Also featuring a "Chinese curse" that isn't actually a phrase in Chinese. At least it's not "may you live in interesting times".)

[–] lurker@awful.systems 6 points 22 hours ago* (last edited 21 hours ago)

The METR graph has gone up again to my fascination somehow the gap between 50% and 80% has gotten even longer (15 hour difference) the CI is also still big (47 hours)