this post was submitted on 10 May 2026
20 points (100.0% liked)

TechTakes

2573 readers
62 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments
[–] Architeuthis@awful.systems 12 points 1 day ago (2 children)

In other Scott of Siskind news, he just posted an entirely unnecessary amount of words to aggressively push back against the adage that "all exponentials sooner or later turn into sigmoids" as if it was by itself a load bearing claim of the side arguing against the direct imminence of the machine god.

It's just a bunch of arguing by analogy ( "helping you build intuition" ) and you-can't-really-knows while implying AI 2027 was very science much rigorous, but it also feels kind of desperate, like why are you bothering with this overperformative setting-the-record-straight thing, have you been feeling inadequate as an AI-curious stats fondler of note lately?

[–] scruiser@awful.systems 4 points 15 hours ago* (last edited 15 hours ago)

he just posted an entirely unnecessary amount of words

taking a quick look at it... it's actually short by Scott's standards, but still overly long, given that the only point he makes is claiming Lindy's Law is applicable to predicting AI progress in absence of other information. Edit: glancing at it again... its not that short, I kinda skimmed until I got to Scott's actual point my first time glancing at it. You can't blame me for not reading it.

you-can’t-really-knows

Yeah, he straw-mans AI critics/skeptics as trying to make an argument from ignorance, then tries to argue against that strawman using Lindy's Law (which assumes ignorance and a pareto distribution). He completely ignores that AI critics are actually making detailed arguments about LLM companies consuming all the good and novel training data, hitting the limits on what compute costs they can afford, running into problems of the long lead time for building datacenters, etc. Which is pretty ironic given his AI 2027 makes a nominal claim to accounting for all that stuff (in actuality it basically all rests on METR's task horizons, and distorts even that already questionable dataset).

[–] lurker@awful.systems 5 points 17 hours ago

The idea of “the exponential curve goes up forever” has always been silly and an idea rooted in capitalism for me (“no bro you don’t get it we’re gonna get infinite money forever”). Limited resources exist, and people are already very fed up with the ludicrous amounts of water and electricity data centres take up. Making bigger models that need to run for longer is also probably going to take an exponential amount of resources (and also make people hate you more).