this post was submitted on 15 Jun 2025
14 points (100.0% liked)

TechTakes

2027 readers
215 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

top 50 comments
sorted by: hot top controversial new old
[–] rook@awful.systems 24 points 2 weeks ago (3 children)

I might be the only person here who thinks that the upcoming quantum bubble has the potential to deliver useful things (but boring useful things, and so harder to build hype on) but stuff like this particularly irritates me:

https://quantumai.google/

Quantum fucking ai? Motherfucker,

  • You don’t have ai, you have a chatbot
  • You don’t have a quantum computer, you have a tech demo for a single chip
  • Even if you had both of those things, you wouldn’t have “quantum ai”
  • if you have a very specialist and probably wallet-vaporisingly expensive quantum computer, why the hell would anyone want to glue an idiot chatbot to it, instead of putting it in the hands of competent experts who could actually do useful stuff with it?

Best case scenario here is that this is how one department of Google get money out of the other bits of Google, because the internal bean counters cannot control their fiscal sphincters when someone says “ai” to them.

[–] V0ldek@awful.systems 17 points 2 weeks ago* (last edited 2 weeks ago)

Quantum computing reality vs quantum computing in popculture and marketing follows precisely the same line as quantum physics reality vs popular quantum physics.

  • Reality: Mostly boring multiplication of matrices, big engineering challenges, extremely interesting stuff if you're a nerd that loves the frontiers of human knowledge
  • Cranks: Literally magic, AntMan Quantummania was a documentary, give us all money
[–] BlueMonday1984@awful.systems 10 points 2 weeks ago

Best case scenario here is that this is how one department of Google get money out of the other bits of Google, because the internal bean counters cannot control their fiscal sphincters when someone says “ai” to them.

That's my hope either - every dollar spent on the technological dead-end of quantum is a dollar not spent on the planet-killing Torment Nexus of AI.

load more comments (1 replies)
[–] BlueMonday1984@awful.systems 17 points 1 week ago (2 children)

New article from Axos: Publishers facing existential threat from AI, Cloudflare CEO says

Baldur Bjarnason has given his commentary:

Honestly, if search engine traffic is over, it might be time for blogs and blog software to begin to deny all robots by default

Anyways, personal sidenote/prediction: I suspect the Internet Archive's gonna have a much harder time archiving blogs/websites going forward.

Up until this point, the Archive enjoyed easy access to large swathes of the 'Net - site owners had no real incentive to block new crawlers by default, but the prospect of getting onto search results gave them a strong incentive to actively welcome search engine robots, safe in the knowledge that they'd respect robots.txt and keep their server load to a minimum.

Thanks to the AI bubble and the AI crawlers its unleashed upon the 'Net, that has changed significantly.

Now, allowing crawlers by default risks AI scraper bots descending upon your website and stealing everything that isn't nailed down, overloading your servers and attacking FOSS work in the process. And you can forget about reigning them in with robots.txt - they'll just ignore it and steal anyways, they'll lie about who they are, they'll spam new scrapers when you block the old ones, they'll threaten to exclude you from search results, they'll try every dirty trick they can because these fucks feel entitled to steal your work and fundamentally do not respect you as a person.

Add in the fact that the main upside of allowing crawlers (turning up in search results) has been completely undermined by those very same AI corps, as "AI summaries" (like Google's) steal your traffic through stealing your work, and blocking all robots by default becomes the rational decision to make.

This all kinda goes without saying, but this change in Internet culture all-but guarantees the Archive gets caught in the crossfire, crippling its efforts to preserve the web as site owners and bloggers alike treat any and all scrapers as guilty (of AI fuckery) until proven innocent, and the web becomes less open as a whole as people protect themselves from the AI robber barons.

On a wider front, I expect this will cripple any future attempts at making new search engines, too. In addition to AI making it piss-easy to spam search systems with SEO slop, any new start-ups in web search will struggle with quality websites blocking their crawlers by default, whilst slop and garbage will actively welcome their crawlers, leading to your search results inevitably being dogshit and nobody wanting to use your search engine.

[–] HedyL@awful.systems 9 points 1 week ago (2 children)

FWIW, due to recent developments, I've found myself increasingly turning to non-search engine sources for reliable web links, such as Wikipedia source lists, blog posts, podcast notes or even Reddit. This almost feels like a return to the early days of the internet, just in reverse and - sadly - with little hope for improvement in the future.

load more comments (2 replies)
load more comments (1 replies)
[–] blakestacey@awful.systems 16 points 1 week ago

In other news, I got an "Is your website AI ready" e-mail from my website host. I think I'm in the market for a new website host.

[–] gerikson@awful.systems 14 points 2 weeks ago (4 children)
[–] BurgersMcSlopshot@awful.systems 20 points 2 weeks ago* (last edited 2 weeks ago)

"we set out to make the torment nexus, but all we accomplished is making the stupid faucet and now we can't turn it off and it's flooding the house." - Every AI company, probably.

[–] Soyweiser@awful.systems 17 points 2 weeks ago* (last edited 1 week ago) (1 children)

Pre GPT data is going to be like the steel they fish up from before there were nuclear tests.

E: https://arstechnica.com/ai/2025/06/why-one-man-is-archiving-human-made-content-from-before-the-ai-explosion/ ow look, my obvious prediction was obvious.

[–] YourNetworkIsHaunted@awful.systems 19 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Alright OpenAI, listen up. I've got a whole 250GB hard drive from 2007 full of the Star Wars/Transformers crossover stories I wrote at the time. I promise you it's AI-free and won't be available to train competing models. Bidding starts at seven billion dollars. I'll wait while you call the VCs.

[–] Soyweiser@awful.systems 16 points 2 weeks ago

Do you want shadowrunners to break into your house to steal your discs? Because this is how you get shadowrunners.

[–] swlabr@awful.systems 11 points 2 weeks ago

dark forest internet here we go!!!

load more comments (1 replies)
[–] scruiser@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

So us sneerclubbers correctly dismissed AI 2027 as bad scifi with a forecasting model basically amounting to "line goes up", but if you end up in any discussions with people that want more detail titotal did a really detailed breakdown of why their model is bad, even given their assumptions and trying to model "line goes up": https://www.lesswrong.com/posts/PAYfmG2aRbdb74mEp/a-deep-critique-of-ai-2027-s-bad-timeline-models

tldr; the AI 2027 model, regardless of inputs and current state, has task time horizons basically going to infinity at some near future date because they set it up weird. Also the authors make a lot of other questionable choices and have a lot of other red flags in their modeling. And the picture they had in their fancy graphical interactive webpage for fits of the task time horizon is unrelated to the model they actually used and is missing some earlier points that make it look worse.

[–] aio@awful.systems 11 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

If the growth is superexponential, we make it so that each successive doubling takes 10% less time.

(From AI 2027, as quoted by titotal.)

This is an incredibly silly sentence and is certainly enough to determine the output of the entire model on its own. It necessarily implies that the predicted value becomes infinite in a finite amount of time, disregarding almost all other features of how it is calculated.

To elaborate, suppose we take as our "base model" any function f which has the property that lim_{t → ∞} f(t) = ∞. Now I define the concept of "super-f" function by saying that each subsequent block of "virtual time" as seen by f, takes 10% less "real time" than the last. This will give us a function like g(t) = f(-log(1 - t)), obtained by inverting the exponential rate of convergence of a geometric series. Then g has a vertical asymptote to infinity regardless of what the function f is, simply because we have compressed an infinite amount of "virtual time" into a finite amount of "real time".

load more comments (1 replies)
load more comments (6 replies)
[–] froztbyte@awful.systems 13 points 2 weeks ago (2 children)

preprint, but this looks like it’ll be making a splash soon

load more comments (2 replies)
[–] Soyweiser@awful.systems 12 points 2 weeks ago (4 children)

First confirmed openly Dark Enlightenment terrorist is a fact. (It is linked here directly to NRx, but DE is a bit broader than that, it isn't just NRx, and his other references seem to be more garden variety neo-nazi type (not that this kind of categorizing really matters)).

[–] gerikson@awful.systems 11 points 2 weeks ago (1 children)

They have a badge now, JFC

[–] flaviat@awful.systems 9 points 2 weeks ago (1 children)

I misinterpreted this reply as the guy in the post being hired as a police officer. Thank god.

load more comments (1 replies)
[–] JFranek@awful.systems 9 points 2 weeks ago (5 children)

his other references seem to be more garden variety neo-nazi type

Also apparently pro LGBT neo-nazis, which I refuse to believe are not a parody. See this cursed screenshot:

load more comments (5 replies)
load more comments (2 replies)
[–] antifuchs@awful.systems 12 points 1 week ago (3 children)

Unilever are looking for an Ice Cream Head of Artificial Intelligence.

I think I have found a new favorite way to refer to true believers.

load more comments (3 replies)
[–] antifuchs@awful.systems 12 points 2 weeks ago (4 children)

AllTrails doing their part in the war on genAI by disappearing the people who would trust genAI: https://www.nationalobserver.com/2025/06/17/news/alltrails-ai-tool-search-rescue-members

load more comments (4 replies)
[–] rook@awful.systems 11 points 2 weeks ago (1 children)

New lucidity post: https://ludic.mataroa.blog/blog/contra-ptaceks-terrible-article-on-ai/

The author is entertaining, and if you’ve not read them before their past stuff is worth a look.

load more comments (1 replies)
[–] e8d79@discuss.tchncs.de 11 points 2 weeks ago* (last edited 2 weeks ago) (1 children)
[–] irelephant@lemmy.dbzer0.com 12 points 2 weeks ago* (last edited 2 weeks ago)

Irrelevant. Please stay on topic and refrain from personal attacks.

I think if someone writes a long rant about how germany wasn't at fault for WW2 in a COC for one of their projects, its kinda relevant.

[–] saucerwizard@awful.systems 11 points 2 weeks ago

OT: boss makes a dollar, I make a dime, thats why I listen to audiobooks on company time.

(Holy shit I should have got airpods a long time ago. But seriously, the jobs going great.)

[–] o7___o7@awful.systems 11 points 2 weeks ago

There should be a weekly periodical called Business Idiot.

[–] gerikson@awful.systems 10 points 1 week ago (3 children)

That hatchet job from Trace is continuing to have some legs, I see. Also a reread of it points out some unintentional comedy:

This is the sort of coordination that requires no conspiracy, no backroom dealing—though, as in any group, I’m sure some discussions go on...

Getting referenced in a thread on a different site talking about editing an article about themselves explicitly to make it sound more respectable and decent to be a member of their technofascist singularity cult diaspora. I'm sorry that your blogs aren't considered reliable sources in their own right and that the "heterodox" thinkers and researchers you extend so much grace to are, in fact, cranks.

[–] aio@awful.systems 9 points 1 week ago* (last edited 1 week ago) (2 children)

And sure enough, just within the last day the user "Hand of Lixue" has rewritten large portions of the article to read more favorably to the rationalists.

[–] YourNetworkIsHaunted@awful.systems 11 points 1 week ago (5 children)

User was created earlier today as well. Two earlier updates from a non-account-holder may be from the same individual. Did a brief dig through the edit logs, but I'm not very practiced in Wikipedia auditing like this so I likely missed things. Their first couple changes were supposedly justified by trying to maintain a neutral POV. By far the larger one was a "culling of excessive references" which includes removing basically all quotes from Cade Metz' work on Scott S and trimming various others to exclude the bit that says "the AI thing is a bit weird" or "now they mostly tell billionaires it's okay to be rich".

load more comments (5 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] BlueMonday1984@awful.systems 10 points 2 weeks ago (4 children)

I do think Ed is overly critical of the impact that AI hype has had on the job market, not because the tools are actually good enough to replace people but because the business idiots who impact hiring believe they are. I think Brian Merchant had a piece not long ago talking about how mass layoffs may not be happening but there's a definite slowdown in hiring, particularly for the kind of junior roles that we would expect to see impacted. I think this actually strengthens his overall argument, though, because the business idiots making those decisions are responding to the thoughtless coverage that so many journalists have given to the hype cycle just as so many of the people who lost it all on FTX believed their credulous coverage of crypto. If we're going to have a dedicated professional/managerial class separate from the people who actually do things then the work of journalists like this becomes one of their only connectors to the real world just as its the only connection that people with real jobs have to the arcane details of finance or the deep magic that makes the tech we all rely on function. By abdicating their responsibility to actually inform people in favor of uncritically repeating the claims of people trying to sell them something they're actively contributing to all of it and the harms are even farther-reaching than Ed writes here.

load more comments (3 replies)
[–] swlabr@awful.systems 9 points 2 weeks ago

Doing some reading about the SAG-AFTRA video game voice acting strike. Anyone have details about "Ethovox", the AI company that SAG has apparently partnered with?

[–] sailor_sega_saturn@awful.systems 9 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

More network state nonsense is afoot: https://frontiervalley.com/ https://www.thenerdreich.com/startup-seeks-trump-ai-emergency-for-california-tech-city/

They (who?) have publicly drafted an executive order because they want to take over the Alameda Naval Air Station (a superfund site).

Edit: Per the twitter account the weirdo behind this is James Ingallinera.

[–] swlabr@awful.systems 8 points 2 weeks ago (1 children)

network state

Great, a new stupid thing to know about. How likely is it that a bunch of people that believe they are citizens of an online state will become yet another player in the Stochastic Terrorism as a Service industry?

[–] sailor_sega_saturn@awful.systems 11 points 2 weeks ago* (last edited 2 weeks ago) (5 children)

I'm using the term a bit loosely to mean "libertarian citadel except with techies". Though I think the phrase is technically supposed to mean a nation that starts out as an online community.

Anyway for some reason these weirdos all have this idea that if it wasn't for all those pesky regulations and people they could usher in a glorious new sci-fi and/or cryptocurrency society. Like look at this example: this B-list CEO in the apartment rental business thinks he'll be the ruler of a fiefdom that brings about AGI, Quantum Computing, a nuclear energy revolution, bladerunner style flying cars, and sci-fi materials. It's delusional; or at best grift.

The canonical example of network state is Balaji Srinivasan's Network School. He owns(?) a building in Forest City, Malaysia (or as he calls it: an island in an undisclosed location off the coast of Singapore). But in a broad sense it's useful to consider everything from Sidewalk Labs to California Forever to the M.S. Satoshi as thematically in the same sort of ballpark.

[–] pikesley@mastodon.me.uk 10 points 2 weeks ago

@sailor_sega_saturn @swlabr

Bears. This always leads to some of them being eaten by bears.

[–] swlabr@awful.systems 9 points 2 weeks ago (2 children)

it's funny to me that these futurist/thought leader/tech genius utterly fail to build their cult compounds where a goofy ass cult like Scientology has blown past them completely

load more comments (2 replies)
load more comments (3 replies)

Easy Money Author (and former TV Star) Ben Mckenzie's new cryptoskeptic documentary is struggling to find a distributor. Admittedly, the linked article is more a review of the film than a look at the distributor angle. Still, it looks like it's telling the true story in a way that will hopefully connect with people, and it would be a real shame if it didn't find an audience.

[–] fullsquare@awful.systems 9 points 1 week ago (1 children)
load more comments (1 replies)
load more comments
view more: next ›