this post was submitted on 26 Dec 2025
327 points (99.1% liked)

Fuck AI

5008 readers
1051 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] damnthefilibuster@lemmy.world 39 points 2 days ago (3 children)

They could just… sell us some…

[–] pulsewidth@lemmy.world 12 points 2 days ago (2 children)

What would we do with them?

Ai GPUs are not the same as consumer GPUs - they're not PCIe x16 cards, they in fact don't even have a socket format... They're generally supplied onboard motherboards in a blade-format prefab server build.

[–] picnic@lemmy.world 1 points 20 hours ago

I have had grid gpus. I'd love to have a few accelerators.

[–] damnthefilibuster@lemmy.world 8 points 2 days ago

I wouldn’t mind having a local LLM or GenAI

[–] GreenKnight23@lemmy.world 21 points 2 days ago (2 children)

I would only accept them if they were 75% off msrp or more.

[–] jj4211@lemmy.world 3 points 1 day ago (1 children)

Sure thing, that datacenter GPU is now like 3000 dollars....

[–] very_well_lost@lemmy.world 1 points 16 hours ago

Probably closer to 6k. Datacenter GPUs start at around 25 grand a pop.