this post was submitted on 09 Feb 2026
34 points (100.0% liked)

Fuck AI

6279 readers
1579 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

For images, there is nightshade. For music, there is/will be whatever Benn Jordan is doing. For youtube, there is .ASS. But what about poisoning text on a web page? Is there any standard solution out there?

It should be relatively easy. I've been thinking about doing something myself, but figured someone else must have already done it.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] e8d79@discuss.tchncs.de 8 points 1 month ago (1 children)

You can target the crawlers using tar pits and proof-of-work application firewalls but I am doubtful that poisoning does anything. The second a poisoning method becomes common enough to have an effect the AI companies will just start filtering for that. Unfortunately the only way I see that prevents your work from being stolen is to either not publish it at all, or to only publish to smaller invite based communities that closely monitor who is accepted.

[โ€“] shoki@lemmy.world 2 points 1 month ago

you could also have an unique challange, for example showing the user an image that has instructions to append sone text to the url. anything that scrapers are too stupid for (I don't think they are scraping using "intelligent" ai agents yet)