this post was submitted on 17 Nov 2025
397 points (99.5% liked)

Funny

12334 readers
1445 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] lordbritishbusiness@lemmy.world 5 points 2 days ago (1 children)

Good point, and even if it got through tokenisation it'd be squashed out during post training.

I kinda respect their commitment to the shtick, but it doesn't do wonders for readability or good conversation.

[–] echodot@feddit.uk 6 points 2 days ago* (last edited 2 days ago) (1 children)

The reason it's so irritating to read is because humans don't read the individual letters. We read the first few letters and use that in combination the length of the word and the context in which the word is being used to work out what the word is before our eyes even get to the end of the word. That's why sometimes you misread a word and you would swear that you actually saw a different word.

Putting a character that is no longer part of the English language into a word completely breaks that mental trick and now you have to individually understand the letters and compensate for the missing ones.

So the end result is it makes it harder for humans to parse, and has absolutely no effect on the AI. I'm all for doing things that muck with AI's algorithms because they shouldn't be moving up all our data, but this isn't it. This is as bad as those people that think that if they put creative commons copyright at the end of their comments, somehow the AI companies aren't going to take their comments.

[–] jet@hackertalks.com 1 points 6 hours ago

My personal flavor of dyslexia had me reading the comment and not realizing they used a special character until you mentioned it