The guy replying to me is (as far as I can tell) the sole owner and moderator of .wtf, which is the instance I've been using up until this point. I kinda already knew they allowed AI slop, as there's nothing in the rules that says otherwise, but this interaction really sealed my decision. "Hey, person who makes music. If you don't like another musician using the fascist plagiarism machine, how about you offer to create art for them? After all, if people simply donated their time and effort, maybe they wouldn't have to resort to pissing in the face of their fellow artists of a different medium. Think about it."
Also, I think you can donate to the instance in crypto?
Fuck right off with that.
On another note: PeerTube itself uses Whisper for automatic subtitle generation. It's something I don't LIKE, but I approached the devs about it and they responded very thoughtfully. I'll admit I don't know all the differences between locally run, open source models that are used for accessibility and the horrible plagiarism machines we all despise the most. I suspect they're still built off exploitive tech / trained on stolen data and whatnot, and Whisper being the product of OpenAI doesn't inspire confidence, but Framasoft only uses it to detect speech, not create it. That's hardly "generative" at all, is it? It's just creating subtitles. Now, that doesn't mean the program itself is ethical given how it was likely created (as the devs acknowledge), and we SHOULD push for ethical, FLOSS methods of doing these sorts of things. I'm sure it can be done, it wasn't exploitative before the AI boom, right? This is where my knowledge ends and I ask for feedback. Any thoughts?
Yeah I... Okay let's take automatic subtitle creation for instance. That existed well before the LLM bs and was fine. Plus, the stuff they're calling AI isn't pretending to "create" anything, it's just automating some repetitive tasks or using pattern recognition to move an effect across the frame. If we were to continue to criticize them (which would be fair), I would say that they should not even entertain software made by companies like OpenAI, even if they were fully transparent. However, I don't think it's on the same level as including or supporting anything that claims to be generative.
I won't be installing the AI stuff either, just to be safe, but my issues with the things they replied to me with are the sources of the code they're using and it's potential to be exploitative, not the actual uses of the software. Does that make sense?
Totally. For shits and giggles, I'm gonna see how quickly I can get something simple slapped together in OpenShot this weekend. I have zero experience with it, but if they're touting no AI of any sort, they might have me as a user.