riskable

joined 2 years ago
[–] riskable@programming.dev -4 points 5 months ago* (last edited 5 months ago) (5 children)

I can't take anyone seriously that says it's "trained on stolen images."

Stolen, you say? Well, I guess we're going to have to force those AI companies to put those images back! Otherwise, nobody will be able to see them!

...because that's what "stolen" means. And no, I'm not being pendantic. It's a really fucking important distinction.

The correct term is, "copied" but that doesn't sound quite as severe. Also, if we want to get really specific, the images are presently on the Internet. Right now. Because that's what ImageNET (and similar) is: A database of URLs that point to images that people are offering up for free to anyone that wants on the Internet.

Did you ever upload an image anywhere publicly, for anyone to see? Chances are someone could've annotated it and included it in some AI training database. If it's on the Internet, it will be copied and used without your consent or knowledge. That's the lesson we learned back in the 90s and if you think that's not OK then go try to get hired by the MPAA/RIAA and you can try to bring the world back to the time where you had to pay $10 for a ringtone and pay again if you got a new phone (because—to the big media companies—copying is stealing!).

Now that's clear, let's talk about the ethics of training an AI on such data: There's none. It's an N/A situation! Why? Because until the AI models are actually used for any given purpose they're just data on a computer somewhere.

What about legally? Judges have already ruled in multiple countries that training AI in this way is considered fair use. There's no copyright violation going on... Because copyright only covers distribution of copyrighted works, not what you actually do with them (internally; like training an AI model).

So let's talk about the real problems with AI generators so people can take you seriously:

  • Humans using AI models to generate fake nudes of people without their consent.
  • Humans using AI models to copy works that are still under copyright.
  • Humans using AI models to generate shit-quality stuff for the most minimal effort possible, saying it's good enough, then not hiring an artist to do the same thing.

The first one seems impossible to solve (to me). If someone generates a fake nude and never distributes it... Do we really care? It's like a tree falling in the forest with no one around. If they (or someone else) distribute it though, that's a form of abuse. The act of generating the image was a decision made by a human—not AI. The AI model is just doing what it was told to do.

The second is—again—something a human has to willingly do. If you try hard enough, you can make an AI image model get pretty close to a copyrighted image... But it's not something that is likely to occur by accident. Meaning, the human writing the prompt is the one actively seeking to violate someone's copyright. Then again, it's not really a copyright violation unless they distribute the image.

The third one seems likely to solve itself over time as more and more idiots are exposed for making very poor decisions to just "throw it at the AI" then publish that thing without checking/fixing it. Like Coca Cola's idiotic mistake last Christmas.

[–] riskable@programming.dev 12 points 5 months ago (1 children)

I hate Microsoft and Excel but that date thing is exactly the kind of stuff that AI would be great at.

Just not the kind of AI Microsoft probably plans to put in Excel 🤷

[–] riskable@programming.dev 9 points 5 months ago (2 children)

But think of all those children who lost their lives because they saw people having sex on the Internet!

The ones that survived are scared for life! Missing limbs, eyes blinded, forever unable to work. They'll be begging on the streets!

Right?

[–] riskable@programming.dev 14 points 5 months ago (2 children)

So let me get this straight: Nebraska has been leading the fight... In increasing air pollution.

This guy is proud of this‽

...but I get it: Where does it end? All these pollution controls‽ If we take this all the way to its logical conclusion, trucks will all be electric—emitting zero pollution—and then what will Nebraska do?

Wait: I still don't get it.

[–] riskable@programming.dev 3 points 5 months ago

Well it's certainly not sad news!

[–] riskable@programming.dev -1 points 5 months ago* (last edited 5 months ago) (1 children)

They fear it like they fear immigrants taking people's jobs. It's not a thing.

AI can be dangerous for things like misinformation and scams. What it's not actually doing is taking away people's jobs unless you were just as unreliable and likely to hallucinate.

AI can save a lot of time in a lot of ways but it's not:

  • Replacing artists
  • Replacing writers
  • Replacing programmers

A few idiotic companies have tried to do that but every single one realized their mistake pretty fast.

Maybe some day AI will be able to reliably produce images without weird artifacts.

Maybe some day AI will be able to write a whole chapter of a book without hallucinating (or just plain being able to connect the plot).

Maybe some day AI will be able to write software that is maintainable and doesn't build technical debt faster than a rocket.

Until then, people will still need needed to check, fix, and refine the output of AI to ensure it's usable.

[–] riskable@programming.dev -1 points 5 months ago

Forget AI: Doctors might become dependent on endoscopes for performing colonoscopies! Get your head in the game, doc!

Doctors might also become dependent on stethoscopes for listening to people breathing or listening to their hearts beat!

They might even become dependent on tongue depressors to view the back of patients throats!

Doctors might become dependent on any technology! What is the world coming to‽

[–] riskable@programming.dev 8 points 5 months ago (1 children)

It's all fun and games until you hear how they really plan to get this idea cookin.

[–] riskable@programming.dev 8 points 5 months ago

I guess it's just too late for those kids that viewed porn. What are they going to do with all those dead bodies?

[–] riskable@programming.dev 6 points 5 months ago (1 children)

This assumes his deportation figures are accurate. I doubt they're deporting 750 people/day.

In order to deport lots and lots of people you actually need other undocumented folks to tattle on the actual bad guys. Except when you deport everyone (including US citizens) you don't get those tattlers anymore. Instead, you get neighborhoods and sometimes entire cities worth of people who will not help ICE in the slightest.

[–] riskable@programming.dev 10 points 5 months ago

3rd party, universal cable != Circuit boards/connectors designed for very specific hardware (used internally)

view more: ‹ prev next ›