riskable

joined 2 years ago
[–] riskable@programming.dev 5 points 5 months ago

The group of seven experts called for an end to the atrocities and expressed concern that the enforced disappearances will discourage Palestinians from accessing food distribution points, increasing the risk of starvation.

Is there any doubt at this point that this is Israel's plan? As in, yes: That's the idea. This is how genocide happens. Israel wants the Palestinians gone. Dead. Done. Over with.

If there's no Palestinians in Gaza they can absorb it into Israel which was the entire point of imprisoning them there in the first place.

They're following the American playbook on this one. Not Hitler's. The Israeli government is betting that 100 years after the (successful) genocide, no one will care what they did. Because by then, it'll long since be too late.

The thing is: In 100 years no one will care because some other conflict will have replaced this one. My guess: It'll be Isrealis VS Israelis in a great big civil war that will go on seemingly forever. It'll go on so long, in fact, that peoples of the future will stop thinking of the two sides as one people and will start referring to them with terms like, "Palestinians" and "Jews".

[–] riskable@programming.dev 7 points 5 months ago (5 children)

Training an AI is orthogonal to copyright since the process of training doesn't involve distribution.

You can train an AI with whatever TF you want without anyone's consent. That's perfectly legal fair use. It's no different than if you copy a song from your PC to your phone.

Copyright really only comes into play when someone uses an AI to distribute a derivative of someone's copyrighted work. Even then, it's really the end user that is even capable of doing such a thing by uploading the output of the AI somewhere.

[–] riskable@programming.dev 19 points 5 months ago

Republicans: "Google keeps blocking our emails!" Google: "Yep. Stop sending spam!"

[–] riskable@programming.dev 6 points 5 months ago

Xerox is a bad copy of themselves from decades prior.

[–] riskable@programming.dev 7 points 5 months ago

Try this and the result may shock you!

Doctors hate it!

[–] riskable@programming.dev 8 points 5 months ago (3 children)

I'm going to assume the standard was poorly understood because I can't imagine a multi-billion dollar company hires idiots to set standards.

Ahahahahahahaha! Oh man, you got a good laugh out of me this morning 🤣

[–] riskable@programming.dev 8 points 5 months ago (2 children)

When you work on the same thing for 8 hours a day for years and then suddenly management decides that they need "detailed time tracking."

They just gave you a new job without additional compensation. New responsibilities, no new title, no raise, etc.

Then—months later—they realize that everyone's spending at least half an hour, regularly to figure out how they're spending their time. Some bean counter adds up how much that costs in real money and then—out of nowhere—management decides they don't need detailed time tracking anymore.

[–] riskable@programming.dev 1 points 5 months ago

No. He's giving tax breaks to the rich because he's rich.

There's no logic beyond it than that.

[–] riskable@programming.dev 2 points 5 months ago

For images, it's not even data collection because all the images that are used for these AI image generation tools are out on the internet for free for anyone to download right now. That's how they're obtained: A huge database of (highly categorized) image URLs (e.g. ImageNET) is crawled/downloaded.

That's not even remotely the same thing as "data collection". That's when a company vacuums everything they can from your private shit. Not that photo of an interesting building you uploaded to flickr over a decade ago.

[–] riskable@programming.dev 56 points 5 months ago (27 children)

This is sad, actually, because this very technology is absolutely fantastic at identifying things in images. That's how image generation works behind the scenes!

esp32-cam identifying a cat, a bike, and a car in an image

ChatGPT screwed this up so badly because it's programmed to generate images instead of using reference images and then identifying the relevant parts. Which is something a tiny little microcontroller board can do.

If they just paid to license a data set of medical images... Oh wait! They already did that!

Sigh

[–] riskable@programming.dev 4 points 5 months ago

A feeder most fowl!

view more: ‹ prev next ›