Okay, what is your definition of AI then, if nothing burned onto silicon can count?
If LLMs aren't AI, then absolutely nothing up to this point probably counts either.
Okay, what is your definition of AI then, if nothing burned onto silicon can count?
If LLMs aren't AI, then absolutely nothing up to this point probably counts either.
It's not, but it felt like further than that when I was 7 years old.
Ahhhh yeah, that would do it. :/
See, the funny thing though is that in this specific situation, the workers were legally there. They had gone through the proper channels. They had work permits. This has nothing to do with the law and everything to do with racism, xenophobia, and a power-tripping ICE.
So you're not wrong, but now even the people who are doing it right are getting punished.
It's a very specialized program intended to get a computer to do something that computers are generally very, very bad at - write sensible language about a wide variety of topics. Trying to then get that one specialized program to turn around and do things that computers are good at, and expect to do it well, is very silly.
The train station is unmanned and largely un-maintained. It's just a dirt platform. The only thing the train has to do that it wasn't already doing anyways is stop and start again, which consumes fewer resources than a separate EV driving all the way to the next nearest stop.
Overpopulation combined with the inefficient resource consumption of modern society. If our resource usage per person reduced at the same rate that population increased, it wouldn't be a big deal.
Also that graph is ridiculous. If there was one less child born per person alive, there would be zero children being born in most developed countries (it takes two people to have a child, which would mean two less children per couple). Of COURSE that would result in a drastically reduced carbon footprint, because we'd die out.
Is there at least a stop within walking distance? I had to hoof it two blocks all through elementary school to catch the bus. It was through low-traffic residential streets, but still.
There is significantly higher pressure to conform to societal norms there, including misogynistic views on a woman's role and a more stratified social hierarchy - but there's also a belief that government exists to support society instead of existing to support moneyed interests.
No. Artificial Intelligence has to be imitating intelligent behavior - such as the ghosts imitating how, ostensibly, a ghost trapped in a maze and hungry for yellow circular flesh would behave, and how CS1.6 bots imitate the behavior of intelligent players. They artificially reproduce intelligent behavior.
Which means LLMs are very much AI. They are not, however, AGI.
It would be fine if it had more ways to differentiate elements from each other - darkening around the edges of windows, buttons that actually look raised so they aren't identical to a text box, scroll bars that aren't SO FUCKING TINY that it's clear MS is embarrassed that they exist in the first place, etc. etc.
.....what?
In order for that to be true, the entire dataset would need to be contained within the LLM. Which it is not. If it were, a model wouldn't have to undergo training.
You seem to be mistaking 'intelligence' for 'human-like intelligence'. This is not how AI is defined. AI can be dumber than a gnat, but if it's capable of making decisions based on stimulus without each set of stimulus and decision being directly coded into it, then it's AI. It's the difference between what is ACTUALLY called AI, and when a sci-fi show or novel talks about AI.