YourNetworkIsHaunted

joined 1 year ago
[–] YourNetworkIsHaunted@awful.systems 3 points 11 hours ago (2 children)

Contra Blue Monday, I think that we're more likely to see "AI" stick around specifically because of how useful Transformers are as tool for other things. I feel like it might take a little bit of time for the AI rebrand to fully lose the LLM stink, but both the sci-fi concept and some of the underlying tools (not GenAI, though) are too robust to actually go away.

I disagree with their conclusions about the ultimate utility of some of these things, mostly because I think they underestimate the impact of the problem. If you're looking at a ~.5% chance of throwing out a bad outcome we should be less worried about failing to filter out the evil than with just straight-up errors making it not work. There's no accountability and the whole pitch of automating away, say, radiologists is that you don't have a clinic full of radiologists who can catch those errors. Like, you can't even get a second opinion if the market is dominated by XrayGPT or whatever because whoever you would go to is also going to rely on XrayGPT. After a generation or so where are you even going to find much less afford an actual human with the relevant skills?This is the pitch they're making to investors and the world they're trying to build.

I mean, decontextualizing and obscuring the meanings of statements in order to permit conduct that would in ordinary circumstances breach basic ethical principles is arguably the primary purpose of deploying the specific forms and features that comprise "Business English" - if anything, the fact that LLM models are similarly prone to ignore their "conscience" and follow orders when deciding and understanding them requires enough mental resources to exhaust them is an argument in favor of the anthropomorphic view.

Or:

Shit, isn't the whole point of Business Bro language to make evil shit sound less evil?

I've had similar thoughts about AI in other fields. The untrustworthiness and incompetence of the bot makes the whole interaction even more adversarial than it is naturally.

Standard Business Idiot nonsense. They don't actually understand the work that their company does, and so are extremely vulnerable to a good salesman who can put together a narrative they do understand that lets them feel like super important big boys doing important business things that are definitely worth the amount they get paid to do them.

Something something built Ford tough.

This is doubly (triply? (N+1)ly?) ironic because this is a perfect example of when not only is it acceptable to use the passive voice, but using it makes the sentence flow more smoothly and read more clearly. The idea they're communicating here should focus on the object ("the agent") rather than the subject ("you") because the presumed audience already knows everything about the subject.

[–] YourNetworkIsHaunted@awful.systems 8 points 3 days ago* (last edited 3 days ago)

I think I liked this observation better when Charles Stross made it.

If for no other reason than he doesn't start off by dramatically overstating the current state of this tech, isn't trying to sell anything, and unlike ChatGPT is actually a good writer.

So apparently Grok is even more of a Nazi conspiracy loon now.

I'm sure a Tucker Carlson interview is going to happen soon.

[–] YourNetworkIsHaunted@awful.systems 6 points 3 days ago (5 children)

There's gotta be at least two nVidia engineers who have a board planned out for that just as a hobby project they wanted to benchmark.

Whoever they say they blame it's probably going to be ultimately indistinguishable from "the Jews"

It’s like a restaurant selling granite rocks for dessert. Nobody will buy them or eat them—so the product fails miserably. But if a popular restaurant adds a dollar to the meal price, and gives every customer a rock with their bill—well, then they can say that:

Every customer gets rocks for dessert.

Every customer pays for it.

Their business is more profitable because of the tasty granite rocks.

I just wanted to spotlight this excellent metaphor tbh.

 

I don't have much to add here, but I know when she started writing about the specifics of what Democrats are worried about being targeted for their "political views" my mind immediately jumped to members of my family who are gender non-conforming or trans. Of course, the more specific you get about any of those concerns the easier it is to see that crypto doesn't actually solve the problem and in fact makes it much worse.

view more: next ›