diz

joined 2 years ago
[–] diz@awful.systems 17 points 2 weeks ago* (last edited 2 weeks ago) (4 children)

The other interesting thing is that if you try it a bunch of times, sometimes it uses the calculator and sometimes it does not. It, however, always claims that it used the calculator, unless it didn't and you tell it that the answer is wrong.

I think something very fishy is going on, along the lines of them having done empirical research and found that fucking up the numbers and lying about it makes people more likely to believe that gemini is sentient. It is a lot weirder (and a lot more dangerous, if someone used it to calculate things) than "it doesn't have a calculator" or "poor LLMs cant do math". It gets a lot of digits correct somehow.

Frankly this is ridiculous. They have a calculator integrated in the google search. That they don't have one in their AIs feels deliberate, particularly given that there's a plenty of LLMs that actually run calculator almost all of the time.

edit: lying that it used a calculator is rather strange, too. Humans don't say "code interpreter" or "direct calculator" when asked to multiply two numbers. What the fuck is a "direct calculator"? Why is it talking about "code interpreter" and "direct calculator" conditionally on there being digits (I never saw it say that it used a "code interpreter" when the problem wasn't mathematical), rather than conditional on there being a [run tool] token outputted earlier?

The whole thing is utterly ridiculous. Clearly for it to say that it used a "code interpreter" and a "direct calculator" (what ever that is), it had to be fine tuned to say that. Consequently to a bunch of numbers, rather than consequently to a [run tool] thing it uses to run a tool.

edit: basically, congratulations Google, you have halfway convinced me that an "artificial lying sack of shit" is possible after all. I don't believe that tortured phrases like "code interpreter" and a "direct calculator" actually came from the internet.

These assurances - coming from an "AI" - seem like they would make the person asking the question be less likely to double check the answer (and perhaps less likely to click the downvote button), In my book this would qualify them as a lie, even if I consider LLM to not be any more sentient than a sack of shit.

[–] diz@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago) (8 children)

Try asking my question to Google gemini a bunch of times, sometimes it gets it right, sometimes it doesn't. Seems to be about 50/50 but I quickly ran out of free access.

And google is planning to replace their search (which includes a working calculator) with this stuff. So it is absolutely the case that there's a plan to replace one of the world's most popular calculators, if not the most popular, with it.

[–] diz@awful.systems 12 points 2 weeks ago

That's why I say "sack of shit" and not say "bastard".

[–] diz@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago) (11 children)

The funny thing is, even though I wouldn't expect it to be, it is still a lot more arithmetically sound than what ever is it that is going on with it claiming to use a code interpreter and a calculator to double check the result.

It is OK (7 out of 12 correct digits) at being a calculator and it is awesome at being a lying sack of shit.

[–] diz@awful.systems 10 points 2 weeks ago

Incels then: Zuckerberg creates a hot-or-not clone with stolen student data, gets away with it, becomes a billionaire.

Incels now: chatgpt, what's her BMI.

[–] diz@awful.systems 7 points 2 weeks ago

I think I figured it out.

He fed his post to AI and asked it to list the fictional universes he’d want to live in, and that’s how he got Dune. Precisely the information he needed, just as his post describes.

[–] diz@awful.systems 10 points 2 weeks ago* (last edited 2 weeks ago)

I am also presuming this is about purely non-fiction technical books

He has Dune on his list of worlds to live in, though...

edit: I know. he fed his post to AI and asked it to list the fictional universes he'd want to live in, and that's how he got Dune. Precisely the information he needed.

[–] diz@awful.systems 8 points 2 weeks ago* (last edited 2 weeks ago)

Naturally, that system broke down (via capitalists grabbing the expensive fusion power plants for their own purposes)

This is kind of what I have to give to Niven. The guy is a libertarian, but he would follow his story all the way into such results. And his series where organs are being harvested for minor crimes? It completely flew over my head that he was trying to criticize taxes, and not, say, republican tough-on-crime, mass incarceration, and for profit prisons. Because he followed the logic of the story and it aligned naturally with its real life counterpart, the for profit prison system, even if he wanted to make some sort of completely insane anti tax argument where taxing rich people is like harvesting organs or something.

On the other hand, much better regarded Heinlein, also a libertarian, would write up a moon base that exports organic carbon and where you have to pay for oxygen to convert to CO2. Just because he wanted to make a story inside of which "having to pay for air to breathe" works fine.

[–] diz@awful.systems 13 points 2 weeks ago (5 children)

Maybe he didn't read Dune he just had AI summarize it.

[–] diz@awful.systems 3 points 2 weeks ago* (last edited 2 weeks ago)

Yolo charging mode on a phone, disable the battery overheating sensor and the current limiter.

I suspect that they added yolo mode because without it this thing is too useless.

[–] diz@awful.systems 4 points 2 weeks ago* (last edited 2 weeks ago)

There is an implicit claim in the red button that it was worth including.

It is like Google’s AI overviews. There can not be a sufficient disclaimer because the overview being on the top of Google search implies a level of usefulness which it does not meet, not even in the “evil plan to make more money briefly” way.

Edit: my analogy to AI disclaimers is using “this device uses nuclei known to the state of California to…” in place of “drop and run”.

[–] diz@awful.systems 5 points 2 weeks ago

Jesus Christ on a stick, thats some trice cursed shit.

Maybe susceptibility runs in families, culturally. Religion does, for one thing.

view more: ‹ prev next ›