this post was submitted on 05 Jun 2025
643 points (97.5% liked)

People Twitter

7252 readers
1010 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] calcopiritus@lemmy.world 19 points 1 day ago (1 children)

My gf knows nothing about math. And has to learn a bit. I'm a software engineer so of course I offered to teach her.

Gave her some exercises to do. Because she prefers to do them alone without help. And after 30 minutes she's like "can you help me decipher what chatgpt told me?".

Of course, what chatgpt told her was utter garbage. Bruh, I'm right here and I'm teaching you, just ask me for help, not a word predictor.

I'm actually impressed how bad chatgpt answered, this was low high school level maths/physics. The bot is advertised as if it were going to leave me without a job in the next year.

[–] Soleos@lemmy.world 4 points 1 day ago

Your GF secretly works for openAI. You're not teaching her, she's getting you to train chatGPT :O

[–] dukatos@lemm.ee 11 points 1 day ago (3 children)

Can AI open a jar? Can AI kill a spider?

[–] buddascrayon@lemmy.world 6 points 1 day ago

I would have asked, "And what exactly does ChatGPT think it is?" Because there's a pretty good chance that it made up a definition that is 100% bullshit.

[–] solsangraal@lemmy.zip 128 points 2 days ago (43 children)

it only takes a couple times of getting a made-up bullshit answer from chatgpt to learn your lesson of just skip asking chatgpt anything altogether

[–] QueenHawlSera@sh.itjust.works 41 points 2 days ago (3 children)

I stopped using it when I asked who I was and then it said I was a prolific author then proceeded to name various books I absolutely did not write.

[–] miss_demeanour@lemmy.dbzer0.com 23 points 2 days ago

I just read "The Autobiography of QueenHawlSera"!
Have I been duped?

load more comments (2 replies)
[–] ColeSloth@discuss.tchncs.de 19 points 2 days ago (2 children)

But chatgpt always gives such great answers on topics I know nothing at all about!

load more comments (2 replies)
[–] papalonian@lemmy.world 13 points 2 days ago (3 children)

I was using it to blow through an online math course I'd ultimately decided I didn't need but didn't want to drop. One step of a problem I had it solve involved finding the square root of something; it spat out a number that was kind of close, but functionally unusable. I told it it made a mistake three times and it gave a different number each time. When I finally gave it the right answer and asked, "are you running a calculation or just making up a number" it said that if I logged in, it would use real time calculations. Logged in on a different device, asked the same question, it again made up a number, but when I pointed it out, it corrected itself on the first try. Very janky.

[–] stratoscaster@lemmy.world 11 points 2 days ago (1 children)

ChatGPT doesn't actually do calculations. It can generate code that will actually calculate the answer, or provide a formula, but ChatGPT cannot do math.

load more comments (1 replies)
load more comments (2 replies)
[–] HollowNaught@lemmy.world 2 points 1 day ago

I feel like a lot of people in this community underestimate the average person's willingness to trust an AI. Over the past few months, every time I've seen a coworker ask something and search it up, I have never seen them click on a website to view the answer. They'll always take what the AI summary tells them at face value

Which is very scary

[–] SuperSaiyanSwag@lemmy.zip 1 points 1 day ago

My girlfriend gave me a mini heart attack when she told me that my favorite band broke up. Turns out it was chat gpt making shit up, came up with a random name for the final album too.

load more comments (38 replies)
[–] PumpkinEscobar@lemmy.world 69 points 2 days ago

First they came for my mansplaining and I said nothing...

[–] ICastFist@programming.dev 18 points 1 day ago (1 children)

Shorting: cutting your jeans into shorts

[–] q181c@sopuli.xyz 14 points 1 day ago (1 children)

That's jorting. You have bejorted yourself.

[–] AnUnusualRelic@lemmy.world 4 points 1 day ago (1 children)

Shorting is when you plug your fork into an electrical outlet.

[–] frostysauce@lemmy.world 1 points 1 day ago

Instructions unclear, dick stuck in jorts.

[–] Blackmist@feddit.uk 58 points 2 days ago (7 children)

Shorting: Like investing but you can lose more than you put in.

[–] kameecoding@lemmy.world 27 points 2 days ago

Investing: Theoretically infinite gain, limited loss

Shorting: Limited gain, theoretically infinite loss

[–] Grandwolf319@sh.itjust.works 23 points 2 days ago

Shorting: investing but you take a loan to do it and you bet on things failing.

[–] Jax@sh.itjust.works 14 points 2 days ago (15 children)
load more comments (15 replies)
load more comments (4 replies)
[–] Lemminary@lemmy.world 9 points 1 day ago

You're wasting computation time by saying please to your boyfriend. Alman said it and he has a boyfriend.

[–] RidderSport@feddit.org 6 points 1 day ago

I had to explain this to my girlfriend, I still have a job

[–] BigDanishGuy@sh.itjust.works 8 points 1 day ago

That's when you turn it around and ask her about how to deal with some social situation. Oh nvm, chatgpt gave me a great idea

[–] mad_lentil@lemmy.ca 27 points 2 days ago (4 children)

If ai can start pirating old movies then it's curtains for me brotherzz 😞

[–] andrewth09@lemmy.world 10 points 2 days ago

"Hey Alexa, add Rick and Morty to Sonarr and download the first season for me"

load more comments (3 replies)
[–] WorldsDumbestMan@lemmy.today 2 points 1 day ago

We are all redundant now. How does it feel living what I lived through mg entire life?

[–] darthelmet@lemmy.world 14 points 2 days ago (4 children)

People do know they could have just googled things and looked at the top results before right? Even with all the enshitification it's gone through it will still usually yield something useful for any topic that isn't super niche. Like if I search "shorting" the top 3 results are articles from Wikipedia, Investorpedia, (don't know enough about it to know if it's reliable) and Charles Schwab, a source with a conflict of interest since they probably sell that service, but they're probably at least going to explain why you'd want to buy it from them, and then as a bonus the 4th result is an ELI5 Reddit thread, which while probably not the most reliable source of info, is probably about on the same level as randomly asking your SO about a topic which they're not an expert in.

load more comments (4 replies)
load more comments
view more: next ›