this post was submitted on 08 Jun 2025
104 points (92.6% liked)

Apple

18923 readers
256 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] QuarterSwede@lemmy.world 0 points 11 hours ago (1 children)

You aren’t wrong by in this case, nothing needs to be proven by a 3rd party since anyone recently in programming knows how LLMs works. It’s factual.

[–] pennomi@lemmy.world 4 points 10 hours ago (1 children)

LLMs are famously NOT understood, even by the scientists creating them. We’re still learning how they process information.

Moreover, we most definitely don’t know how human intelligence works, or how close/far we are to replicating it. I suspect we’ll be really disappointed by the human mind once we figure out what the fundamentals of intelligence are.

[–] QuarterSwede@lemmy.world -1 points 9 hours ago

They most definitely are understood. The basics of what they’re doing doesn’t change. Garbage in, garbage out.