this post was submitted on 08 Mar 2025
16 points (83.3% liked)

Apple

18616 readers
8 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 2 years ago
MODERATORS
top 6 comments
sorted by: hot top controversial new old
[–] [email protected] 5 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

I frankly think Anthropic and OpenAI will/would struggle to make a hallucination free AI too. I don’t understand why Apple thinks they are going to be able to fix hallucinations.

[–] [email protected] 10 points 3 weeks ago (1 children)

I don’t even know if it’s theoretically possible to make a hallucination free LLM. That’s kind of its basic operating principle.

[–] [email protected] 2 points 3 weeks ago (1 children)

People are misled by the name. Its not making stuff up, its just less accurate

[–] [email protected] 1 points 3 weeks ago (1 children)

Less accurate as in misleading and outright false.

[–] [email protected] 2 points 3 weeks ago

It always predicts the next word based on its tokenisation, data from training and context handling. So accuracy is all there is.

[–] [email protected] 5 points 3 weeks ago

It's a travesty. The whole LLM "AI" push is a fraud. There's nothing approaching actual intelligence. It's simply statistical word strings.