this post was submitted on 07 Mar 2026
85 points (98.9% liked)

Hardware

6567 readers
67 users here now

All things related to technology hardware, with a focus on computing hardware.


Some other hardware communities across Lemmy:


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
 

Announcements this week were mostly business as usual, but Apple isn't immune.

you are viewing a single comment's thread
view the rest of the comments
[–] NotMyOldRedditName@lemmy.world 1 points 1 week ago* (last edited 1 week ago)

512gb of ram would let you run very good AI models, it was able to self run a very large deepseek for example. I believe they cost around 10k? It would cost a lot more than anything else out there to run a full deepseek or similarly sized models.

The problem is they can run inference on models, but they aren't great for training them.

Apple keeps improving the memory bandwidth speed though and future generations might be good at training as well assuming their Metal software keeps improving, which isn't as good as CUDA.

Edit: 10k before the ram price apocalypse anyway.