this post was submitted on 07 Mar 2026
85 points (98.9% liked)

Hardware

6554 readers
59 users here now

All things related to technology hardware, with a focus on computing hardware.


Some other hardware communities across Lemmy:


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
 

Announcements this week were mostly business as usual, but Apple isn't immune.

you are viewing a single comment's thread
view the rest of the comments
[–] cerebralhawks@lemmy.dbzer0.com 9 points 1 week ago (4 children)

It had half a terabyte of RAM, sand they’re not saying what it cost. It could also be that the demand for that model was so low, they wouldn’t list it anymore. If you needed that configuration, you could probably special order it.

No consumer needs more than 16/32GB of RAM on a Mac. Some gamers are pushing 64GB RAM, but I haven’t heard of anything but fringe benefits for doing so. The people who need 512GB of RAM are doing some high end video production — think like Pixar type applications. Making these extremely detailed models and rendering them in 4K for a movie. That kind of thing.

[–] pearcake@sh.itjust.works 17 points 1 week ago* (last edited 1 week ago) (2 children)

Macs have unified RAM and a special MLX framework for AI purposes tailored to their M chips, which makes it possible running full sized ChatGPT/Gemini/Deepseek level LLMs locally without serious limitations. You can also join 4 macs using lightning port and make a cluster for LLM/image/video generator model. It’s a real bang for buck stuff compared to other platforms for personal use. So they disappeared because they are mostly sold out, even after price increase.

[–] cerebralhawks@lemmy.dbzer0.com 8 points 1 week ago (2 children)

Macs don't have Lightning ports. I don't think they ever did. Mac keyboards and mice used to.

...I think you mean Thunderbolt, which looks like USB-C and carries something like 40Gbps bandwidth.

[–] Ankkuli@sh.itjust.works 1 points 3 days ago

Thunderbolt doesn’t just look like USB-C, it is USB-C.

[–] pearcake@sh.itjust.works 3 points 1 week ago

Yeah, Thunderbolt, I always forget which is which

[–] artyom@piefed.social 6 points 1 week ago

Such a great deal on a slop-generation machine!

[–] Whitebrow@lemmy.world 3 points 1 week ago* (last edited 1 week ago)

Was 4000$ for the ram (system itself was extra)

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 3 points 1 week ago (1 children)

The Mac Studio isn’t aimed at normal consumers. It’s a Mac Pro without the expansion slots.

[–] cerebralhawks@lemmy.dbzer0.com 2 points 1 week ago (2 children)

A case could be made for the base Studio for budding photographers, videographers, artists, and the like.

There's a WIDE gulf between the base model with 16 or 32GB of RAM, and one with 512GB RAM.

Sure, but the Mac Pro has always been like that. There was a wide gulf between the 32gb base mac pro and the 1.5TB version too. They've always covered from high end amateur to high end pro.

[–] 9point6@lemmy.world 1 points 1 week ago (1 children)

I think that's the point though, they're catering to the kinds of producer/professional that you list AND people who might want to do something that might need all that memory (there are plenty of scientific and AI workloads could easily chew through that)

Apple is also in the situation where as a business they will want to gain OS market share, but without the advantage of the PC platform that anyone can build a PC to meet their exact needs. If they're hypothetically incredibly successful at their goal but with just consumer targeting hardware available, they're leaving market share on the table by not providing hardware to fill the more niche use cases.

Now they could have done the provide loads of options thing, but it's Apple, so they do their normal thing and have basically this one configurable model to serve all the possible non-average-consumers who want a workstation out there

[–] cerebralhawks@lemmy.dbzer0.com 1 points 1 week ago (1 children)

I think anyone who needs 512GB of RAM will be able to get it, regardless of who they cater to.

Like me and (I assume) you (no disrespect intended), we see the configurations they offer most people. But we go to the car dealership, the computer shop, whatever, we say we want whatever... they're gonna say "well you can get that customised, somewhere else." But someone like, say if Tom Cruise is buying a car, the American actor, he's not going to the Ford dealership, wherever he's going, he'll get the options he wants. Or if you're, say, Elon Musk, or, a better example might be Tim Cook — maybe Tim Cook wants a green Mac Studio or a red one or one with a Pride Flag on it... they're gonna make it for him. The people with the money and means can always get what they want. They're not gonna say no. If that makes any sense. They're just not advertising it. I don't think they're gonna tell someone who says 256GB RAM isn't enough, they need that 512GB... I'm quite sure there's not gonna say no.

[–] rabidhamster@lemmy.dbzer0.com 1 points 1 week ago

Yeah, but it was the only way to get half a terabyte of *VRAM* for under $10k. That was its niche.

[–] NotMyOldRedditName@lemmy.world 1 points 1 week ago* (last edited 1 week ago)

512gb of ram would let you run very good AI models, it was able to self run a very large deepseek for example. I believe they cost around 10k? It would cost a lot more than anything else out there to run a full deepseek or similarly sized models.

The problem is they can run inference on models, but they aren't great for training them.

Apple keeps improving the memory bandwidth speed though and future generations might be good at training as well assuming their Metal software keeps improving, which isn't as good as CUDA.

Edit: 10k before the ram price apocalypse anyway.