this post was submitted on 17 Sep 2025
25 points (100.0% liked)

Hardware

689 readers
1 users here now

A community for news and discussion about the hardware side of technology. Questions and support posts are also welcome, so long as they are relevant to hardware and interesting technologies therein.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.

founded 1 year ago
MODERATORS
 

Enough power to feed four GPUs and more...

you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world 1 points 3 months ago* (last edited 3 months ago)

4x or more 3090s is a pretty popular homelab setup, heh. I've seen janky ones with like 7 mixed 3090s/3060s. This actually makes some sense.

It's absolutely bonkers that AMD/Intel won't just sell 'normal' folks tons of VRAM on GPUs instead, but, well... they don't want money I guess. And of course Nvidia's not going to do it.