this post was submitted on 23 Apr 2025
57 points (100.0% liked)

Hardware

1697 readers
296 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 10 points 3 days ago* (last edited 3 days ago) (2 children)

TBH why would anybody even upgrade from the 30 serires or an AMD? At this point performance gains have almost plateaued.

[–] [email protected] 5 points 3 days ago

I see zero reasons to upgrade from my 3080 and I have a 1440 primary monitor (only goes up to 75 Hz, but that's fine with me).

[–] [email protected] 0 points 3 days ago

The performance gains per watt and per dollar have plateaued. Unless you have a really cool distributed rendering solution for 3090s the final total performance has increased very significantly. And I'm saying 3090s because that's Ampere's effective consumer top of the line. There are clearly plenty of upgrade paths from a 3060, even if the one you find optimal isn't a 5060.

Sometimes people just say things, man.