I remember when High-end-GPUs were around 500 €.
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
I am still on my GTX 1060 3 GB, probably worth about $50 at this point lol
I ran vr on one of those. Not well, but well enough.
Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.
My GPU which is an RTX2060 is getting a little long in the tooth and I'll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I'm at it.
My kid got the 2060, I bought a RX 6400, I don't need the hairy arms any more.
Then again I have become old and grumpy, playing old games.
Hell, I'm still rocking with a GTX 950. It runs Left for Dead 2 and Team Fortress 2, what more do I need?
Don't think I'll be moving on from my 7900XTX for a long while. Quite pleased with it.
All I want is more VRAM, it can already play all the games I want.
But with our new system we can make up 10x as many fake frames to cram between your real ones, giving you 2500 FPS! Isn't that awesome???
Bullshitted pixels per second seem to be the new currency.
It may look smooth in videos, but 30fps upframed(?) to 120fps will still feel like a 30fps game.
Modern TVs do the same shit, and it both looks and feels like ass. And not good ass.
I don't mean to embarrass you, but you were also supposed to say "AI!"
Points with a finger and laughs
Look at that loser not using AI
Uhhh, I went from a Radeon 1090 (or whatever they're called, it's an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It's normal to not buy a GPU every year.
Ain't nobody got time (money) for that!
Oh totes. NVIDIA continuing to lie even more blatantly to their face, driver bricking issues on updates, missing GPU ROPS performance, even more burn problems with a connector they knew continued to be problematic and lied about it, they and their retail partners releasing very limited inventory and then serving internal scalping while also being increasingly hostile to the rest of their consumers, ray tracing performance improvements they have to exclusive push in certain games and the newest most expensive hardware to actually get any benefit from their cards, false MSRP pricing and no recourse for long time loyal customers except a lottery in the US while the rest of the regions get screwed. Totes just that it's "too expensive", because when have gamers ever splurged on their hobby?
I'm sitting on a 3060 TI and waiting for the 40-series prices to drop further. Ain't no universe where I would pay full price for the newest gens. I don't need to render anything for work with my PC, so a 2-3 year old GPU will do just fine
Exact same here and I'm upgrading any time soon. MHWilds runs like ass no matter the card and I'm back to playing Hoi4 in the mean time.
The progress is just not there.
I've got RX 6800 XT for €400 in May 2023 which was at that point almost a 3y old card. Fastforward to today, the RX 9060 XT 16GB costs more and is still slower in raster. Only thing going for it is FSR4, better encoder and a bit better RT performance about which I couldn't care less about.
I have a 3080 and am surviving lol. never had an issue
Pretty wise, that's the generation before the 12HVPWR connectors started burning up.
Afaik the 2080was the last FE with a regular PCIe power connector.
Still running a 1080, between nvidia and windows 11 I think I'll stay where I am.
I have a 3080 also. It's only just starting to show it's age with some of these new UE5 games. A couple weeks ago discovered dlssg-to-fsr3 and honestly i'll take the little bit of latency for some smoother gameplay
I'm still using my GTX 1070. There just aren't enough new high-spec games that I'm interested in to justify paying the outrageous prices that NVIDIA is demanding and that AMD follows too closely behind on. Even if there were enough games, I'd refuse to upgrade out of principle, I will not reward price gouging. There are so many older/lower-spec games that I haven't yet played that run perfectly for me to care. So many games, in fact, that I couldn't get through all of them in my lifetime.
Lezgooo 1070 crew reporting in (☞゚ヮ゚)☞
It's just because I'm not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I'm waiting until AMD gets a little better with ray tracing and switching to team red.
I stopped maintaining a AAA-capable rig in 2016. I've been playing indies since and haven't felt left out whatsoever.
Don't worry, you haven't missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles...
The majority sure, but there are some gems though.
Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, ... for example
You can always wait for a couple of years before playing them, but saying they didn't miss anything is a gross understatement.
It's funny, because often they aren't prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.
Indies are great. I can play AAA titles but don't really ever.. It seems like that is where the folks with the most creativity are focusing their energy anyways.
Why not just buy a cheaper one? X060 or X070 series is usually fine in price and runs everything at high enough settings. Flagship is for maxed out everything on 4k+ resolutions. And in those cases, everything else is larger and more expensive as well; the monitor needs to be 4k, huge ass PSU, large case to fit the PSU and card in, even the power draw and energy... costs just start growing exponentially.
For me, with a 2080 already, I would have to spend much more than what my gpu was worth to have any significant upgrade. It's just not worth it.
I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it'll be my right for at least 10 years.
But it was expensive.
I'm still surviving on my RX580 4GB. Limping along these days, but no way I can justify the price of a new GPU.
Unfortunately gamers aren't the real target audience for new GPUs, it's AI bros. Even if nobody buys a 4090/5090 for gaming, they're always out of stock as LLM enthusiasts and small companies use them for AI.
Ex-fucking-actly!
Ajajaja, gamers are skipping. Yeah, they do. And yet 5090 is still somehow out of stock. No matter the price or state of gaming. We all know major tech went AI direction disregarding average Joe about either they want or not to go AI. The prices are not for gamers. The prices are for whales, AI companies and enthusiasts.
I am tired of be treated like a fool. No more money for them.