As long as AMD and Intel continue their open source drivers, I'm fine with it.
Games
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
I'd rather pay for chinese gpus than cloud gaming. The article, by focusing solely almost considers than nothing else exists, especially in China where moore threads gpu start to have reasonnable perf for gaming. I don't think Europe can't produce something as long as it stays neoliberal, but some weird stuff could happen with RISCV
Fact is, Nvidia has the vast majority of market share: https://www.techpowerup.com/337775/nvidia-grabs-market-share-amd-loses-ground-and-intel-disappears-in-latest-dgpu-update
It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good?
Alas, for whatever reason they aren’t even picking existing alternatives to Nvidia, so Nvidia suddenly becoming unavailable would be a huge event.
It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good?
Because the 2022 GPU amd 7900xtx has better raw performance than the 9000 series, and I'm pissed off they didn't try to one-up themselves in the high-end market. I'm not buying a new Nvidia card but I'm not buying a 9000 series either because it feels like I'm paying for a sub-par GPU compared to what they're capable of.
Well, same for me TBH. I’d buy a huge Battlemage card before you could blink, but the current cards aren’t really faster than an ancient 3090.
But most people I see online want something more affordable, right in the 9000 series or Arc range, not a 384-bit GPU.
AMD didn't make a 5090 equivalent so I won't buy their mid-tier card
Is there a name for this thinking?
how is it a sub par GPU given it targets a specific segment (looking at it's price point, die area, memory & power envelope) with its configuration?
You're upset that they didn't aim for a halo GPU and I can understand that, but how does this completely rule out a mid to high end offering from them?
the 9000 series is reminiscent of nv10 versus vega10 GPUs like the 56, 64, even the Radeon 7; achieving equivalent performance for less power and hardware.
For accelerated rendering etc cuda is the standard, and because of it Nvidia. And it's like that for a lot of other niche areas. Accelerated encoding? NVIDIA, and CUDA again. Yes, AMD and Intel can generally do all that these days as well. But that's a much more recent thing. If all you want to do is game, sure that's not a big issue.
But if you want to do anything more than gaming. Say video editing and rendering timelines. Nvidia had been the standard with a lot of inertia. To give an example of how badly AMD missed the boat. I have been using accelerated rendering on my gt 750 in blender for a decade now. The card came out early 2014. The first AMD card capable of that came out one month before the end of 2020. Nearly a 7 year difference! I'm looking at a recent Intel arc or battle mage card or a 6xxx series AMD ATM. Because I run BSD/Linux and Nvidia has traditionally been a necessary PITA. But with both of them now supporting my workflows. Nvidia is much less necessary. Others will eventually leave too. As long as AMD and Intel don't fuck it up for themselves.
Yeah I mean you are preaching to the choir there. I picked up a used 3090 because rocm on the 7900 was in such a poor state.
That being said, much of what you describe is just software obstinacy. AMD (for example) has had hardware encoding since early 2012, with the 7970. Intel quicksync has long been a standard on laptops. It’s just a few stupid propriety bits that never bothered to support it.
CUDA is indeed extremely entrenched in some areas, like anything involving PyTorch or Blender’s engines. But there’s no reason (say) Plex shouldn’t support AMD, or older editing programs that use OpenGL anyway.
Yes, the software will get there long before many people's hearts and minds. As you said already is in many cases. The inertia Nvidia got by being early is why they're so dominant now. But I think Nvidia's focus on crypto and now data center AI is set to hurt them long term. Only time will tell, and they're technically swimming in it ATM. But I'm getting out now.
CUDA is actually pretty cool, especially in the early days when there was nothing like it. And Intel/AMD attempts at alternatives have been as mixed as their corporate dysfunction.
And Nvidia has long has a focus on other spaces, like VR, AR, dataset generation, robotics, “virtual worlds” and such. If every single LLM thing disappeared overnight in a puff of smoke, they’d be fine; a lot of their efforts would transition to other spaces.
Not that I’m an apologist for them being total jerks, but I don’t want to act like CUDA isn’t useful, either.
Uh, then AMD wins the PC GPU wars, due to unexpected resignation of Nvidia, and Intel becomes the new AMD, in that market segment.
And also some Chinese companies emerge as new PC GPU manufacturers, though what exact market strategy they would try to specialize in or pursue, is hard to predict.
Anybody who either just wants a local compute gaming pc, or doesn't have the best internet access / data caps... goes with AMD/Intel, 'casuals' go with renting their remote game rendering.
The economic/cultural dynamics of pc gaming begin to resemble buying a new/used car vs leasing one, both get more financialized in their own ways.
... Why does there need to be a whole article about this?
- Nvidia abandons x86 desktop gamers
- The only hardware that gamers own are ARM handhelds
- Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking
- AI bubble pops
- Nvidia tries to regain x86 desktop gamers
- Gamers are almost entirely on ARM
- Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else
Nvidia does not care about the ISA of the CPU at all. They don't make it after all. Also not clear how they would kill x86. If they leave the consumer GPU market they cede it to AMD and Intel.
Nvidia does not care about the ISA of the CPU at all.
That’s kinda my point. They’re stuck communicating over PCI-E instead of being a first-class co-processor over AMBA.
Nvidia has drivers for arm. They're not in as good a shape as the X86 one is. But I don't think it's that big of a roadblock.
Sure, but do you need a discrete video card if you’re gaming on an ARM SoC? And we’ve seen from the struggles of x86 iGPUs that graphics APIs pretty much have to choose whether they’re going to optimize for dedicated VRAM or shared memory, cuz it has inescapable implications for how you structure a game engine. ARM APIs will probably continue optimizing for shared memory, so PCI-E GPUs will always be second-class citizens.
You need a discrete video card on ARM exactly as much as you do on x86.
The GPU is independent of the CPU architecture, so if you don't like x86 iGPU performance expect to not like ARM iGPU performance either.
Of course you can dump a full desktop dGPU into the same package as your CPU, like Apple did with the M4 Max, but again this can be done on x86 as well. None of that is dependent on the ISA.
Yes, even for applications other than gaming. There are legitimate mad lads out there running steam games with discrete video cards on Raspberry Pi's and LLMs. Not to mention there are non soc arm machines. And soc Intel machines.
Sometimes getting the integrated graphics on any of these SOCs working is a much harder prospect than getting a discrete one.
With China working hard to catch up with chip production, it is only a matter of time before we start seeing attractively priced Chinese made GPUs on the market. No idea how long it will take though.
What makes you think chinese firms wont also jump on the AI bandwagon?
someone with an actual CS/engineering background feel free to correct me, but i feel like the only way out of this for gamers is if someone finds a better type of chip for AI work. GPUs just happened to be the best thing for the job when the world went crazy, they were never designed specifically for these workloads.
If someone like tenstorrent can design a RISC-V chip for these LLM workloads, it might take some demand off gaming GPUs.
You’ve got a good point. I wouldn’t be surprised if nVidia was working on a dedicated platform for AI to cover this exact issue. Then again, I would be equally unsurprised if they just didn’t care and didn’t mind gutting the home gaming market for short-term profit.
All I have to say is thank God and good riddance may your stock price collapse
Seriously. Why would I care that a billion dollar corporation who exploited the market to maximize their revenue is leaving for a fad market?
"Bye bitch."
PC gaming itself will hardly change, because AMD cards work just fucking fine. They've only ever been a little bit behind on the high end. They've routinely been the better value for money, and offered a much lower low end. If they don't have to keep chasing the incomparable advantages Nvidia pulls out of their ass, maybe they can finally get serious about heterogenous compute.
Or hey, maybe Nvidia ditching us would mean AMD finds the testicular fortitude to clone CUDA already, so we can end this farce of proprietary computation for your own god-damn code. Making any PC component single-vendor should've seen Nvidia chopped in half, long before this stupid bubble.
Meanwhile:
Cloud gaming isn't real.
Anywhere after 1977, the idea that consumers would buy half a computer and phone in to a mainframe was a joke. The up-front savings were negligible and difference in capabilities did not matter. All you missed out on were your dungeon-crawlers being multiplayer, and mainframe operators kept trying to delete those programs anyway. Once home internet became commonplace even that difference vanished.
As desktop prices rose and video encoding sped up, people kept selling the idea you'll buy a dumb screen and pay to play games somewhere else. You could even use your phone! Well... nowadays your phone can run Unreal 5. And a PS5 costs as much as my dirt-cheap eMachines from the AOL era, before inflation. That console will do raytracing, except games don't use it much, because it doesn't actually look better than how hard we've cheated with rasterization. So what the fuck is a datacenter going to offer, with 50ms of lag and compression artifacts? Who expects it's going to be cheaper, as we all juggle five subscriptions for streaming video?
AFAIK, the H100 and up (Nvidia’s bestselling data center GPUs) can technically game, but they’re missing so many ROPs that they’re really bad at it. There will be no repurposing all those AI datacenters for cloud gaming farms.
Games and gaming have fully become like Hollywood and Silicon Valley and I expect zero good things from them at this point. As with movies and music, now most of the good stuff will be from individuals and smaller enterprises. The fact is today's GPUs have enough power to do extraordinary things. The hardware these days moves so fast, no one is squeezing any performance out of anything like they used to have to do. And not every game needs photo realistic ray traced graphics, so these GPUs will be fine for many gamers so long as they remain supported through drivers.
They are in cahoots with the RAM cartels to push gaming onto their cloud services so that competitors like AMD don't just pick them up. Trying to make everything into a service is just a side benefit, although I'm sure they realize 16 bit SNES games are still fun and that people will just be driven to less powerful entertainment platforms.
Good riddance, may the bubble burst and all that IP be available from a lincenser that charges low license fees, or even free!