this post was submitted on 26 Mar 2025
53 points (98.2% liked)

PC Gaming

14310 readers
501 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 20 comments
sorted by: hot top controversial new old
[–] HK65@sopuli.xyz 15 points 1 year ago (1 children)

enabled with FSR 4 technology

I'm pretty sure we'll have a separate corpo-English by 2100 that is not intelligible by normal people.

[–] Obi@sopuli.xyz 2 points 1 year ago (1 children)

The only reason I opened the article was to find out what FSR meant. They never actually spell it out, you can understand it's AI upscaling from the context but I guess they just assume you know the acronym...

[–] jonesy@aussie.zone 3 points 1 year ago

It stands for FidelityFX Super Resolution.

[–] Suppoze@beehaw.org 12 points 1 year ago (1 children)

You know what would gather even more interest? Games not running like shit on native resolution.

[–] ZeroHora@lemmy.ml 4 points 1 year ago (1 children)
[–] Viri4thus@feddit.org 5 points 1 year ago (1 children)

I find it amusing that the company that is promoting brute force calculation of ray trajectories rather than using optimised code (competition defeat device) calls native rendering "brute force". Meanwhile some of the best games of the past decade run on potato powered chips.

[–] ZeroHora@lemmy.ml 3 points 1 year ago (1 children)

I can't get over this bullshit, Nvidia could become the best company in the world, with the best products at the best prices but I'll never forgive this level of bullshit, fucking brute-force rendering.

[–] Viri4thus@feddit.org 3 points 1 year ago

Absolutely. Jensen is so rich, if he wanted to spend his fortune, he couldn't, within the habitual human lifespan.

All that success because in the late naughties and early 10s NVIDIA (at least in EU) were giving away free GPUs to universities and giving grants on the condition researchers would use CUDA. Same with developers, they had two engineering teams in East Europe that would serve as outsoucing for code to cheapen development of games as a way to promote NVIDIA's software "optimisations". Most TWIMTBP games of that era, Bryan Rizzo's time, have some sort of competing HW defeat device. They were so successful that their modern GPUs, Blackwell, can barely run some of their old games...

[–] the_q@lemm.ee 10 points 1 year ago (2 children)

Nvidia creates problem then creates solution and charges a premium for it. Industry smells money and starts including said problem in games. AMD gets left behind and tries to play catch up. Offers open source implementations of certain technologies to try and also create solution. Gamers still buy Nvidia.

[–] FreeBooteR69@lemmy.ca 6 points 1 year ago

None of these two are our friends, though AMD is much nicer to the open source world. I tend to buy AMD because at least the hardware i've bought has good value and tremendous linux support.

[–] Dasus@lemmy.world 1 points 1 year ago

I probably will, yeah.

Or I was going to. Would've got 5070 ti, but didn't have luck with the stock when it came out then drank most of the money, thought to give it a bit of time.

I'm gonna wait a few months to see how this turns out after 5060ti comes out and whatnot.

[–] Tywele@lemmy.dbzer0.com 9 points 1 year ago (1 children)

Good. FSR is finally able to compete with DLSS.

[–] moody@lemmings.world -1 points 1 year ago (1 children)

Is it? I haven't used an Nvidia GPU since the GTX series, but my understanding was that DLSS was very effective. Meanwhile, the artifacting on FSR bothers the crap out of me.

[–] ShinkanTrain@lemmy.ml 4 points 1 year ago* (last edited 1 year ago) (1 children)

Yes. FSR4 is the first version that uses dedicated hardware to do it like DLSS. Consensus seems to be that's it's on the same level as DLSS 3 (CDN model) but is heavier to run, which is pretty great for a first attempt.

[–] moody@lemmings.world 1 points 1 year ago

I see, it's unfortunate that it requires dedicated hardware, but I guess it makes sense when the main competitor already has that.

[–] LaMouette@jlai.lu 4 points 1 year ago (1 children)

Lol a fully dedicated tech for things we absolutely won't notice.

[–] ShinkanTrain@lemmy.ml 5 points 1 year ago (1 children)

FSR4 is absolutely noticable. I can't tell the difference between native 4k and 1440p scaled to 4k with FSR4. That's a giant performance boost.

[–] ObtuseDoorFrame@lemm.ee 1 points 1 year ago

I've even been having trouble telling the difference between Super Resolution 4 and native. Driver level upscaling this good is a game changer, I might not even have to deal with optiscaler.

I'd just like to see games optimized better so FSR/DLSS isn't needed.

[–] leshy@r.nf 2 points 1 year ago* (last edited 1 year ago)

AMD is fairly very aware of this

PCGamer needs to edit this stuff.