this post was submitted on 19 Nov 2025
128 points (99.2% liked)

PC Gaming

12719 readers
548 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 22 comments
sorted by: hot top controversial new old
[–] headset@lemmy.world 5 points 19 hours ago

Oh look! Another "shortage"

[–] MalReynolds@piefed.social 59 points 1 day ago* (last edited 1 day ago) (1 children)

Didn't OpenAI just sign a deal with Samsung and SK Hynix to consume 40% of the worlds DRAM wafer supply for the foreseeable future.

This is a natural consequence of sucking up all the oxygen.

Fucking AI bubble...

[–] Grimy@lemmy.world 14 points 1 day ago

If the bubble pops though, it means the market is going to get flooded.

[–] rafoix@lemmy.zip 40 points 1 day ago

Oh good, another shortage. Just what we all need right now.

[–] Washedupcynic@lemmy.ca 13 points 1 day ago (1 children)

I had a knee jerk reaction to the headline, and I want to clarify for the TL;DR people. They are talking about stopping manufacturing lower and mid-range graphics cards. They aren't going to roll out an update that would brick cards that have already been sold to customers; (that was my knee jerk assumption, because manufacturers have done shit like that before and I assume the worst.)

[–] IncogCyberSpaceUser@piefed.social 3 points 1 day ago (1 children)

When did manufacturers remotely destroy people's GPUs?

[–] Washedupcynic@lemmy.ca 3 points 1 day ago

I should have clarified better. To date, there are no instances I know of where GPU manufacturers are bricking people's GPUs. When I said manufacturers, I meant it more broadly in terms of gadget/tech manufacturing.

Sony removed other OS function after units were sold, which bricked online function for those that wanted to keep it, since the only way to keep that function was to not update the console or take it online. There have been instances of manufacturers bricking devices, in this case robot vacuums, when the user discovered it was reporting location data about his house to the home company. The user stopped that data collection, and the company remote bricked the device. Spotify bricked a piece of hardware, car thing, after people had already paid for it, with a limited 1 month refund window from when it was discontinued. Belkin is bricking some of it's smart home products on January 31, 2026. In 2019, Sonos launched a trade-up scheme that offered existing owners 30% off the cost of a new speaker. But owners had to activate "Recycle Mode" on their existing Sonos speaker, making it permanently unusable - even if there was nothing wrong with it.

My brain went, kill = brick, then thought about all of the shit I've seen where tech manufacturers fucked over their customers in the past, and just assumed the worst from the start before reading the article.

[–] humanspiral@lemmy.ca 11 points 1 day ago (2 children)

This is extremely serious for economic bubble.

Orders for datacenter AI chips exceed supply, and more high end/other memory per TSMC wafer is further nightmare. This is likely to mean higher prices per token for datacenter buyers, and higher prices for users/model renters, and much slower demand growth and AI progress. It also means long delays for datacenters, and better black market (China, higher than MSRP diversions from contracted deliveries).

I'm not sure if affects phone/lpddr soldered memory, but tsmc is going to charge more for phone chips too. This can cause whole consumer/business computing market to collapse. Return of older generation designs on underused process nodes will give little reason to upgrade, and still overcharge. This can be an opening for China exports of competing products that were not possible at low/reasonable ram/tsmc prices/availability, where even if China has difficulty achieving best yields, it's still profitable to invest/expand aggressively, that discourages US/western colonies from investing.

This race to give the US Skynet, for stronger political control/social credit/surveillance of Americans, can make a bubble in everything else, and accelerate financial collapse, all the while making the goal impossible to achieve and forcing China to become stronger/more resilient, with greater share of global computing supply.

[–] frezik@lemmy.blahaj.zone 9 points 1 day ago (2 children)

It affects all DRAM across the board. The fundamentals of DRAM haven't changed in decades, and everything comes from three companies.

Good thing Microsoft forced people to throw away a bunch of perfectly functional PCs. This was the perfect time for everyone to have to buy new ones.

[–] muusemuuse@sh.itjust.works 2 points 1 day ago

They didn’t force them to throw away PCs. They have alternative options. People chose to be angry instead of doing anything about it.

It’s the shit people bitch about Apple for doing, but it’s somehow adorable when Microsoft does it.

[–] Washedupcynic@lemmy.ca 1 points 1 day ago

Linux has entered the chat. I installed linux on my older machine that's ~ 7 years old. My new machine has windows 11 and I fucking hate it.

[–] jaykrown@lemmy.world 4 points 1 day ago (1 children)

Absolutely. Something you missed though, at the same time AI models are becoming more efficient and cheaper to run, so these data centers are going to be a massive waste of resources in a year.

[–] humanspiral@lemmy.ca 1 points 19 hours ago* (last edited 18 hours ago)

if tsmc only makes datacenter chips from now on, then "we" are shut out from the huge privacy (and fine tuning specialization) gains given by small efficient cheap to run models (or play games on new hardware). US datacenters will serve US empire/political establishment both with government as main LLM customer, but also for data collection/palantir ontology/social credit scores on every American.

I suspect that better datacenter chips won't actually reduce their cost due to supply limitations, but even for small efficient models, personal hardware has a long payback period compared to a per token "rental" cloud charge. It is unlikely that all of the datacenter chip buyers will have non-government customers to use them all, and so either bailout or bankruptcy followed by megatech buying the datacenters for cheap followed by a bailout in government revenue for big tech global/citizen control applications.

Eventually, even the government has too much AI resources, at planned expansion pace, and then consumer/business computing/gpu market comes back. Could be as soon as 2026 that a collective understanding of absurdity occurs.

[–] Coldgoron@lemmy.zip 5 points 1 day ago

Just need my 3060ti to last 8 more years

[–] luciole@beehaw.org 6 points 1 day ago (2 children)

Big opportunity coming for emerging players in the GPU market.

[–] DdCno1@beehaw.org 15 points 1 day ago (1 children)

What emerging players? You can't just whip up a competitive GPU in a jiffy, even if you have Intel money.

Also, unless they are from a different planet that has its own independent supply chain, they'd have to deal with the very same memory shortage and the very same foundries that are booked out for years.

[–] humanspiral@lemmy.ca 5 points 1 day ago

Chinese emerging players who are shut out from tsmc/taiwan/ROK chips and memory sources.

[–] frezik@lemmy.blahaj.zone 5 points 1 day ago (1 children)

DRAM shortages affect everything. There's no wiggling out of that through alternative GPUs.

[–] luciole@beehaw.org 2 points 1 day ago (2 children)

If the big two completely abandon the low-mid market Intel, Lisuan, Moore Thread or whatever might put the little DRAM they manage to grab on that orphaned market. A large proportion of gamers aren't into buying 2000$ GPUs. Those companies might not succeed right away but they've been cooking for a while already so I wonder.

[–] tempest@lemmy.ca 2 points 20 hours ago

The Price after the RAM increase would bump those cards up market and out of reach for the people in the segment.

[–] frezik@lemmy.blahaj.zone 2 points 22 hours ago (1 children)

To sell at a loss, or at least very low profit? Low end GPUs tend to have tight margins to begin with. Why stick limited DRAM in there when there are products that need it that can actually be sold for profit?

I guess they can be a loss leader. It's not a sustainable business model, though, and this DRAM shortage is projected to last a while.

[–] luciole@beehaw.org 2 points 21 hours ago

I agree. In retrospect When I said "big opportunity" I was pushing it. More of a (narrow) potential opening to try for a modest market share. I guess I'm just hoping affordable GPUs remain a thing.