this post was submitted on 19 Nov 2025
119 points (99.2% liked)

PC Gaming

12713 readers
429 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 17 comments
sorted by: hot top controversial new old
[–] MalReynolds@piefed.social 50 points 21 hours ago* (last edited 21 hours ago) (1 children)

Didn't OpenAI just sign a deal with Samsung and SK Hynix to consume 40% of the worlds DRAM wafer supply for the foreseeable future.

This is a natural consequence of sucking up all the oxygen.

Fucking AI bubble...

[–] Grimy@lemmy.world 11 points 20 hours ago

If the bubble pops though, it means the market is going to get flooded.

[–] Coldgoron@lemmy.zip 5 points 15 hours ago

Just need my 3060ti to last 8 more years

[–] rafoix@lemmy.zip 38 points 23 hours ago

Oh good, another shortage. Just what we all need right now.

[–] Washedupcynic@lemmy.ca 10 points 18 hours ago (1 children)

I had a knee jerk reaction to the headline, and I want to clarify for the TL;DR people. They are talking about stopping manufacturing lower and mid-range graphics cards. They aren't going to roll out an update that would brick cards that have already been sold to customers; (that was my knee jerk assumption, because manufacturers have done shit like that before and I assume the worst.)

[–] IncogCyberSpaceUser@piefed.social 3 points 11 hours ago (1 children)

When did manufacturers remotely destroy people's GPUs?

[–] Washedupcynic@lemmy.ca 1 points 10 hours ago

I should have clarified better. To date, there are no instances I know of where GPU manufacturers are bricking people's GPUs. When I said manufacturers, I meant it more broadly in terms of gadget/tech manufacturing.

Sony removed other OS function after units were sold, which bricked online function for those that wanted to keep it, since the only way to keep that function was to not update the console or take it online. There have been instances of manufacturers bricking devices, in this case robot vacuums, when the user discovered it was reporting location data about his house to the home company. The user stopped that data collection, and the company remote bricked the device. Spotify bricked a piece of hardware, car thing, after people had already paid for it, with a limited 1 month refund window from when it was discontinued. Belkin is bricking some of it's smart home products on January 31, 2026. In 2019, Sonos launched a trade-up scheme that offered existing owners 30% off the cost of a new speaker. But owners had to activate "Recycle Mode" on their existing Sonos speaker, making it permanently unusable - even if there was nothing wrong with it.

My brain went, kill = brick, then thought about all of the shit I've seen where tech manufacturers fucked over their customers in the past, and just assumed the worst from the start before reading the article.

[–] humanspiral@lemmy.ca 10 points 20 hours ago (2 children)

This is extremely serious for economic bubble.

Orders for datacenter AI chips exceed supply, and more high end/other memory per TSMC wafer is further nightmare. This is likely to mean higher prices per token for datacenter buyers, and higher prices for users/model renters, and much slower demand growth and AI progress. It also means long delays for datacenters, and better black market (China, higher than MSRP diversions from contracted deliveries).

I'm not sure if affects phone/lpddr soldered memory, but tsmc is going to charge more for phone chips too. This can cause whole consumer/business computing market to collapse. Return of older generation designs on underused process nodes will give little reason to upgrade, and still overcharge. This can be an opening for China exports of competing products that were not possible at low/reasonable ram/tsmc prices/availability, where even if China has difficulty achieving best yields, it's still profitable to invest/expand aggressively, that discourages US/western colonies from investing.

This race to give the US Skynet, for stronger political control/social credit/surveillance of Americans, can make a bubble in everything else, and accelerate financial collapse, all the while making the goal impossible to achieve and forcing China to become stronger/more resilient, with greater share of global computing supply.

[–] jaykrown@lemmy.world 4 points 11 hours ago

Absolutely. Something you missed though, at the same time AI models are becoming more efficient and cheaper to run, so these data centers are going to be a massive waste of resources in a year.

[–] frezik@lemmy.blahaj.zone 7 points 17 hours ago (2 children)

It affects all DRAM across the board. The fundamentals of DRAM haven't changed in decades, and everything comes from three companies.

Good thing Microsoft forced people to throw away a bunch of perfectly functional PCs. This was the perfect time for everyone to have to buy new ones.

[–] Washedupcynic@lemmy.ca 1 points 9 hours ago

Linux has entered the chat. I installed linux on my older machine that's ~ 7 years old. My new machine has windows 11 and I fucking hate it.

[–] muusemuuse@sh.itjust.works 2 points 11 hours ago

They didn’t force them to throw away PCs. They have alternative options. People chose to be angry instead of doing anything about it.

It’s the shit people bitch about Apple for doing, but it’s somehow adorable when Microsoft does it.

[–] luciole@beehaw.org 6 points 23 hours ago (2 children)

Big opportunity coming for emerging players in the GPU market.

[–] frezik@lemmy.blahaj.zone 4 points 16 hours ago (1 children)

DRAM shortages affect everything. There's no wiggling out of that through alternative GPUs.

[–] luciole@beehaw.org 1 points 9 hours ago

If the big two completely abandon the low-mid market Intel, Lisuan, Moore Thread or whatever might put the little DRAM they manage to grab on that orphaned market. A large proportion of gamers aren't into buying 2000$ GPUs. Those companies might not succeed right away but they've been cooking for a while already so I wonder.

[–] DdCno1@beehaw.org 14 points 22 hours ago (1 children)

What emerging players? You can't just whip up a competitive GPU in a jiffy, even if you have Intel money.

Also, unless they are from a different planet that has its own independent supply chain, they'd have to deal with the very same memory shortage and the very same foundries that are booked out for years.

[–] humanspiral@lemmy.ca 5 points 20 hours ago

Chinese emerging players who are shut out from tsmc/taiwan/ROK chips and memory sources.