this post was submitted on 07 Feb 2026
395 points (99.5% liked)

Technology

80859 readers
3074 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] M0oP0o@mander.xyz 10 points 3 hours ago

What could a GPU cost? $5000?

[–] percent@infosec.pub 22 points 8 hours ago (2 children)

Maybe some Chinese manufacturer will find a way to fill the gap in the market

[–] pupbiru@aussie.zone 5 points 1 hour ago

i think the latest is that china has managed to create a GPU that’s ~7 years behind. i’m not sure that’s “a GPU from 7 years ago” or “it will take them 7 years, acknowledging that there’s a known path so will take less time”

AFAIK they’ll have to figure out EUV or some other method of lithography at that scale, which they’re trying really hard at but it’s one heck of a difficult thing to do which is why only TSMC currently actually has it working

[–] CovfefeKills@lemmy.world 5 points 7 hours ago

Here's hoping

[–] Paranoidfactoid@lemmy.world 7 points 9 hours ago

If you want to do work with the GPU you're still buying NVIDIA. Particularly 3D animation, video/film editing, and creative tools. Even FOSS tools like GIMP and Krita prefer NVIDIA for GPU accelerated functions.

[–] UltraBlack@lemmy.world 14 points 16 hours ago* (last edited 16 hours ago) (2 children)

We're running straight into a future where consumers' only option for computers are a cloud solution like MS 365

[–] SocialMediaRefugee@lemmy.world 4 points 3 hours ago (1 children)

Pushing constantly towards a subscription economy.

[–] M0oP0o@mander.xyz 1 points 3 hours ago

That "economy" is already falling apart. Subscriptions are down, services on "the cloud" are becoming less reliable, piracy is way up again, and major nations and companies are moving to alternatives.

Hell, DDR3 is making a comeback. All that is needed is one manufacturer to start making 15 year old tech again and bam, the house of cards falls.

[–] WorldsDumbestMan@lemmy.today 13 points 15 hours ago (1 children)

The only future, is one where billionaires aren't in it.

[–] UnderpantsWeevil@lemmy.world 7 points 9 hours ago (1 children)

Brother, we're up to trillionaires now and they don't seem like they're going anywhere.

[–] WorldsDumbestMan@lemmy.today 1 points 2 hours ago

Didn't like 1% of them die from accidents recently? That sub accident, that guy who's penis surgery went wrong.

[–] A_Random_Idiot@lemmy.world 19 points 19 hours ago (1 children)

i really hope nvidia collapses when the AI bubble pops. They've been more harm than good for consumers for too long.

[–] hamsterkill@lemmy.sdf.org 8 points 10 hours ago (1 children)

It won't collapse. It'll lose a huge chunk of its stock price, but it both has other business to fall back on and its chips will still likely be used in whatever the next tech trend is - probably neural network AI or something.

[–] jj4211@lemmy.world 2 points 10 hours ago (2 children)

I am not sure. They have other businesses but not sure those other businesses are able to sustain the obligations that nVidia has committed to in this round. They are juggling more money than their pre-AI boom market cap by a wide margin, so if the bubble pops, unclear how big a bag nVidia will be left holding and if the rest of their business can survive it. Guess they might go bankrupt and come out of it eventually to continue business as usual after having financial obligations wiped away..

Also, they have somewhat tarnished their reputation with going all in on the dataenter equipment to, seemingly here, abandoning the consumer market to make more capacity for the datacenters. So if AMD ever had an opportunity to maybe cash in, well, here it might be.... Except they also dream of being a big datacenter player, but weaker demand may leave them with leftover capacity..

[–] hamsterkill@lemmy.sdf.org 1 points 1 hour ago* (last edited 1 hour ago)

juggling more money than their pre-AI boom market cap by a wide margin

I'm not sure what you mean by this. Nvidia carries a vanishingly small amount of debt for its size. It has way more liquidity than debt.

[–] TheOakTree@lemmy.zip 3 points 3 hours ago

Never underestimate AMD's ability to miss good opportunities.

[–] boaratio@lemmy.world 6 points 16 hours ago

I know radeons don't really have the performance crown, but as a life long Nvidia GPU and Linux user, the PITA drivers are not a problem when you use an AMD radeon card.

[–] horse@feddit.org 16 points 22 hours ago (1 children)

As someone not looking to spend a ton of money on new hardware any time soon: good. The longer it takes to release faster hardware, the longer current hardware stays viable. Games aren't going to get more fun by slightly improving graphics anyway. The tech we have now is good enough.

[–] ExLisper@lemmy.curiana.net 9 points 21 hours ago (3 children)

People don't just use computers for gaming. If this continues people will struggle to do any meaningful work on their personal computes which is definitely not good. And I'm not talking about browsing facebook but about coding, doing research, editing videos and other useful shit.

[–] wonderingwanderer@sopuli.xyz 2 points 7 hours ago

Scientific modeling and simulations

[–] UnderpantsWeevil@lemmy.world 1 points 9 hours ago

If this continues people will struggle to do any meaningful work on their personal computes

Excel users devestated.

[–] SoleInvictus@lemmy.blahaj.zone 10 points 18 hours ago (6 children)

But wait! They can pay for remote computing time for a fraction of the cost! Each month. Forever.

I fully expect personal computers to be phased out in favor of a remote-access, subscription model. AI popping would leave these big data centers with massive computational power available for use, plus it's the easiest way to track literally everything you do on your system.

[–] wonderingwanderer@sopuli.xyz 2 points 6 hours ago

Hopefully the AI bubble popping means they have to close data centers and liquidate hardware. Dirt-cheap aftermarket servers would be good for the fediverse.

[–] UnderpantsWeevil@lemmy.world 1 points 9 hours ago

I fully expect personal computers to be phased out in favor of a remote-access, subscription model

I wouldn't hold my breath.

[–] ExLisper@lemmy.curiana.net 4 points 13 hours ago* (last edited 13 hours ago) (1 children)

easiest way to track literally everything you do on your system.

And ban undesired activities. "We see you're building app to track ICE agents. That's illegal. Your account was banned and all your data removed.".

[–] SoleInvictus@lemmy.blahaj.zone 3 points 12 hours ago* (last edited 12 hours ago)

"Remain in your cube - The Freedom Force is en route to administer freedom reeducation. Please be sure to provide proof of medical insurance prior to forced compliance."

[–] obbeel@lemmy.eco.br 3 points 16 hours ago (1 children)

Remote computing is very expensive. It's just the gated (owned by companies) LLMs that are cheap for the final consumer. Training a 2b LLM on remote compute will cost thousands of dollars if you try to.

[–] wonderingwanderer@sopuli.xyz 2 points 6 hours ago

2B is nothing, even 7B is tiny. Commercial API-based LLMs are like 130-200 billion parameters.

I mean yeah, training a 7B LLM from scratch on consumer-grade hardware could take weeks or months, and run up an enormous electric bill. With a decent GPU and enough VRAM you could probably shorten that to days or weeks, and you might want to power it on solar panels.

But I haven't calculated what it would take to do on rented compute.

load more comments (2 replies)
[–] anon_8675309@lemmy.world 6 points 21 hours ago

They’re AI only now.

load more comments
view more: next ›