What could a GPU cost? $5000?
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Maybe some Chinese manufacturer will find a way to fill the gap in the market
i think the latest is that china has managed to create a GPU that’s ~7 years behind. i’m not sure that’s “a GPU from 7 years ago” or “it will take them 7 years, acknowledging that there’s a known path so will take less time”
AFAIK they’ll have to figure out EUV or some other method of lithography at that scale, which they’re trying really hard at but it’s one heck of a difficult thing to do which is why only TSMC currently actually has it working
Here's hoping
If you want to do work with the GPU you're still buying NVIDIA. Particularly 3D animation, video/film editing, and creative tools. Even FOSS tools like GIMP and Krita prefer NVIDIA for GPU accelerated functions.
We're running straight into a future where consumers' only option for computers are a cloud solution like MS 365
Pushing constantly towards a subscription economy.
That "economy" is already falling apart. Subscriptions are down, services on "the cloud" are becoming less reliable, piracy is way up again, and major nations and companies are moving to alternatives.
Hell, DDR3 is making a comeback. All that is needed is one manufacturer to start making 15 year old tech again and bam, the house of cards falls.
The only future, is one where billionaires aren't in it.
Brother, we're up to trillionaires now and they don't seem like they're going anywhere.
Didn't like 1% of them die from accidents recently? That sub accident, that guy who's penis surgery went wrong.
i really hope nvidia collapses when the AI bubble pops. They've been more harm than good for consumers for too long.
It won't collapse. It'll lose a huge chunk of its stock price, but it both has other business to fall back on and its chips will still likely be used in whatever the next tech trend is - probably neural network AI or something.
I am not sure. They have other businesses but not sure those other businesses are able to sustain the obligations that nVidia has committed to in this round. They are juggling more money than their pre-AI boom market cap by a wide margin, so if the bubble pops, unclear how big a bag nVidia will be left holding and if the rest of their business can survive it. Guess they might go bankrupt and come out of it eventually to continue business as usual after having financial obligations wiped away..
Also, they have somewhat tarnished their reputation with going all in on the dataenter equipment to, seemingly here, abandoning the consumer market to make more capacity for the datacenters. So if AMD ever had an opportunity to maybe cash in, well, here it might be.... Except they also dream of being a big datacenter player, but weaker demand may leave them with leftover capacity..
juggling more money than their pre-AI boom market cap by a wide margin
I'm not sure what you mean by this. Nvidia carries a vanishingly small amount of debt for its size. It has way more liquidity than debt.
Never underestimate AMD's ability to miss good opportunities.
I know radeons don't really have the performance crown, but as a life long Nvidia GPU and Linux user, the PITA drivers are not a problem when you use an AMD radeon card.
As someone not looking to spend a ton of money on new hardware any time soon: good. The longer it takes to release faster hardware, the longer current hardware stays viable. Games aren't going to get more fun by slightly improving graphics anyway. The tech we have now is good enough.
People don't just use computers for gaming. If this continues people will struggle to do any meaningful work on their personal computes which is definitely not good. And I'm not talking about browsing facebook but about coding, doing research, editing videos and other useful shit.
Scientific modeling and simulations
If this continues people will struggle to do any meaningful work on their personal computes
Excel users devestated.
But wait! They can pay for remote computing time for a fraction of the cost! Each month. Forever.
I fully expect personal computers to be phased out in favor of a remote-access, subscription model. AI popping would leave these big data centers with massive computational power available for use, plus it's the easiest way to track literally everything you do on your system.
Hopefully the AI bubble popping means they have to close data centers and liquidate hardware. Dirt-cheap aftermarket servers would be good for the fediverse.
I fully expect personal computers to be phased out in favor of a remote-access, subscription model
I wouldn't hold my breath.
easiest way to track literally everything you do on your system.
And ban undesired activities. "We see you're building app to track ICE agents. That's illegal. Your account was banned and all your data removed.".
"Remain in your cube - The Freedom Force is en route to administer freedom reeducation. Please be sure to provide proof of medical insurance prior to forced compliance."
Remote computing is very expensive. It's just the gated (owned by companies) LLMs that are cheap for the final consumer. Training a 2b LLM on remote compute will cost thousands of dollars if you try to.
2B is nothing, even 7B is tiny. Commercial API-based LLMs are like 130-200 billion parameters.
I mean yeah, training a 7B LLM from scratch on consumer-grade hardware could take weeks or months, and run up an enormous electric bill. With a decent GPU and enough VRAM you could probably shorten that to days or weeks, and you might want to power it on solar panels.
But I haven't calculated what it would take to do on rented compute.
They’re AI only now.