this post was submitted on 13 Mar 2025
22 points (95.8% liked)

Technology

37746 readers
3 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] yogthos@lemmy.ml 3 points 11 months ago (2 children)

It appears you've missed the point here, which is that it turns out you can use older GPUs in creative ways to get a lot more out of them than people realized. Having latest chips isn't the bottleneck people thought it was.

[–] Euphoma@lemmy.ml 1 points 11 months ago (1 children)

This article doesn't talk about older gpus though? Its talking about using the V80 fpga from amd, which released in 2024 and costs 10k. Unless I'm misunderstanding something about the article? I do think its a good breakthrough being able to use an fpga like this though.

[–] yogthos@lemmy.ml 1 points 11 months ago

You're right, the chip they leveraged isn't actually that old. The key part is that we're seeing a lot of optimizations happening in software space now that allows to use existing chips more efficiently.

[–] utopiah@lemmy.ml 0 points 11 months ago* (last edited 11 months ago) (1 children)

turns out you can use older GPUs in creative ways to get a lot more out of them than people realized

If that's the point then that's the entire GPU used for mining then ML revolution, thanks to CUDA mostly, that already happened in 2010 so that's even older, that'd 15 yeas ago.

What I was highlighting anyway is that it's hard to trust an article where simple facts are wrong.

[–] yogthos@lemmy.ml -1 points 11 months ago (1 children)

Imagine not being able to understand that new software optimization techniques are continuously being discovered. 🤦

[–] utopiah@lemmy.ml -1 points 11 months ago

Well, I honestly tried (cf history). You're neither addressing my remark about the fact from the article nor the bigger picture. Waste of time, blocked.