this post was submitted on 07 Oct 2025
58 points (96.8% liked)
Technology
40737 readers
459 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's strange that the concept of efficiency seems to have been abandoned. Is consumption of vast computing resources no longer seen as indication of a design flaw?
Cost efficiency is all there is now
I have to doubt the cost efficiency too.
Do you reckon openai could get cheaper power or gpus? Or something else? Could nvidia get lower production costs for these?
I'm talking about the software side of things. Generative "AI" seems to be a "brute force" approach to artificial intelligence - just throwing hardware at the problem instead of finding a better approach. Given the limitations of GenAI, it just feels crazy to keep going this way. Like a sunk-cost fallacy. These are just my thoughts though, not a real scientific analysis.
Recent advancements using "dynamic sparsity" or "selective activation" approaches increase efficiency beyond "brute force". This is how China began to compete without anywhere near the number of GPUs or power.
Read elsewhere that openai are trying to buy themselves ahead enough to outlast the bubble popping.
Seems that everything else is sacrificed from that end.
Will be hilarious if they are one of the first to go!