this post was submitted on 07 Oct 2025
58 points (96.8% liked)

Technology

40741 readers
507 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
 

These numbers don't make any sense to me, as the hed is about buying lots of chips, and the body is about power use. No matter how you slice it, $8.76/kWh is a terrible fucking investment ... if that's chip-inclusive, that's another story.

Still, the audacity of saying "we're going to invest $1 trillion" is Dr. Evil-level humour.

OpenAI is signing about $1 trillion (€940 billion) in deals this year for computing power to keep its artificial intelligence dreams humming.

On Monday the outfit inked a deal with AMD which follows earlier tie-ups with Nvidia, Oracle and CoreWeave, as Sam Altman’s outfit scrambles to secure enough silicon to keep ChatGPT online and the hype machine alive.

The latest commitments would give OpenAI access to more than 20 gigawatts of computing capacity over the next decade, roughly the output of 20 nuclear reactors. At about $50 billion per gigawatt, according to OpenAI’s estimates, the total tab hits that $1 trillion figure.

Analysts are not convinced this financial engineering makes any sense. DA Davidson analyst Gil Luria said: “OpenAI is in no position to make any of these commitments,” adding that it could lose about $10 billion this year.

top 15 comments
sorted by: hot top controversial new old
[–] chromodynamic@piefed.social 26 points 1 month ago (2 children)

It's strange that the concept of efficiency seems to have been abandoned. Is consumption of vast computing resources no longer seen as indication of a design flaw?

[–] ryannathans@aussie.zone 3 points 1 month ago (1 children)

Cost efficiency is all there is now

[–] chromodynamic@piefed.social 8 points 1 month ago (1 children)

I have to doubt the cost efficiency too.

[–] ryannathans@aussie.zone 1 points 1 month ago* (last edited 1 month ago) (1 children)

Do you reckon openai could get cheaper power or gpus? Or something else? Could nvidia get lower production costs for these?

[–] chromodynamic@piefed.social 3 points 1 month ago (1 children)

I'm talking about the software side of things. Generative "AI" seems to be a "brute force" approach to artificial intelligence - just throwing hardware at the problem instead of finding a better approach. Given the limitations of GenAI, it just feels crazy to keep going this way. Like a sunk-cost fallacy. These are just my thoughts though, not a real scientific analysis.

[–] ryannathans@aussie.zone 1 points 1 month ago

Recent advancements using "dynamic sparsity" or "selective activation" approaches increase efficiency beyond "brute force". This is how China began to compete without anywhere near the number of GPUs or power.

[–] tangentism@beehaw.org 1 points 1 month ago

Read elsewhere that openai are trying to buy themselves ahead enough to outlast the bubble popping.

Seems that everything else is sacrificed from that end.

Will be hilarious if they are one of the first to go!

[–] ryper@lemmy.ca 18 points 1 month ago

These numbers don’t make any sense to me, as the hed is about buying lots of chips, and the body is about power use. No matter how you slice it, $8.76/kWh is a terrible fucking investment … if that’s chip-inclusive, that’s another story.

Data center scale is usually given in terms of power consumption, not computing power. The trillion dollars is meant to buy enough hardware to suck up 20GW of power, and probably none of the money will go towards power generation.

[–] lichtmetzger@discuss.tchncs.de 18 points 1 month ago (2 children)

Ed Zitron's gonna have a field day with this. OpenAI's motto seems to be scaling "to infinity and beyond". But what can you expect from a techbro CEO that takes Dyson spheres seriously.

[–] floofloof@lemmy.ca 14 points 1 month ago* (last edited 1 month ago) (2 children)

Apparently they were never meant to be taken seriously. Dyson's article was satirical. Techbros, unfortunately, don't pick up on this.

Dyson spheres are a joke - Angela Collier

[–] MaggiWuerze@feddit.org 4 points 1 month ago

Its the torment nexus all over again

[–] TehPers@beehaw.org 2 points 1 month ago

That anyone would even imagine a Dyson Sphere being remotely possible to build is beyond me.

Even supposing you managed to build one somehow, the maintenance cost would scale with the size (and therefore be astronomical, in the literal sense).

[–] Powderhorn@beehaw.org 7 points 1 month ago

That's a Roomba that just rolls around, right?

[–] Hirom@beehaw.org 2 points 1 month ago

Meanwhile, Nvidia has promised to pump $100 billion into OpenAI over the next decade, a move that will conveniently help OpenAI pay for Nvidia’s own chips.

OpenAI and NVIDIA's future are getting tied together more than they already were

[–] jarfil@beehaw.org 2 points 1 month ago

$8.76/kWh is a terrible fucking investment

Not kWh, it says watts not watt-hours.

Still a silly way to refer to computing power, though.