792
this post was submitted on 14 May 2026
792 points (98.8% liked)
Technology
84646 readers
4184 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Only a couple hundred mill a year per data center. Building a nuclear power plant costs several billion and isn't free to operate either, so it's not pure capex, there's still opex involved.
Prices rise for everyone so it becomes the community's problem as much as it becomes the data center's problem. The US in particular has three grids, so in reality, the community is either the western US, the eastern US, or Texas.
Profitable over a decade or more maybe. The data center isn't guaranteed to be in operation for that long. You know those ~30-40k USD "graphics" cards they use? The ones that a single AI data center would likely have tens of thousands of, often even around 100k? They're used for about 3 years usually, often less. They become obsolete in that timeframe, just unable to compete with newer products in terms of both raw performance as well as efficiency. That's up to 3 billion dollars of GPUs every 3 years or less, per data center. Just a tiny economic downturn or people seriously realizing that this bubble is going to have to pop eventually and they'll have to stop running these data centers.
NPPs also usually take many years to complete. It took nearly two decades for the Finns to get Olkiluoto 3 running. Data centers need to be ready in a few years because in 5 years the AI craze could be over and they'll no longer be needed.
AI companies ain't gonna do shit for electricity generation if they're not forced to.
In my country, joining the grid or upgrading your circuit breaker has a one-time amperage-based fee (assuming you're close to the substation, otherwise it gets more expensive). I propose that for companies looking to consume huge amounts of electricity, there should also be a mandatory generation capacity increase fee that could be paid out to a nearby municipal power company that then uses it to build more power plants, or to some level of local government that could then sponsor building a power plant or 10.
Edit: Whoever downvoted me must think that data center operators are going to do anything out of the good of their hearts lol
People on lemmy downvote you just for disagreeing ALL the time, even if you make (as you just did) an informed and thoughtful reply. It’s honestly just as bad as Reddit with the downvote shit
The lemms are peculiar like that
Now I'm just wondering who downvoted you for a literal nothingburger comment
That is public info btw -> https://lemvotes.org/