this post was submitted on 18 May 2025
77 points (100.0% liked)
Technology
38664 readers
596 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
it seems a bit disingenuous to call these “data centers in space” or “super computers”.
30 terabytes of storage across 12 satellites? So 2.5 TB each and 744 tops (which is like, a modern mid range graphics card for a PC, the RX 9070 XT does 1557 tops for reference). Like that just sounds like they’re launching a powerful PC in to orbit. Like, that’s a lot of power for a satellite, for comparison the curiosity rover is using the same kind of CPU as a 2000 era imac G3, but it’s not a data center.
The idea of doing more processing of the data on the satellite rather than processing it on the ground is interesting and neat, but representing these as anything more than that is… weird.
due to cosmic radiation, computers in space run in triplicates…. so everything is times 3….
but yes, it’s a lie.
also, the definition of supercomputer is a bit muddy. my phone is a supercomputer by most standards (obsolete standards).
So the "journalist" Wes Davis is a liar and the Verge is a slop factory run by idiots.
ok
Judging by the fact these are launching on long march 20s. It’s probably not going beyond LEO, so it doesn’t need proper deep space hardening like the RAD750 or the like.
It’s probably closer to off the shelf parts like what’s used on the ISS.
12 of 2800 planned have been launched.
I have a server at home built from old parts and some refurbished drives with nearly as much storage as the currently launched satellites. 2800 satellites like this would come out to around 230 of my servers, or ~7PB.
A single 2U server with 12 drives, each with 24TB storage, can hold 288TB. It would take ~24 of those to get to 7PB, which is a lot of servers, but not so many that someone with quite a lot of savings couldn't afford it.
Also, the servers on the ground can be cooled by, idk, air if needed. Or water. Or I guess liquid nitrogen if you want. Point is there's an atmosphere for the heat to dissipate to, unlike space.
They've certainly had to come up with some way to effectively radiate the heat into space. The article doesn't mention it though. i presume it's one of the main reasons for networking so many machines together?
That’s still not very much compared to most data centers. Like, 7000 terabytes is a lot of storage for one person, but it barely even registers compared to most modern data centers.
Also, 2800 desktops networked together isn’t really a super computer or a data center.
such a network is interesting as a scientific tool for gathering and processing data, certainly, but not a data-center and not a super computer.
But being accurate with the headline makes it less click baity. 😏 Honestly, this article is scant on details.
Data centers don't usually have an "X-ray polarization detector for picking up brief cosmic phenomena." Like you said, it seems more like a scientific tool than an actual "data center."
Imagine the latency on a data center in space. Uplink/downlink every time your server gets an inferencing request? Lol.
I could see it being fine for longer running asynchronous requests, but that would be if the cost/benefit made any sense at all, and if the servers had any resources worth talking about.