Hacker Newsnew | past | comments | ask | show | jobs | submit | cmgbhm's commentslogin

Doing like an 8xh200 server (https://docs.nvidia.com/dgx/dgxh100-user-guide/introduction-...) is 10.2kW.

Let’s say you need 50m^2 solar panels to run it, then just a ton of surface area to dissipate. I’d love to be proven wrong but space data centers just seem like large 2d impact targets.


Yeah, you need 50m^2 of solar panels and 50m^2 of radiators. I don't see why one is that much more difficult than the other.

You need 50sqm of solar panels just for a tiny 8RU server. You also forgot any overhead for networking, control etc. but let's even ignore those. Next at the 400km orbit you spend 40% of the time in shade, so you need an insulated battery to provide 5kWh. This would add 100-200kg of weight to a server weighing 130kg on its own. Then you need to dissipate all that heat and yes, 50sqm of radiators should deal with the 10kW device. We also need to charge our batteries for the shade period, so we need 100sqm of solar panels. And we also need to cool the cooling infrastructure - pumps, power converters, which wasn't included in the power budget initially.

So now we have arrived to a revised solution: a puny 8RU server at 130 kg, requires 100sqm and 1000 kg of solar panels, then 50-75 sqm of the heat radiators at 1000-1500 kg, then 100-200 kg of batteries and then the housing for all that stuff plus station keeping engines and propellant, motors to rotate all panels, pumps, etc. I guess at least 500kg is needed, maybe a bit less.

So now we have a 3 ton satellite, which costs to launch around 10 million dollars at an optimistic 3000/kg on F9. And that's not counting cost to manufacture the satellite and the server own cost.

I think the proposal is quite absurd with modern tech and costs.


Don't forget to budget power to run the coolant heaters and prevent them from freezing in the shade.

Especially if with the radiators you can just roll out as rolls of aluminum foil, which is very light and very cheap.

Only on a short distance. To effectively radiate a significant amount of heat, you need to actually deliver the heat to the distant parts of the radiator first. That normally requires active pumping which needs extra energy. So now you need to unfold sonar panels + aluminium + pipes (+ maybe extra pumps)

Orbital assembly of a fluid piping system in space is a pretty colossal problem too (as well as miles of pipes and connections being a massive single point failure for your system). Dispersing the GPUs might be more practical, but it's not exactly optimal for high performance computation...

It’s a fun problem to think about but even if all the problems were solved we would have very quickly deprecating hardware in orbit that’s impossible to service or upgrade

>large 2d impact targets

I bet you a million dollars cash that you would not be able to reach them.


The other benefit is a remaster is a new copyrighted work. If you have a 40yo album and you can made the old copies breakdown, you’ve effectively given your heirs a longer copyright window.


You can add an intermediate sca stage that exports the uv dependencies as requirements.txt



Where I can sometimes get burnt is busybox.

I more often get burnt in zsh to bash than that however


If they win, there’s a forever revenue stream to extract and they keep their TM law sharp.


There's no revenue stream, JavaScript is the colloquial term used to refer to ECMAScript, ain't nobody paying Oracle if they started trying to enforce it.


The non troublesome use-case is clicking the starlink button and taking to their support.

Having bought a Subaru, I really tried to see where the consent is in the process. In my case, I think it’s the account establishment process that the dealer did.


How is that supposed to work for second hand vehicles?


> that the dealer did

is that even legal?

can one go to court for "absence of consent"?


”You opted-in and consented to sharing of information by looking at the car”


In this case, how is consent revoked?


It’s the author.


Understanding how tokens get passed around. The pattern in Gitlab seems to be much more explicit.

Protected branches and associated secrets. Much cleaner construct on gitlab.

GitHub actions defacto seems to be tracing yaml to compiled JavaScript to hopefully that right source to shell commands.

Gitlab seems to be yaml to shell commands.

Nested projects. Nice midspot between monorepo and access control management.

API. I may be out of date on it but I recall the gitlab apis as pretty sensible. The github apis for administration has a very odd rest/graphql split.


I have an LG about that vintage and it’s starting to black out when doing 4K content. All components before it switched out and up to date in firmware. Reatarting works, sometimes all day, sometimes 1 minute.

My other TV about the same vintage is starting to have stuck pixels in the corner.

Modern failure modes aren’t nearly as graceful.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: