Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
I have one of those minisforum amd hx 370 (the x1 ai pro). Those are very powerful awesome hardware. I use the mini pc as a work computer for 3D and dev on OpenSuse and lightweight low power gaming machine (like long haul Xplane12 flight during the night).
Everything is well made and beautifully built.
As for this NAS version, if money is not an issue I wouldn’t hesitate. 10Gbs, tons of ram, Amd hx 370. It sure is overkill for a NAS, it’s more tailored for a very beefy docker server and/or virtualization station while being a multimedia NAS at the same time.
I built my own synology replacement with second hand itx parts in a jonsbo n3 case, but if I hadn’t or just had plenty of cash to spare, I would definitely go for a server like this one (my use case is NAS + docker + virtualization + eventual game server all in one).
As a side note, the "AI" part is just communication for now, those chips are not yet supported for local LLM on Linux (Windows only atm), they need ROCm support for iGPU RDNA 3.5 and the new AMD NPU integration into those local frameworks (llama.ccp etc).
https://github.com/amd/gaia
It will come for sure, it’s just not ready yet.
Thanks for that, almost pulled the trigger so I can have an all in one solution for running immich with AI, perplexica and other self hosted AI things. Currently I just serve the processing power from my MacBook and desktop PC.