this post was submitted on 14 Apr 2026
6 points (56.0% liked)

Selfhosted

58549 readers
457 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

So...with all this openclaw stuff, I was wondering, what's the FOSS status for something to run locally? Can I get my own locally run agent to which I can ask to perform simple tasks (go and find this, download that, summarize an article) or things like this? I'm just kinda curious about all of this.

Thanks!

you are viewing a single comment's thread
view the rest of the comments
[–] hendrik@palaver.p3x.de 1 points 2 days ago* (last edited 2 days ago)

I think you need some Agent software. Or a MCP server for your existing software. It depends a bit on what you're doing, whether that's just chatting and asking questions that need to be googled. Or vibe coding... Or query the documents on your computer. As I said there's OpenClaw which can do pretty much everything including wreck your computer. I'm also aware of OpenCode, AutoGPT, Aider, Tabby, CrewAI, ...

The Ollama projects has some software linked on their page: https://github.com/ollama/ollama?tab=readme-ov-file#chat-interfaces
They're sorted by use-case. And whether they're desktop software or a webinterface. Maybe that's a good starting point.

What you'd usually do is install it and connect it to your model / inference software via that software's OpenAI-compatible API endpoint. But it frequently ends up being a chore. If you use some paid service (ChatGPT), they'll contract with Google to do the search for you, Youtube, etc. And once you do it yourself, you're gonna need all sorts of developer accounts and API tokens, to automatically access Google's search API... You might get blocked from YouTube if you host your software on a VPS in a datacenter... That's kinda how the internet is these days. All the big companies like Google and their competitors require access tokens or there won't be any search results. At least that was my experience.