this post was submitted on 12 Mar 2025
49 points (81.8% liked)

Selfhosted

44306 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Wondering about services to test on either a 16gb ram "AI Capable" arm64 board or on a laptop with modern rtx. Only looking for open source options, but curious to hear what people say. Cheers!

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 2 weeks ago (1 children)

Well they are fully closed source except for the open source project they are a wrapper on. The open source part is llama.cpp

[–] [email protected] 1 points 2 weeks ago (1 children)

Fair enough, but it's damn handy and simple to use. And I don't know how to do speculative decoding with ollama, which massively speeds up the models for me.

[–] [email protected] 1 points 2 weeks ago

Their software is pretty nice. That's what I'd recommand to someone who doesn't want to tinker. It's just a shame they don't want to open source their software and we have to reinvent the wheel 10 times. If you are willing to tinker a bit koboldcpp + openewebui/librechat is a pretty nice combo.