if you can't host it offline, it's not privacy respecting.
this post was submitted on 16 Dec 2025
7 points (88.9% liked)
Privacy
3443 readers
89 users here now
Icon base by Lorc under CC BY 3.0 with modifications to add a gradient
founded 2 years ago
MODERATORS
I understand, thx.
Offline ollama instance + alpaca. Then use a model that can also use the terminal or look things up on the web, etc.
Just because it's an offline instance doesn't mean it's privacy.
You can't. The cloud is a black box, and LLMs are not E2E encrypted. Also VPN audits can be fake. Download your own LLM if you want but you will suffer unless you have a powerful machine.
Thx. That's what I was thinking. More or less. But I won't download my own local AI either as I really don't wish to use one. I was just curious to know if there was any way to... control this kind of claims.
privacy respecting’ AI called Euria (which they also claims to be green-ish
