I am so sorry for your loss of Zeus. ๐ข He was a very handsome boy!
mike_wooskey
Double upvote!
Rules 1, 2 and 6
Interesting! I've seen the sidebar but not thought of much advantage from it. I'll take another look. Model/provider change is a breeze, I just assumed Claude would do the same but maybe they want to make it harder to leave their models? Workspaces? Sounds interesting - gotta check it out.
I just discovered how easy it is to view/switch sessions in opencode.
I use opencode, have seen Claude but never used it. What are some examples of things opencode does that Claude doesn't?
When you say "in Lemmy communities", do you mean "on the internet"?
I i think it's great that you're spreading the word about the fediverse and helping get people off of big data.
You might consider using a youtube alternative for your channel, perhaps a PeerTube instance.
Not for long.
๐โค๏ธ
You used the past tense. I'm very sorry for your loss. ๐ข
I want to upvote this more than once.
The option is called "Do Not Sell or Share My Personal Information", which suggests that by selecting the option, you want them to not sell or share your information. But in parentheses it adds "slide left to opt out of sale/share", which suggests that disabling the option means for them to not sell or share your information.
I'm my experience, running Ollama locally works great. I do have a beefy GPU, but even on affordable consumer grade GPUs you can get good results with smaller models.
So it technically works to run an AI agent locally, but my experience has been that coding agents don't work well. I haven't tried using general AI agents.
I think the amount of VRAM affordable/available to consumers is nowhere near enough to support a context length that's necessary for a coding agent to remain coherent. There are tools like Get Shit Done which are supposed to help with this, but I didn't have much luck.
So I'm using OpenCode via OpenRouter to use LLMs in the cloud. Sad that I can't get local-only to work well enough to use for coding agents, but this arrangement works for me (for now).