this post was submitted on 01 Feb 2025
991 points (98.5% liked)
Political Memes
2427 readers
138 users here now
Non political memes: !memes@sopuli.xyz
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Kccp, hugging face, grab a model that fits your vram in gguf format. I think two clicks after downloaded.
It's the same model, your pc just sucks lmfao
Buddy I have a running and been testing 7b and 14b compared to the cloud deepseek. Any sources, any evidence to back what you're saying? Or just removed and complaining?
You can do it in LM Studio in like 5 clicks, I'm currently using it.
I mean obviously you need to run a lower parameter model locally, that's not a fault of the model, it's just not having the same computational power