this post was submitted on 10 Feb 2025
66 points (100.0% liked)
PC Gaming
10691 readers
414 users here now
For PC gaming news and discussion.
PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
Where are you getting this information from? Most models that are less than 16B params will run just fine with less than 24 GB of VRAM. This github discussion thread for open-webui (a frontend for Ollama) has a decent reference for VRAM requirements.
I should have been more specific. The home models that actually compete with paid ones in both accuracy & speed. Please don't be one of those to exaggerate & pretend it works just as good with much less. It simply doesn't.