this post was submitted on 13 Dec 2025
2 points (66.7% liked)
Artificial Intelligence
1800 readers
1 users here now
Welcome to the AI Community!
Let's explore AI passionately, foster innovation, and learn together. Follow these guidelines for a vibrant and respectful community:
- Be kind and respectful.
- Share high-quality contributions.
- Stay on-topic.
- Enhance accessibility.
- Verify information.
- Encourage meaningful discussions.
You can access the AI Wiki at the following link: AI Wiki
Let's create a thriving AI community together!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
CUDA definitely is fully implemented on Linux. I mean all the AI products these days (ChatGPT etc) run on Linux clusters in some datacenters. You'd need to install all the CUDA compute packages in addition to the graphics drivers... But(!) 2GB of VRAM is pretty limiting. Most decent large language models use way more than that, so they won't fit on your graphics card. Maybe try one of the tiniest models you can find. The model download file shouldn't be bigger than 1.5GB or so. Otherwise it'll load it on the CPU anyway. Maybe there's a guide for LMStudio, not sure if it bundles all the requirements like CUDA.