this post was submitted on 13 Dec 2025
2 points (66.7% liked)

Artificial Intelligence

1800 readers
1 users here now

Welcome to the AI Community!

Let's explore AI passionately, foster innovation, and learn together. Follow these guidelines for a vibrant and respectful community:

You can access the AI Wiki at the following link: AI Wiki

Let's create a thriving AI community together!

founded 2 years ago
MODERATORS
 

Hello there

I am starting to understand and discover local (like on my computer) LLM.

LMstudio make it easy for beginners.

There is something I can’t make works right : my laptop has a Nvidia T600 2Gb graphic card. Genuine Nvidia drivers works well on the OS and graphic applications (Linux Mint). But LMstudo can’t use it.

LLM models run fine but only on CPU.

I read somewhere about CUDA not being fully implemented on Linux, is it the limiting factor ? Anyone managed to offload some work to an Nvidia T600 on laptop?

Thanks

you are viewing a single comment's thread
view the rest of the comments
[–] hendrik@palaver.p3x.de 2 points 1 week ago* (last edited 1 week ago)

CUDA definitely is fully implemented on Linux. I mean all the AI products these days (ChatGPT etc) run on Linux clusters in some datacenters. You'd need to install all the CUDA compute packages in addition to the graphics drivers... But(!) 2GB of VRAM is pretty limiting. Most decent large language models use way more than that, so they won't fit on your graphics card. Maybe try one of the tiniest models you can find. The model download file shouldn't be bigger than 1.5GB or so. Otherwise it'll load it on the CPU anyway. Maybe there's a guide for LMStudio, not sure if it bundles all the requirements like CUDA.