503
Motherboard sales are now collapsing amid unprecedented shortages fueled by AI
(www.tomshardware.com)
This is a most excellent place for technology news and articles.
Yup, you want memory accessible to the GPU for local AI. AMD Strix Point and Mac devices are popular options. CPU can run LLMs but very slowly. I've got 32 GB of RAM and 8 VRAM and it's borderline useless for models that don't fit in the VRAM.