this post was submitted on 26 Dec 2025
310 points (99.1% liked)
Linux
10798 readers
681 users here now
A community for everything relating to the GNU/Linux operating system (except the memes!)
Also, check out:
Original icon base courtesy of lewing@isc.tamu.edu and The GIMP
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I successfully ran local Llama with llama.cpp and an old AMD GPU. I'm not sure why you think there's no other option.
Llama.cpp now supports Vulkan, so it doesn't matter what card you're using.
I mean... my 6700xt dont have offical rocm support, but the rocm driver works perfectly fine for it. The difference is amd hasnt but the effort in testing rocm on their consumer cards, thus cant make claims support for it.