exu , 5 months ago Llama.cpp supports OpenCL as well and performs better than rocm in my limited experience. That should work on basically any GPU.
Llama.cpp supports OpenCL as well and performs better than rocm in my limited experience. That should work on basically any GPU.