I’m trying to get Ollama to run on my FL16 with the dGPU but it doesn’t detect the GPU. Is there a reason why that wouldn’t work? I have ROCm installed.
Works perfect on Arch. Just install the ollama-rocm package, start the service, and enjoy talking to various LLM. It’s fun using nvtop to watch the gpu usage shoot to 100% for a few seconds while it answers your question.
1 Like
Thanks, it worked. But I wanted to use the docker image which bundles open-webui and ollama. And Docker refuses to work with the GPU no matter what I try. It always says “docker: Error response from daemon: could not select device driver “” with capabilities: [[gpu]].”