CashyOS (Arch) ollama / docker iGPU recognition

Hi

I have everything up to date from rocm, vulkan to distro.

first I tried to get ollama running as docker container with gpu pass through. No luck iGPU wasn’t found

Second I tried ollama install through cli command. Worked but only cpu support. The iGPU is unsupported.

did anyone get it running?


Which Linux distro are you using?

CashyOS

Which release version?

Latest

Which kernel are you using?

Latest

Which BIOS version are you using?

Latest

Which Framework Desktop model are you using? (AMD Ryzen™ AI Max 300 Series)

395

I did not get ollama to work on openSUSE Tumbleweed with the iGPU either. llama.cpp with Vulkan on the other side worked right just fine. Rocm gave me troubles on openSUSE too, but worked on Fedora42 with the ready made container. I could not see much of an advantage with using rocm though for the LLMs I am interested in so I just sticked with Vulkan.

Don’t use ollama - they don’t support AMD properly. Just use llama.cpp directly - there are Vulkan binaries, or you can compile it yourself. Plus it will save you a lot of headache down the road.

Ollama looks attractive for beginners, but there are so many hidden gotchas (like 4096 context by default) that will just lead to frustration later. I know, I’ve been there.

2 Likes

Thank you for the notch.

I will try it over the Weekend :slight_smile: