Does the Desktop have ROCM support

Does the framework desktop support ROCM?

1 Like

Base on recent videos I’ve watched strix halo support is in release candidate mode in ROCm. https://youtu.be/7-E0a6sGWgs?si=OpfqxBR9-cDlogJs&t=103

If you are interested in AI use I would look at the benchmarks here: AMD Ryzen AI MAX+ 395 "Strix Halo" — Llama.cpp Backend Performance Comparison

So you can use ROCm with strix halo, it is a little buggy right now. But the vulkan backend seems to be stable but slower.

1 Like

I dunno, it just seems weird to me then, that they called it the AI Max , but didn’t give it ROCM support out of the box.

What do you mean “didn’t give it ROCM support out of the box“?

Ah, I misread the link, looks like it is supported/listed. Just only in the windows section.

6.4.1+ has support. 6.4.3 (current version) has decent performance. https://rocm.docs.amd.com/ There are some GPU hangs that are in the progress of being resolved.

If you’re looking for the latest, I’d recommend using ROCm/TheRock builds. Either the Python packages in en env, or probably easier, use the tarballs and symlink them into /opt/rocm or somewhere standard: TheRock/RELEASES.md at main · ROCm/TheRock · GitHub

5 Likes

If you are a beginner like me, you may find those container based solutions helpful. If you are on Fedora42, just follow the explanations on his GitHub and ROCm support should work on your Framework Desktop at least for the text based models. I am currently trying out his container for image/video creation. I did have issues on openSUSE, probably something related to the container support.

5 Likes

Another good resource: https://strixhalo-homelab.d7.wtf/ along with their discord.

2 Likes

Yes.
(first let me note that I am not an expert, the following is just what I have found)

On the Windows side…
… you can use LMStudio. In Settings > Runtime, in the lefthand panel, select the option for the “ROCm llama.cpp (Windows)”. After that downloads, make sure to also switch the Engine to “ROCm llama.cpp (Windows)” over in the righthand panel - it will probably be on Vulkan by default.

For image or video generation, you can try “Amuse”. I only started messing with it yesterday, but it seems to be some kind of stable diffusion UI front-end supported by AMD and with a model browser that calls out compatible models with the AMD Strix Halo (Max+ 395). I think its using Windows Direct ML not rocm, but still, it seems like its worth checking out.

NOTE:
Initially, LM Studio did not recognize “ROCm llama.cpp (Windows)” as compatible with my Framework Desktop. It turns out the software and driver package Framework supplies is a bit out of date. I probably could have just updated the GPU driver and been done with it, but I went full ‘belt-buckle and suspenders’ and did a full DDU uninstall, reinstalled the GPU driver (and Adrenalin software) directly from AMD’s site. Then fully uninstalled and reinstalled LM Studio. At that point LM Studio recognized “ROCm llama.cpp (Windows)” as a compatible runtime.

On linux…
… rocm is finally getting there with gfx11511 support. But I’m going to assume you meant on Windows and leave it at that. There are really good responses in this very thread for linux users. the link to The Rock on git hub, plus that link to the “kyuz0/amd-strix-halo-toolboxes” github are where you want to start - see post by @Jiral for links.

4 Likes

ROCm 7.0.1

1 Like

ROCm 6.4.4 adds formal support for both Windows and Linux for Strix Halo.

https://www.amd.com/en/blogs/2025/the-road-to-rocm-on-radeon-for-windows-and-linux.html

3 Likes

I am unable to replicate your success. How did you do your DDU?

For the record, I did what you WISHED you did, and it worked. I installed the latest AMD software, (25.9.1) and then REBOOTED. LM Studio that let me install the missing runtime. Thank you for your post.

1 Like

I tried again, this time installing 25.10.2. It worked after rebooting. Thank you!