AI workloads finally supported on the Framework 13 + Ryzen AI 9 HX 370

Hello there!

The “AI” in “Ryzen AI 9 HX 370” (finally!) becomes a reality with the ROCm Core SDK 7.10.0 technology preview release! It includes “new support for the following AMD Instinct GPUs and Ryzen AI APUs: […] Ryzen AI 9 HX 370”.

Has anyone tried it?

Full release notes

3 Likes

I see no mention of NPU(aka Hybrid) support. GPU/APU offload has been achievable with older versions of ROCM on the Strix Halo parts for sometime. So this doesn’t really bring that much to the table. The missing piece of the puzzle is integration of the xdna/NPU offload hybrid path which currently relies on a non-public pre-release of the Ryzen AI stack to work with generally accepted best pracitce model techniques. There are some Proof of concept specialized attempts around to make Hybrid NPU+APU but it’s not usable in the same sense that APU is today with i.e ramalama etc and being able to use models from hugging face/Guff’s etc. The main project AMD is targeting is Lemonade which is a fancy wrapper around several other tools and it seems to be their internal target inference platform (Outside of vllm which is targeted at server class). Hybrid NPU+APU is possible with lemonade only on Windows currently, and lemonade on Linux in my testing offers nothing outside of just DIY approach to lama.cpp+rocm containers for running up inference currently.

1 Like

Is Ubuntu 25.10 compatible with that rocm version? Or do the fw13 ai 300 laptops support 24.04 LTS or 22.04 LTS?