Ryzen AI with AMD IPU on Framework laptops?

Do the AMD Ryzen APUs on the Laptop 13 support AMD’s “Ryzen AI”?

This software [1] to speed up neural networks requires both

  1. an AMD Inference Processing Unit (IPU), which is contained in some (but not all) Ryzen 7000 mobile processors (HS and U series) [2]. In some publications it is called “XDNA”.
  2. enablement by the OEM. Apparently some PCs with Ryzen 7000 APUs are on the market that do NOT support the AMD IPU for sheer lack of manufacturer support. [3]

[1] https :// ryzenai.docs.amd .com/en/latest/index.html
[2] https://www.amd.com/en/products/ryzen-ai
(click to open the footnotes at bottom of page)
[3] AMD IPU Device not detected on 7940hs · Issue #5 · amd/RyzenAI-cloud-to-client-demo · GitHub

Is there a clear statement from Framework Computer Inc. if “Ryzen AI” and the AMD IPU are supported on their laptops? I am currently looking at Laptop 13, but same question applies to Laptop 16.

Best regards, Bernhard
(pls. excuse the distorted first URL - as a forum newbie I am limited to 2 URLs)

7 Likes

Welcome to the forum!

AFAIK, so far there hasn’t been any official communication regarding whether or not AMD IPU is built into the specific SKU that is shipping with Framework laptops or if it is enabled. Your best bet to get an answer though would be through creating a ticket and reaching out to Framework Support.

Physically Slaying, Mentally Decaying
Kai

Wow I ordered an AMD with the hope of working with this new Ryzen AI tech and I’m batch 1. This has to be my workstation for work, so, I would be really disappointed if the feature is not implemented…

Is this enablement is hardware design or can this be addressed by a firmware update ?

According to the official Ryzen AI Software documentation it appears to be a matter of installing the software.

The Ryzen AI Software Platform supports AMD Ryzen 7040U, 7040HS series mobile processors with Windows 11 OS.

As mentioned before though I would highly recommend making a support ticket and seeing if you can get confirmation through official channels as considering how no end user has the machine in hand anything said by us is simply speculation.

1 Like

If the term “SKU” refers to AMD’s processors, this detail can be found in AMD’s documentation: (see link [2] in original post above)

Ryzen™ AI technology is compatible with all AMD Ryzen 7040 series processors except the Ryzen 5 7540U and Ryzen 3 7440U. OEM enablement is required. Please check with your system manufacturer for feature availability prior to purchase. GD-220.

I would assume this list could possibly be extended with processor models released in future.
What struck me was the relevance of OEM support, i.e. just having the right processor is not good enough. The OP’s link [3] mentions a few affected devices.

OK, thanks. I opened a ticket this morning, referring to this thread. I’d be pleased if they respond right here in the forum. Otherwise I’ll let you know of any personal response to me.

Quick update: Framework’s support kindly responded to my ticket, essentially saying:

We understand that you want to know if the AMD Ryzen APUs on the Laptop 13 support AMD’s IPU and “Ryzen AI". We will be sharing more information before we start shipping Framework 13 AMD Ryzen 7040 Series. We suggest you register in our blog post​ to get newsletters and product updates.

So I will follow the blog posts from now on.

To early reviewers of Ryzen machines: you are welcome to contribute your own findings.

2 Likes

Two reports of the IPU showing in device manager here [RESPONDED] AMD Batch 1 Guild - #784 by joeshmoo

3 Likes

If you’re interested in Ryzen AI support, especially on Linux, this article may be of interest. Make your voice heard.

5 Likes

There is some progress:

4 Likes

Here is a tutorial that should get it working: GitHub - AMDResearch/Riallto: The Riallto Open Source Project from AMD

1 Like

Riallto appears to be Windows only :frowning:

I’m trying via the Discord for https://lmstudio.ai/ (which is supported by AMD) but not getting very far so far - it only works with GPU offload turned off.

Feel free to install and chip in ?

lmstudio is a wrapper around llama.cpp, which supports AMD GPUs through ROCm. ROCm does not support NPUs, though, only GPUs, so neither does llama.cpp and thus not lmstudio.

Although : GitHub - huggingface/optimum-amd: AMD related optimizations for transformer models