Any FW16 owner with a GPU via Oculink?

Personally I don’t think a transfer from a unified CPU config with 8 Zen 4 cores to a hybrid CPU config with 4 Zen 5 and 8 Zen 5c should be considered “tiny”. That is a very big change, with obvious pros and cons.
Pros include dramatically better energy efficiency especially at low TDPs. The 28W HX 370 in Zenbook S 16 punches well above its class, surpassing the 45W (with liquid metal) 7840HS in Framework 16 in many CPU-heavy workloads, according to a review from Phoronix.
Cons is between the Zen 5 cluster and Zen 5c cluster there is a VERY high core to core latency (~150ns), according to MLID and Geekerwan. Geekerwan also have found this latency affected many games that need more than 4 cores, and dragged the gaming performance quite a bit. And the iGPU will be bottlenecked even by LPDDR5X-7500 if the game is really memory-bandwidth intensive (looking at you, PUBG and ZZZ).
It’s up to you if you still want to call these changes “a tiny update”.

Let us know if/how it works, very interested :slight_smile:

1 Like

As Mark points out there have been substantial improvements/changes.
For the particular use I’m looking after, improvement in power efficiency, and better iGPU (hopefully LPCAMM2, but that might be too much for Framework as of now) make the price a less bitter bite so to speak.
Basically what I’m looking for in Framework for our tiny company is powerful machines(within reason) as a desktop substitute, but energy-independent enough as to be able to work a full day/1.X days when going around.
The fact that Framework was eager to announce they would be skipping 8040 series but haven’t released any kind of statement regarding strix makes me believe they have something in the oven, but of course, without official confirmation that’s just castles in the sky.
Every company has its own rules and procedures and that’s ok. Although Framework is on good track they still have to gain more reputation to reach a wider customer segment and the last thing they need is undelivered promises, but I can’t see how informing of a (very) likely thing to happen under a big question mark so that customers can plan accordingly could ever be harmful.

And going back to the topic, the reason for the interest in eGPU compatibility through Oculink is basically for the possibility of using the laptop as a desktop replacement.

Did you receive it and had a chance to try it out? :pleading_face:

Nope, parcel still in Germany lol, hope it won’t be too long, it should be this week ^^

1 Like

Some news guys !

It’s alive !

Curently using the DEG1 Oculink 4x (pcie 4.0) from Minisforum with a 4070Ti on the framework 16 !

I ran some rapid benchmarks and here some results :

  • 3DMark [Time Spy Extreme ]-> Score : 9197 / Graphics : 10613 / CPU : 5239
  • PassMark PerformanceTest [3D only] → Score : 32221 (Average 4070Ti score : 31772)
    (sorry I was unable to upload my screenshots)

I had to run the error 43 fixer script which took me 1 second and then Nvidia Geforce Experience was able to detect and install the needed drivers, a restart and all running perfectly.

I haven’t tested any games yet, but I’ll do it soon.

6 Likes

Hey @Pecorjunior thank you so much for even making a video!

Did you plug it to the M.2 2230 port or the M.2 2280 one?

2 Likes

Hello @Procurement,

I did use the 2280 slot as it’s above the 2230. And removing the input deck plate

If using the 2230, both slots would have been used for the Oculink m.2 adapter, which means my OS should be installed somewhere else (USB stick or storage module)

1 Like

@Pecorjunior Yeah guessed so :laughing:

Thank you so much for the test once more, you should go now to egpu and post the build and maybe link here so that it gets some visibilty haha. I’m sure others will be glad to know about this setup as well :slight_smile:

1 Like

Just to add some impressions from yesterday evening, I was able to run Dead by Daylight at highest settings, 4K, 120hz without any freeze or issue at all.

I played for about 2 hours, the GPU was 65% usage and 160W power consumption on average.

That said, DbD isn’t the most demanding game and the gpu wasn’t used at full load.
So I didn’t achieve to see if the 4x Oculink bandwidth reduces performances a lot or not.

Nonetheless, 4K 120hz freezeless is outstanding from my pov ! (I used the 780m since april)

Yep you’re right, I should do that !

I’ll try to take the time to do it this week.

(edit : well, I’ll do it later I don’t have the time this week lol)

Congratulations, nice setup. Do you know if this would improve Stream VR performance for Meta Quest 3 - when the video feed is streamed over Wi-Fi?

I can try some SteamVr performance test with the Oculust Quest 2 this afternoon if it interests you.

In theory it should improve the performances as the SteamVr would use the eGPU (in my case the 4070Ti) instead of the integrated graphics.

For the wifi part, it’s just a matter of bandwidth and latency overall.

(Video output from the eGPU surely goes as input to the VR headset thanks to the SteamVr software that sends it through wifi)

Anyway, it can be cool to confirm performance uplift in such configuration.

1 Like

That would be amazing. Thank you. I’ve not heard of anyone doing or testing this, so you’re trailblazing for us all.

I wasn’t 100% sure the video feed would go back to the laptop or if it had to be taken from the GPU straight to a monitor, if you see what I mean? But seeing as it’s essentially a PCIe connection it should behave like an internal GPU in that regard right?

1 Like

At least I think it should yes, plus the FL16 has a mux that should be able to disable the iGPU to avoid having this kind of issue.

Though I don’t know if the mux works for the m.2 or only the dedicated pcie lanes of the FL16. (I don’t know a lot about MUX so if someone has more info, feel free to correct me)

In the end, the only way to be sure of those theories is to test those !

1 Like

The pinout for expansion bay connector shows that there’s a displayport connection from the dGPU board back to laptop for the purposes of driving internal display. That’s what the MUX on the motherboard is switching - it’s a pretty dumb KVM type device which just switches between two sources of display signal for the internal display - either iGPU or dGPU.

The OcuLink eGPU scenario doesn’t have any provisions to send the DisplayPort signal back down the cable, so MUX will not really help there. In fact, it might even lead to some black screen issues if the AMD software decides to switch the internal display to dGPU, but there would be no signal.

I wonder if @Josh_Cook thought of that? Could be a cool idea to include a (mini-)DisplayPort input on the OcuLink expansion card so that the eGPU can drive the internal display like PC Thunderbolt controller cards have DisplayPort inputs sometimes.

3 Likes

I was able to re-install my vr setup before going to sleep ! So here one screen of Half Life Alyx running on AirLink paired with Steam Vr with fps and % usage etc…

In the console of Steam Vr we can see this :

Basically, it says that the needed score to run HL Alyx at 72hz in 2080x2096p on the oculus is 570MP/sec but the 4070Ti has 3282 MP/sec average so it upscales automatically to 2544x2544.(Even though it technically could do more upscale as it is 5.75x faster than the initial game demand in performances)

The oculus quest 2 is locked at 72hz so I can’t change the value… But I could try to upscale even more on a next try to see where is throttle lol.

2 Likes

So it works, that’s great news. Thank you so much for testing this.

I have ordered most of the parts I need, I can’t wait to try this out myself. I don’t have the graphics expansion module in my FW so when I try Steam VR it completely fails. Even the Steam VR home app is so pixelated I can’t read any of the text.

1 Like

I clearly understand, iGPU sucks for demanding games lol.

If I can give you an advise, the only particularity of the setup is the software side for Nvidia cards. Error 43 may show up (or not but still will be an issue), search for it on eGPU.io and RUN the program, then gefore experience or any drivers will work.

Yep, I have a full sized DisplayPort input on my current design

3 Likes