Any FW16 owner with a GPU via Oculink?

Some news guys !

It’s alive !

Curently using the DEG1 Oculink 4x (pcie 4.0) from Minisforum with a 4070Ti on the framework 16 !

I ran some rapid benchmarks and here some results :

  • 3DMark [Time Spy Extreme ]-> Score : 9197 / Graphics : 10613 / CPU : 5239
  • PassMark PerformanceTest [3D only] → Score : 32221 (Average 4070Ti score : 31772)
    (sorry I was unable to upload my screenshots)

I had to run the error 43 fixer script which took me 1 second and then Nvidia Geforce Experience was able to detect and install the needed drivers, a restart and all running perfectly.

I haven’t tested any games yet, but I’ll do it soon.

6 Likes

Hey @Pecorjunior thank you so much for even making a video!

Did you plug it to the M.2 2230 port or the M.2 2280 one?

2 Likes

Hello @Procurement,

I did use the 2280 slot as it’s above the 2230. And removing the input deck plate

If using the 2230, both slots would have been used for the Oculink m.2 adapter, which means my OS should be installed somewhere else (USB stick or storage module)

1 Like

@Pecorjunior Yeah guessed so :laughing:

Thank you so much for the test once more, you should go now to egpu and post the build and maybe link here so that it gets some visibilty haha. I’m sure others will be glad to know about this setup as well :slight_smile:

1 Like

Just to add some impressions from yesterday evening, I was able to run Dead by Daylight at highest settings, 4K, 120hz without any freeze or issue at all.

I played for about 2 hours, the GPU was 65% usage and 160W power consumption on average.

That said, DbD isn’t the most demanding game and the gpu wasn’t used at full load.
So I didn’t achieve to see if the 4x Oculink bandwidth reduces performances a lot or not.

Nonetheless, 4K 120hz freezeless is outstanding from my pov ! (I used the 780m since april)

Yep you’re right, I should do that !

I’ll try to take the time to do it this week.

(edit : well, I’ll do it later I don’t have the time this week lol)

Congratulations, nice setup. Do you know if this would improve Stream VR performance for Meta Quest 3 - when the video feed is streamed over Wi-Fi?

I can try some SteamVr performance test with the Oculust Quest 2 this afternoon if it interests you.

In theory it should improve the performances as the SteamVr would use the eGPU (in my case the 4070Ti) instead of the integrated graphics.

For the wifi part, it’s just a matter of bandwidth and latency overall.

(Video output from the eGPU surely goes as input to the VR headset thanks to the SteamVr software that sends it through wifi)

Anyway, it can be cool to confirm performance uplift in such configuration.

1 Like

That would be amazing. Thank you. I’ve not heard of anyone doing or testing this, so you’re trailblazing for us all.

I wasn’t 100% sure the video feed would go back to the laptop or if it had to be taken from the GPU straight to a monitor, if you see what I mean? But seeing as it’s essentially a PCIe connection it should behave like an internal GPU in that regard right?

1 Like

At least I think it should yes, plus the FL16 has a mux that should be able to disable the iGPU to avoid having this kind of issue.

Though I don’t know if the mux works for the m.2 or only the dedicated pcie lanes of the FL16. (I don’t know a lot about MUX so if someone has more info, feel free to correct me)

In the end, the only way to be sure of those theories is to test those !

1 Like

The pinout for expansion bay connector shows that there’s a displayport connection from the dGPU board back to laptop for the purposes of driving internal display. That’s what the MUX on the motherboard is switching - it’s a pretty dumb KVM type device which just switches between two sources of display signal for the internal display - either iGPU or dGPU.

The OcuLink eGPU scenario doesn’t have any provisions to send the DisplayPort signal back down the cable, so MUX will not really help there. In fact, it might even lead to some black screen issues if the AMD software decides to switch the internal display to dGPU, but there would be no signal.

I wonder if @Josh_Cook thought of that? Could be a cool idea to include a (mini-)DisplayPort input on the OcuLink expansion card so that the eGPU can drive the internal display like PC Thunderbolt controller cards have DisplayPort inputs sometimes.

3 Likes

I was able to re-install my vr setup before going to sleep ! So here one screen of Half Life Alyx running on AirLink paired with Steam Vr with fps and % usage etc…

In the console of Steam Vr we can see this :

Basically, it says that the needed score to run HL Alyx at 72hz in 2080x2096p on the oculus is 570MP/sec but the 4070Ti has 3282 MP/sec average so it upscales automatically to 2544x2544.(Even though it technically could do more upscale as it is 5.75x faster than the initial game demand in performances)

The oculus quest 2 is locked at 72hz so I can’t change the value… But I could try to upscale even more on a next try to see where is throttle lol.

2 Likes

So it works, that’s great news. Thank you so much for testing this.

I have ordered most of the parts I need, I can’t wait to try this out myself. I don’t have the graphics expansion module in my FW so when I try Steam VR it completely fails. Even the Steam VR home app is so pixelated I can’t read any of the text.

1 Like

I clearly understand, iGPU sucks for demanding games lol.

If I can give you an advise, the only particularity of the setup is the software side for Nvidia cards. Error 43 may show up (or not but still will be an issue), search for it on eGPU.io and RUN the program, then gefore experience or any drivers will work.

Yep, I have a full sized DisplayPort input on my current design

3 Likes

Thank you! I just ordered my FW16 at the very same moment after reading your message! I haven’t replied until now because… I’ve been too excited with it! Hahaha
I thought I would upload a photo of my laptop with an external disk with a Windows To Go and a desktop’s GPU via Oculink! Just to see how it can look like for the ones interested! The GPU is not mine so I just tried it out :slight_smile:

3 Likes

So cool to hear ! Have you been able to test some games or software with this setup ?
What gpu are you planning to use later ?

I just graduated and have some free time for the next month, I should be able to do some testing and benchmarks so feel free to answer the poll below (or the community poll on my youtube channel) if you want videos about this particular setup.

What video would you like about the FL16 x Oculink GPU combo ?

  • Games benchmarks (long format w/ multiple games)
  • Synthetic benchmarks (i.e. 3Dmark…)
  • Setup installation / prerequisites demo video
0 voters

ps : if you don’t want videos at all but just keeping this forum active, that’ll be good either.

3 Likes

HI

There is my oculink to USB4 setup with both framework 13 & 16

Was a bit hard to get working, but now it’s working perfectly

I’am even considering staying with USB4 because on my RTX 2080, i don’t lose much performance with 32Gbps

Cost me 99$ + 50$ + 20$ + my already owned GPU + power supply

4 Likes

I looked at your setup, so it’s USB4 encapsulating pcie 3.0 4x from oculink (TB3 equivalent) ?

And do you use this kind of setup for compatibility issues with TB3 devices/eGPU enclosures or for anything else ?

ps : I’m mainly using oculink to have 64Gpbs which allows low-latency and no-stuttering in games (even if we all love better “perfomance” / FPS overall)