OcuLink eGPU works with the Dual M.2 expansion bay module

Here’s the one picture I grabbed of the bay before installing it. I also added a bit of tape after taking this photo as well.

I received the OCulink adapter from Amazon with 4 holes and printed the IO cover from Kyle_Tuck’s post (57). To secure them, I printed the plugs from Morkale’s post (48) and pressed them all the way in from the top using a needle-nose plyers.

2 Likes

I recommend you take the cooling pad out from under the board since it introduces additional bulge.
Pressing the board all the way down against the dual module leaves space for the cable bend to fit into the expansion slot.



Its nearly the same height with the m.2 ssd

4 Likes

That did the trick! Fits like a glove now. Thanks, Morkale!

1 Like

Can report the minisforum DEG-1 works wonders. Was able to play VR games on Linux an hour after I switched to it. Experienced some problems with one flatscreen game I tested, when going fullscreen, it would crash the game, steam, and DE, but KDE recovered just fine so overall nothing too terrible.
VR games, after performance mode tweaks, run perfectly fine on a RX6950xt just as they should.

MinisForum DEG-1 auto-starts the PSU&GPU with the laptop, and shuts it both down with the laptop, so if you ALWAYS want to use it, no need to switch on/off anything. Pretty neat.

Also, since my setup has some basic support for PCIe Hotplug (on a PCIe driver level at least, confirmed by sudo dmidecode --type 9), I tried shutting off the GPU after unplugging all screens at least, and Linux handled it just fine. Your mileage may vary, am on a bleeding edge distro, I know this would cause crashes a few years back. Note it was still listed in lspci even while off, so I wouldn’t call it proper support - turning it back on did not magically make it work, maybe you can make it, idk. Not too big a deal, I just care about having a way to quickly disconnect it when I go out of house without having to go through a lengthy process to shut everything down.

Haven’t tested actual PCIe bandwidth on windows yet as my experience there was… hilariously bad. Fresh install barely a week old with existing AMD drivers, still took ages for windows update to install something so it worked, then AMD software didn’t work at all anymore due to hardware changes, and during reinstallation (and admittedly, installation of Windows Mixed Reality at the same time), I got a bluescreen that bricked my install, won’t boot, auto-repair won’t work, all gone to shit. Thanks windows.

5 Likes



lads. the game is on. it is now ordered for printing.

7 Likes

How are you going to drive the dp?

it will be a test board. I am trying to make it DP in for “mux” tesr. not really for final design if it doesn’t work.

1 Like

Looks like you’re trying to do two oculink 4i ports, any hope/desire in doing oculink 8i at any point?

this two oculink 4i can merge into a 8i with proper cable. and those cable are easy to find

2 Likes

Would you be able to detect the need for bifurcation automatically, or would this be a switch/jumper in final version?

that will only be known after the test

1 Like

It looks amazing. Just to confirm, does it have any sort of signal amplifier? I believe the only oculink 8i EGPU out there is pretty basic, so I am wondering if we may need a lot of signal integrity with all the DIY stuff we are doing. I think already for the oculink 4i connection the DEG1 signal amplification came in very handy.

And I also have a second question: will the final version necessarily stick out of the notebook at the end?

1 Like

everything needed to be tested. if the signal from the source is already very bad, and amplifier will only amplify a bad signal. if the signal was just a little bit bad, than we can think of adding retimer and etc to compensate. but the proof of the pudding is in the eating. so we shall see.

4 Likes

I’m just stopping by to say that’s a beautiful photo. I don’t know why you put that much effort in, but it’s not gone unnoticed.

4 Likes

Another little update. Everything has been working super well with this setup. Gaming is just smooth as butter and as I mentioned earlier, it feels like a desktop. FAR fewer of the usual laptop gaming jankiness/stuttering/struggles, and the system fans are so much quieter only having to contend with the CPU.

  • Battlefield 2042 plays great and maintains a consistent 115-120 frames at 4k with mixed settings and balanced DLSS.
  • Helldivers 2 is much more bottle necked by the CPU but still maintains a very comfy 60-70 frames at 4k with high settings and balanced DLSS.
  • Overwatch and other less demanding games are a cake walk
  • Cyberpunk is gorgeous and plays great (50-60 frames) at 4k WITH ray tracing and balanced DLSS. Pulls a steady 250w and 95% utilization on the 5070ti
  • Premiere pro works great and had no trouble editing a multicam 6k project

I’m sure I have a little honeymoon bias, but I think this OCulink set up really unlocks the potential of this machine and has really kind of flipped the script on how I feel about my FW16. What felt like a machine with a lot of compromise before now feels like a versatile gem that can offer something that basically no other laptop can. I never really liked the balance and feel of the computer with the dedicated GPU installed. With the shell/M.2 bay it feels like a legit thin(ish) and light(ish) machine that’s well balanced and portable.

A thin and light laptop with 10tb of SSD storage, 96gb of RAM, AND can easily hook up to a GPU that’s more powerful than the new 5090 mobiles that are rolling out. And that’s before you even consider the Framework repairability and modularity. :smiley:

What a time to be alive, boiz.

PS: my thermal pad kit just shipped today so that will hopefully make this rig behave even more splendidly.

8 Likes

This is what I’m seeing with my 1050 Ti on Oculink. 4x lanes at PCIe 3.0, so I’m at least getting PCIe 3 speed out of my link. I don’t have a better graphics card yet to test on.

1 Like

Hey all, here’s another testimony.

My setup:

  • M2 extension module
  • Aliexpress ribbon cable SFF-8612
  • 3d printed backplate and pins from @Morkale
  • Minisforum Deg 1
  • Nvidia RTX 3090
  • Fedora 41

Card is correctly recognized and I can get video output.
lspci | grep NVIDIA

01:00.0 VGA compatible controller: NVIDIA Corporation GA102 [GeForce RTX 3090] (rev a1)
01:00.1 Audio device: NVIDIA Corporation GA102 High Definition Audio Controller (rev a1)

Checking the speed I get between 2.5GT/s and 16GT/s depending on the load, all four M2 lanes are also detected.
sudo lspci -s 01:00 -vv | grep LnkSta

LnkSta:	Speed 2.5GT/s (downgraded), Width x4 (downgraded)

Here are some pictures of the assembly process. In my case I cut the M2 key about halfway through to size 2242 to make room for the cable bend, I think it is cleaner than screwing it in without the spacer.

I was a bit afraid of bending the ribbon cable too much and it took me a couple of careful attempts (it needs indeed to bend quite a lot) but overall the process was quite smooth. I also added a piece of tape to keep it in place while sliding the expansion bay back in.


I am not sure how to proceed right now with benchmarks and testing that everything works ok. I welcome feedback and benchmark requests from fellow Fedora users!

EDIT: For the ones who already set this up, you can monitor the PCI signal with:

  • nvidia-smi pci -cErrCnt to clean up any previous errors
  • nvidia-smi pci -gErrCnt to monitor new errors
7 Likes

This is exactly what I plan to do too. I also plan to cut it as it simply leaves more space for the bend as you said. Just waiting for the final 3D printing parts to arrive…

Just wanted to say that I could have written every single letter of your post myself - I could not agree more :). In my case it is my good, old RTX 3090 which simply destroys still today in this setup quite modern triple A titles. Examples are Far Cry 6 maxed out with HD textures in 5K resolution at 45 fps, or Cyberpunk pretty much maxed out with pathtracing without issues too. I think my framework now has the performance of a RTX 4090 laptop, but without the cost. It just works. And yes, GPU gets quite hot with using 16+GB of VRAM, but works fine ;).

It is exactly as you mentioned, better for the laptop that only cools the CPU, feels like a desktop, just works.

2 Likes

I found out that Blender has a benchmark working on Fedora out of the box, so here are the results.

Perhaps more experienced users can help me understand whether the error counts are good/bad?


2 Likes