Possibility of an external GPU through the pci x8 expansion slot

Curious if there are any plans to have an external GPU enclosure that connects through the Expansion Bay’s pci x8 slot? My thought is that I dont need a dedicated graphics card when on the go but it would be nice to have one when at home for gaming. Using the Expansion Bay slot to connect a full desktop card externally would give better performance than using a usb 4.0 enclosure or a mobile graphics card. I saw a bit of talk in a thread named “External GPU Expansion via PCIE” but I am hoping for an external enclosure designed to be connected via the Expansion Bay module with a dedicated power supply to the enclosure. Otherwise, I would need to rig up my own enclosure, like people have done with the m.2 slots.


As far as I know, there aren’t any plans to officially support Desktop GPUs using the expansion bay system; however as said in the mentioned thread, it is 100% possible to use some sort of adapter to do so, but you would likely need to build your own enclosure setup to power and cool the GPU.

Of course, this is assuming someone has already built said adapter.

1 Like

If it has USB 4 you could just use an eGPU dock right?

Yes, but you still get a higher bandwidth and lower latency utilizing the 8 pci lanes directly instead of going through USB 4. Worst case, yeah I would grab a normal USB 4 egu enclosure or potentially try and setup my own adapter.

1 Like

In regards to possibly making our own setups, anyone know if there is something premade that will go from the 148 pin out in the framework 16 expansion bay to pci x8 or pci x16? Not sure if this connection is unique to Framework or if it is a standard of some kind.

I for one would find such a docking station very attractive.

You could have an efficient GPU in the laptop expansion board, plus connector to push out the 8x PCIe.

Then you could have the latest gen desktop GPU available at desktop when playing games etc, but have “enough” GPU power at move. I’d definitely prefer something like this as it would suit my way of working.

This is all about open architecture, so I’d be willing to investigate creating such a product, if there is sufficient interest.

1 Like

Hotplug is they main factor in question here, if they implement that it would be pretty neat.

8x pcie4 would bottleneck a lot less then the 4x pcie3 with overhead you get from thunderbolt.

1 Like

Yes, even with 8xpcie it would be vast improvement over TB4x2 or x3 - if they even implement those.

Hotplug in pcie can get tricky, but I suppose its doable since the Asus Z13 and X13 supposely have one for external pcie 4x connector. I gather with GPU:s there is challenge to map the memory correctly.

What I would like to see is an expansion board with some midrange gpu, say 4050, and a proper electrically hotpluggable pcie4 x8 + usb4 + power connector.

But I also wonder, will there be CPU:s with integrated graphics as well? If so, there could be a special expansion board just for docking, not much electronics in it.

1 Like

With AMDs APUs that should be enough for the gaming and work I do on the go, and being able to hook up a high end desktop GPU when I get home would eliminate the need for me to build a gaming computer. I’m not tech savy enough to know if that would cause a bottle neck off hand but I’d love to find out and I hope they go through with it.

1 Like

The AMD 7xxx APUs have surprisingly good integrated gpu performance.

Using the expansion bay to put an extra battery and a pcie/oculink connexion to connect a egpu when at home seems ideal if you do not need the gpu on the go.

As discussed many times around the forum, the problem of connecting through Thunderbol/USB 4 is that the bandwidth is not enough to feed the eGPU, which leads to lower fps and, even worse in most games, a stutter when there are quick moves, that makes playing with it really annoying.

There are numerous Youtube videos demonstrating why Thunderbolt/USB 4 is not a good match for a eGPU. You will see the effects I mention.

Until Thunderbolt 5 (and probable equivalent USB 5) arrives with rumored doubled bandwidth, pci/Oculink is the only good solution to connect an egpu.

1 Like

I am typing this reply on a system that has Thunderbolt 4. I am running 1 4k screen and 2 1080p screens plus the monitor on the laptop which is 4k. I am using a 3080 in it. Right now it is working perfectly on Linux with no stutter or issues. I can play graphically intense games really well and I have no major issues. I can play Witcher 3 at 4k and I get around 40 to 60 fps on ultra preset.

An eGPU is for sure worth it. My laptop has nearly the full power of a desktop and if you compare that to the dGPU in the laptop it still performs way better and I don’t need to plugin multiple different things. One cable can power and provide all the connectivity you need.

I even have ethernet and usb devices connected.

I am using Manjaro.

Just two examples. You can find many more. The stutter is game dependent, some games do not suffer so much from the added latency, others do a lot, but it happens.

Right now Thunderbolt is not great for egpu gaming.

1 Like

Well if it is game dependent you should mention the games because I have had 0 issues.

eGPUs do not suck. It is awesome.

So what I’m seeing is…don’t buy a 4090 to run outside of a desktop? eGPUs have limits and bottlenecks, which absolutely will hinder the top of the line.

That being said, the lower you go, the less bottlenecks become an issue. For a 1080p gamer, triple-A games at medium settings is more than enough for me and I’ve never had a problem with that on my external GTX1650. I’m arguably running the worst eGPU configuration as well: lowest spec cpu (11th gen i5) and single-channel ram (1x16gb).

Yes, you don’t get desktop performance. But you also don’t get a desktop, which is nice. I’m a student; I didn’t want two computers, but I wanted something I could take to class and then do rendering and light gaming on at home. Gaming laptops were too big/heavy for my use case (I lugged a 15” media laptop around for two years; never again).

I’d change this to: “Thunderbolt is not great for a no-compromises setup.” Which is kind of a truism, eGPUs are by design a compromise and people expect that to be the case (I was pleasantly surprised by my laptop’s computing power with an eGPU because of those expectations).

1 Like

I agree, eGPUs do not suck, they are great.

eGPU through Thunderbolt 4 is not great. I suspect if you tried your egpu with more bandwidth and less latency you would notice the difference.

Edit: @Be_Far Even with slower gpus the latency will affect you badly in games that are sensitive to latency.

In any case, if you guys are happy that’s great, but people should be aware of the limitations that egpus suffer through Thunderbolt 4 and that those limitations do not exists through PCI/Oculink.

Could you give an example of a game with a latency issue? I play Titanfall 2 (a movement shooter) on my egpu and haven’t had a perceivable problem. My offset in well-optimized rhythm games is also pretty low and hovers around what I would expect with a mechanical keyboard and mouse over USB2.

I’ve posted two videos showing examples already. Here you have another one comparing Desktop, Oculink and Thunderbolt:

Those aren’t examples of latency, they’re examples of frames per second. Latency is harder to show on video, it’s “when my mouse moves, it takes a perceivable amount of time to register on screen.” This can happen when there’s a lot of complex graphics leading to longer frame times, but it’s not directly correlated with frame rate. You can still have latency with a stable 240fps, but I’m saying I’ve never experienced that in two years of Thunderbolt eGPU gaming.

I asked for an example of a game so I can try to run it and perceive the latency for myself (formerly competitive with rhythm games; visual latency is very noticeable to me).

I’m guessing you have skimmed the video but not really watched.

Notice how he not only talks about the difference in average fps but also of 1% lows. Notice how in several games, the Thunderbolt connection really degrades the 1% up to 50% vs oculink, for example Batman Arkham @1080p. That will show as stutter.

Stuttering isn’t latency either. Yes, unstable frame rates (large gap between average and 1%) will show up as stutter. This actually further divorces unstable frame rates from being caused by cable-related latency, as cable-related latency serves as a constant plus to frame time, and doesn’t vary.

Again, I’d like to know what the latency point is. Your videos show that for a no-compromises build (something using top of the line graphics), an eGPU is unsuitable. This has been and will be the case until a better protocol is practical (Oculink gets close, so I’m optimistic!).

To sidetrack the conversation a bit, I’d like to talk about bottlenecking instead of latency. RAM and CPU are the anecdotal bottlenecks for an egpu, along with the fixed latency of the cable. The videos shown don’t provide data for what is the bottleneck, other than the fact that it’s absolutely not the graphics card. I would have loved to see CPU usage and memory specs+ usage at the point where the FPS was recorded. If those were both low (and the graphics card is unquestionably low), then the problem is actually that fixed length of cable+the protocol.

Similarly, if lowering the settings increases the FPS or decreases the stutter, you know it’s not a fixed length issue like the protocol. The videos mostly tested at the highest settings, and thus we’re still left with a lack of data. For example, my setup’s CPU usage hovers ~30%, ram pretty unused, and my eGPU topping out both vram and compute wise, and lowering graphics typically raises my fps and decreases stutter, so I know my hard limits are gpu-bound (makes sense; it’s a 1650 super).

TLDR diagnosing what is wrong with TB4 egpus is a more complex issue than “egpu gets this many frames but desktop gets this many frames.” The first video does a good job explaining a few of the concerns.