eGPU Discussion - Request a Benchmark

I wanted to setup a thread that encourages the discussion of using an eGPU on Linux with the FW 16. To start it off, here’s been my initial experience.

Specs:

  • FW 16 w/ Ryzen 7
  • Razer Core X
  • Arch Linux w/ KDE Plasma 6
  • NVIDIA proprietary drivers

I’ve made no eGPU specific configuartions to the device as of yet, current behavior should be reflective of default values (aside for the obvious nvidia modesetting=1 in kernel)

On hand I have the following GPU’s available for testing: RTX 3070ti, RTX 3080
GPUs available but with issues: RTX 3090 (Razer Core X doesn’t provide enough 8pin connections, I’d have to find a way around that), RTX 3080ti (this card might be fried)

So far I have tested Baldurs Gate 3, Last Epoch, and Halo Infinite. Halo Infinite and Last Epoch gave me quite good performance with frame rates high on the internal monitor (e.g. Last Epoch was hitting 165 fps) and the CPU seemed to be holding steady around 4.7Ghz on a multicore load which is insane.

Unfortunately, something about Baldurs Gate 3 absolutely does not like the eGPU, and regardless of what the graphics settings are it will not budge from 19-20 fps and it does not appear to be a bandwidth limitation. I do see a single core at 100% on BG3, so it’s possible it’s bottlenecked by that. I tested on both DX 11 and Vulkan through proton and the vulkan version was very bad with all sorts of artifacts and weird glitches whereas the DX 11 through proton works great (aside from fps issues).

I had to test on the internal display because for some reason Plasma will crash if I use an external monitor plugged into the eGPU. Switching to another TTY and switching back into my graphical TTY will bring Plasma back, but it seems to be capped at 30fps regardless of what the refresh rate is set to. Seems to be a similar/same issue as posted here, but I haven’t had the chance to dig through that post in detail yet.

I’m still in the early phases of messing with an eGPU, but with a fresh set of eyes I’ll document my experience here. So far I’m working under the assumption that any issues I run into are a result of either linux configuration/capabilities, limitations of proton, or potentially related to the FW bios (specifically relating to 30fps on external monitors through eGPU).


If anyone wants me to test a particular game/software on the FW 16 with the eGPU I’d be happy run a simple benchmark assuming I own the title (I have a large library). If you want specific graphics settings, let me know. By default I’ll use Medium at 2560x1600, motion blur off.


Update: I don’t encounter the 30fps/crashing issue if I connect through the iGPU via Display Port on the Framework itself. Curiously, I had a difficult time getting the display port to connect at all even without the eGPU, but plugging and unplugging in various combinations of ports 1,2, and 4 eventually got me connected.

1 Like

I’m interested in eGPUs too, and I’m planning to buy one to play VR with Ray tracing in cyberpunk. Somebody claimed the usb4 connection will mean it won’t work, but I think the biggest problem is the RT and that doesn’t go through the usb, and the vr headset would be connected to the eGPU. So if you can try if turning on RT decreases the fps compared to no RT, I’m not sure, that would mean maybe that the usb4 connection can hold up enough to use a 4090 level card to run the vr.

Could you check the usage with nvtop in Bauldur’s Gate? These issues sound a lot like there is a modesetting mistake? I’m not to sure!

Yeah, utilization on Baulders Gate sits around 20-30% with the egpu clocked at less than 1Ghz… definitely not running as intended. iGPU is not in use.

1 Like

Just curious, do you have an enclosure or a stand of some sort you’re using for your testing? I have never broached eGPU use before, but I do have an extra GPU from my desktop replacement. Do you have any links to what you’re using? Are you connecting it via USB c?

I’m using a Razer Core X, which uses Thunderbolt 3 over USB-C

My research suggested that it has the highest compatibility rate/fewest issues.
I mostly bought it out of curiousity as well to see where the tech currently stands in practice, knowing I would be getting my FW 16.

1 Like

I ran a real quick benchmark on the 3070ti for you. Medium settings (no motion blur) at 2560x1600

No RTX

0.1% Min FPS 1% Min FPS 97% Percentile FPS Average FPS GPU Load CPU Load
24.5 25.4 80.2 45 97.7 59.1

RTX ON (medium + reflections)

0.1% Min FPS 1% Min FPS 97% Percentile FPS Average FPS GPU Load CPU Load
22 23.4 73.5 37.9 97.6 47.6

My impression is that a bigger GPU would yield higher frames. Unfortunately, the Core X can’t supply enough power to plug the 3090 in :frowning:

This does look like the gpu is working as much as it can and there’s no usb bottleneck at all, at least to me. Thanks for testing it.

No worries about the power supply, I plan to use a different case,that can just take any PSU, and open air

1 Like