Has anyone actually got ‘usable’ performance with an eGPU on Linux on the fw? I’m really struggling with the setup I’ve got - the desktop grinds along at about 10fps and games are just unplayable at 1fps. (Considerably worse than the iGPU)
The post you linked to mentions later on that it’s to do with how the kernel reports to the BIOS which OS it’s running. So the problem would seem to be that if the BIOS sees a non-Windows OS then it’ll hamstring a bunch of things (including eGPUs)
Since it’s enabled on Linux but not Windows, is there a way to enable it on Windows? I’m looking to get an Intel Arc graphics card for my eGPU, and I need a way to enable reBAR since I’ll be gaming on Windows. Since the BIOS menus don’t have a way to enable it, is there another way to interface with it and turn reBAR on?
The question isn’t which card would be best, the question is whether arc will work. I know I’ll be leaving performance on the table (initially) by buying into the arc ecosystem. The concept of “voting with your wallet” comes into play here, Nvidia is notoriously anti-consumer, and a 3rd player in the graphics card space will provide much-needed competition, and I’m willing to take the performance hit on principle.
I as well would like to know if Rebar can be enabled some how in Windows 11, I already have an ARC GPU and the Arc software says that resizable bar is enabled and turned on, but the device manager properties as well as GPU-Z state that it is disabled… not sure what to trust but there is definitely a drop in performance with an Arc GPU in an eGPU over the previous 3060 I had in there and it’s very frustrating.
ReBar is a BIOS feature that requires a DXE driver for the calls, as long the “Above 4G Decoding” is activated in the BIOS level, it could work, just that Windows doesn’t have the right software within the operating system to use it compared to Linux.
@Jieren_Zheng, I don’t understand the part where you say: “Windows Doesn’t have the right software within the operating system” when other people running Windows 11 can enable ReBar without issue… just not on our Frameworks.
I submitted a support ticket and got the exact response I was expecting: “…For now, this is not available. While we cannot promise or speculate on the possibility of including this in future iterations of our product, we appreciate your suggestion and will pass this along to the appropriate department.”
The Arc tool says that ReBar is available and turned on but every thing else says it is not, and since I am getting about 50% of the performance I should with the A770 I am incline to believe that it is in fact NOT enabled.
This sucks and I won’t be able to sell my old GPU now (cause some games are just not playable with 17FPS). Maybe I will just put the A770 in a glass case for now like an ART piece?
I was just pointing out that ReBar was by default on Linux for a long time, but Windows had to have both the GPU driver to enable it rather than at a kernel level.
The next part would require BIOS firmware support from the mainboard with the processor, both the “Above 4G Decoding” and “Resizable Bar” features (and of course disabling CSM).
I have been working with some folks on Github to enable ReBar on Intel 4th Gen Z97 mainboards (after seeing success with Intel 3rd Gen mainboards) as Above 4G Decoding was implemented for Intel since PCIe gen 3, had to modify the BIOS to add in the DXE driver for ReBar and get the BIOS to assign the GPU memory to the 64-bit memory space rather than 32-bit.
In short, at least for Framework, the feature must be enabled first on the mainboard firmware level, which is highly likely it isn’t.
If anyone looking for info on FPS for different games, see below.
Setup is a 12th Gen i7-1260P with an IETS G500, 32GB and a old Razer Core X (not V2 or Chroma) with a AMD 6700XT OC and respectively a RTX 2080 Ti. The results were almost identical except 2-4 FPS (2080 Ti > 6700 XT).
The IETS lowers the CPU temperatue by 7-10°C on full speed and after one hour of gaming but beware of the noise.
All games on 2560×1440, high-ish graphic settings, without raytracing and on an external display.
FPS values are averaged. Lowest number shows the “dip” when entering/looking at a wider landscape or larger battle. I don’t record 1% dips.
Warzone 2.0 (BR): 58-87 FPS
Baldur’s Gate 3: 49-66 FPS
Detroit Become Human: 38-55 FPS
RDR 2: 47-62 FPS
Cyberpunkt 2077: 34-65
Halo Infinite: 61-90 FPS
Feel free to ask for different games. If I own them, I’ll test them and edit the original post with the values.
I have a question and hopefully someone with longer eGPU experience can answer. I just got a razer core x and popped my 3070ti in it. I have 2 1080p external monitors attached to the eGPU. I have the i5 11th gen framework with windows 11 and 32gb of ram for reference. I have noticed while idle and plugged into the eGPU my cpu temps rise to the 50 and 60 degrees c and fan ramps up a bit. Idle while not attached are in the high 30’s low 40’s. My question is, is this normal? If you need more info I’d be glad to share.
@Chris_Engle I have the same at idle, it’s just from using the Thunderbolt protocols, they’re fairly cpu intensive. The manufacturer max is 100 degrees Celsius, so you’re well below any danger. When rendering/gaming my CPU temps on my 11th gen i5 are in the 90s for prolonged periods, so 50-60 isn’t a huge deal to the cpu.