My portable eGPU setup (ADT-Link R43SG-TB3) [FULL BENCHMARKS AT EGPU.IO]

(also posted on

eGPUs! In case you didn’t know: yes, they DO work with the Framework, even though the only thing stopping the laptop from actually being Thunderbolt-compatible is Intel’s slow certification process.

Browsing the recommendations of enclosures at will tell you that the Razer Core X is the supposed best option. But the problem is that it’s the size of an mITX PC case and weighs a staggering 7 pounds. How am I supposed to take it on the go when it weighs more than my desktop??

The Razer Core X sucks for portability in every way, and it’s also $400 new. I decided to give the ADT-Link R43SG-TB3 a try instead, which is currently listed as the 5th best option on the list for only $150 on AliExpress. (You will also need to buy a PSU for it, and the recommended one was a Dell DA-2 D220P-01 that goes for 20 bucks on eBay.)

And as it turns out- it all works great!

  • Framework Laptop i5 Model (16 GB RAM, 512 NVMe storage, Win 10 Pro)
  • ADT-Link R43SG-TB3 eGPU Adapter
  • Zotac GTX 1060 Mini 6 GB Mini
  • Dell DA-2 D220P-01 PSU

It works plug-and-play. Just connect the eGPU and approve it in the Thunderbolt menu, then go to NVIDIA’s website to download the latest drivers.

I can’t tell you how good it is for gaming since I just needed a video editing rig to take on the go for my job. Even with just using the internal display, the eGPU makes my laptop a lot easier to scrub through footage on Premiere Pro.

CUDA-Z Benchmarks:

Unigine Valley 1.0 Benchmarks:
Here are the benchmarks that compare the performance of the GTX 1060 in comparison to a normal desktop, the eGPU with an internal screen, and the eGPU with an external screen:

The iGPU:
Funny thing you may have noticed- the Iris Xe iGPU on the Framework actually jumps in and helps the GPU to do things! An unintended but great way to make up for lost bandwidth over the USB-C connection, I think. (I also don’t know how to disable it in Unigine Valley but modern applications can usually utilize the power of both the iGPU and dGPU, so I’ll include it.)

I was going to omit these LuxMark 3.1 scores altogether since they seem way too good to be true, but I figured I’d include them here- THESE LUXMARK SCORES DO NOT INFLUENCE ACTUAL PERFORMANCE LIKE THE UNIGINE VALLEY ONES. I have no idea why these numbers are as good as they are. (It could be that OpenCL itself is the bottleneck and not the hardware)

Need even more performance?
You can get the most performance out of this setup by using an external monitor and disabling the Intel Xe iGPU in your Device Manager. Don’t forget to re-enable it before disconnecting the GPU though, or something bad might happen. I don’t want to find out what that could be myself.


Here are some benchmarks to go with it:


How to transport it:
I bought anti-static bags to transport everything in a backpack when I fly to other parts of the country. It’s not perfect but it’s way better than the stupid giant Razer Core.


  • It’s an adapter not an enclosure so your GPU is exposed and naked (not ideal for households with kids or pets)
  • For use on an external screen (for maximum performance,) you will need to disable the Intel Xe iGPU in the Windows 10 Device Manager or your external monitor will stutter like crazy (and you will also need to remember to re-enable it before disconnecting your eGPU)
  • Since it’s a GPU, it’s a bandwidth hog- you can only use one more high-speed USB-C device (like storage) at the same time if you connect it to the other side of the laptop, meaning you can only have two high-speed USB-C devices total (40Gbps each side)

Avoidable drawbacks:

  • The Dell charger is gigantic and heavy and you’d be better off buying a smaller Flex ATX or SFX power supply instead (although this Dell brick is dirt cheap and works too)
  • I am an idiot and dropped the securing thumbscrew inside the GPU and had a scare that I shorted something- BE CAREFUL with this adapter! Don’t be stupid like me

Overall a great build- eGPUs are a no-brainer to pair with the modular/repairable Framework, especially this actually portable one!


@gs1, this is an amazing write-up. Thanks for being our guinea pig!

I have a full size GTX 1080. Do you have an idea for how the adapter would handle such a lengthy card? I’m concerned it would just tip down to the corner of the card and lose stability.


I think it should handle just fine! Here’s a picture of someone using the same adapter with a 1080Ti from this thread:

EDIT: I just noticed there’s two of those Dell power bricks in this image. You might want to check that thread I linked where this image is from, I didn’t factor in the higher power draw of heavier cards like the 1080… (although do get a Flex ATX/SFX PSU not the big heavy Dell)

The Dell adapter actually looks more compact than both flex ATX and SFX. I don’t care about the weight.

The TDP of the GTX 1080 is 180W. I figure no more than a few extra watts for the fancy RGB lights and fans on my model. The Dell has an output of 220W.

I wonder how much power the ADT-Link adapter itself requires and how it manages passthrough charging to the laptop (if it does).

That adapter is the product of the wonderful PCB reverse-engineering industry in Shenzhen.
It is basically the Intel’s “thunderbolt 3 to PCIe x4 evaluation board”, but with the x4 plug changed to the x16 plug (the controller and wiring is basically the same), so it does not handle power.

I was looking at AKiTiO Node earlier because they have room for a atx brick and a full size card, but again it is somewhat expensive.
The reality is that those thunderbolt controller themselves cost such a amount (i have no idea why. rumor had it that they are 28nm but still) even before the pandemic, $125 and above. This is why ADT-Link is priced at around $150 and others are about $100 more

So now I had seen the brawn of eGPUs, maybe when I had enough $$$ I will get one. Or not. Minimum graphics on iGPUs are okay.


That’s fine for me. I can’t imagine the board itself pulling more than a few watts, so I should be in the clear with a GPU that consumes ~200W with a 220W PSU.

Looks nice, but I’m looking for the most compact, cheapest solution I can find. I have my GTX 1080 from 2016 and with the insane cost of GPUs now, plan on holding onto it until it dies, so I’m aiming towards minimal.


I double-checked the dimensions for you:

Dell DA-2 - 4.0 in x 7.75 in x 1.75 in (10.2 mm x 19.7 mm x 4.4 mm)
Flex ATX - 3.21 in x 5.91 in x 1.59 in (8.2 mm x 15.0 mm x 4.0 mm)
SFX - 3.94 in x 4.92 in x 2.5 in (10.0 mm x 12.5 mm x 6.4 mm)

The Flex ATX is much smaller than the Dell and the SFX is marginally smaller. You were right about the SFX PSU then but a Flex ATX one is definitely more portable!

I would be careful with the PSU. Even though a normal 1080 only consumes 180W on average the 10ms peaks can go over 300W. In comparisson a 1060 with 120W TDP has 10ms peaks of around 150W. With desktop PSUs this can cause the overcurrent protection to trip very fast making the PSU shut down


Thank you! Looks like I’d need a higher rated supply then.

You could probably find some kind of project box to mount everything in.

I would design and 3D print an enclosure for it all if I knew how!

how much could you really do with this because you would be very cpu bottle necked?

Generally you are looking at about 2% loss when playing on a external display, and no more than 15% (not confirmed) when playing on laptop display.
Again, it depend on many things. The main bottleneck is the bandwidth, so you might be able to do, say, raytracing 1080p 60fps but can’t do minimum 1440p 50fps

1 Like

Hey! Just writing to let everyone know that I updated my post, specifically with

  • new findings on a workaround on how to use this with an external screen without stuttering! yay!
  • information on being able to use another high-speed USB-C device simultaneously with the eGPU since I just recently learned that each side of the Framework is capable of Thunderbolt at 40 Gbps for a total of two devices, not just one for the whole machine

I see where you are coming from, and thanks for the write up. I think for me the eGPU is for when I want to treat the Framework like a desktop instead of a laptop. I get how if you needed to bring that with all over the place, getting something as small as possible would be best, but I just would be worried about all the exposed non-supported elements getting unnecessary wear and tear.

For me, I think I would rather lug around a 7 pound enclosure (I mean 7 lbs isn’t really that heavy) and be more confident that the movement and transportation isn’t going to break anything inside.

That said, I’m sure there are things you could do to help reinforce the area that give me cause for concern. Some kind of honeycomb enclosure that would allow you to anchor the card from the rear as well, to prevent unnecessary stress on the pci-e connector, etc.

Is that something you are worried about at all, or am I blowing things out of proportion here?

I can definitely see the concern of friction damage here, and carrying these components in anti-static bags are never anything I would put in a bag that I’m not carrying with me at all times, like airport check-in luggage. I’d largely still pack my things in my backpack to minimize the GPU moving and rubbing on the exposed backplate side, but after many close calls I’ve had thinking I damaged the very small components attached to the backplate, things were fine.

In other words, I have no way to confirm if carrying a GPU by itself in only an anti-static ziploc bag inside a backpack puts it at risk of damage, but I feel comfortable taking my chances from my personal experiences.

I would LOVE to 3D print a custom enclosure for the entire thing, but I have no idea how. I’ve always wanted to 3D print my own Velkase Velka 7 instead of paying $300 for that ITX case, actually!

The way it’s done on these platforms without discrete graphics is that all display is passed to the iGPU. Then it get sent to the display
What probably happened before you disable the iGPU is that the graphics image get processed in the eGPU, sent back (because everything goes to the iGPU for display), then sent back out to eGPU for display.

Whereas laptops with discrete graphics have two ways of wiring it:
Without a MUX (basically a switch), the dGPU is wired through the iGPU to displays. Which mean that no matter how there is the iGPU bottleneck
With a MUX, the display (and external output) can be connected to either GPU, which make use of full performance. But a MUX is fairly expensive ($100).

So if you connect eGPU to computers with iGPUs, it might work out of the box. Or, just some motherboard tweaking.

If no graphics device is available, the computer fall back to CPU computed graphics (it’s a thing?!)
Yes, it’s a thing. This is where you can find the “4x faster graphics” on earlier iGPU advertisements when it’s comparing it to a similar-TDP CPU.
I had tried CPU graphics on my old Lenovo Xiaoxin Air 13 Pro (it’s such a piece of crap, I dont care if I destroyed it). You can even play games like World of Tanks but you know it’s not good when you see your CPU utilization go 100%.

This is cool! I had no idea this was possible after previously buying a Ryzen 1700 for an old PC build and learning that it requires a dGPU to run. I guess Intel chips behave differently

1 Like

That’s cool! Thanks for sharing this info

1 Like

For those running Linux as their operating system of choice, I use the Fedora flavor and did a write–up on my successful eGPU experience here

1 Like