FW 16: No Thunderbolt (4) Support?

After having read through all documentation I can find, It appears there is no TB(4) official support? I don’t believe it even says 40G vs 20G anywhere.

This is the closet thing I have determined…USB4 speeds on the rear ports (2x)

There is plenty of hypothetical and conjecture in prior threads.

I guess it is possible we have all TB4 features available on the USB4 (2x) ports but without testing and certification, we would have to hear from the framework team on the capability/feature delta is between TB4 and USB4 (hopefully none).

As a user who primarily computes while docked (into a TB4 dock), it is important to me as a consumer. Thanks in advance for any official answer or detail.

Further details and documentation are also welcomed (such as USB-C output wattages), exact limitations of the lower expansion card slots, etc.

6 Likes

and if I’ve missed these answers shared in an official capacity, please link. TIA.

The Thunderbolt name is owned by intel, certification and approval to use the name has to go through them. Full feature USB 4 provides the same capabilities, just without Intel’s stamp.

And note that while Framework staff does sometimes answer questions here, this is a primarily a community forum. You can contact support if you want an official response.

There is a blog post planned on the connectors, I suspect exact details will be given then.

Framework | Framework Laptop 16 Deep Dive - Battery and Speakers

More from the Framework Laptop 16 Deep Dive blog posts:

7 Likes

As @MJ1 wrote, Thunderbolt is a Trademark of Intel (and Apple). In reviewing the AMD Ryzen 7 7840HS spec sheet, this CPU supports 2 x USB4 ports. Which means you’d have 2 x 40Gb/s ports out of the 6 ports on the FW16. Note, that 40Gb/s with USB4 is only when using a 1M or shorter cable. When using a 1.5 or 2M cable, speeds are reduced to 20Gb/s. On the other hand, Thunderbolt 4 supports 40Gb/s up to 2M cable.
Otherwise, USB4 is “similar” to Thunderbolt but there are other differences. See this article which goes over the differences in detail. For most users, the big difference is Thunderbolt 4 can drive 2 x 4K monitors from a single port. USB4 can only do 1.

https://www.tomsguide.com/features/thunderbolt-4-vs-usb4-whats-the-difference

5 Likes

And as far as I know the AMD USB4 implementation, if it is fully enabled also allows full TB3 compatibility with eGPU Support on the Frame.Work AMD Chipsets.

That TB4 is not supported is not an issue of the Framework Notebook it is an issue what the AMD mainboard is able to support.

Actually no AMD based Notebook supports TB4 and only a few support USB4 with it full specs. (DP/ Power Delivery / 40GByte/s and eGPU compatibility)

And it is the responsibility of the vendor to implement it.
For example even the really expensive Dell Alienware M16 Notebooks with the newest AMD Chipsets only support USB3.2

4 Likes

Uli_Feltes If I understand you well The 16 laptop with ryzen 7 7840hs supports a thunderbolt 3 egpu, I am misundurstanding ?

We are awaiting confirmation from FW, as some of the capabilities of TB3 under the USB4 spec are optional.

Here is a table for general reference.

2 Likes

While it’s always nice to have official confirmation from FW, I’ll go out on a limb here (lol) and use the AMD specifications for the Ryzen 7 7840HS and pretty much confirm the USB 4 implementation in all AMD based FW laptops going forward will be full USB 4 40Gb/s. This means nearly all TB 4 features are included. I say nearly because there are some docks that may have a feature or two that won’t work the same when used in a solely USB 4 40Gb/s implementation. For example, dock based NICs, ability to use dual displays through dock are two potential hiccups that you should be aware of when dock shopping.

Here’s the full spec sheet for the Ryzen 7 7840HS https://www.amd.com/en/product/13041

2 Likes

That’s the main reason for supporting Intel CPU / Chipset with full Thunderbolt 4 support! So unfortunate and such a deal breaker FrameWork decided to go AMD-only on the 16’’ model ;-(

The poll here says they made the right choice. The community voted AMD over intel, and by a large margin. 80%

Framework does not have the resources of larger companies. They can’t provide every option all at once. Most people understand that more options will come in time, and that just impatiently complaining won’t help.

7 Likes

I don’t think it was so much a decision to go AMD only. I think it’s more a matter of limited resources and having to start somewhere. With the 13" they started with Intel and after a couple of years (while many said no AMD was a dealbreaker), they are finally releasing an AMD option. With the 16" they started with AMD. Hopefully an Intel option will be available in the future.

6 Likes

almost 99% likely to be unrelated - but pasting here due to similarity to above chart.

3 Likes

Unfortunately, the community, like many, may be focused on the wrong things. That is performance over reliability. Intel offers reliability that AMD has tried to match for over 20 years.

1 Like

You make a good point. There’s no perfect solution. It’s a matter of tradeoffs and for the 16" it appears AMD rolled out the red carpet for FW and FW chose to take them up on it. Hopefully the AMD based FW16 is a winner.

1 Like

That’s a hot take of a sweeping generalisation! We’re not in the Super Socket 7 days anymore. Have you seen the number of datacentres using Epyc?

6 Likes

I have to admit that since using my “AMD Ryzen Embedded V2748 with Radeon Graphics” server with ECC Ram, I never had a hick-up anymore.
Details on my server - Build blog: Server IMB-V2000P setup

Intel has spent too much time resting on the “stability” capital. AMD has caught up a lot, even more in terms of used technology (7nm CPU here - Zen 2) that also reduces power consumption drastically.

7 Likes

The FW13 debuted with Intel. AMD people had to wait to get AMD.

The FW16 will debut with AMD. Intel people will have to wait to get Intel. Intel people probably won’t have to wait as long as AMD people though.

And Framework is a small company so they won’t have the resources to get both out at the same time. I’m still surprised they were able to offer AMD on the 13. I thought for sure it’ll be Intel for a long time (just like System 76).

I’m pretty happy with the FW16 being AMD first. I get the efficiency, hopefully battery life too. I also surprised AMD came first this time. I think it was a good choice for Framework (small company) is working with AMD (which really needs some wins over Intel and nVidia).

I’m not a fan girl by any means. I always buy what works best for me, and in the days of the K6 and Athlon, I was AMD all the way (young, strapped for cash, so budget builder). When the Core 2 series came out, I was Intel ever since. It’s only when Zen/Ryzen came out they became a better choice again for me.

At this juncture, I am actually torn between 13 gen series and the 7000 series. Both seem to have pros and cons. Plus price is less of an issue as a working adult.

For people who are happy with AMD, they’re probably just like me - not having the latest Thunderbolt protocol is not a show stopper. For me, having even TB3/USB4 functionality is more than good enough. I am prioritizing efficiency/battery life, and Linux compatibility (AMD GPU). I’m also not completely sold on the BIG/little architecture for x86 just yet. Mostly because I’m on Linux and I’m not sure how well their scheduler is for that kind of arch.

If I was a Windows 11 user, then I would wholeheartedly go with an Intel CPU and nVidia GPU.

Whatever the reason 80% of people are also happy with the AMD CPU/GPU combo, I guess I’m part of that.

PS: Feeling this whole thing is the same when I was looking to buy my first System76 laptop. So many people wanted AMD, but none was offered until a year after I purchased my Intel version.

2 Likes

Please stop the debate about Intel vs. AMD. That could go on and on and on and it’s not on topic.

We’ll start deleting posts if it goes on much more.

4 Likes

I’m glad to see USB and Thunderbolt evolving. Maybe I’m getting older, but these protocols seem to be updated much sooner than I remember. I’d also like to see Oculink become a mainstream port for laptops and handheld gaming devices.

So that makes me wonder: Why does the AMD platform have less high-end/new protocols connectivity options? Is it because TB requires certification with Intel? If so, could Oculink be the answer?

I really hope to see open protocols to be standards and mainstream than proprietary ones, even if they aren’t the best. Ex: FSR vs DLSS, Freesync vs G-Sync, XMP vs EXPO, etc.

Well intel got a bit of a headstart there, they invented it together with apple and it still took 3 generations for it to become at least somewhat common outside apple products and is still missing on low to midrange intel stuff even if most of the previously expensive stuff now comes for free with the cpu (you still need the redrivers which aren’t cheap and and the certification but the barrier is a lot lower these days).

There is a lot of things people use thunderbolt for and occulink only really covers one (and even that with drawbacks and advantages).

Most of what people use thunderbolt for can actually be covered with just a simple usb3 + DP type-c port (dock with a bunch of usb ports and networking and a bunch of screens via mst and of course power). But the issue there is mostly that with a thunderbolt port you know what you are getting while with a random usb-c port it’s a bit luck of the draw depending on how the manufacturer felt that day.

For the egpu use-case where occulink shines it does definitely provide somewhere between more and a whole lot more bandwidth than tb4/usb4 (depending on if you got 4x or 8x) it does still have the drawback of not being hotpluggable which tb/usb4 is. This my be a dealbreaker for some. PCIE hotplug is possible but needs to be speciffically implemented on all levels.

Agree with you on most of those except the xmp/expo thing, those are just a set “know probably working settings” which are manufacturer specific by nature (the intel and the amd memory controller behave very differently). Better solution there would be to convince jdec to make spicier standard profiles XD.

1 Like