My hope is that future, more efficient CPU and GPU options will become available. This would allow for upgrades to performance without increasing the thermal load.
The cooling on the FW16 is great, but without BIOS level fan control, it’s can’t fully live up to it’s potential. I’ve managed with some third party solutions, but it’s just not what it could be. When I do a render, I take off my keyboard and track pad and put it on a stand to get the most cooling I can.
In the end, the Framework 16 mostly is a mid-high level performance laptop. It will never be a high level performance laptop due to the lack of adequate cooling for high performance components, but that doesn’t mean Framework shouldn’t work on providing an upgrade path for it.
While it does get quite hot, The Framework 16 is still a great laptop in my opinion, and the promise of future upgradeability only makes it better.
The Framework 16 will likely always be a mid - high level or even just a mid level performance laptop. Future CPU and GPU options that are more efficient will come out, but CPU and GPU manufacturers will still keep pushing to the thermal limits for the extra performance.
Even though Nvidia’s 40 Series was significantly more efficient than the 30 Series, they still got the 4090 to draw even more power and produce more heat to get the top level performance.
More efficient processors will lead to more performance for less heat, but companies will still keep going to the limit with their high power processors as it makes them look good. This is why the Framework will likely always remain a mid-high level performance laptop.
But this is also one of the advantages of the Framework ecosystem as it allows you to get those generational improvements without the need to purchase a new device every time. You also only need to upgrade the things you want (like a new dGPU module or a new mobo), and not the entire system.
The 7800M has also been released, without really telling the world:
https://www.amd.com/en/products/graphics/laptops/radeon/7000-series/amd-radeon-rx-7800m.html
tl;dr
Navi 32, 60CUs with 3840 stream processors, 2,15GHz boost clock, 12GB 192 bit DDR6, 180W.
But as long as there’s no charger capable of supporting the laptop with 240w, I don’t believe, that we will see any of these for our FW16.
If you’re gaming on a laptop, and playing something so demanding that you got the latest and greatest dGPU for, you probably installed yourself somewhere comfortable. You brought your gaming mouse, maybe even a your gaming KB.
Would it not be possible to have a separate power connection for any new dGPU they offer for a FW16, that is so power hungry?
Just have the FW16 cpu/system run from one power brick and the dGPU from another, tada no more waiting for bigger better power bricks?
Just wondering if that could be a consideration?
A dGPU manufacturer could easily add a barrel jack, but at some point an eGPU is just the better option.
I think I they’ve got to do the things they said they would do first for their new hardware platform. Most notably the 240W usb-c power supply. Once they’ve got that dialed in they can cook up whatever they have thermal runway for. As they can’t spread themselves too thin maybe they’ll go for something more performant but not top end. Don’t want to spend tons of RnD for low sales volume, very expensive option. 7700S was probably a good move to make sure their platform was successful at volume, no design surprises with heat/complications, current 180W power adapter limitation, etc.
Because the slot for the graphics card is not limited by height and depth (only width), you could fit bigger fans, and cooling fins, to handle the higher thermals of more powerful GPUs. You could even have a dedicated USB-C port for power, allowing both the laptop and the GPU to intake 180 watts each. (More wild designs could include ports for fluid intake and output for portable water cooling.)
Would this be big, bulky, and a little goofy looking? Sure. But I know a lot of people who would go for something like that if it meant they could do renders in the field, especially if they could slim down their computer with a quick swap of the cards. The lack of “Gaming” marketing for the FW16 means that researchers and scientist can more easily justify it on grants. This segment of the market is ripe with people who just need to push pixels and don’t care too much about bulk, heat, noise, or aesthetics.
Computer go burrrrr.
The issue is that it is much easier and cheaper to get devices that are much more powerful than the Framework 16, while also getting a much slimmer profile compared to it.
Just take a look at laptops like the Lenovo Legion series. Both the 7/7i and the 9i have the top of the line Intel or AMD HX CPUs while also packing in RTX 4090s. The Framework 16 couldn’t beat any of them in either CPU or GPU related tasks.
The profile of both laptops are around the same with the Lenovo’s being even slimmer than the Framework 16. Even without the GPU, the Framework 16, would likely just match the profile of the Lenovo Legions.
You could easily get the same amount of performance as the Framework 16 in a much slimmer chassis these days like the ASUS Zephyrus G series.
The market is there for high performance portable computing, but the modularity of the Framework 16 is the downfall of it as it comes at the cost of space savings and price.
For more details on the comparison between the two laptops you can check this link: Lenovo Legion Pro 7i Gen 9 (16IRX9H, 2024) vs Framework Laptop 16: which is better? | NR
The 7700s just isn’t a compelling mobile GPU in terms of performance. Worse performance and worse feature set than Nivida’s offerings. 16GB would have been a decent tradeoff, but there are expenses associated with doing 16GB an 128-bit bus beyond just the extra RAM chips. AMD’s strategy of keeping RDNA and CDNA separate isn’t helping either, since official ROCm support is limited to the 7900 GRE/XT/XTX in the consumer space while CUDA works on basically every Nvidia card.
They are launching the 12GB 7800m but I suspect those are just harvested 7900m dies. Costs aside, one would be better off with the full 7900m dGPU.
FW should embrace what it could be and offer a dGPU chasis that suports 120mm x 15 and/or 120mm x 25mm fans. The extra thickness would raise the rear of the laptop and improve cooling. It would add bulk (obviously) but the FW16 is not a small laptop anyway. What it would do is give it an undisputed advantage of extremely low noise under load compared to basically every other laptop.
I think 7700s is fine overall at performance, but 8GB GPU memory is just at par with low-end GPUs. My previous laptop had 12GB VRAM being a 6700M, the previous iteration of 7700S.
I understand AMD doesn’t want to cannibalize their own market, but getting a high end GPU just for VRAM is ridiculous.
High end GPUs are bad idea on laptops, fans are noisy, bulky chargers, high power consumption, they create unnecessary heat to your computer and will mess your CPU/GPU/Disk thermals limits causing bottlenecks (and random restarts).
Also, at most games GPU is not going to be your major bottleneck, but your CPU.
I agree that the RX7700S is a decent mid level laptop GPU, but the VRAM and other capabilities are slightly lacking. AMD’s ROCm is so much worse than CUDA, and even Intel’s putting more work on oneAPI than AMD. They aren’t even working on trying to get consumer cards to work with ROCm.
Personally, I think the best GPU for the Framework 16 would be the Arc A770M. The Arc A770M has 16GB VRAM which is as much as an RTX 4090 Mobile. Performance definitely isn’t as good as the RTX 4090 Mobile, but it has so much more VRAM than the 7700S. It also has a max TDP of 120W while the Framework can provide 100W, so it’ll be running near its max performance.
Sadly Framework’s gone with AMD Advantage this time, and it’s likely going to be some time before they have Intel chips in the Framework 16. Nvidia GPUs would be nice, but they would cost a fortune.
The 7700s is maybe 5-10% slower than a desktop 6600, so the 7x40 is not going to bottleneck it. The CPU’s biggest gaming weakness is the low L3 cache but otherwise it should keep pace with non-X3D CPUs in the 5600x-5800x and 7600x-7700x ranges (again, the low cache will be a factor for frame pacing/1% lows).
16GB would allow the 7700s to age better and run games with higher quality textures and less stuttering, regardless of raw GPU horsepower. That would have been a clear win for it over the competition. 16GB would also be nice for basic LLM/AI- style tasks and development. That would have been another win (even with unofficial ROCm support). As it stands now, Raster, RT, FrameGen, and DLSS/FSR Upscaling are all better on comparable NV offerings, with the same 8GB limitations.
AMD’s mobile GPU lineup is limited - they essentially had 4 GPUs that were +/- 10% of the 7700s and then (later) a big leap up to the 7900m. The 7700s was probably the best choice (higher CU count, lower clocks and thus better performance/watt) so I don’t blame FW for choosing it. And there are AMD advantages, especially on Linux.
Heat management in laptops is a thing, as it weight, size, and battery life. However, consider that the majority of us slap a case on our slim cell phones the minute we buy it. I believe that some of use would be willing to add some thickness to our laptops to improve cooling and noise. Many of us use raised stands or laptop cooling pads. Obviously FW doesn’t have large marketshare, and the subset of FW16 owners with the dGPU is smaller still, but I think offering a replacement dGPU chasis that allows for larger fans to improve cooling and lower noise would be a great upgrade and offer a benefit that cannot be found elsewhere.
Regarding ARC A770m, the reason it has 16GB of VRAM is that it runs on a 256-bit bus, just like the 4090m and 7900m. The upcoming 7800m runs on a 192-bit bus and hence it has 12GB. GPUs like the 7700s and 4070m have 8GB because they are on a 128-bit bus. 128-bit cards can have 16GB (see the 7600XT and 4060 TI 16GB) but require memory modules on both sides of the PCB which is easier to do on a desktop GPU with a backplate on and no real size constraints. I imagine for a laptop that is a bigger issue.
Different dies.
The 7900m uses the Navi 31 die, which is a 304 mm² die with 96 CUs (only 72 active in the 7900m) and is also used in the desktop 7900 XTX, 7900 XT, and 7900 GRE. The 7900m also has 4 MCDs (dies containing memory controllers and cache).
The 7800m uses the Navi 32 die, which is a 196 mm² die with 60 CUs and is also used in the desktop 7700 XT and 7800 XT. The 7800m also has 3 MCDs.
The 7700S (and all other RDNA 3 mobile GPUs) use the Navi 33 die, which is a 204 mm² die with 32 CUs and is also used in the desktop 7600 and 7600 XT. Navi 33 has memory controllers and cache integrated into the main die so there aren’t any MCDs.