A rant about the Framework 16's lack of gaming capability

While I am overall impressed by my Framework 16, I bought it as an all-in-one machine, including some gaming. With respect to gaming, the laptop has been an overall disappointment. I understand the 180W USB-PD power supply being the largest available at the time, but the broken EC code preventing 240W power supplies being usable means you’re stuck with draining the battery, and the broken EC code causes the dGPU to throttle to about 22W when the battery hits 95%. To exacerbate the issue, a recent Windows 11 update changed the power profile for balanced, causing battery drain while gaming and making the official workaround obsolete.

Before anyone mentions the community EC fixes, I’m stuck on windows for a variety of reasons and can’t run that. It also shouldn’t be up to the end user to fix this problem, and it shouldn’t take 6 months to roll out a fix to the flagship product.

I’m really hoping that at Computex Framework announces a better dGPU, but with the RDNA4 laptop GPU rumors hinting at most only having 8GB of VRAM, I’m concerned that even if they fix the EC code in the next BIOS update, we’re still going to be stuck with 8GB of VRAM. It’ll still be a $2500 laptop that only meets minimum system requirements for any relatively new game, and whatever dGPU Framework chooses is destined for the landfill in the not-too-distant future for gaming. If not, it wouldn’t surprise me if the Framework Desktop actually out-performs the Framework 16 with an RDNA4 GPU simply because the user can allot however much RAM to the GPU they choose.

So please, actually fix the EC code in the BIOS, announce an official 240W power supply, and announce a dGPU that isn’t already obsolete.

5 Likes

I do hope something about the FW16 is announced at Computex. Seems like updates about FW16 have stagnated. Not even any news on a desperately awaited stable Firmware update.

4 Likes

My biggest complaint for the FW16 itself is really the weak dGPU. I can’t wait to get a more powerful one. I have been fortunate that most of the games I play don’t seem to pull as much and I don’t often have the battery drain issues. I haven’t bought the 240W charger that some have so I don’t have that issue either. Although I am a bit worried about the USB charging issues. I’m literally about to board a flight and see if I’ll have enough charge using a Anker Prime power bank. I’ve heard some having issues charging at certain wattage.

I think the last presentation Framework gave really got to us FW16 owners. They went and released two new products and didn’t give us anything for the FW16. No updates on any of the prior issues, no new additions or hardware. We didn’t even get the cool bezels the FW13 got. I suppose there’s nothing stopping us from purchasing the translucent USB-C expansion cards but that wasn’t “for” the FW16.

We really need SOMETHING for the FW16. The sooner the better.

7 Likes

That SOMETHING is a new stable BIOS..

7 Likes

I wonder if we’re using the same laptop…

The latest Nvidia GPU that fits into the same “up to 100W TDP” bracket as 7700S is the RTX 5070 Mobile (see end of the table here), and that one’s only 10-15% more performant than the 7700S.

So you already get a pretty good GPU in terms of performance per watt. And I can attest to that, my FL16 runs Satisfactory at 4K60 no problem, and playing Baldur’s Gate 3 split screen on the laptop’s screen is a similarly fine experience. ← I would consider those use cases quite advanced performance-wise.

Now, if you want a GPU which produces 150W-200W of heat, then AMD had the chips that do that in the same generation as 7700S (see table at the end of this post), so if Framework wanted to, they could have released higher-tier modules as well — the GPU connector is capable of up to 200W of power delivered to the GPU alone when the laptop is plugged in.

But those GPUs would require quite a bit of cooling hardware, which would probably transform the laptop into a completely different, much less mobile form-factor. This is only my opinion, but for heavy gaming you’re much better off getting an eGPU. You won’t have the problems with the charger or the cooling system sounding awful. You’ll still have an upgradeable (and a much more resellable) GPU. While for many games the 7700S is very competent.

1 Like

I don’t disagree that the 7700S can perform well at its maximum ~90W TDP (or that’s the max I’ve seen), but it won’t stay there. My example ends up at about 60W after power management decides how much I get. I can maintain a stable 60fps in Indiana Jones and the Great Circle on low settings, but for about 15 minutes. Then, the battery has reached 95%, and the 7700S is limited to ~22W until the battery recovers. So terrible performance for about 10 minutes, then the ~60W battery saver mode.

This also touches on one of my primary concerns. In Indiana Jones and the Great Circle, I’m already hitting the limits of the 8GB of VRAM with the 7700S with the GPU load at only ~80%. A 9070S with 8GB of VRAM may not present any performance improvement in this game. As more games pick up ray-tracing features, this is only going to get worse. I’d much rather take a down-clocked 9070M XT with 12GB VRAM as that will have more longevity.

5 Likes

The website / benchmark you link explicitly calls out that they’re comparing the 7700s at 100w tgp vs a 5070 at 50w tgp.

A 13% performance advantage at half the power limit is actually massive in mobile.

Notebookcheck has the performance difference closer to 30% once you let the 5070 stretch it’s legs.

Not saying FW could/should use Nvidia; I’ve heard Nvidia doesn’t like to provide dies for use in modular FFs. But that comparison is much less flattering for the 7700s than you’re making it out to be.

E: also not true that 5070 is top sku that can fit in 100w; Nvidia offers 5080m from 80-150w and 5090m from 95-150w tgp. Obviously performance goes down for examples of the same sku as you reduce power.

4 Likes

that is crazy. Maybe they will have new BIOS once Computex is over so they can not worry about new product launch for a while.

If you order everything bare minimum (no RAM, no SSD, DIY, no Windows, only USB ports, no RGB keyboard), it’s a $2K laptop.

But yeah I am very disappointed in the Framework cooling solution, and the cracks around the AMD .. stuff just doesnt work. The SD card reader just disconnected from a USB port reset a little while ago as well.

Whose fault is that, we really dont know. There’s a very long list of things to fix on the Framework Github Issue Tracker, and it doesn’t seem to be decreasing in number.

Though as much complain I have about the subpar cooling and others, I have to say the gaming performance .. is ok. For me, with the integrated. Team Fortress 2 (recently got into that), Bloons. Derail Valley is a bit struggle, so is Shapez 2. But it’s workable .. sort of. Barely. Not at 1600p, definitely not. 1080p.
Keep in mind the Discrete is theoretically 3x as fast, so.

USB-C PD rant

I think the whole 240W over Type-C is just dumb. I am surprised that no industry has come up with a better, universal, power delivery system.

Before you say “but USB-C 240W”, no. USB-C is designed for .. what, data? 60W? 20V?

You are telling me you are pumping 5A at 48V through those teeny tiny pins that are so small you are literally not allowed to touch as a person (and survive maybe a few thousand cycles and stand up to no abuse) is going to pull 240W?

Nonono.

I don’t care if it is a spec. ATX 12 Pin high power 2x6 3.0 (whatever they call it) is a spec, yet GPUs, cables and PSUs are melting left right and center. Sure, it’s what, 1%, but that’s bloody 1%. PCIe had a simular failure rate, but only at the start.

They should just invent, like, a “universal barrel jack”. Something like Dell’s, Hp’s or Razer’s. Lenovo’s square one is also quite nice.
Gigantic contacts for minimum resistance and maximum current, robust (and modular) connector to ensure reliability (Dell’s small barrel is less nice, but still 135W), and a simple (albeit pain, depend on manufacturer) communication for charger power so the laptop don’t overdraw.
Yeah, Lenovo make 230W chargers for their mobile workstations. No problem. 20V .. 11.5A.
Or a enlarged Type-C, similar to what Apple uses on the really old Mac Minis, or the recent GPMI.

I agree with this. I haven’t turned my desktop (RTX 3090) on in months, and have just been using my Framework. If anything, I’ve been limiting the dGPU’s power draw. I’ve found that in some games setting the CPU+GPU power draw limit to a combined 50W loses like 5-10 frames, but it goes from max fan speed to nearly silent, and the game is still just as playable.

1 Like

There’s a reason the 7840U is actually surprisingly good. Use less than 1/3 the power but only losing out 20% of the performance. Perf per watt really dont stack up too well on the 7840, though since I am on integrated, when in games like Derail Valley that’s both CPU and GPU intensive, the CPu and iGPU fight for power, result in stuttery physics. I cannot replicate this even on like, 2nd gen Intel hardware, which is beyond ancient.

Though I would probably half the power (maybe 30W CPU and 100W GPU) rather than 50W, but we have the freedom of choice. You will probably lose like basically nothing.

Remains to be see what die the 7700S use. If they use a cut-down size core and boost it higher, that’s bad. That’s a lot of money ($400), I want big silicon downclocked for efficiency.

Yeah, I probably would’ve gotten a U-series when I build my FW 16 if they offered it at the time.

I mean, you get the same silicon, so its not like you are losing out on anything. If anything, the HS offer greater possibility. Especially under CPU+GPU workloads. (like playing Derail Valley)
Which, fun enough, doesnt run well on the 7840HS, because the CPU and GPU share the TDP, so they have to fight. This results in stuttery physics, where the train (and everything else physics related) would visibly speed up and slow down in a hitching manner..

That would be unfortunate. I have 64GB of RAM, and can only use 8 GB for dGPU. It is barely enough to drive my two 4k displays :frowning:

As I work with AI, I want a more “unified memory” experience for dGPU. I do not care about performance loss due to the PCI data throughput, as it would be faster than a CPU, regardless.

Unified memory ain’t happening, you need CPU and GPU on the same chip, or at least the same board for that.

Hi,

I am the author of the “community EC fixes”.
I have open sourced the fixes, and told FW where they are so hopefully they will consider using them so everyone can benefit.
The community fix has been tested on FW16 AMD 7840HS/7940HS and FW13 AMD 7840U/7640U.
It has not been testing on any other FW models.
Although I do not play any games, my fixes do fix it for another user with a 240W PSU playing a game.
A partical / temp fix (without applying the EC firmware fixes) is achieved by using “ectool chargecontrol idle”. I think there is a windows and Linux version of the tool.
It stops the laptop trying to charge the laptop while gaming. If the laptop is charging while gaming, it causes stutters during the game apparently.

1 Like

Agreed. My use case has also changed since I originally placed my order and the HS series is a better fit for that now anyways. I left a lot of details out of my earlier post for the sake of brevity.

1 Like

Could you please point to the ectool Windows repo/binaries? Can’t find it…

You could try the ectool from here:

It is a windows version that github automatically builds for me.
I don’t have windows installed on my FW16, so please let me know if it works on your FW16 Windows.
Its filename is “ectool-windows-amd64-ectool-win-test.zip”

@James3 You’ve successfully made an exe file, but it does not seem to be able to do anything. Every command I’ve tried returns “ioctl errno 6, EC result 255 ()”. The only part of it that runs on windows seems to be “help”.

Ideally, you want a 1st party, Framework-compiled-and-signed ectool. Been waiting for 4 years now.

3 Likes