Hi, I was wondering what the total power draw of the Framework 16 would be. I am currently looking into investing in the graphics module, but I have seen many reviews and people saying that the 180w is not enough for graphically intensive tasks (i.e gaming). I currently want to wait for the 240W power adapter to come out from the framework (I know the delta power adapter exists, but I don’t want to buy that just yet since the cable is not detachable). Still, they have no plans to release a 240W power adapter since 240W USB-C PD chargers don’t exist for consumers. I want to see the power draw on the device to see if I can cap the GPU power using MSI afterburner.
II use a mix of chargers for my FE16 w/ GPU. theres’ a 65 watt charger i use on the couch which is more than powerful enough to charge the laptop while writing or watching videos. it’ll not hold up when gaming, but as long as i’m not gaming it’ll keep the laptop going.
I also use a thunderbolt dock with 95 watts of charging, that’ll handle most CPU bound workloads like number crunching and minecraft (even heavily modded, so long as i don’t do shaderpacks) and keep the battery charged to the 80% limit i have set.
but that one’s not strong enough for GPU intensive games. If i game on something that stresses the GPU the laptop will start discharging the battery to make up the difference, in the games i play i lose a percent of battery charge every two to three minutes like that.
and i use the 180W charger that came with the laptop for heavy gaming sessions. plugging it in along side my thunderbolt dock has the laptop seamlessly flip to charging from the more powerful charger and it will not only keep up with the games i play but also slowly charge the battery back up to the 80% setpoint.
The only time i’ve managed to make the laptop draw so much power that the battery was discharging when the 150W charger was plugged in was when i was running PRIME95 to stress the CPU to the max and the fuzzy donut of death that is Furmark at the same time. that would slowly drain the battery, but even that unrealistically intense and constant workload took for me over an hour before it hit 20%
a 240W charger will definitely have enough juice to sustain maximum warp indefinitely, but honestly, the first party 180W charger will get you where you need to go for all realistic workloads.
edit: corrected typo. the stock charger is 180watt, not 150 watt as originally written
People report dGPU performance being worse eith the 240W charger because of a bad boosting behavior, so I’d wait on getting any 240W charger until they support it properly in firmware.
As for the consumption - for doing work with occasional CPU spikes (running a JetBrains IDE +some other stuff) it’s fine to keep it on sub-100W charger. When playing Satisfactory at 4K with the dGPU on the stock 180W power adapter, the laptop discharges slowly where it would go from 100% to 30% in 3+ hours of playing. Satisfactory isn’t one of the most demanding games graphically, though.
Currently run with the 180W adapter with 7840HS + 7700S for gaming.
Overall its solid if you don’t care about pushing over native res and mid to high settings on most games.
FF7 Rebirth, Indiana Jones and the Great Circle, and South of Midnight are the newest games I’ve played.
I do not see battery drain if I keep Windows power management to balanced and mid to high settings, going for native display res of the 16’s screen and a solid 60-90fps.
When pushing settings higher and wanting to not see a FPS fall off, I have to set Windows power management to Performance, and in AMD software set to performance.
This does drain the battery, I got about 5 hours before the battery was down to 15%.
Battery drain has been an issue when jumping into new games in the first couple of weeks before performance and bug patches get pushed for the game.
I basically have the same configuration so far. I am currently using a 100W brick with a 10ft 100W cable and it runs well. I was looking into upgrade anyways because the iGPU is not powerful enough to play the games i want to play.
I see. I still kind of want to wait for the 240W since i’ll be guranteed to run any gpu intensive games of programs and dont need to worry about battery drain. I can definitely push it to its limit if i want to, but i’ll have to wait till either framework releases the charger and firmware, or i get the delta charger, which is probably unlikely.
Given it requires 100% utilization constantly to even hit the power limits and start discharging the battery I personally would not worry about issues with power on the 180W charger.
it is highly unrealistic for both CPU and GPU to be tasked to 100% while gaming. there’s always going to be a bottleneck that prevents one or the other hitting perfectly 100% and that will keep the power budget low enough that the 180W charger will be able to keep up with any real world workload.
But if you want to wait to be sure, that’s valid too. Plus if you wait there might be a next gen announcement for the FW16 that’s bound to be enticing.
Ill give it some time for framework to release any new spec gpu and motherboards. I heard AMD is skipping a gen for laptop CPUs and GPUs so the wait will probably be longer than usual. but hopefully the the GPU will get cheaper by then. (and hopefully the 240W gets released too).
My F16 uses overall around 30W normally as I keep dGPU in S3Cold as much as possible, CPU capped to 3.6GHz (no CPU boost) and external monitor from laptop USB-C.
When gaming, it uses over 180W at some games, especially when VFIO (no smartshift)
How do you set it to this mode? I only see it go into d3cold, and only if I have not used the rear port. Once I do, I have to reboot to get it to go back in to d3cold. How are you manually setting it to some “S3cold” mode I never heard of?
i’m 99% certain that s3cold is a typo of d3cold given the letters are right next to each other on the keyboard and i don’t know of any S power states for PCIe devices, just full on systems
If not gaming (or you don’t care about the small penalty), use the side USB-C for external monitor instead, so it uses your iGPU instead of dGPU.
Use Wayland instead of X. Quite large explanation, but basically Wayland does a better job dealing with GPUs without a monitor.
Also, if you use Chrome-based browsers, disable GPU acceleration, otherwise will randomly use your dGPU and never release it until you close the browser.