So I though there were some discrepancies in your data (with itself and what I thought to know).
First, your 60W Charger has lower wattages than on battery without GPU - afaik, the laptop should be able to use the battery in addition to any charging capabilities, to cover peaks in power draw.
Second, you measured power draw (I assume wall power) of 22W with the GPU disabled, 12W with it enabled for otherwise same parameters (first two tests). That seems wrong.
So I made my own quick and dirty stress test on Linux, with stress -c 16, and artificial all-core stresstest, all sustained usage.
All data from ‘sensors’, which reports PPT/socket power with amdgpu-pci-c100, battery power draw with BAT1-acpi-0, and temps among others with k10temp-pci-00c3 (those temps were by far the highest).
Power Source |
Power Profile |
Condition |
Socket Power |
Battery Draw |
Temps |
Battery (65%) |
Power Save |
stress -c 16, Display 40% |
20W |
32.8W |
67°C |
Battery (70%) |
Balanced |
stress -c 16, Display 40% |
36W |
55W |
96°C |
Battery (75%) |
Performance |
stress -c 16, Display 40% |
38W |
59W |
100°C |
65W Charger + Bat (70%) |
ANY - no effect |
stress -c 16, Display 40% |
30W |
9.3W |
86°C |
65W Charger + Bat (70%) |
ANY - no effect |
C++ compilation, 14 threads, Display 40% |
30W |
5.8W |
82°C, peak 95°C as it moved to fewer cores |
65W Charger + Bat (75%) |
Power Save |
idle /w top and browser open, Display 40% |
5.2W, average 3.2W |
45W (charging?) |
42°C |
65W Charger + Bat (75%) |
Balanced |
idle /w top and browser open, Display 40% |
7W, average 3.2W |
45W (charging?) |
42°C |
65W Charger + Bat (75%) |
Performance |
idle /w top and browser open, Display 40% |
15W, average 3.2-4W |
43-44W (charging?) |
42°C |
Battery (75%) |
Power Save |
idle /w top and browser open, Display 40% |
~4W, average 3.2W |
9.85W |
39°C |
Battery (75%) |
Power Save |
idle /w top and browser open, Display 0% |
~4W, average 3.2W |
7.1W |
38°C |
Battery (75%) |
Power Save |
idle /w only sensor, Display 0% |
~4W, average 3.0W |
6.44W (or 6.9W /w minor activity) |
37°C |
So yes, there definitely seems to be something weird going on, at least on linux the power profiles seem to have no effect on peak power draw, only on base power draw. And since peak power draw on any AC mode is lower than on battery with Balanced or Performance, there certainly seems to be a problem.
I measured the C++ compilation times of my project (again 14 threads only), both on battery and wall power (65W):
Compilation on battery 48W peak, 42-40W sustained, 1:42
Compilation on wall power 36W peak, 30W sustained, 1:46
So I can confirm that the real-world performance is slightly lower, with BOTH on Performance power profile. Something seems to be wrong.
And in practice, the peak momentary socket power I observed on battery was:
48W on stress test on battery performance
60W on custom multicore workload with some iGPU use on battery balanced
63W on custom multicore workload with some iGPU use on battery performance
– For some more context on the workload, it’s not loading all cores necessarily, it has a sustained socket power of 35-40W, with a battery draw of up to 55W (with display at 0%) on the batter performance test
I’m running Manjaro KDE and the FW16 is configured with a 7840HS, no dGPU, 64GB of 5600MHz RAM (so a bit higher than normal power draw), and a 2TB SSD.