Dell has 1165g7 and LPDDR4x RAM (16GB)
FW has 7840u and the Crucial 2x16 DDR5 kit
Brightness set to 30% on both (they both have the same peak brightness, so levels should be similar)
Both are on battery. The FW has a larger 61Wh battery, the Dell a 50Wh one.
The difference is quite striking, with the Dell predicted to outlast the AMD by a good margin, while being at 47% charge VS 74 of the AMD.
Notice the comparatively crazy high temps on the AMD (over 50 VS below 40).
OS is the same, same level of updates. Dell is even running Vorta + Teams in the background (I forgot at the time of testing), which the FW isn’t.
Also, with intel I used to be able to confirm it was using HW accelerated decoding using intel_gpu_top…what’s the equivalent for AMD?
The situation on AMD needs to improve, or all the noise about the better node process and efficiency is gonna stay a rumor in my book.