I am running Debian testing on a FW 13-AMD 7840U, with a stock kernel, 6.10.9-amd64, using PPD 0.23 from Debian testing. I was getting pretty decent battery life from it (between 5 and 10 Watts under a modest load). I got excited by the arrival of the 2.8K display and installed it on this machine and power consumption is consistently above 12 Watts, sometimes as high as 18, under the same loads I had before. Is this to be expected? It makes my battery last somewhere around 4 hours which I find too low. I am thinking of replacing the new display with the original one, partly because I don’t see enough of a difference between the two displays to sacrifice that much battery life.
Have you turned on VRR? This should help.
Didn’t think we had VRR on Linux. I’ve been running 60hz for battery life.
Hank, it’s pushing more pixels and at double the refresh rate. I don’t think the big jump in discharge is the difference in screen. Check what discharge rate you get when you set 60hz and something close to the original resolution?
Sorry for the dumb question: How do I turn on VRR? I am using GNOME/Wayland.
Thanks. Hank
I discovered I had an open Firefox tab that was eating a lot of CPU cycles. Closing that brought me down a watt or so. Turning the refresh rate to 60Hz brought the discharge rate down another 2 watts. I’m back to close to 6 Watts, which I can live with. I will also try dropping the resolution at some point, too. Thanks for the suggestion.
It’s an experimental option in GNOME. Look it up. Kde it’s front and center in display settings by default.
I was thinking just to get a baseline of the difference between new and old panel.
Thanks for the tip, I found it and am trying it now: Variable refresh rate from 30 to 120 Hz.
I’m trying Mario’s suggestion for VRR. I’ll try yours next if that doesn’t help. Setting it to 60Hz definitely made a difference, tho, even without lowering the resolution.
Can you common on how much ‘a difference’ is? Debating getting this panel!
I can get the power usage down to around 6W by setting VRR to a max of 60Hz, while keeping the higher resolution. I tried dropping the resolution and didn’t see much difference. I’m still not convinced I have all the power settings right for this machine, though - power usage occasionally gets to 5W but mostly is higher, generally 6 to 7, sometimes spikes up at 14W if I’m spinning up a VM.
Wow. I will probably upgrade then! Looks like Sway also supports VRR. Thank you for the update.
I was getting ~10w idle with all my usual apps open, with refresh at 120Hz, enabled VRR on Gnome down to ~6w.
I would say that the display itself is not the one giving more power draw but it’s the iGPU.
I’m running Fedora 40 with the latest kernel.
So have we been able to confirm that Gnome VRR actually does save battery life? From what I can tell it’s unclear whether it’s designed to downclock the refresh rate at idle (or how aggressively it does so), or if it’s only designed to reduce tearing in fullscreen games.
Howdy everyone!
I’d be happy to clarify to the extent that I can about this. VRR does downclock the refresh rate of the display at idle. Gnome and KDE both have their ways of implementing this but it effectively comes down to allowing the compositor that draws your desktop to alter its frame rate based on screen activity.
To my understanding, the Gnome VRR setting is equivalent to the “Always” setting for VRR in KDE. Under this setting, VRR is forced for the entire display, not just for full screen applications. I’ve seen some independent testing that shows that under VRR mode the FW13 with the 2.8K display get better battery than 120 Hz without VRR but worse battery than 60 Hz. Your mileage may vary, but generally speaking yes enabling VRR should improve battery life somewhat.
Please let me know if there’s anything I can better clarify.
That makes, thanks sm! Btw, do you know if there’s a way to show the current debug FPS? No worries if not
Hi Kenneth.
Sorry to say I don’t know of any simple way to track the active rate of the display in VRR mode. Typically on a desktop display you’ll have that built into the OSD but laptop displays tend to be put under the control of the OS, and on that side of things you’d need to use something like xrander to build a script that’s constantly printing active display information and parsing that down to just the “Active Rate” or a similarly named parameter. Outside of that it can be quite difficult to tell.