You can instead slightly turn down the bigness to get a slightly dimmer but non deep-fried image, to get those 0.5W you need to run the very aggressive modes which are very noticeable, with the bearable ones you get into the barely measurable 0.2-0.3W region.
Here are the numbers we measured for it on real content using measuring equipment (not software).
This is what was presented at the Display next hackfest this year.
Good to know. Question is, is half a Watt really worth the significantly decreased displaying quality? And that’s at 50 % brightness, I usually never go that high unless in direct sunlight. And the ABM 3 mode showing significant reductions shows a much worse image even in these small comparison pictures.
My guess would be better adaptation of the display refresh rate to the content could potentially be better suited to save energy without that many drawbacks, especially with the new partial refresh technology of eDP 5 (?). But I have no idea how far the display in use even can go. It most likely won’t be able to scale down to 10, 5 or even 1 Hz like smartphone displays can.
Yeah; it’s not a perfect solution. I don’t disagree there. But the other thing is how much are you REALLY using power saver?
The idea for power saver is supposed to be to erk out more time the last 10-20 percent of battery. There are definite compromises made for the experience.
EPP is tuned accordingly. The next version of PPD will be turning off CPB (a new feature in 6.11) and also adjusting the lowest scaling frequency differently between balanced and power saver.
I could see an argument that we drop PPD balanced+battery from enabling ABM 1 and then PPD power saver on AC does ABM 1 and PPD power saver on battery is the only thing that does ABM 3.
But will this help the majority of people? I don’t know. We don’t have any data to know how many people use power saver and how frequently. It’s all just guessing.
Oh and in terms of the next power savings features we’ll be seeing you should read up on IPS2 and Panel Replay.
Thanks for sharing @Mario_Limonciello, that’s quite a drastic power reduction. I can’t help but wonder if it changes for productivity tasks like text editing though given the differences in pixel change frequency. If that’s part of a recorded presentation, I would be interested in watching it.
There seems to be a lot of negativity around this feature, and I wanted to share my perspective. I would argue that there most definitely is a use case for 0.5W of savings. Especially if that can be increased to 1.45W if applied more aggressively. A vast majority of the time I’m spending on my laptop, I do not need the full level of fidelity the display can offer and have gone to great lengths in order to minimize my power draw. An additional 0.5-1.45W savings translates up to an additional 1.5 to 5.5 hours of battery life… which is as long as some laptops last in total.
I understand that the image quality trade-off for power savings may not be worth it for many of you, but for people like me it definitely is.
Back before the Framework came out I created a graph of potential battery life estimates. As you can see, as total power usage decreases the battery life in hours increases non-linearly. Meaning if you’ve got your laptop already running pretty efficient, a whole 0.5-1.45W of savings can make quite a large difference. Unless someone has specifically attempted to improve their battery life, I would guess they’re sitting around 10-15W of draw, whereas mine regularly sits between 6 and 7, which has a visible benefit to those power savings.
All of that said, I do think regular users could benefit from greater visibility on the trade-offs being made with this technology.
I actually am only using Power Saver mode unless I need more power. Even balanced seems to be sucking on the battery quite a bit more. And I’m not sure as in what context, but I read that only in Power Saver mode, the governor/scaler can even adapt to the workload. So I never had any reason to exit Power Saver mode if I’m not doing CPU heavy tasks.
No idea what you mean with IPS2, but Panel Replay is exactly what I meant. But I kinda doubt that’s something we can make use of an time soon. Probably a motherboard with eDP 1.5 support has to drop first and then displays being able to make use of it.
The panel itself needs to support Freesync to start.
Fwiw the Framework 16 panel supports panel replay. There were some problems with it though so it’s disabled by default in 6.10. If they’re fixed in time perhaps it can be re-enabled in 6.11.
IPS2 isn’t available on the current gen platforms
It requires an updated display controller hardware.
You’ll first see it on future ones. I’m just mentioning it because it’s a pretty big power savings to turn it on.
You’re probably mixing up the cpufreq governor and PPD power saver state as you’re talking in this paragraph.
EPP can only be tuned when the cpufreq governor allows it which for historical reasons has that name.
If you’re seeing pretty big power consumption in balanced on battery such that you would rather operate in power saver I would wonder if you’ve got some inefficient userspace or some long running background tasks.
Without scheduler changes you would probably be better manually setting the affinities of such tasks and maybe even controlling the performance bounds of a CPU core they’re on. But this is getting into a pretty custom territory and you’ll need to profile your workloads to find out if it really makes sense.
No idea. To answer that I’d have to have some power monitor that will monitor every hardware component and every process during the whole charge. Sadly neither does Gnome bringt such a tool nor do I know of something fitting.
As I suspected the higher power savings are at higher brightness showing dimmer content.
That’s kind of my problem here, will a member of the majority be able to find out why the screen looks like that having never heard about abm or just blame the hardware or software as a whole?
Now panel replay sounds very nice.
IPS2 is not very google-able, not sure what exactly you mean by that.
That’s definitely true, I love that there is the option, my objection is entirely having it enabled by default since the connection between the deepfried image and that particular setting is non obvious if you don’t know abm is a thing.
That is a good justification to manually enable it.
I’ve done a little test of my own. It seems the issue isn’t ABM itself, but the fact that it must be set to 3 by default, which totally nukes the displaying quality. Manually setting it to 1 and having grub update (sysctl -a for some reason doesn’t output any string with amdgpu.abmlevel
or abmlevel
for that matter, but dmesg shows that amdgpu.abmlevel=1
is in the Kernel command line) doesn’t really show any visible differences as far as I can tell. So given that ABM is actually set to 1, I’d change my mind that it can stay active by default, it merely shouldn’t default to the most aggressive mode. Possibly saving 0.5 W without any drawbacks is actually more of a deal than merely saving 0.5 W by deepfrying the content.
This comes back to my comment that people should be using balanced. ABM is zero on AC and 1 on DC while in balanced.
Based on the way you use power states maybe it would be better to add a “battery level” factor into the decision. Only enable ABM starting at 50% battery level, and increase the intensity as the battery level drops. For example ABM 1 at 50%, ABM 2 at 40%, ABM 3 at 30%.
I think this could be a cool feature, but if added I think it would need to be configurable so that it can still be always-on/off.
Well the policy of allowed/disallowed at runtime will remain in compositor. So you’ll need to work with compositor devs to add such a knob.
If it’s user definable, sure.
Here, you guys can try this and see what you think.
I use a 24-Inch-Monitor (Eizo) with my Desktop-PC which also has this ABM-Feature. This Monitor also has an Ambientlight-Sensor. In this case the whole ABM-Thing is independent from any Performance-Profiles and Workloads. That’s how it should be. I try to explain.
The optimal Brightness of the Backlight always has to do with Ambientlight, Luminance of the displayed Information and the Human-Eye. Has not much to do with CPU-Power.
When bound to a Performance-Profile and implemented like it is now (i am using Kubuntu) you say only heavy Workloads need precise Color-Presentation and bright Display. Also a User who is using Performance-Profile is “forced” to waste Energy.
Let’s say i want to play a darkthemed Game. I choose the Performance-Profile. Does that mean i need precise Colors and a bright Backlight? In this Usecase i would benefit most from ABM, because ABM not only can save Energy it also can lead to a better Black-Presentation (everybody knows TFTs have Problems with Black because of the Backlight). I know this from my Desktop-Monitor. In my Monitor this Feature is bound to a Monitor-Profile (Movie/Gaming) and can be switched off by selecting onother Profile (sRGB). Also the Monitor can show the saving in Watts.
But i also know the side-effects, like in Movies when the Credits roll and they used white small Script on Black. Because of the high amount of Black my Monitor dimms the Backlight → White becomes Grey (and sometimes suddenly becomes White).
Next thing is, ABM only can be good when Ambient is taken into account. On Kubuntu my Ambientlight-Sensors does not work, so this can make things worse when ABM is on by default. As others mentioned, this Feature should be implemented in a way that the User is informed that it there (i only know because i found this Thread, was wondering why my Display changed Brightness when plugging in Power-Cable because i disabled Brightness-Switching in the Energy-Settings) and it should be configurable (ABM-Level).
Greetings from Germany
You must be referring to a different but similar technology. ABM on AMD systems only works on eDP.