[SOLVED] Color issues in Linux 6.9

I hope all upstreams opt out by default and bury this somewhere in the Energy Saving “Last Resort” settings with a warning about how it might alter contents - the entirely off color made me think the panel was broken at first.

2 Likes

This should definitely need to be deliberately enabled with a clear warning. It does save some power, especially at higher brightness.

1 Like

With this commit:

Compositors can set their policies accordingly. For example when HDR is enabled they will probably want it off.

3 Likes

I am not sure the slight power savings are worth the “amd just makes my screen look like crap on linux I guess” sentiment that’ll create with users that can’t track it down to this setting.

It may be low impact for some eyes/displays but just enabling it by default without any warning may be a bit much.

2 Likes

It’s enabled by default in Windows as well. It’s called Varibright on Windows.

All the more reason to do better than Windows.

I don’t think that’s generally a good justification to do something XD.

Not sure if it’s actually active on my windows install but I definitely didn’t notice it looking the way it looks on linux.

Anyway, that’s just my opinion, I know what it is and how to disable it, however I am not sure I’d figure it out if I didn’t stumble over it here and would probably just think the framework display sucks.

1 Like

Right; we’re not using the exact same policy in Linux as Windows does. Windows is a lot more aggressive with when it’s enabled.

I have seen review complaints about it (for example the verge had an article complaining about it on Windows). It’s definitely a polarizing setting that some people really hate and others don’t mind.

Right; the eventual goal is the compositor and DE will also have the ability to turn it on and off with a button. It will decide policy as part of it’s modest by indicating if it wants color accuracy or power savings.

Maybe it’s worth @Matt_Hartley putting together an article about it explaining it for now.

1 Like

Being able to turn it off with a button does require knowing that this is the thing making the display look like that which is why imo having it on by default is a bit of a problem. If it is a toggle and it instantly applies it’s pretty obvious what happened to the display, otherwise unless the setting is called “deepfry my image for a bit of powersavings” it may not be obvious that is the one to turn off.

Having the option is nice and it may be less disruptive on other displays or better tuned or something but having it on by default sounds like a bad idea.

That’s probably a good idea, the screen on the framework (at least on the 13) is already pretty meh, no need to make it look even worse.

2 Likes

I’m curious what the power savings of the feature actually is. Perhaps I should test the delta.

At least on lower brightness on the 13 it was very little (like <0.5W at 20% brightness) back when I tested it.

For my use case, that would be an 8% improvement/reduction in battery life assuming it’s at 0.5W. If it’s less than that though, it becomes difficult to measure and can quickly become within the realm of error-margin.

You can instead slightly turn down the bigness to get a slightly dimmer but non deep-fried image, to get those 0.5W you need to run the very aggressive modes which are very noticeable, with the bearable ones you get into the barely measurable 0.2-0.3W region.

Here are the numbers we measured for it on real content using measuring equipment (not software).

This is what was presented at the Display next hackfest this year.

Good to know. Question is, is half a Watt really worth the significantly decreased displaying quality? And that’s at 50 % brightness, I usually never go that high unless in direct sunlight. And the ABM 3 mode showing significant reductions shows a much worse image even in these small comparison pictures.

My guess would be better adaptation of the display refresh rate to the content could potentially be better suited to save energy without that many drawbacks, especially with the new partial refresh technology of eDP 5 (?). But I have no idea how far the display in use even can go. It most likely won’t be able to scale down to 10, 5 or even 1 Hz like smartphone displays can.

Yeah; it’s not a perfect solution. I don’t disagree there. But the other thing is how much are you REALLY using power saver?

The idea for power saver is supposed to be to erk out more time the last 10-20 percent of battery. There are definite compromises made for the experience.

EPP is tuned accordingly. The next version of PPD will be turning off CPB (a new feature in 6.11) and also adjusting the lowest scaling frequency differently between balanced and power saver.

I could see an argument that we drop PPD balanced+battery from enabling ABM 1 and then PPD power saver on AC does ABM 1 and PPD power saver on battery is the only thing that does ABM 3.

But will this help the majority of people? I don’t know. We don’t have any data to know how many people use power saver and how frequently. It’s all just guessing.

Oh and in terms of the next power savings features we’ll be seeing you should read up on IPS2 and Panel Replay.

Thanks for sharing @Mario_Limonciello, that’s quite a drastic power reduction. I can’t help but wonder if it changes for productivity tasks like text editing though given the differences in pixel change frequency. If that’s part of a recorded presentation, I would be interested in watching it.

There seems to be a lot of negativity around this feature, and I wanted to share my perspective. I would argue that there most definitely is a use case for 0.5W of savings. Especially if that can be increased to 1.45W if applied more aggressively. A vast majority of the time I’m spending on my laptop, I do not need the full level of fidelity the display can offer and have gone to great lengths in order to minimize my power draw. An additional 0.5-1.45W savings translates up to an additional 1.5 to 5.5 hours of battery life… which is as long as some laptops last in total.

I understand that the image quality trade-off for power savings may not be worth it for many of you, but for people like me it definitely is.

Back before the Framework came out I created a graph of potential battery life estimates. As you can see, as total power usage decreases the battery life in hours increases non-linearly. Meaning if you’ve got your laptop already running pretty efficient, a whole 0.5-1.45W of savings can make quite a large difference. Unless someone has specifically attempted to improve their battery life, I would guess they’re sitting around 10-15W of draw, whereas mine regularly sits between 6 and 7, which has a visible benefit to those power savings.
image

All of that said, I do think regular users could benefit from greater visibility on the trade-offs being made with this technology.

1 Like

I actually am only using Power Saver mode unless I need more power. Even balanced seems to be sucking on the battery quite a bit more. And I’m not sure as in what context, but I read that only in Power Saver mode, the governor/scaler can even adapt to the workload. So I never had any reason to exit Power Saver mode if I’m not doing CPU heavy tasks.

No idea what you mean with IPS2, but Panel Replay is exactly what I meant. But I kinda doubt that’s something we can make use of an time soon. Probably a motherboard with eDP 1.5 support has to drop first and then displays being able to make use of it.