Adaptive Backlight Management (ABM)

As there are a number of people who are working on getting the best power consumption they can out of their Framework 13 laptops, I wanted to share this feature that amdgpu has that is very little known called Adaptive Backlight Management.

Module Parameters — The Linux Kernel documentation

What it does is runs an algorithm internal to the display hardware that will allow reducing the backlight “brightness” dynamically depending upon the content. On content that is predominantly dark it can have a power savings of almost ~1W at runtime depending upon how aggressive it’s been configured.

Right now, it’s disabled upstream by default but can be turned on at runtime. Unfortunately, no compositors offer any knobs or sliders to enable it at runtime though.
If you’d like to experiment with it and see the effects you can set amdgpu.abmlevel=NUM on the kernel command line where NUM is between 1 and 4.

For the best balance on the experience and power savings I’d suggest using 3.

36 Likes

Pinning this, thanks Mario.

1 Like

That sounds interesting. Does it only do that on battery, or is the logic completely independent on power source?

1 Like

From the kernel perspective it’s an interface - the policy is decided by userspace.
So in this case it will apply to both battery and to AC. It’s a DRM property though and userspace can change it dynamically (it’s a property called “abm level”). You can see it show up in drm_info snippet below.

├───Connectors
│   ├───Connector 0
│   │   ├───Object ID: 93
│   │   ├───Type: eDP
│   │   ├───Status: connected
│   │   ├───Physical size: 310x170 mm
│   │   ├───Subpixel: unknown
│   │   ├───Encoders: {0}
│   │   ├───Modes
│   │   │   ├───2560x1440@60.01 preferred driver nhsync nvsync
│   │   │   ├───2560x1440@50.00 driver nhsync nvsync
│   │   │   ├───2560x1440@48.01 driver nhsync nvsync
│   │   │   ├───2560x1440@48.01 driver nhsync nvsync
│   │   │   ├───1920x1200@60.01 driver nhsync nvsync
│   │   │   ├───1920x1080@60.01 driver nhsync nvsync
│   │   │   ├───1600x1200@60.01 driver nhsync nvsync
│   │   │   ├───1680x1050@60.01 driver nhsync nvsync
│   │   │   ├───1280x1024@60.01 driver nhsync nvsync
│   │   │   ├───1440x900@60.01 driver nhsync nvsync
│   │   │   ├───1280x800@60.01 driver nhsync nvsync
│   │   │   ├───1280x720@60.01 driver nhsync nvsync
│   │   │   ├───1024x768@60.01 driver nhsync nvsync
│   │   │   ├───800x600@60.01 driver nhsync nvsync
│   │   │   └───640x480@60.01 driver nhsync nvsync
│   │   └───Properties
│   │       ├───"EDID" (immutable): blob = 149
│   │       ├───"DPMS": enum {On, Standby, Suspend, Off} = Off
│   │       ├───"link-status": enum {Good, Bad} = Good
│   │       ├───"non-desktop" (immutable): range [0, 1] = 0
│   │       ├───"TILE" (immutable): blob = 0
│   │       ├───"CRTC_ID" (atomic): object CRTC = 0
│   │       ├───"scaling mode": enum {None, Full, Center, Full aspect} = None
│   │       ├───"underscan": enum {off, on, auto} = off
│   │       ├───"underscan hborder": range [0, 128] = 0
│   │       ├───"underscan vborder": range [0, 128] = 0
│   │       ├───"max bpc": range [8, 16] = 16
│   │       ├───"abm level": range [0, 4] = 0
│   │       ├───"Colorspace": enum {Default, BT709_YCC, opRGB, BT2020_RGB, BT2020_YCC} = Default
│   │       ├───"HDR_OUTPUT_METADATA": blob = 0
│   │       ├───"vrr_capable" (immutable): range [0, 1] = 1
│   │       ├───"Content Protection": enum {Undesired, Desired, Enabled} = Undesired
│   │       └───"HDCP Content Type": enum {HDCP Type0, HDCP Type1} = HDCP Type0
1 Like

Well level 3 and 4 look horrible on my setup (especially on the desktop, video playback looked somewhat better but still noticeable). Level 4 did reduce power consumption at 20% brightness both on my pretty dark desktop and when playing back a pretty bright 4k 60 file in kodi by pretty much exactly 0.5W.

Testing level 2 now (it still looks worse than 0 but you at least have to look) but I am not having huge hopes.

1 Like

Well in an unexpected turn, level 2 was 0.6W lower than level 0 on the same tests.

Not sure what’s going on there, I have the hunch it’s just generally dimmer but I can’t really measure that.

I’ll leave this feature off for now.

Edit: never mind test error, looks like somehow my display brightness was set wrong

How about “1”? Still too aggressive for you?

I do think this is the kind of thing that could really benefit from a daemon (like PPD) to only enact on a specific criteria.

Like think how you have “power saver” on a cell phone which kicks in at 10% battery when you’re not plugged in.

Maybe a daemon could have a policy to turn it on when you set acpi platform profile to power saver and are on battery below 30% or something like that.

1 Like

2 is already non aggressive enough.

Idk the way level 3 and 4 looked you are better off just turning down brightness.

Maybe it need some special tuning for the specific panel or something cause it looked like it just turned down the backlight and turned the brightness of the image way up which caused it to look washed out af but this might be able to be tuned to look less bad.

Just give me an oled panel already XD

Level 2 with the correct brightness setting gave more expected results, about a reduction of 0.3W on both tests.

Doesn’t really seem worth it at this point.

1 Like

Pretty neat, thanks for the tip!

Setting ‘3’ seems to have pushed me squarely into the 4 watt zone for my average idle draw.

The effect on the display is quite noticeable in some situations, it seems subtle most of the time but maybe the kind of thing that could cause some extra eye strain. I’ll run with it for a while and see how it is to live with.

Is there an easy way to update this after booting?

None of the compositors support changing it AFAIK.

1 Like

Hi, I created a post on reddit (https://www.reddit.com/r/framework/comments/18auiuj/display_brightness_of_the_amd_13/) because the lowest level of brightness it is still very high.
@Matt_Hartley pointed me to this topic, I don’t think is really the same thing, I tried using 4 but it hasn’t solved the issue, it was still too bright, and in fact I didn’t like that was adaptive. Any suggestion? I don’t understand if it is an hardware limit and if there is a viable software solution (on Linux)

Yeah it’s not the same thing, I wouldn’t expect this to influence lowest possible brightness.

1 Like

Does amdgpu’s adaptive backlight management offer any significant advantage over something like wluma, that just sets the screen brightness through an algorithm using the backlight sensor and screen content?
wluma uses the export-dmabuf wayland protocol and Vulkan on the GPU to run the algorithm, which is extremely resource-efficient.

I’m not aware of any analysis of it, but they are of course different algorithms that run differently.

The abm method is hardware based and will work with X, Wayland or a console.

It sounds like what you shared attempts to be resource light, but I think someone would need to run some benchmarking with each and the same workloads to confirm power consumption impact for each of them.

1 Like

Makes sense. Once I receive my Framework 16 I’ll probably do some comparison tests and post them here

Does anyone know what API the kernel exposes for userspace to switch the ABM level? It’s mentioned in the docs, but there’s no link or followup. I’m considering coding a quick CLI tool for it.

It uses the DRM API. But you can only change it using the DRM master (the compositor).
So you would need to write a change for your compositor.

For example in GNOME this is mutter.