[Battery Life] Impact of RAM / memory configuration + extra data

Hello Framework Friends, I decided to go in deep and see how different RAM configurations impact power consumption, and collected some data in regards to idle usage and video playback in and outside of different browsers. Results are just from my Batch 1 i7-1165G7 Laptop.

Highlighted findings/TLDR:
(discharge rate in watts)

Averages Total/All Browser YouTube 720P Tears of Steel 720P MKV Idle
Fedora 34 + Windows 10
8GB 1x8GB 2133MHz 5.58 6.6 5.4 2.5
16GB 2x8GB 2133MHz 5.72 6.8 5.6 2.5
32GB 1x32GB 3200MHz 5.95 7.1 5.7 2.8
64GB 2x32GB 3200MHz 6.33 7.5 6.2 2.9
40GB 2133MHz (1x8GB 2133MHz + 1x32GB 3200MHz running at 40GB 2133MHz) 6.0 7.1 5.9 2.6


It appears in order of impact to power consumption (least to greatest):

  1. Using an extra ram slot:
    • Seems negligible
  2. Size (e.g. 8gb vs 16gb):
    • Probably dependent on specific running task
  3. Higher frequency
    • Seems to have a significant impact. I assume this is because the Tiger Lake XE GPU can run faster due to the higher frequency. 2133MHz → 3200MHz bumps 3DMark’s Fire Strike ~2-300 points. Also, i7-1165G7’s GPU is faster than i5-1135G7, so I’m curious on i5-1135G7 numbers.

So I think the biggest culprit may be: the higher RAM frequency, which improves Tiger Lake XE GPU performance, may actually be causing video playback using HW decoding and general graphics…things? to consume more battery. Looks like if you’re trying squeeze the most battery, 3200MHz uses significantly more power than 2133MHz.

Compare in Browser YouTube 720P playback in Microsoft Edge (HW Decode) on Windows (shown in [1]):
1x8GB 2133MHz @ 5.2W vs. 2x32GB 3200MHz @ 6.1W

How can we possibly mitigate this?

  • Ability to turn off a RAM slot (trading in dual to single channel performance loss for battery life), probably in the BIOS
  • Ability to downclock RAM (trading in performance, GPU being a big factor for battery life), probably in the BIOS


  • Ability to downclock the GPU?
  • Ability to undervolt (which, sigh, isn’t possible at the moment on Tiger Lake-U. Alas Intel, alas.)


  • These are just my findings at this point in time, who knows what will change in the future.
  • Don’t pay attention too much attention to the Linux vs. Windows averages, as currently browser hardware encoding on Linux is very experimental/alpha/beta/buggy/wonky whatever one wants to call it.
  • Firefox HW Decode at least on my Fedora install is wonky [2]
  • These numbers are skewed towards idle times and video playback, meaning YMMV depending on your workflow. E.g. CPU/RAM intensive tasks may perform differently.
  • I tried taking some average consumption when numbers stabilized, and erred on the lower side. 0.05W rounded up, so take that into account in margin of error.

[1] Full data:

Miscellaneous thoughts:

  • At this point I’ve opened the top panel probably at least 50 times, maybe closer to 100. The battery disconnect option in the BIOS is oh-so-awesome. I have System Setup (BIOS) in my GRUB bootloader. Enter BIOS, Battery Disconnect, lift top panel off (I keep all the bottom 5 screws undone), swap out RAM, set top panel back down to attach magnetically, and leave screws undone. So. Nice.
    Edit: Clarification: I only kept the screws undone during testing and would not recommend them undone during regular use. See my reasoning here.

  • Please ensure RAM’s inserted in all the way. Once I mistakenly didn’t pop it in all the way, sat at a black screen for a minute ("it’s just doing the RAM change detect sequence…wait this is taking longer than usual…wait…checks RAMoh sh–…*pops RAM all the way in…sweats nervouslyturns back onwhew, didn’t short anything :smiley:

  • I’ve read that Tiger Lake is very efficient with hardware video decoding, and we can see that it’s pretty good on Windows (MS Edge and Chrome). HW Decoding in browsers on Linux seems…iffy, at the moment, anyways.


SW = Software (software video decoding)
HW = Hardware (hardware video decoding)


  • CPU: i7-1165G7
  • Memory kits used:
    • 16GB (2x8GB), 2133MHz: SK Hynix (timings too lazy to look up/verify) 1.2V model: HMA41GS6AFR8N
    • 64GB (2x32GB), 3200MHz: G.Skill RipJaws Series CL22-22-22-52 1.2V model: F4-3200C22D-64GRS
  • A single USB-C expansion card plugged in, so no HDMI/SD card etc.
  • 10% brightness
  • 50-75% speaker volume
  • Single stick RAM config tested in left slot

Linux Fedora (5.14.0 Vanilla Kernel for PSR fix, stability unknown)

  • Using TLP with powersave governor
  • Using intel-media-driver, not libva-intel-driver
  • Microsoft Edge Beta Version 93.0.961.33
  • Firefox version 91.0.2 (64-bit)
    • Wayland Firefox: MOZ_ENABLE_WAYLAND=1 MOZ_DISABLE_RDD_SANDBOX=1 firefox
    • Wayland Firefox seems to use less battery on Sway, e.g. YouTube 720P, no HW Decode: ~11W

[2] Important: Firefox HW Decode at least on my Fedora install doesn’t work as expected:
Something seems wonky as hardware decoding consumes more power than software decoding.

  • I did jump through some hoops
  • There are open bugs
  • Still seems beta-ish at the moment

So take that as you will. Confirmed HW/SW decoding with intel-gpu-top.

Windows 19042.1202

  • Using battery saver profile
  • Microsoft Edge 93.0.961.38 (64-bit)
  • Chrome 93.0.4577.63 (64-bit)
  • Firefox version 91.0.2 (64-bit)
  • Microsoft Movies & TV 10.21061.1012.0

Idle Tests:

I just let the computer idle :slight_smile: to check that it successfully idles, the CPU should be in C9/C10 states mostly.

I’ve found that anything that constantly changes what’s displayed on the screen, like a flashing alert, will prevent being in C9/C10 states and reaching low idle power consumption.

I’m assuming it’s due to how Panel Self Refresh (PSR) works, and needing it to reach C9/C10.

Browser YouTube Tests:

Done by selecting 720P, not auto, so quality doesn’t change mid-video.
The video is almost fullscreen like shown. Maximize window, YouTube in theater mode, like so:

I let videos play for a few minutes to let things stabilize, video to buffer, etc. Numbers are rough averages of what I notice, may spike when buffering more etc. I’ve noticed periodic slight increases in power consumption which I assume happens when the browser downloads more YouTube video.

Seems like the greater video on-screen size equates to more power consumption.
Can also probably assume some increase in power consumption playing higher than 720P.

To check hardware decoding in Chromium DevTools:

edge yt hd decode

YouTube in mpv with youtube-dl VAAPI HW Decode

mpv --ytdl-format="bestvideo[height<=720][vcodec!=avc1]+bestaudio/best" [youtube-url-without-brackets]

mpv playing YouTube avc1 codec:

CTRL+H toggles hardware decoding on/off.

Confirm with intel_gpu_top that you’re actually hardware decoding. The Video section should show higher than 0%, like so:


This is super interesting. Did not know RAM speeds and RAM configuration affect battery life.

Sorry but what are the units here?

Discharge rate in watts, added to original post for clarification!

1 Like

I’d be curious to see if the numbers change slightly by doing full screen video, because without having to draw the rest of the user interface it is simply upscaling the video to the screen resolution and not rendering the complex and frequently updating UI of the system, which means it can probably be offloaded to optimized routines.

I registered to this forum just to ask you this: Are there any issues with keeping the screws undone by far? I am planning to carry a spare battery everywhere, so I want tool-less access to replace the battery if possible.

(My work requires quite a ton of power that drains the battery, and I hate the idea of being tethered to a cable, so battery banks are not a solution either)

I did a quick test with Windows 10 Movies & TV between maximized and fullscreen with a 720P video:

  1. Recorded discharge over a 5 minute period in 10 second intervals.
  2. Averaged the discharge rate for the last 4 minutes, (disregarding the first minute where things may settle).

I only did this once, so take this with a bucket of salt :grin:!

Fullscreen consumed 63mW or 0.063W more than maximized. I’d guess that, with at least Movies & TV, either none or a combination of these are happening:

  • the app doesn’t take advantage of “exclusive fullscreen” like Windows games, so perhaps the Windows interface/UI is still rendered in the background, resulting in no power savings
  • the Windows UI isn’t that power hungry (and even less when static)
  • I’ve confirmed playing a video on a larger area on screen consumes more power; that extra power draw from maximized to fullscreen may outweigh the savings from not having to render Windows UI
  • only having run this once, results could be within the margin of error. Fullscreen could in fact consume less than maximized

With that, I’d be confident OS UI power consumption isn’t much when static/not updating like during video playback. And the power draw delta between maximized and fullscreen (in Windows Movies & TV, at least) is negligible.

For clarification, I kept the screws undone only during the tests so I could swap the memory faster (will update the original post!). I’d be uncomfortable leaving them undone during regular usage because they jut out a bit and can get caught on or perhaps scratch something.

I also noticed if e.g. the left side of my laptop was off the side of my table, and I put slight pressure on the left part of the palm rest area (e.g. when typing), I wouldn’t be able to click the trackpad. Probably because only the magnetic fasteners were holding down the top keyboard/touchpad assembly and not the screws, so that slight amount of flex didn’t allow the touchpad to work properly.

Edit: if you’re careful with the above though, I’d say it’s probably okay! Things are held together pretty well even when the screws are undone.

Ability to turn off a RAM slot (trading in dual to single channel performance loss for battery life), probably in the BIOS

Tangent: what is happening when entering deep sleep in linux? I imagine, if memory used is less than half of the capacity, one could theoretically move everything from one of the two slots to another one, and then disable the former. I wonder if this would result in significant power savings.

Since it doesn’t look like anybody mentioned it, I imagine that RAM capacity alone actually doesn’t make a difference and it’s actually all down to how many memory chips are on the actual PCB - in particular dual-rank (memory chips on both sides of the PCB) vs single-rank (memory chips on only one side of the PCB).

Conversely, dual-rank usually performs better so it’s theoretically possible that the extra power being drawn by the dual-rank config would be offset by the CPU and/or iGPU not needing to turbo as high and/or for as long.

…which makes me wonder if that kind of logic would similarly apply to dual-channel memory since, not only is the power draw considerably greater from dual-channel, but the performance increase is also considerably greater from dual-channel.

1 Like