Can you suggest what you had to install? It’s supposed to find and try to install them for you for Deb/rpm/arch.
python-systemd, python-udev, iasl and python-distro… those I am sure of, the rest I do not recall. I also noticed it tried to install some packages with pip which caused it to fail to install dependencies since you can’t install python packages system wide with pip on Arch. Unless you really want to.
Specifically, it needs python-distro
to detect the host operating system. Without it, it’ll fall back on installing with pip
.
Ah yeah; I remembered having to install python-distro by hand when I tried it on Arch. The tough thing is python-distro
isn’t really needed unless you’re missing the others. But maybe it should just be made into a hard requirement.
I’ve modified the script to make python-distro
compulsory. The rest it should try to install with pacman
now (will obviously prompt you though).
Hi there!
I am considering doing a board upgrade from the original 11th gen to either this AMD board (or the latest Intel one). With this upgrade, I aim to fix only one issue: poor battery life when suspended (s2idle). I seem to understand that, if things are configured properly (with the help of @Mario_Limonciello’s script), the AMD board will get the job done.
Do you see any reason to consider the 13th gen Intel solution instead, though?
Cheers,
no AMD’S way more power efficent and has a vastly superior IGPU
AFAIK 13th gen has [deep]
Looking to order a 13’ AMD soon.
What is now the best setup to get 10-hour battery running Linux?
Which combo CPU, WIFI, RAM, etc?
The newer Intel Core Ultra might be more interesting as it boasts more capable ARC iGPUs which do provide very similar performance to the Radeon 780M on the AMD APUs. Historically, Intel iGPUs also have consumed less power while the newer AMD APUs with RDNA graphics have a lot of issues with the GPU’s power consumption.
If you watch a lot of video content on your laptop, you might as well go with the newer Core Ultra as the impact the RDNA GPU has on battery life can be important.
Ryzen 7 (larger battery), 32GB of RAM, default MT/RZ616 wifi… It’s just about running it in power save on watching no video at all to get close to the 10-hour.
32 GG on one stick or 2 x 16GB? Which one is better for the battery?
Single stick uses marginally (like as in barely above measurable error ~0.2W) less power than 2 at low load at the cost of halving your bandwidth. Not worth it imo. Running 4800 instead of 5600 doesn’t even make a measurable difference at low load.
This only counts for amd the intel platforms apparently react quite differently (apparently reacts quite well to slower speeds).
The only reason I’d go with a single 32 stick over 2x16 is if I knew I’d add a second 32 stick relatively soon.
Great answer, thanks!
Thank you, @Bennett_Derrico, @Charlie_6 and @Shijikori.
I know this to be true on Desktop and/or Windows. I just wanted confirmation about the Linux/Ubuntu side of things. Thanks!
11th gen does as well, but the wake-up time is a bit of a pain. And I cannot find an easy way to configure the machine to switch from s2idle
to deep
after a certain amount of time, to save battery.
This sounds intriguing, but I would like to avoid finding myself running 24.04 LTS without it working out of the box, because the hardware is too recent. Wouldn’t I incur into this risk? I am using the machine for work, so I am not so interested to be on the bleeding edge, to be honest.
Did a quick Google search, it seems there are still issues to be sorted in the IO but it does boot and work just fine asside from those IO issues.
The power efficiency of the cores is pretty insane, hardware decoding on linux is another story though. On windows it’s apparently fine but on linux it uses pretty excessive amounts of power compared to even much older intel systems. Performance wise though it’s pretty great, if only they could somehwo iron out that video decoding thing.
Hell the cores are so efficient that sw decoding 720p30 uses less power than it takes my old intel thinkpad with hw decoding.
So, turns out this was with hardware decoding disabled and while on power save. If using balanced instead, the consumption will shoot up to 20W. Enabling hardware decoding reduces the power consumption by about 5 to 8W.
I have observed an annoying behaviour which is when the acpi platform power is set to power-save, the contrast and colour accuracy of the screen will go to utter garbage. I really dislike this behaviour and prefer to take the power consumption personally. I could probably tweak the PPD settings so it does not put the acpi platform power to power-save but still.
I could probably tweak the PPD settings so it does not put the acpi platform power to power-save but still.
Yeah this is ABM (panel power saving). It does save a lot of battery life but can be off putting to some people.
it can be tuned in two ways, both summarized here:
upower / power-profiles-daemon · GitLab
i think i’ll be going with the kernel parameter option. thanks!