Consistent Wake from Sleep Issues on Linux Edit:(5x Distros Tested)

It’s honestly funny to me that Linux distros seem to deal with this much better than recent Windows versions. Like, on Debian, I just change a few config settings and boom, my laptop is hibernating when I ask it to, no questions asked. Want to change it to suspend instead? Easy, just toggle one setting. Want to change the sleep method? Just add one kernel parameter and then you don’t have to think about it.

After I set the kernel parameter and edited /etc/systemd/logind.conf, I literally don’t have to think about suspend/hibernate/etc anymore. I press the power button and it hibernates (like I asked it to). I close the lid and it suspends in deep sleep (like I asked it to). I can manually issue poweroff and reboot commands when I desire and it will just…do the right thing.

2 Likes

Nah, this is mostly on the OS Windows and on Intel. Intel for refusing to support the standard sleep states. Windows for being such a clusterf*ck when it comes to suspend/hibernate/etc (and refusing to give control to the user).

On Linux, there might be other issues (though I’ve generally had a decently good time), but suspend/hibernate/etc works fairly well, it seems. Things hibernate when you ask them to. Things suspend when you ask them to. And things poweroff/reboot when you ask them to.

1 Like

For my part, I’ve heard a few things about the Intel side:

  • They have had bugs with suspend/resume for years
  • Tiger Lake’s sleep is much less efficient than previous generations, bringing the new additional complexity of deep sleep
  • The architecture is not yet fully power-optimized

Linux has never been about zero bumps in the road, but about giving you the tools to either fix it yourself or collaborate with those with the know how to do so.

Deep sleep and hibernate were working for me with less than 3 minutes of effort total on EndeavourOS (arch based).

eGPU handing in Linux still needs some serious polish, but it’s daily driver functional with minor quirks in X11. Wayland seems to be taking giant steps backwards in the last month or two though.

It’s not a hardware problem. Period.

1 Like

Expecting every user to be a programmer or computer engineer will do nothing to drive Linux to mainstream. I like the flexibility Linux offers but I understand hesitancy surrounding convincing my friends because none of them want to spend time “fixing” a computer. I like to tinker so it bothers me not a whit. I understand people who don’t wish to tinker. Gatekeeping will not bring about the age of Linux on desktop dominance. Perhaps the Steam Deck will if the UX is good, I pray it is.

2 Likes

Plus, finding the information needed to “fix” the computer requires enough knowledge to ask the right questions, and understand the answers.

Search for answers with the wrong keywords and guess what you find?

Does someone have data on whether “deep” actually uses less power than "s2idle:? I know it should, but I recall someone mentioning on this forum that it didn’t – even though it did take longer to wake up from it. It would be good to have some data on this. I guess a reasonable one is to measure battery voltage, make it sleep for an hour or so, wake it up, and measure voltage again (probably just go with the Wh estimate the system provides)? Subtracting and dividing by the sleep duration should do the trick.

I guess it may matter what kind of expansion cards are plugged it.

It’s perhaps a bit too basic a system so every bit of support you have to grab manually, and not a lot of proprietary drivers (for proprietary silicon) exist because how Linux I believe is completely open source. And the upside is because you can tune every bit of it, it can be very performant and stable. Or the total opposite.

As I mentioned earlier, some 90% of computer users have never even seen a command line interface on their computer and know absolutely nothing about them, let alone performing gdb install and even diskpart.exe or chkdsk.

(why should I type random piece of command floating on the web that I have absolutely no idea what they are about)

I understand how command like works after I take an intensive CS course and is moderately comfortable using command line tools. But do I want to do that for every install/update? No. And where do gdb install get the files from? It just goes deep.

Windows also have its own very very deep rabbit holes (not made any easier by two “control panels”), but I believe the many windows and prompts it pops up at least make a more user-friendly interface than “posting a monochrome cmd window with a few lines of sentences”. And if you are really adept into using either system like a “linux chad” the amount of “quirks” you get to know can be “scary”.

The kernel yes but nothing else is guaranteed

As for familiarity with CLI, yeah, it is intimidating and even the GUI interfaces are just bad, look at Synaptic for what I mean. Even compare Ubuntu’s software center to Windows Store and Windows Store is much easier to navigate

There is also the constant fragmentation, how many distros are there? How many countless dev hours go into each distro? How much better would Linux as a whole be if all that investment went into fewer distros? There are like 5 versions of Ubuntu for goodness sake, and the only difference I’m aware of is the DE

I’m not completely against multiple distros-far from it, I think that the big ones each serve a purpose

Big three I can think of are

  1. Debian
  2. Arch
  3. Fedora/RHEL

And maybe the various smaller distros that I have no experience with

  1. BSD-based
  2. Slackware
  3. OpenSUSE

But how many distros are just reskinned versions of the above? How much dev work actually gets pushed upstream? That’s a genuine question because I don’t know. The fragmentation is the greatest strength and biggest weakness of Linux

Microsoft only needs to dev one Kernel and one DE

1 Like

I’d say it is why Linux looks so daunting to the average user

Windows and MacOS have just one version really (Windows Pro is just DLC for OS) and the various distros can vary wildly in UX

I think Debian does an admirable job for placing stability above all else and in a server or enterprise environment, I can respect that

But let’s get real, I can’t remember the last time an OS crashed just out of the blue and it wasn’t me screwing around with settings that could cause instability

There really doesn’t need to be this many distros, especially since most are just derivatives of the above with a new skin on top

Except Arch, Arch scares me and I bless the existence of Manjaro, the rest I assume are roughly equivalent in their Plug ‘n Play nature

I wish RHEL’s mentality was more widespread

Good FOSS projects die from lack of funding, make the product available for free but pay for support sounds like the only sustainable way to fund dev work

Not like people aren’t used to paying for Windows Licensing anyhow

Well, it’s not that simple. Ubuntu, for example, doesn’t correspond to any single Debian version, since they pull from Debian sid, freeze at some point, and then skin it with their own theme and such. And at some point, they had tried to develop an alternative init system (upstart), alternative display server (mir), and alternative desktop shell (unity). All were ultimately failures, and it sort of soured the Linux community at-large on Ubuntu and Canonical.

Then take Linux Mint, which has both an Ubuntu-based version and Debian-based version, both of which get combined with their own software and defaults (disabling snap, enabling flatpak, driver locator/defaults, etc).

Or take some that are based on none of the above, like Void or NixOS or GUIX or whatever. Or take some more mobile-focused distros like Alpine or postmarketOS.

My point is that yes, maybe there are “too many” distros. But from what I know, most of the distro-specific work gets upstreamed when possible (e.g. Canonical has a bad reputation because they’re perceived as not really contributing to upstream, implying most distro developers do contribute to upstream projects). And while there is a decent amount of fragmentation, I’d argue it’s not really on the distro level per-se, but rather at the graphical toolkit (GTK+{2,3,4}, Qt{4,5,6}, wxWidgets, etc) and desktop environment/window manager/compositor (Gnome, KDE, Xfce, LXDE, AwesomeWM, Sway, i3, *Box, etc) levels.

There are lots of different distros because there are lots of different opinions around which release cadence and level of stability is right, how often new software should be introduced into one release, and so on. And I understand that might be intimidating for a potential new user, but I don’t really see any way of solving that and most new users get redirected to Ubuntu, Linux Mint, and sometimes Fedora. So it’s not like new users are given the full breadth of choices and that’s okay (they’ll be waiting when the user is ready).

In the same vein, I understand why there are different desktop environments and window managers/compositors. There are drastically different design choices surrounding how they’re configured, what options are provided, and whether they’re scriptable and heavily extensible. And again, in this area, the choice isn’t really a problem because new users are generally directed either to Gnome or KDE.

The toolkit case, however, is a bit more annoying. This diversity leads to inconsistencies in HiDPI support (GTK+2 is terrible at this, whereas GTK+3 and above are better), theming inconsistencies (Gtk and Qt use different themes and getting them to play nice is a hassle), and mostly feels…stupid. I really wish we only had one graphical toolkit because it would fix some of the papercut issues that keep cropping up.

All of that being said, my point is simply that most of the diversity in distros doesn’t really affect anything because new users tend to enter through a handful of distros and the rest are there waiting for when they’re ready (if they want to explore at all).

2 Likes

And when, not if, M$ screws up something major and makes your PC unusable in the one kernel and one DE, what are you going to do to about it? Wait and hope and pray for a patch?

What you are calling a problem (“fragmentation”), I consider the greatest strength of the GNU/Linux ecosystem. When one project in the constellation does something dumb, I have the option of moving to a different project (usually very easily installed alongside or replacing the problematic one entirely), collaborating with the project’s devs, fixing it myself, or writing my own project to replace it.

I will take freedom and choice in what software I use over the security of a corporation curated walled garden any day.

You’ll note I said the same thing

You mistake what I’m saying as a preference for Windows, I’m not saying that at all

What I am saying is that pushing Linux mainstream requires a dominant version of Linux above all others, the default as it were

Fragmentation inhibits that goal and that it is easier for Microsoft to retain its market share so long as Linux remains fragmented and disorganized

Denial of a problem doesn’t make it go away or mitigate its effects

If a distro truly is unique, then power to the devs I say, some distros truly do provide new or otherwise useful functionality like postmarketOS or TAILS

And I would agree that some fragmentation is necessary and desirable and it might be more accurate to describe the issue as you did

I’m frustrated that despite Linux being a superior OS, it lacks any real presence outside of servers

Answering my own question:

Yes, deep does provide significant power savings over s2idle (not counting possible regressions in 5.16 onwards that may need some time to get fixed)

1 Like

We’ve gotten a bit off-topic on the original purpose of this thread, but it’s interesting.

I’d like to inject a thought which I came upon in a Bryan Cantrill talk. He posits that good open source software is more like a mathematical theorem, and gives the example of the Pythagorean theorem. The Pythagorean theorem is a useful mathematical fact, and can be proved any number of ways, but ultimately there’s one “common” form that it takes. The process of “discovering” the Pythagorean theorem was probably littered with dead ends an false starts though.

In many ways, I think that part of what frustrates people about Linux is that (in the desktop space at least), the development follows a more academic trajectory a la mathematics/science. There are dead ends, there’s duplicated effort, differences of opinion, and public fights about which way is “proper” and who is right. I don’t think you can separate these things from the “openness” of Linux development. It leads to many problems (such as the wake from sleep issues this thread addresses), but it also has many advantages.

I can’t say if that’s a bug or a feature. Just my $0.02.

1 Like

I had this issue on garuda linux and it turns out it was the NVME drive, which I bought from newegg: Are you a human? For now, I copied my OS back to my 1TB expansion card and sleeping works when booted from there.