[RESPONDED] Ubuntu 22.04 unable to launch some apps

Hi there!

I’m a happy user for the Intel 13th Gen for the Framework laptop… until last couple of days where something went very wrong on my Ubuntu 22.04.3 LTS installation. I almost update it weekly and I have no extra installation from outside of the Ubuntu software center.

I use Visual Code to do some Python and I do also game on Steam.
For some reason some of the games stop launching 2 days ago. Out of curiosity I did a check on games from Ubuntu Software center and I realized that neither some “native” games would launch.
Afterwards I realized that Visual Code would not launch neither.

I’m sorry to provide no extra info but I see no event log or something similar to report.
Would be a huge help to get ideas on how to check logs, what to try or how to proceed.
As worse scenario would an Ubuntu reinstallation help (with the hope that user space will be respected 100% or doing some backup)? or would you recommend me to switch to Fedora? I never used it but if it is more stable, then I’m happy to switch!

Thanks and sorry in advance!

Actually. if you want to some help we need more details.
My advice is to open a console, figure out what the application binary name of the app is you want to launch, and launch it in the console. The Error message should be posted here so we can identify what is wrong.

As a general advice, open a console, and issue:

sudo apt update && sudo apt dist-upgrade && sudo apt autoremove

This should update all existing software and make sure all dependencies are set.

Hi Jorg!

I did not find the Visual Code binary name, but for sure I’ll invest time on getting this info and give it a try, even a few minutes ago I tried again and Visual Code now launches well but I can try to find out more about the Steam games… I’ll post this info in a few hours!

About reinstalling Ubuntu, if I remember it well, does it offer to keep user data&config?

And in case of installing Fedora, would that be an option as well? (sorry I never tried Fedora… and I’m very much tempted at the moment)

Won’t make a difference which distribution you use.
Steam applications (games) usually require proton (kind of a windows library call translator based on Wine). If that one is not installed, most steam games won’t launch.
My bet - for steam games, re-install steam or verify its installation.

For visual code, can’t help. I use emacs for all my coding. And I do a lot :slight_smile:

1 Like

I also don’t have Visual Studio Code installed, and I’m also not running Ubuntu, but there are a couple approaches you can take to figuring out where Visual Studio Code’s main executable is.

One approach is to look at your package manager to see its definition for the Visual Studio Code package (“package” being basically the name for a Linux installer). There’ll usually be “list” and “search” type commands you can use to find the package, and then you can usually get those tools to spit out a list of files which are copied onto your system at installation time. On Ubuntu, you’d be between apt and dpkg. The Software Center is a package installation frontend that’s like a friendlier apt, but apt is also friendly.

To give you an easy answer, assuming you’re not using Flatpaks, it looks like the name of VS Code’s main binary is /usr/bin/code. You can type that into a terminal, or just code, and it should run, or at least fail to run with an interesting error message.

If you did install VS Code using a Flatpak, you want the terminal command flatpak list to see the list of installed flatpaks, and flatpak run com.microsoft.vscode (for example, name might be different) to run it from the command line, in general. I don’t think Ubuntu comes with Flatpaks by default… could be wrong. It’s worth mentioning, since they’re getting pretty common.

I wouldn’t assume that Fedora will help you. If anything, in general, my money would be on you having a slightly easier time in Ubuntu, but, really it’s the same, different tools doing mostly the same things.

I would definitely not assume that reinstalling Ubuntu is safe, and more to the point, right now, it doesn’t sound like your system is broken enough for that to be the more direct route to getting it working. Without basic troubleshooting, you’ll wind up doing a lot of reinstallations, wasting a lot of your time.

1 Like

Have tried launching these games on command line steam? maybe an error message will show there to provide more clues :slight_smile:

Seems Visual Code main binary is set to the system or user path so I can do this:

code .

(Found here!)
This should open Visual Code from the current folder in terminal.
Now again it does not launch :frowning: seems a bit seldom for now when it executes and when it does not. And from terminal I get not a single line of exceptions, errors or whatever, just the command prompt ready to run the next command.

I see no code under /usr/bin and following this I get this in my machine:

$ whereis code
code: /snap/bin/code

Then I continue with my search, here it goes:

$ ls -la /snap/bin/code
lrwxrwxrwx 1 root root 13 de des.  13 23:10 /snap/bin/code -> /usr/bin/snap

Those all seem to be folder links, and the one containing the binaries seems to be here (found by chance, by the way):

$ ls /snap/code/
147  148  current

Current is a pointer to 148 folder. The 2 folders follow the same structure:

$ ls 148
electron-launch  etc  meta  snap  usr

Those folders follow a more complex structure where I could not find the main binary that launches all this.

Anyway as stated, I could run code .and no extra info is shown.

Thanks a lot for those words, I know where to focus in an efficient way! :slight_smile:

It seems I’ll need a bit of time to get the steam cmdup and running.
Meanwhile I did launch 0ad on the terminal directly and after a few errors I get into it.
So, worth trying with some steam games using the steam cmd for sure!

This isnt an exact solution to your problem but I’ve had zero issues with VSCodium which is to VSCode what Chromium is to Google Chrome (basically: community maintained binaries).

Might be worth a shot.

1 Like

Only do it if you are in for some learning curve. SELinux can and will make your life hard at first, but the additional security is worth it.

As to your VSCode isssue simply run Codium which is the opensource version, and install it directly. Do not use a snap or in the case of Fedora a flatpak you will run into networking issues with the IDE that may be insurmountable or at least a huge pain that can and will crop up frequently after updates.

I feel that I should apologize. I left out any mention of Snaps, which are Canonical’s answer to Flatpaks. I had no idea that Ubuntu’s Software Center would default to installing the snap version of packages by default for desktop apps. That’s why there’s so much indirection and obscurity. Personally, I feel that, at some level, it’s impolite or dishonest to not dump out a lot of diagnostic text to a terminal window when a graphical program’s running.

Anyways, the question becomes, where is snap putting any error messages or logs, and/or, why aren’t snaps in particular running reliably? You could always uninstall those programs and reinstall them, maybe using apt to install Code or Codium, and downloading Steam from Valve’s website. That’d likely avoid the problem without troubleshooting it.

That is the first thing I disable in ubuntu → snapd and all related stuff. I leave flatpak as I have some software (kdenlive - video editor) that runs nicely as snap and the devs provide recent packages for it.
Never understood the snap move => Replicating the android mess under linux that is!

1 Like

No worries, I’m here to ask and learn! And I thank you all for this nice thread!
Sorry if it goes a bit off-topic on the Framework system itself!
It’s impossible to show all the corners from the system at once so I may keep asking for a while :slight_smile:

So Software Center is something else and parallel to the apt package system? that’s a surprise to me… what about dpkg is it also independent from the two others?
I have a strange feeling when updates appear on Ubuntu as there are two different applications that show them: Software Center itself where I see updates regarding apps I installed through this tool and an "Update Manager" which to me looks like a bit more connected to apt (but no real idea) where the updates are more Ubuntu OS focused, where kernel, translations and some others (Jupyter, Python…) are updated.

Why do snaps exist? is it a way to isolate possible dependency problems on the OS?

I’ll investigate a bit on the Snaps in Ubuntu and yes, probably I’ll do installations based on code I get from their sites/repos as I may get more recent versions (for example for the Arduino IDE the 2.x version and not the 1.8 which comes from the Software Center)

Nice! :sunglasses:
I’ll start using this right away and give some feedback on it in some days!

I don’t use Ubuntu (or Debian) enough to be highly confident, but I’m fairly sure dpkg underlies apt, with apt being the higher-level tool you interact with primarily, and dpkg being more of the package management plumbing.

Then I have a hunch that Software Center is meant to be a single unifying GUI that sits on top of both apt and, apparently, snaps.

Yeah, that’s the main reason they exist. The point of a Linux distribution, traditionally, is to deliver to you the user a collection of various open source / free software projects distributed together (including the Linux kernel itself). It puts a lot of burden on the package maintainers associated with the distro. “First-party” packages in a distribution tend to be built by package maintainers from source code maintained upstream by other people, sometimes with distro-specific patches applied.

The problem appears when you are the elsewhere-developer who has some interest in building the software you wrote for Linux generally, rather than targeting only one or two distros, maybe because your software is proprietary, or because it’s not popular enough to attract the interest of every distro maintainer the world over, as the case may be.

Flatpaks, Snaps, and there was a third one… AppImage, are each attempts to build a distro-spanning system that is capable of managing different sets of otherwise-conflicting dependencies in parallel, so all those software developers can deploy software to “Linux” without building tons of distro-specific packages.

The main reason for a package system is to reduce the base installation size and make sure you install libraries only once per system.
The biggest advantage is the dependency checking - which is a real pita to maintain. But when installing Ubuntu or Fedora/Redhat the old fashioned way, you’ll have only one libc, one libstc++ etc. installed for all packages which have been compiled with these libraries to run with these applications.
In the end, you can create a small system as we see with alpine using only about 15MBytes of space, tiny Ubuntu with 130MBytes.
If you install an application, the package manager will try to install the missing dependencies, if they are not available, refuse the installation.

One other advantage/drawback of these OneAppAndDependenciesInOneLargePackage is the environment where it runs. Sometimes, it is secured and you cannot access to your home-directory if it is not configured right. So it is a real pain to access your local files.

A real pain in the access, some might say…

I tend to avoid all three of those options, as I find RPMs (or DEB or whatever) just tend to run “better”.

1 Like

This is a niggle, but I’d call snaps, Flatpaks, etc. “packages” along with .deb files, .rpm files, or whatever. I don’t think I’m the only one who’s always read “package” as a general term.

I actually disagree with this too, being a Slackware fan :wink:
I think the main advantage of packaging is having the ability to not only add, but reliably remove and upgrade software.

The alternative is having a bunch of files scattered around your hard drive, with no idea which ones came from where. (ye olde ./configure; make; make install)

Distributions managing dependencies for you I’d call a very significant but secondary benefit, in the traditional package model. You can mostly just fire and forget, but every now and again, awkward conflicts creep in, or some set of circular references breaks the packaging system, or something’s unavailable… or you are motivated to care for some reason about optional dependencies. Then you wind up having to know what you’re doing to an extent.

1 Like

Being one of the early adopters of RedHat 1.0 (yes, beta and 1.x :slight_smile: ) and having added some patches later on to the package manager because it did lack some functionality (the md5 checksums for each and every file inside the package (imagine the libc being corrupt during a cluster-wide upgrade, yes, that happened to me), or the PGP signature of the packages). And after that, the CIT (Cluster installation toolkit - which was the predecessor of the kickstart installer) - I think I have a little experience on that. especially since I did that to get away from Slackware…

But you are right. The primary reason was to actually be able to cleanly install and remove a package (if configuration files where changed, these are not removed but backed up at their location). However being able to exactly replicate an installation in different hardware is what made these packages very valuable!
The second reason was to reduce the overall installation size while having the possibility to verify not only the files but also the source (through the cryptographic signature). Just point out at package groups to install a system for a specific task (desktop, server, remote x-client) while the installer took care of the underlying hardware.
Remember that back in time (gush, I hear my kids telling I identify myself as old now :smiley:), we didn’t have gigabit or fast ethernet (100Mbps) but only 10Mbps to share between 120 systems.

So it evolved through all these requirements :} but remained much more stable than any Windows/Mac type systems. Went through all of them, and as (I’m the family tech), had to deal with remote tech-support for all of them. Only that I soon refused support for Windows then Mac. Since then, they all are under linux.

1 Like

Wow, Red Hat 1.0!
When I was a young lad, I remember hearing rumors of Red Hat 4, some kind of free operating system, didn’t sound possible… my earliest Linux memory is attempting to bootstrap stage 1 of Gentoo on a Dell Latitude CPx. It was a valiant, uninformed, attempt.

Then I installed Debian.

But why would you leave perfection? (ok, kidding, I kid)

That’s kind of an interesting point about cause and purpose. I guess Linux always has been oriented toward server-first, scale-first… It would make sense if packages were created mainly to manage fleets of machines. I’ve only ever dealt with single machines at a time, and mostly desktops. Would you believe I only have five cows, they all have names, and I let them sit on the couch?

You know, strangely, I find myself missing 3.5" floppies. No wonder file corruption used to be more common.

You’ve got your family using Linux?
I can’t imagine pulling that off. Nice work.