Don't want to see articles from a certain category? When logged in, go to your User Settings and adjust your feed in the Content Preferences section where you can block tags!
We do often include affiliate links to earn us some pennies. See more here.

Minecraft's rather large 'Update Aquatic' content update came out recently and it came with their fancy new launcher, getting it on Linux and keeping it up to date is now a snap.

To be clear, I'm talking about the Snap packaging format that comes built into Ubuntu, Solus and more. If you don't have Snap package support, installing it is easy and it gives you easy access to quite a lot of games and applications across different distributions and all kept up to date for you.

Minecraft has been available as a Snap for a while, but it was using the now outdated and ugly old launcher, which wasn't a great user experience. As of today, it was updated to give you the new and improved experience.

You can grab it from the Snapcraft store or if your distribution supports it, like Ubuntu, it can be found by just searching for "Minecraft" in the Software Centre.

If you already have it installed, you can wait for it to update automatically or quickly do it yourself like so in terminal:

snap refresh minecraft

Naturally, you can download Minecraft direct from the official site as well, but for me personally I do prefer the experience of just hitting install and having it all done there and then.

Also, it seems Mojang have already fixed the minor text distortion issue I reported to them recently—great!

Article taken from GamingOnLinux.com.
8 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
26 comments
Page: «2/3»
  Go to:

cRaZy-bisCuiT Jul 24, 2018
Quoting: Exidan
Quoting: TheSHEEEP
Quoting: Exidan
Quoting: PJgot to admit I have mixed feelings when I read news like this.
On the one hand it is awesome to hear about new ways of getting your software without hassle of hunting dependencies, configs etc and appreciate diversity in Linux.
But on the other hand - damn it, can't we agree on a single universal package format, not 3? It has this deb vs rpm stench all over it. Certainly I'd be happier if all the effort would went into making a single working, universally recognized format before adding new ones. Possibly Flatpak, even though personally I enjoy AppImages the most (due to its simplicity) - as it seems the most widely accepted across distros and does not bear the usual Canonical controversy mark...
I don't like how they handle libraries and dependencies. isn't the whole point of the linux ecosystem to avoid redundancy? if they ship every library with the snap (and they look first for the shipped library before looking into the system), they will end up with a whole lot of redundancy.
The worst point about linux (well, next to the fragmentation) is that terrible idea of avoiding redundancy by assuming you just have to the right versions of the right libraries.
It is completely impractical when you actually want to distribute software.

When you distribute software, your software was built against certain versions of certain libraries.
There is simply no way to guarantee that a user has those certain versions of those certain libraries on their computer. Nor is there a way to guarantee that there will always be your specific required version (architecture, version, etc.) available anywhere.
Nor is it realistic to expect devs to make sure that there is a PPA or whatever with exactly the versions they need.
Nor can you be sure that none of the symlinks on a user's system isn't somehow broken, pointing to a wrong version, etc.
Nor can you be sure that some update to a library won't break compatibility.
Nor can devs be expected to always make sure their software works with the most recent versions of all dependencies - devs must be able to move on to new projects, not maintain their old projects forever.
There are thousands of problems with this approach and it just barely works for open source projects IF and only if they are well maintained - for all others, it really doesn't. It is a "weakest link" approach - all goes well until the weakest link in the chain breaks - and "weakest link" approaches are generally terrible.

The ONLY way to make sure your distributed software works as intended is to distribute the exact versions of dependencies with it. Or use Docker or smth. similar (though that isn't applicable for all cases).
I rather have some megabytes "wasted", if what I get is software that is guaranteed to work on my user's machines without a hassle and without influencing anything else on the user's machines.

Oh, and because I know some tinfoil hat will come with the security argument:
If one of my dependencies has a security problem, I can update that dependency and forward that update to users. It is my responsibility as a dev to watch out for stuff like that.
But 95% of all software doesn't even do anything that could pose a security threat even if there was an exploit. And for the other 5% this happens so rarely that using a different approach doesn't come close to the benefits of distributing dependencies with your software.
hmm... I always liked the point of no redundancy about linux, and one of the strongest argument against using windows. and fragmentation on linux? only if your hdd is nearly full, really. I don't see other way to do it (besides the "linux way").
Anyway, I guess we just have total disparity of opinions on this matter lol
I'm with you, bros! Even though dependency hell is real every once in a while, especially on either Rolling Release distros where some packages are updated quickly but not everything depending on them.

Also for old, non maintained packages.
TheSHEEEP Jul 24, 2018
View PC info
  • Supporter Plus
Quoting: Exidan
Quoting: TheSHEEEP
Quoting: Exidan
Quoting: PJgot to admit I have mixed feelings when I read news like this.
On the one hand it is awesome to hear about new ways of getting your software without hassle of hunting dependencies, configs etc and appreciate diversity in Linux.
But on the other hand - damn it, can't we agree on a single universal package format, not 3? It has this deb vs rpm stench all over it. Certainly I'd be happier if all the effort would went into making a single working, universally recognized format before adding new ones. Possibly Flatpak, even though personally I enjoy AppImages the most (due to its simplicity) - as it seems the most widely accepted across distros and does not bear the usual Canonical controversy mark...
I don't like how they handle libraries and dependencies. isn't the whole point of the linux ecosystem to avoid redundancy? if they ship every library with the snap (and they look first for the shipped library before looking into the system), they will end up with a whole lot of redundancy.
The worst point about linux (well, next to the fragmentation) is that terrible idea of avoiding redundancy by assuming you just have to the right versions of the right libraries.
It is completely impractical when you actually want to distribute software.

When you distribute software, your software was built against certain versions of certain libraries.
There is simply no way to guarantee that a user has those certain versions of those certain libraries on their computer. Nor is there a way to guarantee that there will always be your specific required version (architecture, version, etc.) available anywhere.
Nor is it realistic to expect devs to make sure that there is a PPA or whatever with exactly the versions they need.
Nor can you be sure that none of the symlinks on a user's system isn't somehow broken, pointing to a wrong version, etc.
Nor can you be sure that some update to a library won't break compatibility.
Nor can devs be expected to always make sure their software works with the most recent versions of all dependencies - devs must be able to move on to new projects, not maintain their old projects forever.
There are thousands of problems with this approach and it just barely works for open source projects IF and only if they are well maintained - for all others, it really doesn't. It is a "weakest link" approach - all goes well until the weakest link in the chain breaks - and "weakest link" approaches are generally terrible.

The ONLY way to make sure your distributed software works as intended is to distribute the exact versions of dependencies with it. Or use Docker or smth. similar (though that isn't applicable for all cases).
I rather have some megabytes "wasted", if what I get is software that is guaranteed to work on my user's machines without a hassle and without influencing anything else on the user's machines.

Oh, and because I know some tinfoil hat will come with the security argument:
If one of my dependencies has a security problem, I can update that dependency and forward that update to users. It is my responsibility as a dev to watch out for stuff like that.
But 95% of all software doesn't even do anything that could pose a security threat even if there was an exploit. And for the other 5% this happens so rarely that using a different approach doesn't come close to the benefits of distributing dependencies with your software.
hmm... I always liked the point of no redundancy about linux, and one of the strongest argument against using windows.
But what is the argument?
It saves space? In a time when space is disk virtually free (except for SSDs, disks are basically giveaways), this is a non-issue.
It obviously works for Windows, and binaries on Windows can be strangely large.

Quoting: Exidanand fragmentation on linux? only if your hdd is nearly full, really. I don't see other way to do it (besides the "linux way").
No, you misunderstood. I didn't mean the fragmentation of disks.
I mean the fragmentation of communities and development resources. A hundred distros, where just a handful would be so much better because they would have received much more dev resources.
Imagine all that work that goes into all the small distros with a handful of users each would be focused and organized so it would benefit only a few distros. Those few distros would be MUCH better than they are now, and very likely so flexible and configurable, that none of the smaller distros would even be requested by anyone.
But no, everyone has to bake their own little ego cake and the community as a whole as well as the spread of linux suffers from it. Developers shy away from "all those distros" (as mistaken as that impression might be), manufacturers don't even get the idea to pick a distro for distribution because there is no "official" distribution, etc.

I very much recommend this article: https://www.dedoimedo.com/computers/linux-fragmentation-sum-egos.html


Last edited by TheSHEEEP on 24 July 2018 at 3:48 pm UTC
Ananace Jul 24, 2018
Quoting: KelsI get that flatpaks and snaps are easy to install, but what's never been clear to me is are they as easy to update as PPAs for instance? Or do I have to go to every single piece of software I use and keep them updated manually and individually?

Both Flatpak and Snap comes with built-in updating as far as I know, as well as their repository-based installing. Which also happens to be the reason I prefer them above AppImage with their install-less design.

Now I can't speak for Snap - can't easily install it on my dist - but for Flatpak it's as simple as `flatpak update` to update one or all installed applications. Or just use KDE Discover / GNOME Software and let it update the apps for you.
marcus Jul 24, 2018
View PC info
  • Supporter
Quoting: TheSHEEEPBut what is the argument?

It saves space? In a time when space is disk virtually free (except for SSDs, disks are basically giveaways), this is a non-issue.
It obviously works for Windows, and binaries on Windows can be strangely large.

Security. I'm with Exidan on this, but from a slightly different angle.

It does obviously not work under Windows. On Windows every application ships its special funny version of ffmpeg or libjpeg for example. Do you think this ever gets updated to fix bugs? I think adapting the Windows approach (and in the end all these snaps, flatpacks, docker images (lets subsume them under "containers") and whatnots are nothing else + a bit more isolation) is a bad thing. It encourages packaging libraries and then abandoning them because "the application works".

In a distro, a library that has a vulnerability gets either updated or (lacking updates) removed. Sure this breaks you application if ABI breaks or if there is no fixed version, but for good reason! It removes a vulnerability from your system.

Use an upstream supported library version in your application and you are fine. Admittedly there are libraries that have never heard of stable APIs/ABIs or semantic versioning, but that's a problem on its own and those container formats are not a solution to them but part of the problem. They suggest that breaking APIs and ABIs is fine. You can just package the right version and the library dev can then go on developing without having to backport fixes to the old code.

This is a huge problem and going that route will invite many of the security holes we find in the Windows world into the Linux world. If you want to do this, be at least honest: statically link the libraries in. Because in the end, all those funny container formats are doing the same.... just not using static linking. Containers are already there. They are called static binaries. No dependency hell required.


Last edited by marcus on 24 July 2018 at 8:31 pm UTC
TheSHEEEP Jul 25, 2018
View PC info
  • Supporter Plus
Quoting: marcus
Quoting: TheSHEEEPBut what is the argument?

It saves space? In a time when space is disk virtually free (except for SSDs, disks are basically giveaways), this is a non-issue.
It obviously works for Windows, and binaries on Windows can be strangely large.
It does obviously not work under Windows. On Windows every application ships its special funny version of ffmpeg or libjpeg for example. Do you think this ever gets updated to fix bugs? I think adapting the Windows approach (and in the end all these snaps, flatpacks, docker images (lets subsume them under "containers") and whatnots are nothing else + a bit more isolation) is a bad thing. It encourages packaging libraries and then abandoning them because "the application works".
"The application works" is the one and only important criterium.
So what if some dependent library gets an update? If the update is important for your application, you can just update it to the recent version.
If the update isn't important to your application (which is the vast majority of cases), you don't need to do anything.

Quoting: marcusIn a distro, a library that has a vulnerability gets either updated or (lacking updates) removed. Sure this breaks you application if ABI breaks or if there is no fixed version, but for good reason! It removes a vulnerability from your system.
But we're not talking about distros here, we're talking about distributing applications. For the OS itself I actually think this variant of relying on external packages makes sense. But not for single applications that do not even want to become part of the system (like games).
So what if ffmpeg, the standalone executable, has a security leak in a very weird circumstance? The only thing I do in my application is using it for a very specific use case that is fully under my control. I don't put my version of it into some system folder, only my application uses it. My application's way of using it isn't insecure at all. Or even better, I use the dynamic library, not the executable, so there's no way anyone can use it directly - if you don't start copying my application's dependencies into your system folder, in which case it is clearly the user's fault and not my problem any more.
A very clear case of my application not needing the update.

Quoting: marcusThey suggest that breaking APIs and ABIs is fine. You can just package the right version and the library dev can then go on developing without having to backport fixes to the old code.
Breaking APIs and ABIs IS FINE. You can take a look at Windows API programming prior to Windows 8 to see the absurd legacy shit Windows devs had to deal with because Microsoft was afraid to break APIs.
And the second sentence is exactly the point - it is less work for everyone involved without any downside. Plus the library dev still has to backport if too many users of the library for some reason cannot switch to a newer version.

Quoting: marcusThis is a huge problem and going that route will invite many of the security holes we find in the Windows world into the Linux world.
This is mostly hearsay without any basis. The reasons linux has less breaches than Windows is first and foremost that it is a smaller target - if it ever becomes big, that will change in an instant - and the second part is that no application can just go and change system files, run cronjobs, etc. without the user's approval.
The "annoying" thing of having to type your password each time is actually way more secure than Windows' popup where you have to click a button.
And lastly, Linux distros are developed differently, with a lot more different eyes on the code. Leaks are more easily found and fixed this way. Open source development is, in the end, more secure.

Quoting: marcusIf you want to do this, be at least honest: statically link the libraries in. Because in the end, all those funny container formats are doing the same.... just not using static linking. Containers are already there. They are called static binaries. No dependency hell required.
This is terrible advice and shows that you have zero experience developing software for end users.
First of all, some software you cannot even link statically without breaching license (FFmpeg, for example, forces you to give away object files of your project, or go open source yourself - check this.
Also, software (usually) needs updating. That usually means your own code has changed, and sometimes it means a dependency has updated and you wanted that change.
If you link all your dependencies statically, each and every update will be - depending on the size of your dependencies - gigantic. While space is virtually unlimited, bandwidth unfortunately still isn't. And this propagates - from your repo (if you got binaries there, maybe in a git LFS) to the build server, to the install build server (if separate) to the CDN to every user. Trust me, that is a big no-no.

The only benefit of static linking is that people won't know what libraries you use (if for some reason you want to hide that???) - so it is actually less honest than dynamic linking.


Last edited by TheSHEEEP on 25 July 2018 at 5:27 am UTC
marcus Jul 25, 2018
View PC info
  • Supporter
Quoting: TheSHEEEP
Quoting: marcus
Quoting: TheSHEEEPBut what is the argument?

It saves space? In a time when space is disk virtually free (except for SSDs, disks are basically giveaways), this is a non-issue.
It obviously works for Windows, and binaries on Windows can be strangely large.
It does obviously not work under Windows. On Windows every application ships its special funny version of ffmpeg or libjpeg for example. Do you think this ever gets updated to fix bugs? I think adapting the Windows approach (and in the end all these snaps, flatpacks, docker images (lets subsume them under "containers") and whatnots are nothing else + a bit more isolation) is a bad thing. It encourages packaging libraries and then abandoning them because "the application works".
"The application works" is the one and only important criterium.

No. This might be the case for you, but others care about the security of their system as well. I'd prefer if an application does not expose a vulnerability silently. If it suddenly stops working with my system libjpeg then I'll notice. I will not notice if it just bundles its own broken libjpeg.

Quoting: TheSHEEEPSo what if some dependent library gets an update? If the update is important for your application, you can just update it to the recent version.
If the update isn't important to your application (which is the vast majority of cases), you don't need to do anything.
How do I update a library in a SNAP/FLATPAK/docker container manually? How do i keep track of all the different installed versions of a library in different applications? This *is* exactly the job of the system package manager. This is (at least for me) one of the key advantages of Linux - centrally managed libraries and applications that are monitored for vulnerabilities by maintainers.

Quoting: TheSHEEEP
Quoting: marcusIn a distro, a library that has a vulnerability gets either updated or (lacking updates) removed. Sure this breaks you application if ABI breaks or if there is no fixed version, but for good reason! It removes a vulnerability from your system.
But we're not talking about distros here, we're talking about distributing applications. For the OS itself I actually think this variant of relying on external packages makes sense. But not for single applications that do not even want to become part of the system (like games).
So what if ffmpeg, the standalone executable, has a security leak in a very weird circumstance? The only thing I do in my application is using it for a very specific use case that is fully under my control.

And this is the dangerous misconception. What about a multiplayer game that uses avatars? This is not under your control. I can inject an infected jpeg that exploits the library there. Same goes for networking libraries. openssl for example, which a lot of applications depend on directly exposes all network based communication to exploitation if it has a bug that is not fixed.

Quoting: TheSHEEEP
Quoting: marcusThey suggest that breaking APIs and ABIs is fine. You can just package the right version and the library dev can then go on developing without having to backport fixes to the old code.
Breaking APIs and ABIs IS FINE. You can take a look at Windows API programming prior to Windows 8 to see the absurd legacy shit Windows devs had to deal with because Microsoft was afraid to break APIs.

And the second sentence is exactly the point - it is less work for everyone involved without any downside. Plus the library dev still has to backport if too many users of the library for some reason cannot switch to a newer version.

No. It is not. The downside is a security vulnerability shipped with your application. You may not care about this, but others, like me, do.

And breaking ABIs and APIs *is* fine if you use semantic versioning. And best if you also maintain security fixes for the old ABI/API version for at least some time to give the downstream a chance to update. Especially in the "agile" world today this seems often no longer the case.

Quoting: TheSHEEEP
Quoting: marcusThis is a huge problem and going that route will invite many of the security holes we find in the Windows world into the Linux world.
This is mostly hearsay without any basis. The reasons linux has less breaches than Windows is first and foremost that it is a smaller target - if it ever becomes big, that will change in an instant - and the second part is that no application can just go and change system files, run cronjobs, etc. without the user's approval.
The "annoying" thing of having to type your password each time is actually way more secure than Windows' popup where you have to click a button.
This is not a myth. A number of recognized publications analyzed exactly this problem of the vulnerabilities introduced by bundled libraries. The this one from the reputable OAKLAND security conference as an example: http://legacydirs.umiacs.umd.edu/~tdumitra/papers/OAKLAND-2015.pdf

I do not dispute that there are other mechanisms that make Linux more secure (though the password vs. click a button thing is not one of them -- as a sysadmin I see way to many people that mindlessly type sudo and enter their password). And sure, you do not immediately get root access, but neither do you get that on windows usually. But having local execution capabilities is an important first step for any attack and should not be invited by having old, unmaintained code --- no matter in which application.

Here some examples from some of the games shipped by steam that I have installed:
ShadowOfMordor, Alien Isolation, Life is Strange, Borderlands 2 --- OpenSSL 1.0.1 14 Mar 2012
Expeditions: Conquistador, Element4L, TIS-100,Oxenfree --- OpenSSL 0.9.8o (32 bit) and 1.0.0g (64 bit)
Pillars Of Eternity --- OpenSSL 1.0.0g
Talos Principle --- OpenSSL 1.0.1i (Last patch release of 1.0.1 was 'u')
Millie The Centipede --- OpenSSL 0.9.8o

OpenSSL 1.0.1 is unmaintained and does not receive fixes. The last release was made roughly 2 years ago ... 1.0.0 is even older and unmaintained longer, so is 0.9.8 ....

These games use the network (at least Borderlands 2 does). Same goes for the steam API here btw. They should really provide security updates for it ... No sane distribution still ships OpenSSL 1.0.1. They all moved on to 1.0.2. Steam itself uses 1.0.2j btw. We are at 'o' as the current patch release --- which is also installed by my distro.

Quoting: TheSHEEEPAnd lastly, Linux distros are developed differently, with a lot more different eyes on the code. Leaks are more easily found and fixed this way. Open source development is, in the end, more secure.

Actually, this is a fallacy that has been often debunked and does not even help, if the updates that are enabled by this are then not installed because applications ship old library version.

Quoting: TheSHEEEP
Quoting: marcusIf you want to do this, be at least honest: statically link the libraries in. Because in the end, all those funny container formats are doing the same.... just not using static linking. Containers are already there. They are called static binaries. No dependency hell required.
This is terrible advice and shows that you have zero experience developing software for end users.
First of all, some software you cannot even link statically without breaching license (FFmpeg, for example, forces you to give away object files of your project, or go open source yourself - check this.
Also, software (usually) needs updating. That usually means your own code has changed, and sometimes it means a dependency has updated and you wanted that change.
If you link all your dependencies statically, each and every update will be - depending on the size of your dependencies - gigantic. While space is virtually unlimited, bandwidth unfortunately still isn't. And this propagates - from your repo (if you got binaries there, maybe in a git LFS) to the build server, to the install build server (if separate) to the CDN to every user. Trust me, that is a big no-no.

The only benefit of static linking is that people won't know what libraries you use (if for some reason you want to hide that???) - so it is actually less honest than dynamic linking.

But securitywise it's the same (You are right with the license issue though. I forgot about that. Btw: I don't link statically, but I also don't ship my dependencies myself ...). Btw: Many steam games statically link OpenSSL (all of the above listed that are not from Feral, if my script worked correctly).

Lets agree to disagree here: You value "It just works" more than "it is secure" and I do the other way around. It's just a matter of different priorities.


Last edited by marcus on 25 July 2018 at 6:17 am UTC
TheSHEEEP Jul 25, 2018
View PC info
  • Supporter Plus
Quoting: marcusHow do I update a library in a SNAP/FLATPAK/docker container manually?
I don't know, but I was under the impression that a library installed by SNAP for my program is just available to my program, not to others. I may be wrong, of course. In which case, I really don't see what makes SNAP different from the the good old apt-get, just installing stuff globally for everyone.

Quoting: marcusHow do i keep track of all the different installed versions of a library in different applications?
If you install a game that has some /lib folder with libraries it uses and puts its dependencies there, nothing will be "installed" on your system to be abused by other programs. Someone would manually have to replace the system version with that game's old version (ignoring for a moment that you usually can't just do that), and if someone has the access rights to do THAT, they'd also have the power to just replace the system version with their own binary entirely, making the whole point irrelevant.

Quoting: marcus
Quoting: TheSHEEEPSo what if ffmpeg, the standalone executable, has a security leak in a very weird circumstance? The only thing I do in my application is using it for a very specific use case that is fully under my control.
What about a multiplayer game that uses avatars? This is not under your control. I can inject an infected jpeg that exploits the library there.
And that infected jpg will not get you anywhere, as neither the server you upload to nor the users that will see it later will ever "execute" the .jpg file on their computers.
There is a reason no game allows its players to just upload any file, and if a file is uploaded, it is at the very least converted or checked. Because if that isn't done, that would indeed be an exploit. But not one that would have been prevented by updating a library.

Quoting: marcusThis is not a myth. A number of recognized publications analyzed exactly this problem of the vulnerabilities introduced by bundled libraries. The this one from the reputable OAKLAND security conference as an example: http://legacydirs.umiacs.umd.edu/~tdumitra/papers/OAKLAND-2015.pdf
That paper brings examples that completely defeat the point:
QuoteThe user is running two versions of Adobe Reader, a default up-todate version and a vulnerable version. The attacker convinces the user to install a Firefox add-on that looks benign. The malicious add-on has filesystem access through the XPCOM API [4]. It locates the vulnerable and patched versions of the Adobe Reader library (nppdf32.dll) and overwrites the patched version with the vulnerable one.
Note the bold part.
As soon as the user willingly says "yes" to something, aka gives access rights, everything can happen.
Instead of switching a vulnerable and patched version of some library, the add-on might have as well just put its own version into the system (which would have been a way better idea from the hacker's perspective). That there was a non-patched version somewhere on the user's PC is completely irrelevant as there would be countless other ways.
That the paper tries to put the blame on a fully irrelevant fact just shows that an example has been picked to "prove" the point they wanted to prove to begin with. Just another example of biased "research".
There is no protection for users screwing up their own security.

Quoting: marcusHere some examples from some of the games shipped by steam that I have installed:
ShadowOfMordor, Alien Isolation, Life is Strange, Borderlands 2 --- OpenSSL 1.0.1 14 Mar 2012
Expeditions: Conquistador, Element4L, TIS-100,Oxenfree --- OpenSSL 0.9.8o (32 bit) and 1.0.0g (64 bit)
Pillars Of Eternity --- OpenSSL 1.0.0g
Talos Principle --- OpenSSL 1.0.1i (Last patch release of 1.0.1 was 'u')
Millie The Centipede --- OpenSSL 0.9.8o

These games use the network (at least Borderlands 2 does). Same goes for the steam API here btw. They should really provide security updates for it ... No sane distribution still ships OpenSSL 1.0.1. They all moved on to 1.0.2. Steam itself uses 1.0.2j btw. We are at 'o' as the current patch release --- which is also installed by my distro.
This proves exactly my point.
Nobody, ever, has been or will be hacked or in any way influenced by a game not using the most recent version of OpenSSL. Otherwise, these games would have updated their dependencies, been removed from Steam, made it to the news, etc..
This isn't true for every kind of program or application, obviously. Especially programs that are themselves important parts of operating systems are better served by automatically using the latest version.
But a game and many other kinds of "non-system" software certainly doesn't belong to that category.

Quoting: marcusLets agree to disagree here: You value "It just works" more than "it is secure" and I do the other way around. It's just a matter of different priorities.
I do value my security.
I just also realize that my security isn't in the least reduced by my games coming packaged with their own dependencies. Because there is no realistic way to abuse those dynamic libs lying in some /lib folder of my installed game if I don't go ahead and install shady software from untrusted sources.
What I don't do is put on a tinfoil hat, fearing and preparing for theoretically possible attacks that never did and never will happen to me.


Last edited by TheSHEEEP on 25 July 2018 at 10:49 am UTC
PJ Jul 25, 2018
Quoting: ExidanI don't like how they handle libraries and dependencies. isn't the whole point of the linux ecosystem to avoid redundancy?

Well those new package formats are IMO not meant to replace old ones, only to work alongside them.
While the old ones work great for open source apps as it is easy to grab new code and recompile it with the libraries you have it is really not doable in situations like proprietary apps - games for example.
With variety of distros, each with its own libraries, configs etc there was no way for any sane publisher to cater for all of those... And thus we hadn't access to many proprietary apps or they were limited to specific distros when it came to support (usually RHEL when it came to pro apps - look Modo, Maya, Substance, Mari...).
New packaging options that bundle libraries etc can be a means to solving this issue.
Heck, they are also awesome in other cases as well. For example - recently I've updated my workstation to OpenSUSE Leap 15. I've added Steam shortly after so I could have some game fun in the free time. It has turned out that initial Steam package for Leap 15 was awful - while the app itself launched and worked, huge chunk of the games didn't because of the way libraries have been linked. In the past I'd have been screwed - or forced to try to untangle this mess. But nowadays - I've just installed it through Flatpak and that's it. ATM I'm not even considering switching it - works like a charm. All games so far start without issues. THe only extremely minor nitpick is that tray icon does not work - I get this "missing" icon space there. (side note - really hope Steam switches to Flatpaks as their official way of handling their packages - and damn, please make them not create their own).
I really do hope more commercial software for Linux get distributed like that and save me time trying to make things work.

That's why I'm a bit annoyed we have 3 universal formats not one. Again it forces the publishers to investigate and make a choice. And they may simply consider it a waste of time or a bit too risky.
It would be awesome if major foundations behind open source and Linux could agree on one and actively promote it. In this very area I think "I know better" attitude should stop.
marcus Jul 25, 2018
View PC info
  • Supporter
Quoting: TheSHEEEP
Quoting: marcus
Quoting: TheSHEEEPSo what if ffmpeg, the standalone executable, has a security leak in a very weird circumstance? The only thing I do in my application is using it for a very specific use case that is fully under my control.
What about a multiplayer game that uses avatars? This is not under your control. I can inject an infected jpeg that exploits the library there.
And that infected jpg will not get you anywhere, as neither the server you upload to nor the users that will see it later will ever "execute" the .jpg file on their computers.
There is a reason no game allows its players to just upload any file, and if a file is uploaded, it is at the very least converted or checked. Because if that isn't done, that would indeed be an exploit. But not one that would have been prevented by updating a library.

I think you lack some basic knowledge here on how exploits for media formats work. They are usually completely valid or at least seemingly valid files that trigger internal buffer overflows in the processing libraries. An infected jpeg ist not an executable. It triggers code paths in the library responsible for rendering the jpeg and this *will* be done on your machine.

Quoting: TheSHEEEP
Quoting: marcusThis is not a myth. A number of recognized publications analyzed exactly this problem of the vulnerabilities introduced by bundled libraries. The this one from the reputable OAKLAND security conference as an example: http://legacydirs.umiacs.umd.edu/~tdumitra/papers/OAKLAND-2015.pdf
That paper brings examples that completely defeat the point:
QuoteThe user is running two versions of Adobe Reader, a default up-todate version and a vulnerable version. The attacker convinces the user to install a Firefox add-on that looks benign. The malicious add-on has filesystem access through the XPCOM API [4]. It locates the vulnerable and patched versions of the Adobe Reader library (nppdf32.dll) and overwrites the patched version with the vulnerable one.
Note the bold part.

You concentrate on a point of the paper that was not even subject of the discussion here. Look into section IIA and concretely all the discussion around deployment delay. This is exactly the problem bundled libraries invite.
The inactive program versions part of the paper was not subject of this discussion here. This is only relevant to circumvent auto patching techniques of bundled libraries. These are clearly not in effect here (for steam games or SNAPs / Flatpacks)

Quoting: TheSHEEEPNobody, ever, has been or will be hacked or in any way influenced by a game not using the most recent version of OpenSSL. Otherwise, these games would have updated their dependencies, been removed from Steam, made it to the news, etc..
This isn't true for every kind of program or application, obviously. Especially programs that are themselves important parts of operating systems are better served by automatically using the latest version.
But a game and many other kinds of "non-system" software certainly doesn't belong to that category.

Sorry, but this is just reckless and shows that you put a completely different weight on security than I do. I don't say that I wouldn't use games due to this, but denying the risks and making such sweeping statements as you do ("Nobody, ever, has been or will be hacked [...]" ) ist just reckless.

Quoting: TheSHEEEP
Quoting: marcusLets agree to disagree here: You value "It just works" more than "it is secure" and I do the other way around. It's just a matter of different priorities.
I do value my security.
I just also realize that my security isn't in the least reduced by my games coming packaged with their own dependencies. Because there is no realistic way to abuse those dynamic libs lying in some /lib folder of my installed game if I don't go ahead and install shady software from untrusted sources.
What I don't do is put on a tinfoil hat, fearing and preparing for theoretically possible attacks that never did and never will happen to me.

See above. Call me all the funny things you like, but denying that there are risks and that they are real is dangerous. You abuse those vulns by just sending packets to the port of your game server that trigger them (OpenSSL vulns), hand crafting user-provided images (libjpeg) or hand crafting user-provided strings like names (ui-rendering vulns).
Scoopta Jul 26, 2018
Can we please kill off snappy already. Canonical needs to stop making their own standards. Everyone else has decided to use flatpak. It would be really nice not to have two universal package managers. They've pretty much dropped Mir because of their switch to gnome but snappy seems to be sticking around.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.