Latest Comments by TheSHEEEP
Nightdive Studios have released Forsaken Remastered with Linux support
31 Jul 2018 at 7:46 pm UTC
31 Jul 2018 at 7:46 pm UTC
Well, that was a blast from the past.
I played the original on the N64 - and I did quite like it.
I played the original on the N64 - and I did quite like it.
Space RPG 'Star Traders: Frontiers' to leave Early Access next week
26 Jul 2018 at 12:49 pm UTC
26 Jul 2018 at 12:49 pm UTC
MY BODY IS READY!
Seriously, these guys have never disappointed.
Seriously, these guys have never disappointed.
The big Steam Client update is out for everyone with the new Steam Chat
25 Jul 2018 at 11:05 am UTC
25 Jul 2018 at 11:05 am UTC
Quoting: demonThis gives you the old interface where you can actually go offline for the friends-thingy.You can still go invisible in the new chat - and set notifications so they won't annoy you if someone starts playing some game.
Snap! The new Minecraft launcher now has another easy way to be installed on Linux
25 Jul 2018 at 10:40 am UTC
There is a reason no game allows its players to just upload any file, and if a file is uploaded, it is at the very least converted or checked. Because if that isn't done, that would indeed be an exploit. But not one that would have been prevented by updating a library.
As soon as the user willingly says "yes" to something, aka gives access rights, everything can happen.
Instead of switching a vulnerable and patched version of some library, the add-on might have as well just put its own version into the system (which would have been a way better idea from the hacker's perspective). That there was a non-patched version somewhere on the user's PC is completely irrelevant as there would be countless other ways.
That the paper tries to put the blame on a fully irrelevant fact just shows that an example has been picked to "prove" the point they wanted to prove to begin with. Just another example of biased "research".
There is no protection for users screwing up their own security.
Nobody, ever, has been or will be hacked or in any way influenced by a game not using the most recent version of OpenSSL. Otherwise, these games would have updated their dependencies, been removed from Steam, made it to the news, etc..
This isn't true for every kind of program or application, obviously. Especially programs that are themselves important parts of operating systems are better served by automatically using the latest version.
But a game and many other kinds of "non-system" software certainly doesn't belong to that category.
I just also realize that my security isn't in the least reduced by my games coming packaged with their own dependencies. Because there is no realistic way to abuse those dynamic libs lying in some /lib folder of my installed game if I don't go ahead and install shady software from untrusted sources.
What I don't do is put on a tinfoil hat, fearing and preparing for theoretically possible attacks that never did and never will happen to me.
25 Jul 2018 at 10:40 am UTC
Quoting: marcusHow do I update a library in a SNAP/FLATPAK/docker container manually?I don't know, but I was under the impression that a library installed by SNAP for my program is just available to my program, not to others. I may be wrong, of course. In which case, I really don't see what makes SNAP different from the the good old apt-get, just installing stuff globally for everyone.
Quoting: marcusHow do i keep track of all the different installed versions of a library in different applications?If you install a game that has some /lib folder with libraries it uses and puts its dependencies there, nothing will be "installed" on your system to be abused by other programs. Someone would manually have to replace the system version with that game's old version (ignoring for a moment that you usually can't just do that), and if someone has the access rights to do THAT, they'd also have the power to just replace the system version with their own binary entirely, making the whole point irrelevant.
Quoting: marcusAnd that infected jpg will not get you anywhere, as neither the server you upload to nor the users that will see it later will ever "execute" the .jpg file on their computers.Quoting: TheSHEEEPSo what if ffmpeg, the standalone executable, has a security leak in a very weird circumstance? The only thing I do in my application is using it for a very specific use case that is fully under my control.What about a multiplayer game that uses avatars? This is not under your control. I can inject an infected jpeg that exploits the library there.
There is a reason no game allows its players to just upload any file, and if a file is uploaded, it is at the very least converted or checked. Because if that isn't done, that would indeed be an exploit. But not one that would have been prevented by updating a library.
Quoting: marcusThis is not a myth. A number of recognized publications analyzed exactly this problem of the vulnerabilities introduced by bundled libraries. The this one from the reputable OAKLAND security conference as an example: http://legacydirs.umiacs.umd.edu/~tdumitra/papers/OAKLAND-2015.pdf [External Link]That paper brings examples that completely defeat the point:
The user is running two versions of Adobe Reader, a default up-todate version and a vulnerable version. The attacker convinces the user to install a Firefox add-on that looks benign. The malicious add-on has filesystem access through the XPCOM API [4]. It locates the vulnerable and patched versions of the Adobe Reader library (nppdf32.dll) and overwrites the patched version with the vulnerable one.Note the bold part.
As soon as the user willingly says "yes" to something, aka gives access rights, everything can happen.
Instead of switching a vulnerable and patched version of some library, the add-on might have as well just put its own version into the system (which would have been a way better idea from the hacker's perspective). That there was a non-patched version somewhere on the user's PC is completely irrelevant as there would be countless other ways.
That the paper tries to put the blame on a fully irrelevant fact just shows that an example has been picked to "prove" the point they wanted to prove to begin with. Just another example of biased "research".
There is no protection for users screwing up their own security.
Quoting: marcusHere some examples from some of the games shipped by steam that I have installed:This proves exactly my point.
ShadowOfMordor, Alien Isolation, Life is Strange, Borderlands 2 --- OpenSSL 1.0.1 14 Mar 2012
Expeditions: Conquistador, Element4L, TIS-100,Oxenfree --- OpenSSL 0.9.8o (32 bit) and 1.0.0g (64 bit)
Pillars Of Eternity --- OpenSSL 1.0.0g
Talos Principle --- OpenSSL 1.0.1i (Last patch release of 1.0.1 was 'u')
Millie The Centipede --- OpenSSL 0.9.8o
These games use the network (at least Borderlands 2 does). Same goes for the steam API here btw. They should really provide security updates for it ... No sane distribution still ships OpenSSL 1.0.1. They all moved on to 1.0.2. Steam itself uses 1.0.2j btw. We are at 'o' as the current patch release --- which is also installed by my distro.
Nobody, ever, has been or will be hacked or in any way influenced by a game not using the most recent version of OpenSSL. Otherwise, these games would have updated their dependencies, been removed from Steam, made it to the news, etc..
This isn't true for every kind of program or application, obviously. Especially programs that are themselves important parts of operating systems are better served by automatically using the latest version.
But a game and many other kinds of "non-system" software certainly doesn't belong to that category.
Quoting: marcusLets agree to disagree here: You value "It just works" more than "it is secure" and I do the other way around. It's just a matter of different priorities.I do value my security.
I just also realize that my security isn't in the least reduced by my games coming packaged with their own dependencies. Because there is no realistic way to abuse those dynamic libs lying in some /lib folder of my installed game if I don't go ahead and install shady software from untrusted sources.
What I don't do is put on a tinfoil hat, fearing and preparing for theoretically possible attacks that never did and never will happen to me.
Snap! The new Minecraft launcher now has another easy way to be installed on Linux
25 Jul 2018 at 5:14 am UTC
So what if some dependent library gets an update? If the update is important for your application, you can just update it to the recent version.
If the update isn't important to your application (which is the vast majority of cases), you don't need to do anything.
So what if ffmpeg, the standalone executable, has a security leak in a very weird circumstance? The only thing I do in my application is using it for a very specific use case that is fully under my control. I don't put my version of it into some system folder, only my application uses it. My application's way of using it isn't insecure at all. Or even better, I use the dynamic library, not the executable, so there's no way anyone can use it directly - if you don't start copying my application's dependencies into your system folder, in which case it is clearly the user's fault and not my problem any more.
A very clear case of my application not needing the update.
And the second sentence is exactly the point - it is less work for everyone involved without any downside. Plus the library dev still has to backport if too many users of the library for some reason cannot switch to a newer version.
The "annoying" thing of having to type your password each time is actually way more secure than Windows' popup where you have to click a button.
And lastly, Linux distros are developed differently, with a lot more different eyes on the code. Leaks are more easily found and fixed this way. Open source development is, in the end, more secure.
First of all, some software you cannot even link statically without breaching license (FFmpeg, for example, forces you to give away object files of your project, or go open source yourself - check this [External Link].
Also, software (usually) needs updating. That usually means your own code has changed, and sometimes it means a dependency has updated and you wanted that change.
If you link all your dependencies statically, each and every update will be - depending on the size of your dependencies - gigantic. While space is virtually unlimited, bandwidth unfortunately still isn't. And this propagates - from your repo (if you got binaries there, maybe in a git LFS) to the build server, to the install build server (if separate) to the CDN to every user. Trust me, that is a big no-no.
The only benefit of static linking is that people won't know what libraries you use (if for some reason you want to hide that???) - so it is actually less honest than dynamic linking.
25 Jul 2018 at 5:14 am UTC
Quoting: marcus"The application works" is the one and only important criterium.Quoting: TheSHEEEPBut what is the argument?It does obviously not work under Windows. On Windows every application ships its special funny version of ffmpeg or libjpeg for example. Do you think this ever gets updated to fix bugs? I think adapting the Windows approach (and in the end all these snaps, flatpacks, docker images (lets subsume them under "containers") and whatnots are nothing else + a bit more isolation) is a bad thing. It encourages packaging libraries and then abandoning them because "the application works".
It saves space? In a time when space is disk virtually free (except for SSDs, disks are basically giveaways), this is a non-issue.
It obviously works for Windows, and binaries on Windows can be strangely large.
So what if some dependent library gets an update? If the update is important for your application, you can just update it to the recent version.
If the update isn't important to your application (which is the vast majority of cases), you don't need to do anything.
Quoting: marcusIn a distro, a library that has a vulnerability gets either updated or (lacking updates) removed. Sure this breaks you application if ABI breaks or if there is no fixed version, but for good reason! It removes a vulnerability from your system.But we're not talking about distros here, we're talking about distributing applications. For the OS itself I actually think this variant of relying on external packages makes sense. But not for single applications that do not even want to become part of the system (like games).
So what if ffmpeg, the standalone executable, has a security leak in a very weird circumstance? The only thing I do in my application is using it for a very specific use case that is fully under my control. I don't put my version of it into some system folder, only my application uses it. My application's way of using it isn't insecure at all. Or even better, I use the dynamic library, not the executable, so there's no way anyone can use it directly - if you don't start copying my application's dependencies into your system folder, in which case it is clearly the user's fault and not my problem any more.
A very clear case of my application not needing the update.
Quoting: marcusThey suggest that breaking APIs and ABIs is fine. You can just package the right version and the library dev can then go on developing without having to backport fixes to the old code.Breaking APIs and ABIs IS FINE. You can take a look at Windows API programming prior to Windows 8 to see the absurd legacy shit Windows devs had to deal with because Microsoft was afraid to break APIs.
And the second sentence is exactly the point - it is less work for everyone involved without any downside. Plus the library dev still has to backport if too many users of the library for some reason cannot switch to a newer version.
Quoting: marcusThis is a huge problem and going that route will invite many of the security holes we find in the Windows world into the Linux world.This is mostly hearsay without any basis. The reasons linux has less breaches than Windows is first and foremost that it is a smaller target - if it ever becomes big, that will change in an instant - and the second part is that no application can just go and change system files, run cronjobs, etc. without the user's approval.
The "annoying" thing of having to type your password each time is actually way more secure than Windows' popup where you have to click a button.
And lastly, Linux distros are developed differently, with a lot more different eyes on the code. Leaks are more easily found and fixed this way. Open source development is, in the end, more secure.
Quoting: marcusIf you want to do this, be at least honest: statically link the libraries in. Because in the end, all those funny container formats are doing the same.... just not using static linking. Containers are already there. They are called static binaries. No dependency hell required.This is terrible advice and shows that you have zero experience developing software for end users.
First of all, some software you cannot even link statically without breaching license (FFmpeg, for example, forces you to give away object files of your project, or go open source yourself - check this [External Link].
Also, software (usually) needs updating. That usually means your own code has changed, and sometimes it means a dependency has updated and you wanted that change.
If you link all your dependencies statically, each and every update will be - depending on the size of your dependencies - gigantic. While space is virtually unlimited, bandwidth unfortunately still isn't. And this propagates - from your repo (if you got binaries there, maybe in a git LFS) to the build server, to the install build server (if separate) to the CDN to every user. Trust me, that is a big no-no.
The only benefit of static linking is that people won't know what libraries you use (if for some reason you want to hide that???) - so it is actually less honest than dynamic linking.
Snap! The new Minecraft launcher now has another easy way to be installed on Linux
24 Jul 2018 at 3:45 pm UTC
It saves space? In a time when space is disk virtually free (except for SSDs, disks are basically giveaways), this is a non-issue.
It obviously works for Windows, and binaries on Windows can be strangely large.
I mean the fragmentation of communities and development resources. A hundred distros, where just a handful would be so much better because they would have received much more dev resources.
Imagine all that work that goes into all the small distros with a handful of users each would be focused and organized so it would benefit only a few distros. Those few distros would be MUCH better than they are now, and very likely so flexible and configurable, that none of the smaller distros would even be requested by anyone.
But no, everyone has to bake their own little ego cake and the community as a whole as well as the spread of linux suffers from it. Developers shy away from "all those distros" (as mistaken as that impression might be), manufacturers don't even get the idea to pick a distro for distribution because there is no "official" distribution, etc.
I very much recommend this article: https://www.dedoimedo.com/computers/linux-fragmentation-sum-egos.html [External Link]
24 Jul 2018 at 3:45 pm UTC
Quoting: ExidanBut what is the argument?Quoting: TheSHEEEPhmm... I always liked the point of no redundancy about linux, and one of the strongest argument against using windows.Quoting: ExidanThe worst point about linux (well, next to the fragmentation) is that terrible idea of avoiding redundancy by assuming you just have to the right versions of the right libraries.Quoting: PJgot to admit I have mixed feelings when I read news like this.I don't like how they handle libraries and dependencies. isn't the whole point of the linux ecosystem to avoid redundancy? if they ship every library with the snap (and they look first for the shipped library before looking into the system), they will end up with a whole lot of redundancy.
On the one hand it is awesome to hear about new ways of getting your software without hassle of hunting dependencies, configs etc and appreciate diversity in Linux.
But on the other hand - damn it, can't we agree on a single universal package format, not 3? It has this deb vs rpm stench all over it. Certainly I'd be happier if all the effort would went into making a single working, universally recognized format before adding new ones. Possibly Flatpak, even though personally I enjoy AppImages the most (due to its simplicity) - as it seems the most widely accepted across distros and does not bear the usual Canonical controversy mark...
It is completely impractical when you actually want to distribute software.
When you distribute software, your software was built against certain versions of certain libraries.
There is simply no way to guarantee that a user has those certain versions of those certain libraries on their computer. Nor is there a way to guarantee that there will always be your specific required version (architecture, version, etc.) available anywhere.
Nor is it realistic to expect devs to make sure that there is a PPA or whatever with exactly the versions they need.
Nor can you be sure that none of the symlinks on a user's system isn't somehow broken, pointing to a wrong version, etc.
Nor can you be sure that some update to a library won't break compatibility.
Nor can devs be expected to always make sure their software works with the most recent versions of all dependencies - devs must be able to move on to new projects, not maintain their old projects forever.
There are thousands of problems with this approach and it just barely works for open source projects IF and only if they are well maintained - for all others, it really doesn't. It is a "weakest link" approach - all goes well until the weakest link in the chain breaks - and "weakest link" approaches are generally terrible.
The ONLY way to make sure your distributed software works as intended is to distribute the exact versions of dependencies with it. Or use Docker or smth. similar (though that isn't applicable for all cases).
I rather have some megabytes "wasted", if what I get is software that is guaranteed to work on my user's machines without a hassle and without influencing anything else on the user's machines.
Oh, and because I know some tinfoil hat will come with the security argument:
If one of my dependencies has a security problem, I can update that dependency and forward that update to users. It is my responsibility as a dev to watch out for stuff like that.
But 95% of all software doesn't even do anything that could pose a security threat even if there was an exploit. And for the other 5% this happens so rarely that using a different approach doesn't come close to the benefits of distributing dependencies with your software.
It saves space? In a time when space is disk virtually free (except for SSDs, disks are basically giveaways), this is a non-issue.
It obviously works for Windows, and binaries on Windows can be strangely large.
Quoting: Exidanand fragmentation on linux? only if your hdd is nearly full, really. I don't see other way to do it (besides the "linux way").No, you misunderstood. I didn't mean the fragmentation of disks.
I mean the fragmentation of communities and development resources. A hundred distros, where just a handful would be so much better because they would have received much more dev resources.
Imagine all that work that goes into all the small distros with a handful of users each would be focused and organized so it would benefit only a few distros. Those few distros would be MUCH better than they are now, and very likely so flexible and configurable, that none of the smaller distros would even be requested by anyone.
But no, everyone has to bake their own little ego cake and the community as a whole as well as the spread of linux suffers from it. Developers shy away from "all those distros" (as mistaken as that impression might be), manufacturers don't even get the idea to pick a distro for distribution because there is no "official" distribution, etc.
I very much recommend this article: https://www.dedoimedo.com/computers/linux-fragmentation-sum-egos.html [External Link]
Snap! The new Minecraft launcher now has another easy way to be installed on Linux
24 Jul 2018 at 7:52 am UTC Likes: 6
It is completely impractical when you actually want to distribute software.
When you distribute software, your software was built against certain versions of certain libraries.
There is simply no way to guarantee that a user has those certain versions of those certain libraries on their computer. Nor is there a way to guarantee that there will always be your specific required version (architecture, version, etc.) available anywhere.
Nor is it realistic to expect devs to make sure that there is a PPA or whatever with exactly the versions they need.
Nor can you be sure that none of the symlinks on a user's system isn't somehow broken, pointing to a wrong version, etc.
Nor can you be sure that some update to a library won't break compatibility.
Nor can devs be expected to always make sure their software works with the most recent versions of all dependencies - devs must be able to move on to new projects, not maintain their old projects forever.
There are thousands of problems with this approach and it just barely works for open source projects IF and only if they are well maintained - for all others, it really doesn't. It is a "weakest link" approach - all goes well until the weakest link in the chain breaks - and "weakest link" approaches are generally terrible.
The ONLY way to make sure your distributed software works as intended is to distribute the exact versions of dependencies with it. Or use Docker or smth. similar (though that isn't applicable for all cases).
I rather have some megabytes "wasted", if what I get is software that is guaranteed to work on my user's machines without a hassle and without influencing anything else on the user's machines.
Oh, and because I know some tinfoil hat will come with the security argument:
If one of my dependencies has a security problem, I can update that dependency and forward that update to users. It is my responsibility as a dev to watch out for stuff like that.
But 95% of all software doesn't even do anything that could pose a security threat even if there was an exploit. And for the other 5% this happens so rarely that using a different approach doesn't come close to the benefits of distributing dependencies with your software.
24 Jul 2018 at 7:52 am UTC Likes: 6
Quoting: ExidanThe worst point about linux (well, next to the fragmentation) is that terrible idea of avoiding redundancy by assuming you just have to the right versions of the right libraries.Quoting: PJgot to admit I have mixed feelings when I read news like this.I don't like how they handle libraries and dependencies. isn't the whole point of the linux ecosystem to avoid redundancy? if they ship every library with the snap (and they look first for the shipped library before looking into the system), they will end up with a whole lot of redundancy.
On the one hand it is awesome to hear about new ways of getting your software without hassle of hunting dependencies, configs etc and appreciate diversity in Linux.
But on the other hand - damn it, can't we agree on a single universal package format, not 3? It has this deb vs rpm stench all over it. Certainly I'd be happier if all the effort would went into making a single working, universally recognized format before adding new ones. Possibly Flatpak, even though personally I enjoy AppImages the most (due to its simplicity) - as it seems the most widely accepted across distros and does not bear the usual Canonical controversy mark...
It is completely impractical when you actually want to distribute software.
When you distribute software, your software was built against certain versions of certain libraries.
There is simply no way to guarantee that a user has those certain versions of those certain libraries on their computer. Nor is there a way to guarantee that there will always be your specific required version (architecture, version, etc.) available anywhere.
Nor is it realistic to expect devs to make sure that there is a PPA or whatever with exactly the versions they need.
Nor can you be sure that none of the symlinks on a user's system isn't somehow broken, pointing to a wrong version, etc.
Nor can you be sure that some update to a library won't break compatibility.
Nor can devs be expected to always make sure their software works with the most recent versions of all dependencies - devs must be able to move on to new projects, not maintain their old projects forever.
There are thousands of problems with this approach and it just barely works for open source projects IF and only if they are well maintained - for all others, it really doesn't. It is a "weakest link" approach - all goes well until the weakest link in the chain breaks - and "weakest link" approaches are generally terrible.
The ONLY way to make sure your distributed software works as intended is to distribute the exact versions of dependencies with it. Or use Docker or smth. similar (though that isn't applicable for all cases).
I rather have some megabytes "wasted", if what I get is software that is guaranteed to work on my user's machines without a hassle and without influencing anything else on the user's machines.
Oh, and because I know some tinfoil hat will come with the security argument:
If one of my dependencies has a security problem, I can update that dependency and forward that update to users. It is my responsibility as a dev to watch out for stuff like that.
But 95% of all software doesn't even do anything that could pose a security threat even if there was an exploit. And for the other 5% this happens so rarely that using a different approach doesn't come close to the benefits of distributing dependencies with your software.
Egosoft have confirmed that X4: Foundations will be on Linux
23 Jul 2018 at 9:26 am UTC
Extremely clunky, extremely confusing, extremely buggy and very often unintentionally funny - and all of that waaayyy past release. One can only imagine the state it must have been in at launch.
Will definitely be awaiting some impressions of the released game before trying it.
23 Jul 2018 at 9:26 am UTC
Quoting: scaine...snip...Yeah, that was pretty much my experience as well (at first I thought I remembered X3, but those bad memories were actually from Rebirth).
So, nah. I'll give up on this. Probably the worst £25 I've spent in a long time. X4 looks better, but if they don't do something about how clunky and unhelpful everything feels, I'll be staying clear.
Extremely clunky, extremely confusing, extremely buggy and very often unintentionally funny - and all of that waaayyy past release. One can only imagine the state it must have been in at launch.
Will definitely be awaiting some impressions of the released game before trying it.
Card-based strategy game Faeria gets massive “2.0” update, moves away from a F2P model
23 Jul 2018 at 6:34 am UTC
Though I agree it would be nice if they just let you play the tutorial again.
23 Jul 2018 at 6:34 am UTC
Quoting: JanneMy main issue is that you can't reset your account and start over. I played it a little when it first came out. Then forgot about it for a long time. Now, if I want to play it again, I can't restart with the tutorial; I'm dropped in where I was last time I played. But I longer remember how to play - I need to redo that introduction to get back up to speed. I can't, so I no longer play.Just play a few games against AI or puzzles and you'll get back into it.
Though I agree it would be nice if they just let you play the tutorial again.
Retro FPS 'Ion Maiden' is officially getting multiplayer, a delay in the final release and a limited run boxed copy
22 Jul 2018 at 8:44 am UTC
What I do whenever I write some program that is supposed to run on linux is to compile and link my own binaries setting Rpath to a specific path (like $ORIGIN/lib) and copy all* dynamic libs the binaries depend on into that /lib folder as well as the dynamic libs those libs depend on (recursively).
Then I use patchelf ( https://nixos.org/patchelf.html [External Link] ) to adjust the Rpath of each lib in that /lib folder to point to $ORIGIN.
Done. Your binaries and the libs they depend on will look in that lib folder first before trying to load any potentially incompatible libs on the user's system.
If that sounds like quite a bit of work, it is, but I wrote a Python script for it that just gets executed after each build.
And all of this would be completely unnecessary if it was just standard on Linux (as it is on Windows) to look for libs first in your own folder before checking system folders. Oh, well.
* A few libraries you cannot just copy there, like OpenGL, GCC etc. as they depend on each system. Wouldn't make too much sense to package the build computer's NVidia version of libGL.so for all users ;)
22 Jul 2018 at 8:44 am UTC
Quoting: Mblackwell*Phew* Glad you were able to find the right file. I wonder if we are able to build in such a way that it wouldn't be a problem. If I have time I'll point someone at your issue so maybe it won't happen to someone else.There is a way. Rpath: https://en.wikipedia.org/wiki/Rpath [External Link]
What I do whenever I write some program that is supposed to run on linux is to compile and link my own binaries setting Rpath to a specific path (like $ORIGIN/lib) and copy all* dynamic libs the binaries depend on into that /lib folder as well as the dynamic libs those libs depend on (recursively).
Then I use patchelf ( https://nixos.org/patchelf.html [External Link] ) to adjust the Rpath of each lib in that /lib folder to point to $ORIGIN.
Done. Your binaries and the libs they depend on will look in that lib folder first before trying to load any potentially incompatible libs on the user's system.
If that sounds like quite a bit of work, it is, but I wrote a Python script for it that just gets executed after each build.
And all of this would be completely unnecessary if it was just standard on Linux (as it is on Windows) to look for libs first in your own folder before checking system folders. Oh, well.
* A few libraries you cannot just copy there, like OpenGL, GCC etc. as they depend on each system. Wouldn't make too much sense to package the build computer's NVidia version of libGL.so for all users ;)
- GOG now using AI generated images on their store [updated]
- CachyOS founder explains why they didn't join the new Open Gaming Collective (OGC)
- The original FINAL FANTASY VII is getting a new refreshed edition
- GPD release their own statement on the confusion with Bazzite Linux support [updated]
- Bazzite Linux founder releases statement asking GPD to cease using their name
- > See more over 30 days here
How to setup OpenMW for modern Morrowind on Linux / SteamOS and Steam Deck
How to install Hollow Knight: Silksong mods on Linux, SteamOS and Steam Deck