Patreon Logo Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal Logo PayPal. You can also buy games using our partner links for GOG and Humble Store.
Latest Comments by TheSHEEEP
Snap! The new Minecraft launcher now has another easy way to be installed on Linux
25 Jul 2018 at 5:14 am UTC

Quoting: marcus
Quoting: TheSHEEEPBut what is the argument?

It saves space? In a time when space is disk virtually free (except for SSDs, disks are basically giveaways), this is a non-issue.
It obviously works for Windows, and binaries on Windows can be strangely large.
It does obviously not work under Windows. On Windows every application ships its special funny version of ffmpeg or libjpeg for example. Do you think this ever gets updated to fix bugs? I think adapting the Windows approach (and in the end all these snaps, flatpacks, docker images (lets subsume them under "containers") and whatnots are nothing else + a bit more isolation) is a bad thing. It encourages packaging libraries and then abandoning them because "the application works".
"The application works" is the one and only important criterium.
So what if some dependent library gets an update? If the update is important for your application, you can just update it to the recent version.
If the update isn't important to your application (which is the vast majority of cases), you don't need to do anything.

Quoting: marcusIn a distro, a library that has a vulnerability gets either updated or (lacking updates) removed. Sure this breaks you application if ABI breaks or if there is no fixed version, but for good reason! It removes a vulnerability from your system.
But we're not talking about distros here, we're talking about distributing applications. For the OS itself I actually think this variant of relying on external packages makes sense. But not for single applications that do not even want to become part of the system (like games).
So what if ffmpeg, the standalone executable, has a security leak in a very weird circumstance? The only thing I do in my application is using it for a very specific use case that is fully under my control. I don't put my version of it into some system folder, only my application uses it. My application's way of using it isn't insecure at all. Or even better, I use the dynamic library, not the executable, so there's no way anyone can use it directly - if you don't start copying my application's dependencies into your system folder, in which case it is clearly the user's fault and not my problem any more.
A very clear case of my application not needing the update.

Quoting: marcusThey suggest that breaking APIs and ABIs is fine. You can just package the right version and the library dev can then go on developing without having to backport fixes to the old code.
Breaking APIs and ABIs IS FINE. You can take a look at Windows API programming prior to Windows 8 to see the absurd legacy shit Windows devs had to deal with because Microsoft was afraid to break APIs.
And the second sentence is exactly the point - it is less work for everyone involved without any downside. Plus the library dev still has to backport if too many users of the library for some reason cannot switch to a newer version.

Quoting: marcusThis is a huge problem and going that route will invite many of the security holes we find in the Windows world into the Linux world.
This is mostly hearsay without any basis. The reasons linux has less breaches than Windows is first and foremost that it is a smaller target - if it ever becomes big, that will change in an instant - and the second part is that no application can just go and change system files, run cronjobs, etc. without the user's approval.
The "annoying" thing of having to type your password each time is actually way more secure than Windows' popup where you have to click a button.
And lastly, Linux distros are developed differently, with a lot more different eyes on the code. Leaks are more easily found and fixed this way. Open source development is, in the end, more secure.

Quoting: marcusIf you want to do this, be at least honest: statically link the libraries in. Because in the end, all those funny container formats are doing the same.... just not using static linking. Containers are already there. They are called static binaries. No dependency hell required.
This is terrible advice and shows that you have zero experience developing software for end users.
First of all, some software you cannot even link statically without breaching license (FFmpeg, for example, forces you to give away object files of your project, or go open source yourself - check this [External Link]).
Also, software (usually) needs updating. That usually means your own code has changed, and sometimes it means a dependency has updated and you wanted that change.
If you link all your dependencies statically, each and every update will be - depending on the size of your dependencies - gigantic. While space is virtually unlimited, bandwidth unfortunately still isn't. And this propagates - from your repo (if you got binaries there, maybe in a git LFS) to the build server, to the install build server (if separate) to the CDN to every user. Trust me, that is a big no-no.

The only benefit of static linking is that people won't know what libraries you use (if for some reason you want to hide that???) - so it is actually less honest than dynamic linking.

Snap! The new Minecraft launcher now has another easy way to be installed on Linux
24 Jul 2018 at 3:45 pm UTC

Quoting: Exidan
Quoting: TheSHEEEP
Quoting: Exidan
Quoting: PJgot to admit I have mixed feelings when I read news like this.
On the one hand it is awesome to hear about new ways of getting your software without hassle of hunting dependencies, configs etc and appreciate diversity in Linux.
But on the other hand - damn it, can't we agree on a single universal package format, not 3? It has this deb vs rpm stench all over it. Certainly I'd be happier if all the effort would went into making a single working, universally recognized format before adding new ones. Possibly Flatpak, even though personally I enjoy AppImages the most (due to its simplicity) - as it seems the most widely accepted across distros and does not bear the usual Canonical controversy mark...
I don't like how they handle libraries and dependencies. isn't the whole point of the linux ecosystem to avoid redundancy? if they ship every library with the snap (and they look first for the shipped library before looking into the system), they will end up with a whole lot of redundancy.
The worst point about linux (well, next to the fragmentation) is that terrible idea of avoiding redundancy by assuming you just have to the right versions of the right libraries.
It is completely impractical when you actually want to distribute software.

When you distribute software, your software was built against certain versions of certain libraries.
There is simply no way to guarantee that a user has those certain versions of those certain libraries on their computer. Nor is there a way to guarantee that there will always be your specific required version (architecture, version, etc.) available anywhere.
Nor is it realistic to expect devs to make sure that there is a PPA or whatever with exactly the versions they need.
Nor can you be sure that none of the symlinks on a user's system isn't somehow broken, pointing to a wrong version, etc.
Nor can you be sure that some update to a library won't break compatibility.
Nor can devs be expected to always make sure their software works with the most recent versions of all dependencies - devs must be able to move on to new projects, not maintain their old projects forever.
There are thousands of problems with this approach and it just barely works for open source projects IF and only if they are well maintained - for all others, it really doesn't. It is a "weakest link" approach - all goes well until the weakest link in the chain breaks - and "weakest link" approaches are generally terrible.

The ONLY way to make sure your distributed software works as intended is to distribute the exact versions of dependencies with it. Or use Docker or smth. similar (though that isn't applicable for all cases).
I rather have some megabytes "wasted", if what I get is software that is guaranteed to work on my user's machines without a hassle and without influencing anything else on the user's machines.

Oh, and because I know some tinfoil hat will come with the security argument:
If one of my dependencies has a security problem, I can update that dependency and forward that update to users. It is my responsibility as a dev to watch out for stuff like that.
But 95% of all software doesn't even do anything that could pose a security threat even if there was an exploit. And for the other 5% this happens so rarely that using a different approach doesn't come close to the benefits of distributing dependencies with your software.
hmm... I always liked the point of no redundancy about linux, and one of the strongest argument against using windows.
But what is the argument?
It saves space? In a time when space is disk virtually free (except for SSDs, disks are basically giveaways), this is a non-issue.
It obviously works for Windows, and binaries on Windows can be strangely large.

Quoting: Exidanand fragmentation on linux? only if your hdd is nearly full, really. I don't see other way to do it (besides the "linux way").
No, you misunderstood. I didn't mean the fragmentation of disks.
I mean the fragmentation of communities and development resources. A hundred distros, where just a handful would be so much better because they would have received much more dev resources.
Imagine all that work that goes into all the small distros with a handful of users each would be focused and organized so it would benefit only a few distros. Those few distros would be MUCH better than they are now, and very likely so flexible and configurable, that none of the smaller distros would even be requested by anyone.
But no, everyone has to bake their own little ego cake and the community as a whole as well as the spread of linux suffers from it. Developers shy away from "all those distros" (as mistaken as that impression might be), manufacturers don't even get the idea to pick a distro for distribution because there is no "official" distribution, etc.

I very much recommend this article: https://www.dedoimedo.com/computers/linux-fragmentation-sum-egos.html [External Link]

Snap! The new Minecraft launcher now has another easy way to be installed on Linux
24 Jul 2018 at 7:52 am UTC Likes: 6

Quoting: Exidan
Quoting: PJgot to admit I have mixed feelings when I read news like this.
On the one hand it is awesome to hear about new ways of getting your software without hassle of hunting dependencies, configs etc and appreciate diversity in Linux.
But on the other hand - damn it, can't we agree on a single universal package format, not 3? It has this deb vs rpm stench all over it. Certainly I'd be happier if all the effort would went into making a single working, universally recognized format before adding new ones. Possibly Flatpak, even though personally I enjoy AppImages the most (due to its simplicity) - as it seems the most widely accepted across distros and does not bear the usual Canonical controversy mark...
I don't like how they handle libraries and dependencies. isn't the whole point of the linux ecosystem to avoid redundancy? if they ship every library with the snap (and they look first for the shipped library before looking into the system), they will end up with a whole lot of redundancy.
The worst point about linux (well, next to the fragmentation) is that terrible idea of avoiding redundancy by assuming you just have to the right versions of the right libraries.
It is completely impractical when you actually want to distribute software.

When you distribute software, your software was built against certain versions of certain libraries.
There is simply no way to guarantee that a user has those certain versions of those certain libraries on their computer. Nor is there a way to guarantee that there will always be your specific required version (architecture, version, etc.) available anywhere.
Nor is it realistic to expect devs to make sure that there is a PPA or whatever with exactly the versions they need.
Nor can you be sure that none of the symlinks on a user's system isn't somehow broken, pointing to a wrong version, etc.
Nor can you be sure that some update to a library won't break compatibility.
Nor can devs be expected to always make sure their software works with the most recent versions of all dependencies - devs must be able to move on to new projects, not maintain their old projects forever.
There are thousands of problems with this approach and it just barely works for open source projects IF and only if they are well maintained - for all others, it really doesn't. It is a "weakest link" approach - all goes well until the weakest link in the chain breaks - and "weakest link" approaches are generally terrible.

The ONLY way to make sure your distributed software works as intended is to distribute the exact versions of dependencies with it. Or use Docker or smth. similar (though that isn't applicable for all cases).
I rather have some megabytes "wasted", if what I get is software that is guaranteed to work on my user's machines without a hassle and without influencing anything else on the user's machines.

Oh, and because I know some tinfoil hat will come with the security argument:
If one of my dependencies has a security problem, I can update that dependency and forward that update to users. It is my responsibility as a dev to watch out for stuff like that.
But 95% of all software doesn't even do anything that could pose a security threat even if there was an exploit. And for the other 5% this happens so rarely that using a different approach doesn't come close to the benefits of distributing dependencies with your software.

Egosoft have confirmed that X4: Foundations will be on Linux
23 Jul 2018 at 9:26 am UTC

Quoting: scaine...snip...

So, nah. I'll give up on this. Probably the worst £25 I've spent in a long time. X4 looks better, but if they don't do something about how clunky and unhelpful everything feels, I'll be staying clear.
Yeah, that was pretty much my experience as well (at first I thought I remembered X3, but those bad memories were actually from Rebirth).
Extremely clunky, extremely confusing, extremely buggy and very often unintentionally funny - and all of that waaayyy past release. One can only imagine the state it must have been in at launch.

Will definitely be awaiting some impressions of the released game before trying it.

Card-based strategy game Faeria gets massive “2.0” update, moves away from a F2P model
23 Jul 2018 at 6:34 am UTC

Quoting: JanneMy main issue is that you can't reset your account and start over. I played it a little when it first came out. Then forgot about it for a long time. Now, if I want to play it again, I can't restart with the tutorial; I'm dropped in where I was last time I played. But I longer remember how to play - I need to redo that introduction to get back up to speed. I can't, so I no longer play.
Just play a few games against AI or puzzles and you'll get back into it.

Though I agree it would be nice if they just let you play the tutorial again.

Retro FPS 'Ion Maiden' is officially getting multiplayer, a delay in the final release and a limited run boxed copy
22 Jul 2018 at 8:44 am UTC

Quoting: Mblackwell*Phew* Glad you were able to find the right file. I wonder if we are able to build in such a way that it wouldn't be a problem. If I have time I'll point someone at your issue so maybe it won't happen to someone else.
There is a way. Rpath: https://en.wikipedia.org/wiki/Rpath [External Link]
What I do whenever I write some program that is supposed to run on linux is to compile and link my own binaries setting Rpath to a specific path (like $ORIGIN/lib) and copy all* dynamic libs the binaries depend on into that /lib folder as well as the dynamic libs those libs depend on (recursively).
Then I use patchelf ( https://nixos.org/patchelf.html [External Link] ) to adjust the Rpath of each lib in that /lib folder to point to $ORIGIN.
Done. Your binaries and the libs they depend on will look in that lib folder first before trying to load any potentially incompatible libs on the user's system.

If that sounds like quite a bit of work, it is, but I wrote a Python script for it that just gets executed after each build.

And all of this would be completely unnecessary if it was just standard on Linux (as it is on Windows) to look for libs first in your own folder before checking system folders. Oh, well.

* A few libraries you cannot just copy there, like OpenGL, GCC etc. as they depend on each system. Wouldn't make too much sense to package the build computer's NVidia version of libGL.so for all users ;)

The Linux version of Civilization VI should get cross-platform online play in the next few weeks
21 Jul 2018 at 1:05 pm UTC

Quoting: KristianAn int/int fraction? That sounds like a Ratio in Haskell: http://hackage.haskell.org/package/base-4.11.1.0/docs/Data-Ratio.html [External Link]
Yeah, pretty much.

The Linux version of Civilization VI should get cross-platform online play in the next few weeks
21 Jul 2018 at 6:46 am UTC Likes: 1

Ahh, the good old cross platform floats.
Many ways around that problem, but every workaround is kinda "meh".
I've seen special float libraries (which are then a wee bit slower) or some just decide to ditch floats entirely for the cross-platform communication part (IMO the best solution, if possible) or some just use an int/int fraction and convert if necessary (which has the advantage of way better precision).

But yes, if that isn't planned from the beginning, it IS a problem for cross-platform play later on.

Stardew Valley's Multiplayer Update will be out with full Linux support on August 1st
21 Jul 2018 at 6:27 am UTC

Quoting: PhiladelphusFun as this game was before, it's even better in multiplayer. Been playing with a friend since the beta came out, we've made it through our first year so far and have been having a blast.
Yeah, similar here.
The pacing of the game is just really nice for multiplayer. You can decide to do something together throughout the day or split up and only do the farming stuff together in the morning and/or evening.

Slime Rancher adds in happy little helper drones
19 Jul 2018 at 10:53 am UTC

Quoting: liamdawe
Quoting: TheSHEEEPThe game is just ridiculously cute. I played it more than a year ago, and the cute sounds, art and animation just bring a smile to your face.
It's a bit like watching a video of puppies.

Mechanically, I found the game to be rather shallow. There is great breadth here and that can keep you busy for a while, but extremely little depth.
Maybe not unexpected and maybe that fits the cute style, but one should not expect this game to last for a very long time.
Have you played it since any of the recent content updates? Some of them have been quite big!
No, I didn't play it recently, just picked up news of some updates.

All of them introduced basically just "more". More slimes, more recipes, more areas.
That's what I mean with breadth.
Nothing of that makes any of the game mechanics more involved or deep than "pair blob A + B, get their valuable shitplort, build more stuff for even more plort variants".

At some point, you just lose interest, no matter how many more kinds of cute slimes the game throws at you.
Though that point was late enough even a year back to get my money's worth from the game.