You can sign up to get a daily email of our articles, see the Mailing List page!

I recently talked about the Steam release of Smith and Winston but I didn't realise until late last night, that the developer actually made a very interesting blog post about supporting multiple platforms.

Interesting enough, that it warranted an extra post to talk a little about it. Why? Well, a bit of a situation started when game porter Ethan Lee made a certain Twitter post, which is a bit of a joke aimed at developers who see Linux as "Too niche" while practically falling over themselves to get their games on every other new platform that appears. This Twitter post was shared around (a lot) and some developers (like this) ended up mentioning how Linux doesn't sell a lot of games and it continued spreading like wildfire.

There's been a lot of counter-arguments for Linux too, like this and this and this and this and a nice one thrown our way too. Oh and we even spoke to Tommy Refenes who said the next SMB should come to Linux too, so that was awesome. Additionally, Ethan Lee also wrote up a post about packaging Linux games, worth a read if you're new to packaging for Linux.

Where was I again? Right, the blog post from the developer of Smith and Winston about how they support Windows, Mac and Linux. They go over details about how they do so, from using SDL2 which they say "takes 90% of the pain away from platform support" to the cross-platform rendering library bgfx. It's just a really interesting insight into how developing across multiple platforms doesn't have to be overly difficult.

I especially liked these parts:

I’ve been writing games and engines for 30+ years so none of this is new, I have a lot of experience. But you only get the experience by doing it and not making excuses.

By forcing the game through different compilers (Visual C++, Clang and GCC) you find different code bugs (leave the warnings on max). By forcing the runtime to use different memory allocators, threading libraries and rendering technologies you find different runtime bugs. This makes your code way more stable on every platform. Even if you never deploy your code to a non windows platform just running it on Linux or macOS will expose crashes instantly that rarely appear on Windows. So delivering a quality product on the dominant platform is easier if your support the minor platforms as well.

They also clearly mention, that they might not even make their money back on the Linux port of Smith and Winston. However, they're clear that the other reasons (code quality, easier porting to other platforms and so on) do help make up for it. This is a similar point the people from Stardock also made on Reddit.

See the post here in full. If you wish to check out their game, Smith and Winston, it's available on and Steam in Early Access.

Article taken from
27 Likes, Who?
We do often include affiliate links to earn us some pennies. We are currently affiliated with GOG, Humble Store and Paradox Interactive. See more information here.
Page: «2/2
  Go to:

cprn 10 January 2019 at 5:22 pm UTC
TL;DR: Using only one environment gives you an idea of what works in that specific environment. Only by fully understanding another environment you can formulate an opinion about how good is whatever you've used until now and which of your practices are good in general as opposed to just being good in isolation.

I worked with a 2D Python pet project with a weird glitch that every now and then made all animations jump a pixel for 2-3 frames. IMHO it looked good and gave the game a unique feeling but the original developer wasn't happy about how it occasionally f*cked up his precious pixel perfect collisions. I ported the rendering bits of what he used to PySDL2 for entirely unrelated reason - to see how it'd impact the performance - but surprisingly that not only got rid of the glitch but also gave a bunch of nice warnings revealing his XY math returning floats (and as we all know, or at least should know, floats that are supposed to represent integers aren't always equal to these integers). I'm not going to dis the formerly used rendering code, which is a part of pretty okay library, but if I never ported it to something else we wouldn't know what caused the glitch because the formerly used library just floored all pixel coordinates. The argument my colleague gave for not going with SDL2 from the very beginning was the one of limited resources in a one-man army studio, mostly time, and a reluctance of wasting them on learning "new stuff" when he could just re-use those little bits he wrote himself for some other game thus having something he knew by heart and could debug easily, etc.
F.Ultra 10 January 2019 at 8:11 pm UTC
BeamboomI don't understand how different compilers can expose different bugs in the same(?) code. I mean, a bug is a bug isn't it? Or is it because the use of different libraries expose bugs caused by those particular libraries/APIs? If so, how will the code run smoother on a different set of libraries if the bug is related to that other library?

I don't get this?

This is a good question: At a very basic level different compilers warn about different things. Although the C/C++ standard defines what is an error very clearly different compilers will warn about different things. Often this isn't that important and isn't going to save your bacon but a good developer listens to their compiler. There is a reason the compiler isn't happy and it can reveal false assumptions and you'll find yourself saying "I have no idea how that ever ran".

Different compilers do have different standard libraries though and they can be more of less forgiving. Visual C++ STL (Standard Template Library) has extensive debugging output in debug builds that catch errors quickly and precisely at the expense of debug builds being very slow. This also has the effect of using more memory and changed memory usage can also hide or expose different kinds of bugs. So macOS and Linux not having these is "good" as in different and in the case difference is good but not better or worse (I am not saying one compiler is better than another, just different).

Another big difference between standard libraries is the way they allocate memory within the application heap. So the OS gives your application a chunk of memory that only it can use called a heap and the application allocate's and frees blocks within that heap. Each standard library has different algorithms for how allocate and free work and you can even replace them with your own if you are brave/foolish/clever/stupid/genius. So when you allocate a piece of memory, use it, free it and then illegally use this piece of freed memory you get different behaviour. Annecdotally on Windows you get away with this a lot more often than on UNIX where you will more often (but not always) crash almost instantly making it easier to track down the problem. On embedded platforms (consoles for example) where memory is tighter you also get different behaviour as the OS vendor will tweak the memory allocator to be more aggressive about recycling memory than on a desktop where "memory is limitless".

Hope that helps and vaguely makes sense?

Matches 100% my experience from 30+ years of coding. Especially fun with the "change of memory regions" is when the code crashes reliably but when you add some simple "printf ()":s to write out some values just before the crash happens then it stops to crash

And I'd say that the single most important thing that happened to me when I switched from Windows to Linux back in the day was getting access to Valgrind. That is one very very good tool!
aluminumgriffin 10 January 2019 at 10:48 pm UTC
BeamboomI don't understand how different compilers can expose different bugs in the same(?) code. I mean, a bug is a bug isn't it? Or is it because the use of different libraries expose bugs caused by those particular libraries/APIs? If so, how will the code run smoother on a different set of libraries if the bug is related to that other library?

I don't get this?

Each dialect, compiler, and platform behaves somewhat differently - this is pretty much why each coder has its favorite compiler.

Three things that tends to bite people hard if they don't regularly jump platforms are memory alignment, size of datatypes and bit-order/byte-order.

Using a spoiler tag to hide a bit more in depth explanation of two of the issues
Spoiler, click me

The "int" datatype. Depending on which compiler, dialect, version of language, and platform you're on it varies in size. It historically was "the native word size of the platform". Which means that on 16bit machines it should be 16bit, on 32bit platforms it should be 32bit, on 64bit platforms it should be 64bit. However this does not hold true today since it now is defined to only be guaranteed to hold -32768 to 32767 (16bit, signed).
Do note that this already makes it weird it for machines with a word size smaller than 16bit, and also to make it even funnier on 32bit machines where it can be either 16bit or 32bit, and on 64bit machines it is normally (almost - but not quite - always) 32bit as well.
So, an "int" can only be assumed to be "at least 16bit", and on embedded you really should read the datasheets and specs anyways.
Now add to that you often also can change the compiler behaviour by selecting different methods of packing.
And yes, "int" is the most common datatype.

Memory alignment:
Take the simple declaration "int a, b;" and tell me how that is arranged in memory. it is 4 bytes (16bit*2), is it 8bytes (32bit*2), or is it 16bytes (64bit*2) (this might happen either is the int is 8bits or if it is aligned to match with memory boundaries). Also, do the 'a' come before the 'b' in in physical memory? is it something between them? (padding and such) and overflowing 'a' in a such a way that would not be caught, will that alter 'b' or will it cause a memory access violation?

Even funnier is when the runtime of your compiler does not exactly match the settings of the specific build of the libraries you're using (so yes, you can end up with b=a+a; working but calling a function that does b=a+a; will crash - even when fed with the exact same datatypes and values).

Long story short - every place where you've made an assumption can bite you when you jump platforms (this is why you often see a "datatypes.h" in multi-platform projects)

How this makes stuff run smoother - assume an overflowing variable isn't caught by the runtime of one compiler but rather overwrites whatever is next in memory, in this compiler it will corrupt the data in the following variable and this corruption can cause an undesired behaviour in a place that isn't even near the code that overflowed (even worse, where the undesired behaviour arise can be in correct code). But if you try the same code in a compiler with a more strict runtime it will crash at the overflow itself.
fabertawe 11 January 2019 at 12:14 pm UTC
I liked the look of this game anyway and will definitely be buying when it's out of early access (or earlier, funds and time permitting)
F.Ultra 12 January 2019 at 1:29 pm UTC
Also 100% hilarious to read all the mansplaining tweets from Windows devs to Ethan Lee "informing" him on how impossible it is to develop for Linux and how fragmented it is. Having porting games to Linux for over a decade have probably not given him any form of experience in the field so let's inform him!!!
  Go to:
While you're here, please consider supporting GamingOnLinux on Patreon, Liberapay or Paypal. We have no adverts, no paywalls, no timed exclusive articles. Just good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!

Due to spam you need to Register and Login to comment.

Or login with...

Livestreams & Videos
Community Livestreams
  • Story Time: „The Book of Unwritten Tales“
  • Date:
See more!
Popular this week
View by Category
Latest Forum Posts