Join us on the Linux Gaming community on Lemmy, the federated open source alternative to Reddit.

After Canonical announced they would be ending 32bit support earlier this year and then adjusting their plans after the backlash, they've now posted what packages they will look to continue supporting.

Canonical's Steve Langasek posted on their Discourse forum a list which they "have been able to determine there is user demand based on the feedback up to this point" and they will "carry forward to 20.04" and that includes other packages not directly in the list that they may depend on.

Additionally, their methodology for picking the packages included ensuring some well-known apps continue working like Unity, Godot, printer drivers and more. The list includes some noteworthy items like SDL 2, Wine, DXVK, Steam, some Mesa packages, a few open source games and so on.

See the full post here, where Langasek did mention to give feedback if you feel essential 32bit packages are missing from their list. It's good to see some clarity surrounding it, hopefully this won't cause any issues now.

Article taken from GamingOnLinux.com.
Tags: Distro News
24 Likes , Who?
We do often include affiliate links to earn us some pennies. We are currently affiliated with GOG and Humble Store. See more here.
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
36 comments
Page: «3/4»
  Go to:

Redface 17 Sep, 2019
Quoting: ShmerlUsing containers with frozen libraries isn't a good solution either. You want to benefit from all the innovation that goes into Mesa, Wine and the rest of the gaming stack. So either 32-bit libraries need to be maintained, or there must be some architecture translation of x86_32 into x86_64.
As far as I remember then libraries that interface with drivers will have to be the same version, so nvidia and mesa will have to be a current version in the container. The programs in the container run on the same kernel.
And in the Ubuntu world frozen does not mean unmaintained, even if it sometimes seems like it or even is. Security and other bugfixes will be backported to the "frozen" version.
Shmerl 17 Sep, 2019
Quoting: slaapliedjeThe dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

Problems might start creeping in, when upstream (i.e. library developers) will decide, that supporting 32-bit is too much of a burden. It's not as simple as "just build it" usually.


Last edited by Shmerl on 17 September 2019 at 7:46 pm UTC
Shmerl 17 Sep, 2019
Quoting: RedfaceAs far as I remember then libraries that interface with drivers will have to be the same version, so nvidia and mesa will have to be a current version in the container. The programs in the container run on the same kernel.
And in the Ubuntu world frozen does not mean unmaintained, even if it sometimes seems like it or even is. Security and other bugfixes will be backported to the "frozen" version.

Why do you need containers then, if libraries there will be recent? Multiarch already works fine for that. Containers make sense for frozen case, when there are no more upstream updates coming.


Last edited by Shmerl on 17 September 2019 at 7:45 pm UTC
Redface 17 Sep, 2019
Quoting: slaapliedjeThe dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

The build system is set up to support 32 bit installs on 32bit i386 processors which Debian supports for all distributions, and Ubuntu still for LTS 16.04 and LTS 18.04. But since there is no 32 bit installer for 18.04 or newer and no upgrade path for those on 32 bit processors most 32 bit packages since 18.10 will never be installed by a user.

But they still can fail to build and a maintainer has to look into why.

There are also really many of those. Ubuntu has for now 200 packages on the list, lets say that grows to 500.
I earlier counted 33365 available 32 bit packages on 19.04, so lets say 28000 wasted effort and resources. And Ubuntu always has always around 5 different releases supported or under development, so we now are around 100000 packages build that no one uses.

Apart from maintainer time this uses electricity, storage and bandwidth, which all could be used for something useful instead.

Apple is AFAIK completely disabling running 32 bit programs except from inside virtual machines, and soon there will be no MacOS version left with 32 bit support that still is supported. Ubuntu wants to stop building packages no one uses, and in the future finding another solution for the few hundred packages left that are needed to run all kind of 32 bit programs. But already in the first plan they wanted to make sure users still can run 32 bit programs, the solution for that was still not ready though, so they came up with this plan.
So really not like Apple.
How is that similar?
Redface 17 Sep, 2019
Quoting: ShmerlWhy do you need containers then, if libraries there will be recent? Multiarch already works fine for that. Containers make sense for frozen case, when there are no more upstream updates coming.

Mixed cases, some libraries do not always stay backward compatible, while the kernel tries to never break userspace.

So the games might work with current graphics drivers but not with some network library or whatever.

Did you never encounter games or other programs that have problems with newer libraries?

I am not convinced that this is the way forward to run old games, but it is better than virtual machines since containers can get access to the all parts of the system you want.

And containers where the original plan, for now until at least 20.04 Ubuntu will built all requested packages for 32 bit, and who knows for the next distribution after that maybe too, but I understand they do not want to commit to that now. 20.04 will be supported until April 2025 that is already a long time they will stick to this new scheme for that release at least.


Last edited by Redface on 17 September 2019 at 8:16 pm UTC
slaapliedje 17 Sep, 2019
View PC info
  • Supporter Plus
Quoting: Redface
Quoting: slaapliedjeThe dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

The build system is set up to support 32 bit installs on 32bit i386 processors which Debian supports for all distributions, and Ubuntu still for LTS 16.04 and LTS 18.04. But since there is no 32 bit installer for 18.04 or newer and no upgrade path for those on 32 bit processors most 32 bit packages since 18.10 will never be installed by a user.

But they still can fail to build and a maintainer has to look into why.

There are also really many of those. Ubuntu has for now 200 packages on the list, lets say that grows to 500.
I earlier counted 33365 available 32 bit packages on 19.04, so lets say 28000 wasted effort and resources. And Ubuntu always has always around 5 different releases supported or under development, so we now are around 100000 packages build that no one uses.

Apart from maintainer time this uses electricity, storage and bandwidth, which all could be used for something useful instead.

Apple is AFAIK completely disabling running 32 bit programs except from inside virtual machines, and soon there will be no MacOS version left with 32 bit support that still is supported. Ubuntu wants to stop building packages no one uses, and in the future finding another solution for the few hundred packages left that are needed to run all kind of 32 bit programs. But already in the first plan they wanted to make sure users still can run 32 bit programs, the solution for that was still not ready though, so they came up with this plan.
So really not like Apple.
How is that similar?
It is similar because they wanted to ditch 32 bit outright until everyone threatened to stop using it
slaapliedje 17 Sep, 2019
View PC info
  • Supporter Plus
Quoting: Redface
Quoting: slaapliedjeThe dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

The build system is set up to support 32 bit installs on 32bit i386 processors which Debian supports for all distributions, and Ubuntu still for LTS 16.04 and LTS 18.04. But since there is no 32 bit installer for 18.04 or newer and no upgrade path for those on 32 bit processors most 32 bit packages since 18.10 will never be installed by a user.

But they still can fail to build and a maintainer has to look into why.

There are also really many of those. Ubuntu has for now 200 packages on the list, lets say that grows to 500.
I earlier counted 33365 available 32 bit packages on 19.04, so lets say 28000 wasted effort and resources. And Ubuntu always has always around 5 different releases supported or under development, so we now are around 100000 packages build that no one uses.

Apart from maintainer time this uses electricity, storage and bandwidth, which all could be used for something useful instead.

Apple is AFAIK completely disabling running 32 bit programs except from inside virtual machines, and soon there will be no MacOS version left with 32 bit support that still is supported. Ubuntu wants to stop building packages no one uses, and in the future finding another solution for the few hundred packages left that are needed to run all kind of 32 bit programs. But already in the first plan they wanted to make sure users still can run 32 bit programs, the solution for that was still not ready though, so they came up with this plan.
So really not like Apple.
How is that similar?
By the way, the i386 packages and the packages for the 32bit version of the distribution are the same thing, so it is the fault of Ubuntu itself for ditching 32bit supported iso. Debian still supports it (as well as many other architectures) so and Ubuntu just rebuild the Debian packages. So is it REALLY that much more effort for them to continue to do so?
tonR 17 Sep, 2019
QuoteAdditionally, their methodology for picking the packages included ensuring some well-known apps continue working like Unity, Godot, printer drivers and more.
Well, for me it sounds like Canonical wanted to try "pay to maintain" model to some "profitable entities".

We need to understand that 32 bit architecture/application is (and still) unique situation in computer history as it was grown in popularity during the PC market boomed in late 1990's / early-2000 plus popularity and reliability of Win XP that no other Windows before and after can matched it's ability. As like Android apps today, many software developers builder at that time built their program/apps around Win XP and 32 bit. Many of those software still being used, reliable, sometimes up-to-date and more importantly many of those can run on Wine perfectly better than run on some modern Windows.

So, let's say 32bit "things" (and all it's argument of existence) will stay around for a very long time.
slaapliedje 17 Sep, 2019
View PC info
  • Supporter Plus
Quoting: Shmerl
Quoting: slaapliedjeThe dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

Problems might start creeping in, when upstream (i.e. library developers) will decide, that supporting 32-bit is too much of a burden. It's not as simple as "just build it" usually.
Never actually seen it this way, I have seen it the other way where things won't build on 64bit because of machine language and such, but most languages are more abstracted for that, and since the 64bit CPUs are just the extension of the old x86, I can't imagine there being many that are 'we can't compile this on 32bit, but works on 64bit just fine!' unless they start requiring more than 4gb of memory, there is no reason for it.
Shmerl 18 Sep, 2019
In theory, but in practice it's not always that simple.
While you're here, please consider supporting GamingOnLinux on:

Patreon, Liberapay or PayPal Donation.

This ensures all of our main content remains totally free for everyone with no article paywalls. We also don't have tons of adverts, there's also no tracking and we respect your privacy. Just good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.
Livestreams & Videos
Community Livestreams
Latest Forum Posts