Latest Comments by etonbears
X4: Foundations for Linux will not see a beta at release today, still coming though
6 Dec 2018 at 12:18 pm UTC
I like their games and will continue to buy them, but not until it works in a supported fashion on Linux.
6 Dec 2018 at 12:18 pm UTC
Quoting: rustybroomhandlePlus, he moved across to the core dev team and will be making sure that Linux and Mac ports are not made difficult. And, of course, using Vulkan also helps on Linux.Quoting: linuxjacquesThey need time to start thinking about when they are going to start working on the Linux beta?Well, if their Linux dev is still the same guy they hired prior to Albion Prelude, you don't have to worry, he's really good.
Doesn't sound like it's coming any time soon. :-|
I like their games and will continue to buy them, but not until it works in a supported fashion on Linux.
Intel's new discrete GPU will have a focus on Linux gaming
6 Dec 2018 at 11:57 am UTC
Do you have any idea how big the performance gap is for your setup? (assuming you are prepared to use the proprietary driver for testing)
6 Dec 2018 at 11:57 am UTC
Quoting: BrisseYes, once XWayland is in a good state, the issue between the project team and NVidia probably will become more important, unless the Nouveau team can get close in performance. I am led to believe ( no proof, of course) that some of the NVidia performance advantage is due to recognising the executable and adapting the driver code paths to suit it, which will be difficult to match with Nouveau, especially without full documentation.Quoting: Whitewolfe80I know wayland has progressed by still the best way to play games on linux with with xorg yes i know its old as dirt but it works. Wayland needs about another 3 to 5 years of development to reduce its impact and to intergrate better into steam etc. Just my opinion.x.org is still preferable but Wayland is progressing and eventually even old games running through xwayland will probably run just as well on (x)Wayland as on native x.org. This recent merge request [External Link] is a great step in the right direction for GNOME on Wayland but currently it doesn't help much in games because xwayland applications still seem to be stuck at 60fps but it's something that's being worked on.
Do you have any idea how big the performance gap is for your setup? (assuming you are prepared to use the proprietary driver for testing)
Intel's new discrete GPU will have a focus on Linux gaming
4 Dec 2018 at 11:32 am UTC Likes: 1
Neither want to do what the other asks. *shrug*.
I assume the open source driver, Nouveau, does work with Wayland, but then you lose a lot of the performance advantage from buying NVIDIA in the first place.
4 Dec 2018 at 11:32 am UTC Likes: 1
Quoting: Purple Library GuyWayland use a MESA API to allocate memory, NVIDIA want them to add EGLStreams as well.Quoting: iiariWell, one thing to consider is the Wayland issue. Yeah, I know, Wayland is taking forever to dominate, but it is getting used more and more and adoption of this sort of thing tends to accelerate after a certain point, so if we're talking time horizons like 2020 . . .Quoting: pete9102020 is indeed far away in computer years...I'm pretty happy to wait and see how Intel's new GPU turns out first.You're going to be waiting a while :P
Just go get a AMD card!
However, and this almost certainly isn't the place for this, I don't see the point of an AMD card. I'm thinking of getting a new desktop and it appears on Linux even the fastest AMD card is slower than the Nvidia 1070 I've been running the past two years without issue. Except if you're very devoted to the idea of FOSS for drivers, and if you're not cost retrained, why go AMD? Again, taking the FOSS vs non-FOSS out of the equation... Honestly wondering.
My understanding of just what the problem is is fuzzy, but I hear Nvidia don't play well with Wayland.
Neither want to do what the other asks. *shrug*.
I assume the open source driver, Nouveau, does work with Wayland, but then you lose a lot of the performance advantage from buying NVIDIA in the first place.
Valve have adjusted their revenue share for bigger titles on Steam
3 Dec 2018 at 6:36 pm UTC Likes: 3
3 Dec 2018 at 6:36 pm UTC Likes: 3
As most people know from the leak of Valve's handbook for new recruits in 2012, they operate a policy where each employee is encouraged to work on what they think is most important for their customers, and there is no real project management in the strict sense.
This fits in with the fact that for Valve, everything seems to happen when it happens, rather than to any externally obvious plan. What is clear is that since Gabe made his initial views on Windows 10 known, there has been a switch in emphasis in Valve; they have focused on hardware projects, making Linux a practical gaming environment, and expanding Steam support for Mac and Linux, in addition to their previous core competences of Steam on Windows and games development.
As a result there has been not much games output, but they did release a new version of the source engine in 2015 and now their Artifact card game. There has almost certainly been other activity towards other games that we have not seen, but I suspect those projects were starved of the necessary personnel, who made the choice to work on projects they considered more important. It must be quite hard to work in Valve if you really believe in a game project, but see the people you need to help you drift off to other projects. This may explain why there are periodic rumours of work on Half-Life and other games, but nothing concrete emerges, and maybe that was also why some people left.
It is obviously quite frustrating for passionate gamers who see other publisher's titles turned into franchises that produce regular releases, but see Valve not seem to care. But then there have been many changes in gaming over the last 10 years to accommodate a mass-market customer base that is very different from when Half-Life was released. We certainly will get more games from Valve ( Gabe says so, and it is first on the list of what they do on their home page ), but that doesn't mean we will necessarily get the sort of games we individually want.
This fits in with the fact that for Valve, everything seems to happen when it happens, rather than to any externally obvious plan. What is clear is that since Gabe made his initial views on Windows 10 known, there has been a switch in emphasis in Valve; they have focused on hardware projects, making Linux a practical gaming environment, and expanding Steam support for Mac and Linux, in addition to their previous core competences of Steam on Windows and games development.
As a result there has been not much games output, but they did release a new version of the source engine in 2015 and now their Artifact card game. There has almost certainly been other activity towards other games that we have not seen, but I suspect those projects were starved of the necessary personnel, who made the choice to work on projects they considered more important. It must be quite hard to work in Valve if you really believe in a game project, but see the people you need to help you drift off to other projects. This may explain why there are periodic rumours of work on Half-Life and other games, but nothing concrete emerges, and maybe that was also why some people left.
It is obviously quite frustrating for passionate gamers who see other publisher's titles turned into franchises that produce regular releases, but see Valve not seem to care. But then there have been many changes in gaming over the last 10 years to accommodate a mass-market customer base that is very different from when Half-Life was released. We certainly will get more games from Valve ( Gabe says so, and it is first on the list of what they do on their home page ), but that doesn't mean we will necessarily get the sort of games we individually want.
Valve have adjusted their revenue share for bigger titles on Steam
1 Dec 2018 at 5:34 pm UTC Likes: 3
Valve probably charge more than their service is worth to a large developer, but are quite valuable to smaller concern.
1 Dec 2018 at 5:34 pm UTC Likes: 3
Quoting: KimyrielleWhat can I say? It's consistent with non-digital businesses. Large corporations get tax breaks and subsidies. Small companies get lots of red tape and will be milked by the taxman. By making the rich richer, Steam is just doing what everyone else does.True, but it is more a reflection of the balance between costs and value. If big publishers could ship XBox and PlayStation titles direct without paying MS and Sony they probably would.
Valve probably charge more than their service is worth to a large developer, but are quite valuable to smaller concern.
Steam Link hardware officially walks the plank, there's an app for that
21 Nov 2018 at 11:24 am UTC
21 Nov 2018 at 11:24 am UTC
Valve seems to have taken a scattergun approach to encouraging their concept of open gaming, but I still think they would prefer a SteamOS device. I can't see many companies wanting to produce one with a cheap alternative available, even if streaming has a lot of issues for some people.
I guess Steamplay of games via proton is also much easier to validate on SteamOS-driven partner hardware.
Maybe we will see another round of Steam Machines in 12 months or so.
I guess Steamplay of games via proton is also much easier to validate on SteamOS-driven partner hardware.
Maybe we will see another round of Steam Machines in 12 months or so.
Canonical have released some statistics from the Ubuntu installer survey
19 Oct 2018 at 3:34 pm UTC
19 Oct 2018 at 3:34 pm UTC
TL;DR. Counting the number of hardware threads available as the number of CPUs is not a particularly accurate measure, but it's probably good enough for the purposes of this sort of survey.
Long Opinion:
The actual information collected is quite sparse; I have included, as an example, the full information generated by ubuntu-report on the machine I am using at the end of this post. It does fully report processor geometry ( Sockets/Cores/HW Threads), but doesn't even include, for example, the clock speed the cores can run at.
Anyone that has looked at articles from the likes of Tom's Hardware, Anandtech and Phoronix will know that it is actually very hard to really compare CPU performance through simple metrics that take no account of chip architecture.
The physical CPU that you drop in a motherboard socket these days bears no relation to the physical CPU of 10-15 years ago, when 2-way and 4-way multi-socket motherboards were the only real way to get higher performing PCs. Those CPUs had a single core plus cache memory, etched on a single piece of silicon, and that's it.
The reason we have this CPU terminology problem today was the development of the MCM ( multi-chip module ), whereby several separately etched pieces of silicon, or one piece etched in several steps, are electrically integrated into a single physical package.
A modern physical CPU contains not just multiple copies of what used to be called a CPU, but also now contains the functions that used to be performed by separate physical chips on the motherboard ( the Northbridge and Southbridge ), and even GPU logic and memory. The MCM is also the reason why there are so many different physical CPUs to choose from, as Intel and AMD can assemble different components into a physical CPU for different purposes ( server, desktop, router etc ).
The state of the art for this integration is something like the upcoming Zen 2 range of processors from AMD, which will be able to ship a single physical server CPU with 64 cores, each with 2 hardware threads. Ubuntu-report would count this single physical package as "128 CPUs".
It is important to realize that putting multiple CPU cores into a single physical package will, under almost all circumstances, be higher performance and cheaper than the eqivalent number of 1-core CPUs in separate sockets. The only limitations to MCM integration is the number of input/output pins available in your socket design, and the ability of a cooling solution to remove heat from the package.
So there is often very little to be gained from even 2 physical CPU sockets on a motherboard today, and you are generally talking about eye-watering price points if you go down this route.
As each CPU core provides an independent ability to perform computations, I don't think it is unreasonable to count it as an independent CPU. I have more reservations with hardware threads; although you can use them to schedule more than one concurrent workload to a CPU core, each of those workloads will take longer, so hardware threads will not always improve performance; it depends on the workloads available.
====================================================================
Sample Ubuntu-Report Data Collected
====================================================================
{
"Version": "18.04",
"OEM": {
"Vendor": "To Be Filled By O.E.M.",
"Product": "To Be Filled By O.E.M."
},
"BIOS": {
"Vendor": "American Megatrends Inc.",
"Version": "P2.30"
},
"CPU": {
"OpMode": "32-bit, 64-bit",
"CPUs": "8",
"Threads": "2",
"Cores": "4",
"Sockets": "1",
"Vendor": "GenuineIntel",
"Family": "6",
"Model": "45",
"Stepping": "7",
"Name": "Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz",
"Virtualization": "VT-x"
},
"Arch": "amd64",
"GPU": [
{
"Vendor": "10de",
"Model": "1b82"
}
],
"RAM": 32.9,
"Disks": [
128,
128,
500.1,
512.1
],
"Partitions": [
107.6,
0.1,
124.9,
500,
479.7
],
"Screens": [
{
"Size": "597mmx336mm",
"Resolution": "2560x1440",
"Frequency": "59.95"
}
],
"Autologin": false,
"LivePatch": true,
"Session": {
"DE": "ubuntu:GNOME",
"Name": "ubuntu",
"Type": "x11"
},
"Language": "en_GB",
"Timezone": "Europe/London"
Long Opinion:
The actual information collected is quite sparse; I have included, as an example, the full information generated by ubuntu-report on the machine I am using at the end of this post. It does fully report processor geometry ( Sockets/Cores/HW Threads), but doesn't even include, for example, the clock speed the cores can run at.
Anyone that has looked at articles from the likes of Tom's Hardware, Anandtech and Phoronix will know that it is actually very hard to really compare CPU performance through simple metrics that take no account of chip architecture.
The physical CPU that you drop in a motherboard socket these days bears no relation to the physical CPU of 10-15 years ago, when 2-way and 4-way multi-socket motherboards were the only real way to get higher performing PCs. Those CPUs had a single core plus cache memory, etched on a single piece of silicon, and that's it.
The reason we have this CPU terminology problem today was the development of the MCM ( multi-chip module ), whereby several separately etched pieces of silicon, or one piece etched in several steps, are electrically integrated into a single physical package.
A modern physical CPU contains not just multiple copies of what used to be called a CPU, but also now contains the functions that used to be performed by separate physical chips on the motherboard ( the Northbridge and Southbridge ), and even GPU logic and memory. The MCM is also the reason why there are so many different physical CPUs to choose from, as Intel and AMD can assemble different components into a physical CPU for different purposes ( server, desktop, router etc ).
The state of the art for this integration is something like the upcoming Zen 2 range of processors from AMD, which will be able to ship a single physical server CPU with 64 cores, each with 2 hardware threads. Ubuntu-report would count this single physical package as "128 CPUs".
It is important to realize that putting multiple CPU cores into a single physical package will, under almost all circumstances, be higher performance and cheaper than the eqivalent number of 1-core CPUs in separate sockets. The only limitations to MCM integration is the number of input/output pins available in your socket design, and the ability of a cooling solution to remove heat from the package.
So there is often very little to be gained from even 2 physical CPU sockets on a motherboard today, and you are generally talking about eye-watering price points if you go down this route.
As each CPU core provides an independent ability to perform computations, I don't think it is unreasonable to count it as an independent CPU. I have more reservations with hardware threads; although you can use them to schedule more than one concurrent workload to a CPU core, each of those workloads will take longer, so hardware threads will not always improve performance; it depends on the workloads available.
====================================================================
Sample Ubuntu-Report Data Collected
====================================================================
{
"Version": "18.04",
"OEM": {
"Vendor": "To Be Filled By O.E.M.",
"Product": "To Be Filled By O.E.M."
},
"BIOS": {
"Vendor": "American Megatrends Inc.",
"Version": "P2.30"
},
"CPU": {
"OpMode": "32-bit, 64-bit",
"CPUs": "8",
"Threads": "2",
"Cores": "4",
"Sockets": "1",
"Vendor": "GenuineIntel",
"Family": "6",
"Model": "45",
"Stepping": "7",
"Name": "Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz",
"Virtualization": "VT-x"
},
"Arch": "amd64",
"GPU": [
{
"Vendor": "10de",
"Model": "1b82"
}
],
"RAM": 32.9,
"Disks": [
128,
128,
500.1,
512.1
],
"Partitions": [
107.6,
0.1,
124.9,
500,
479.7
],
"Screens": [
{
"Size": "597mmx336mm",
"Resolution": "2560x1440",
"Frequency": "59.95"
}
],
"Autologin": false,
"LivePatch": true,
"Session": {
"DE": "ubuntu:GNOME",
"Name": "ubuntu",
"Type": "x11"
},
"Language": "en_GB",
"Timezone": "Europe/London"
A new stable Steam Client update is out, with fixes for Steam Play and more
12 Oct 2018 at 11:30 am UTC
12 Oct 2018 at 11:30 am UTC
I don't know the game in question, but it would be strange if an update to Proton did that.
Proton is an application like any other game on Steam, whereas your game saves are in a Windows disk emulation specific to that game.
I'm not saying its impossible, but it would be a really bad regression for Proton updates to wipe game prefixes, as they contain all game settings and saves...
If you didn't manually save in your initial session, is it possible the game had not reached the first auto
save checkpoint? Some games work that way.
Proton is an application like any other game on Steam, whereas your game saves are in a Windows disk emulation specific to that game.
I'm not saying its impossible, but it would be a really bad regression for Proton updates to wipe game prefixes, as they contain all game settings and saves...
If you didn't manually save in your initial session, is it possible the game had not reached the first auto
save checkpoint? Some games work that way.
Valve have pushed out a new Steam Play beta with DXVK 0.80 and more
1 Oct 2018 at 7:48 pm UTC
There is currently no "proper" way through the Steam Client to move or delete a Proton build, but
i have found if you delete both the Proton directory and its "appmanifest" file, it will be re-created correctly when it is next required ( but note my previous comment concerning NTFS ). The appmanifests are simple text files in the steamapps directory, named according to the steam product ID, one for each installed product - you'll may have to open them all to find the correct one.
If you forgot to remove the appmanifest, Steam probably got confused trying to update a non-existent Proton directory. I can also confirm that it will update the beta with new releases automatically; you don't actually need to do anything :D
1 Oct 2018 at 7:48 pm UTC
Quoting: GuestI deleted the 3.7beta folder to download the newest version to make sure I got the new version and found that I had to go into the folder and extract the tar.gz myself. So if you aren't able to get any games running look in that folder to make sure it's extracted.Valve seem to be maintaining 2 Proton Folders, 3.7 Stable and 3.7 Beta, which get updated automatically with maintenance builds. You will note that if you choose a proton build, you can only select from the most recent stable and beta builds. This is probably a good thing, otherwise your disk would fill up with point releases.
There is currently no "proper" way through the Steam Client to move or delete a Proton build, but
i have found if you delete both the Proton directory and its "appmanifest" file, it will be re-created correctly when it is next required ( but note my previous comment concerning NTFS ). The appmanifests are simple text files in the steamapps directory, named according to the steam product ID, one for each installed product - you'll may have to open them all to find the correct one.
If you forgot to remove the appmanifest, Steam probably got confused trying to update a non-existent Proton directory. I can also confirm that it will update the beta with new releases automatically; you don't actually need to do anything :D
Valve have pushed out a new Steam Play beta with DXVK 0.80 and more
29 Sep 2018 at 9:41 pm UTC Likes: 1
29 Sep 2018 at 9:41 pm UTC Likes: 1
Haven't tried the latest version yet, but have played a new character in Fallout 4 up to level 70+ using the previous Proton beta. Many games, like F4, still need significant manual intervention to set up, mainly due to missing Windows API implementations, and long-standing issues with automatically integrating wine with the wide variety of Distro/UI combinations.
It doesn't help that the previous Proton beta linked against libgnutls.so.26 rather than libgnutls.so, meaning secure sockets networking fails completely unless your Distro contains this exact version of the library :(
Edit:
Another point to mention is that Proton/Steamplay doesn't seem to like NTFS formatted partitions. It is safest to download any windows game you want to try out to an ext4 formatted disk or partition. Each version of Proton is actually installed as a separate application under .../steamapps/common/, using the Steam Folder of your most recent download ( if you have more than one Steam Folder ). This can lead to different versions of Proton ending up in different Steam Folders which might cause unexpected problems.
In my case, all my Steam games are usually installed to an NTFS-formatted SSD, so EVERYTHING failed to run under Proton/Steamplay until I formatted an old HDD with ext4 and moved Proton and the Windows games into a Steam Folder there.
Edit 2:
Proton 3.7.7 overwrites the previous beta, and as it still has the problem with linking to the wrong secure sockets library, you need to set up the symbolic link again.
It doesn't help that the previous Proton beta linked against libgnutls.so.26 rather than libgnutls.so, meaning secure sockets networking fails completely unless your Distro contains this exact version of the library :(
Edit:
Another point to mention is that Proton/Steamplay doesn't seem to like NTFS formatted partitions. It is safest to download any windows game you want to try out to an ext4 formatted disk or partition. Each version of Proton is actually installed as a separate application under .../steamapps/common/, using the Steam Folder of your most recent download ( if you have more than one Steam Folder ). This can lead to different versions of Proton ending up in different Steam Folders which might cause unexpected problems.
In my case, all my Steam games are usually installed to an NTFS-formatted SSD, so EVERYTHING failed to run under Proton/Steamplay until I formatted an old HDD with ext4 and moved Proton and the Windows games into a Steam Folder there.
Edit 2:
Proton 3.7.7 overwrites the previous beta, and as it still has the problem with linking to the wrong secure sockets library, you need to set up the symbolic link again.
- Nexus Mods retire their in-development cross-platform app to focus back on Vortex
- Canonical call for testing their Steam gaming Snap for Arm Linux
- Windows compatibility layer Wine 11 arrives bringing masses of improvements to Linux
- GOG plan to look a bit closer at Linux through 2026
- European Commission gathering feedback on the importance of open source
- > See more over 30 days here
- Venting about open source security.
- LoudTechie - Weekend Players' Club 2026-01-16
- CatKiller - Welcome back to the GamingOnLinux Forum
- simplyseven - A New Game Screenshots Thread
- JohnLambrechts - Will you buy the new Steam Machine?
- mr-victory - See more posts
How to setup OpenMW for modern Morrowind on Linux / SteamOS and Steam Deck
How to install Hollow Knight: Silksong mods on Linux, SteamOS and Steam Deck