Latest Comments by etonbears
Arch users have a kernel available that will allow AMDGPU and the PRO driver on older cards
21 Aug 2016 at 2:25 pm UTC
21 Aug 2016 at 2:25 pm UTC
The AMD Beta driver currently available on their website has worked fine for my R9 290 on the current Ubuntu kernel ( 4.4, I think ) since it was released several weeks ago after first appearing in the SteamOS brewmaster repo.
It's the first time in years I have had a stable, fast graphics driver and access toe the VTs, which always seemed to disappear when I loaded catalyst. It may still be beta, but I have had no crashes and now get a perfect task switching experience under unity ( where before flickering and z-order problems abounded ).
So, my experience has been very positive, so far, and I think it reflects well on the decision to unify the kernel portion of the AMD graphics stack.
It's the first time in years I have had a stable, fast graphics driver and access toe the VTs, which always seemed to disappear when I loaded catalyst. It may still be beta, but I have had no crashes and now get a perfect task switching experience under unity ( where before flickering and z-order problems abounded ).
So, my experience has been very positive, so far, and I think it reflects well on the decision to unify the kernel portion of the AMD graphics stack.
Feral Interactive are teasing another new Linux & Mac port with a new clue
4 Jun 2016 at 9:57 pm UTC
4 Jun 2016 at 9:57 pm UTC
Quoting: SamsaiYou could also ask why CA would have Ferel port TW:Warhammer when they started porting their own games with TW:Attila?Quoting: GBee...Saints Rows are not in-house ports. They were done by VP (Virtual Programming).
Why would Volition go to a third party to port Red Faction Guerilla when they already have in-house experience of porting their games to linux with the Saint's Row series?
Linux usage on Steam is better than people think
6 Mar 2016 at 9:37 pm UTC
6 Mar 2016 at 9:37 pm UTC
The real problem with the Steam Hardware Survey is in not really knowing what it reflects. Many Linux users have complained that the survey does not reflect their usage, while others point out it is random. But as we don't know what proportion of the user base Valve aim to hit with a survey each month, this is a difficult to judge.
Personally, I have been surveyed 3 times on Linux and about 6 times on Windows / Wine in about 6 years of having a Steam account. Subjectively I would say that I have had fewer surveys per hour of Linux gaming, but this may be because I rarely restart Linux ( just suspend ) compared with how frequently I used to restart Windows. Therefore I rarely restart Steam under Linux, and it is only on a Steam start that I have ever seen the survey.
Similarly, the Steam Hardware Survey only captures a snapshot of "active" steam users, not the passive ones that just use it for digital distribution and play offline, and finally, of course, the survey does not capture those that don't want to have Steam at all, but still play games.
But, even assuming the survey is biased against Linux, it is unlikely that the market is bigger than about 2% of gamers at present, and that will only change as people get machines that are pre-loaded with Linux like the Steam Machines. And that will be a slow build over a number of years. We shouldn't expect more than a small percentage of Windows users to change, as most either don't care or don't have the necessary confidence/skills.
It would be interesting to see Microsoft disable non-Windows Store applications in Windows 10, just to see what the reaction from developers and users is, but I really don't think they are so brave that they will risk destroying their dominant position in the only client market where they are relevant. I would expect the current situation to continue, and just hope that enough developers continue to support Linux ( even though the returns are not great ) on the understanding that it keeps Microsoft honest.
Personally, I have been surveyed 3 times on Linux and about 6 times on Windows / Wine in about 6 years of having a Steam account. Subjectively I would say that I have had fewer surveys per hour of Linux gaming, but this may be because I rarely restart Linux ( just suspend ) compared with how frequently I used to restart Windows. Therefore I rarely restart Steam under Linux, and it is only on a Steam start that I have ever seen the survey.
Similarly, the Steam Hardware Survey only captures a snapshot of "active" steam users, not the passive ones that just use it for digital distribution and play offline, and finally, of course, the survey does not capture those that don't want to have Steam at all, but still play games.
But, even assuming the survey is biased against Linux, it is unlikely that the market is bigger than about 2% of gamers at present, and that will only change as people get machines that are pre-loaded with Linux like the Steam Machines. And that will be a slow build over a number of years. We shouldn't expect more than a small percentage of Windows users to change, as most either don't care or don't have the necessary confidence/skills.
It would be interesting to see Microsoft disable non-Windows Store applications in Windows 10, just to see what the reaction from developers and users is, but I really don't think they are so brave that they will risk destroying their dominant position in the only client market where they are relevant. I would expect the current situation to continue, and just hope that enough developers continue to support Linux ( even though the returns are not great ) on the understanding that it keeps Microsoft honest.
XCOM 2 strategy game reviewed on Linux
13 Feb 2016 at 11:14 pm UTC
13 Feb 2016 at 11:14 pm UTC
Well, I'm enjoying XCOM 1, even though I'm no good at it, so it sounds like XCOM 2 will be a buy, if I ever finish XCOM 1....
Nvidia pushes more OpenGL and Vulkan blog posts for developers
13 Feb 2016 at 10:53 pm UTC Likes: 1
This is actually no different to the situation with D3D on Windows. Does anyone think that writing a 2016 game using the D3D8 API would result in a stellar experience? Probably not, but a lot of OpenGL code still uses compatibility contexts and long obsolete functions associated with them from the same era as D3D8. Perhaps this is the fault of the way OpenGL has historically been specified. There has always been a view that they should try to accommodate every existing OpenGL developer when updating the specification, by not breaking things; I think this just leads to confusion as to what you should and should not use.
The other issue ( apart from using the wrong APIs through ignorance ) that causes developers to write sub-optimal code is the inability of the open source drivers to support the current programming model, and the variability of the open source software versions in the different Linux distributions used. This would be irrelevant except that Intel graphics hardware is still about 20% of the gaming market on Linux, which is itself already a tiny market. If developers want to cover the whole Linux market, there is a strong argument to use older APIs, which are generally not performant on modern AMD and NVidia hardware.
However, it is fair to say that it is not necessarily obvious for a programmer new to OpenGL to work out what they should and should not use because there is no single source pushing out information and tools as there is with D3D. The canonical information source is the Khronos specifications, but these are quite dry as they are intended for driver implementers. AMD and NVidia are, of course, also good sources of tools and information, but they will not always be unbiased.
But the best way learn to use OpenGL is simply to read a lot of books. If you read just one, you will probably get a skewed view. Each author has a particular goal in mind, and can only give you subset of the information you need to be a competent 3D coder.
For the avoidance of doubt, if you are writing OpenGL code in 2016, you should be using OpenGL 4.x APIs only, you should be using a "core" OpenGL context ( which removes access to all the really obsolete APIs and ways of doing things ), and in general you should be organizing your data in large buffers, preferably using bindless resources and indirect drawing commands. If you do this, you will probably see little performance benefit from moving to Vulkan ( although there may be other reasons you may want to switch ). This is what NVidia say ( the second blog link in the above article ) and what AMD say ( "OpenGL Superbible" seventh edition, written by Graham Sellers, the AMD rep at Khronos for OpenGL and Vulkan - also the Vulkan specification editor ).
Unfortunately, not all 2016 OpenGL games will follow ( or be able to follow ) these recommendations.
13 Feb 2016 at 10:53 pm UTC Likes: 1
Quoting: barottoAs far as I am aware, OpenGL drivers ( at least the AMD and NVidia binaries ) perform exactly as you would expect. They do have bugs ( as all code does ), but in general they correctly implement the current OpenGL programming model ( 4.x ), and provide backward compatibility both for previous OpenGL programming models ( 1.x, 2.x, 3.x ) on current generation hardware, and for all OpenGL programming models on older generation hardware. A lot of the backward compatibility, of necessity, is implemented by the driver more slowly than if using the current programming model on current hardware ( surprise! ), because, amongst other reasons, any OpenGL feature where hardware acceleration is not available will be implemented in software on the CPU.Quoting: Mountain ManFrom what I understand, documentation and support have been the two biggest obstacles to OpenGL. Hopefully things are starting to turn around.Also drivers' low quality. And decades of stratified features: there are at least 10 possible ways to do something in OpenGL, but only 1 is the fast path, and this path is very narrow, and oftentimes difficult to identify (because of poor documentation and drivers' idiosyncrasies).
So the net result is, generally, shoddy performance.
This is actually no different to the situation with D3D on Windows. Does anyone think that writing a 2016 game using the D3D8 API would result in a stellar experience? Probably not, but a lot of OpenGL code still uses compatibility contexts and long obsolete functions associated with them from the same era as D3D8. Perhaps this is the fault of the way OpenGL has historically been specified. There has always been a view that they should try to accommodate every existing OpenGL developer when updating the specification, by not breaking things; I think this just leads to confusion as to what you should and should not use.
The other issue ( apart from using the wrong APIs through ignorance ) that causes developers to write sub-optimal code is the inability of the open source drivers to support the current programming model, and the variability of the open source software versions in the different Linux distributions used. This would be irrelevant except that Intel graphics hardware is still about 20% of the gaming market on Linux, which is itself already a tiny market. If developers want to cover the whole Linux market, there is a strong argument to use older APIs, which are generally not performant on modern AMD and NVidia hardware.
However, it is fair to say that it is not necessarily obvious for a programmer new to OpenGL to work out what they should and should not use because there is no single source pushing out information and tools as there is with D3D. The canonical information source is the Khronos specifications, but these are quite dry as they are intended for driver implementers. AMD and NVidia are, of course, also good sources of tools and information, but they will not always be unbiased.
But the best way learn to use OpenGL is simply to read a lot of books. If you read just one, you will probably get a skewed view. Each author has a particular goal in mind, and can only give you subset of the information you need to be a competent 3D coder.
For the avoidance of doubt, if you are writing OpenGL code in 2016, you should be using OpenGL 4.x APIs only, you should be using a "core" OpenGL context ( which removes access to all the really obsolete APIs and ways of doing things ), and in general you should be organizing your data in large buffers, preferably using bindless resources and indirect drawing commands. If you do this, you will probably see little performance benefit from moving to Vulkan ( although there may be other reasons you may want to switch ). This is what NVidia say ( the second blog link in the above article ) and what AMD say ( "OpenGL Superbible" seventh edition, written by Graham Sellers, the AMD rep at Khronos for OpenGL and Vulkan - also the Vulkan specification editor ).
Unfortunately, not all 2016 OpenGL games will follow ( or be able to follow ) these recommendations.
Vulkan webinar to take place this month, hour session talking about the API and SDK
10 Feb 2016 at 9:52 pm UTC Likes: 1
I am more concerned that the companies producing high quality games for Linux continue to do so, because I doubt that sales are particularly good. I take the view that I will monitor, but ignore game companies that don't produce Linux games, instead buying anything resembling AAA games that are ported to Linux.
Many of these are games I would likely not have bought if I was still buying in the larger-choice Windows game market, but now I have found new games/series to like, I find I am less bothered that some of my old favourites are not available.
10 Feb 2016 at 9:52 pm UTC Likes: 1
Quoting: InvisibleThat's easy : major disruption of processes, and additional QA, testing and support costs that are not justified by the revenue. Linux native is still a tiny market for games, and will remain so until reasonable numbers of games-capable PCs ship with Linux rather than Windows by default. That may be never.Quoting: berillionsThx for the fast answer.Quoting: InvisibleI am still waiting for this awesome graphics api :DThey said not :
Do you know if Blizzard will support there Games for Linux because they are a member of the Khronos Group??
https://www.gamingonlinux.com/articles/president-of-blizzard-responds-to-the-linux-petition-petition-owner-creates-childish-response.5080
I really can't understand why this big Company are not supporting Linux. They can only benefit of us Linux users -.-
I am more concerned that the companies producing high quality games for Linux continue to do so, because I doubt that sales are particularly good. I take the view that I will monitor, but ignore game companies that don't produce Linux games, instead buying anything resembling AAA games that are ported to Linux.
Many of these are games I would likely not have bought if I was still buying in the larger-choice Windows game market, but now I have found new games/series to like, I find I am less bothered that some of my old favourites are not available.
New Linux & SteamOS gamer survey for February
9 Feb 2016 at 9:17 am UTC
9 Feb 2016 at 9:17 am UTC
Quoting: BlackBloodRumVoted. :-).For now. We can always hope that this year's new 'Zen' CPUs put a bit more competition in the market... :)
I'll be the one person in the world still using an AMD CPU in the survey :-P.
XCOM 2 tested on R7 370 with Crimson and RadeonSI
6 Feb 2016 at 11:08 pm UTC Likes: 1
Today's high-end GPUs can process data at > 300 GBytes/sec while the PCIe bus and the single CPU core feeding commands and data to/from the GPU can only process about 6GBytes/sec. Conversely, Intel graphics, AMD APUs ( and the current console generation ) don't have to suffer the PCIe bus at all, and all CPU and GPU cores process data in the same memory. These massively different hardware environments actually require an application/game to use an adaptive design depending on the hardware present, but not all do.
Traditional game engine design has the CPU ( preferably using cores other than the one feeding OpenGL ) responsible for processing global game activity, game AI, collision detection etc for each frame, and only once the "state" of a frame is determined does the engine use the GPU to render that state as an image in the framebuffer, transferring additional data to the GPU as needed. This sort of design still works OK for Intel graphics and AMD APU because there the GPU shares CPU memory directly; but modern discrete GPUs are often heavily under-used with such a design, and can suffer significant stalling while waiting for the CPU to complete its tasks or PCIe to transfer data.
Recognition of this problem is why we have OpenGL 4.x, which is designed to allow developers to treat the GPU as a much more general purpose processor, performing tasks that would otherwise be carried out on the CPU. The more of a game's tasks that can be run on a discrete GPU, the less likely that GPU is to be restricted by waiting on the CPU and PCIe bus. However, if you don't adapt the game for the OpenGL 4 model and confinue to use the OpenGL 3.x model ( to cater for those installations that only have OpenGL 3, for example ), then you are almost certain to have unexpectedly poor performance with newer GPUs.
One of Ferel's other ports, Shadow of Mordor, is a good example of a demanding game that works very well. If the AMD drivers were "bad" and "buggy" as a lot of people seem to think, this game would be a disaster. In fact, ( sample of one ) it worked flawlessly. Frame rates were consistent, load times low, rendering seemed perfect. I struggled to understand why Feral said AMD was not supported.
As far as I understand it, the reason why drivers are being continually updated is to add bespoke code paths for the actual way a given game or application uses the driver, as opposed to the way they were supposed to use it. You could argue as to who is at fault here. It should be crystal clear from the specification how a developer should use it, and how a driver will behave, but I.m not sure it always is.
What does seem to be true is that NVidia do a better job at updating their driver suite to provide per game optimised driver configurations. But, whether this is because they have a better mechanism for this in their drivers, or just because they dedicate more resources to the task I don't know.
Just as I don't think it reasonable to always blame the drivers, I think that blaming Feral ( as some people are ) is also unreasonable. Ultimately, they are porting an existing design, with all it's faults and limitations, often also using engines or middle-ware from third parties. The quality of what comes out of Feral's process is certainly no worse than the PC game industry norm; it is a separate question as to whether you consider that norm acceptable.
6 Feb 2016 at 11:08 pm UTC Likes: 1
Quoting: edddeduckferalWhile I'm sure there are cases where driver implementations are at odds with the specifications, or where edge and corner cases in the myriad driver code paths have not been found before, I don't think it fair so much is blamed on the drivers.Quoting: PerkeleenVittupI wonder how swiftly will Feral react to these problems? It's very important to make this much more playable and soon as it's one of those day 1 titles for Linux gaming. I suppose it's not the drivers, but the game itself as other platforms also suffer the same way..Any rendering issues in the game are usually down to the drivers, we worked on AMD support (and Intel) throughout development but by release some driver issues still remained for AMD hardware. As the drivers are improved (or workarounds for driver bugs are found) then we can add official support.
The reason the game renders as well as it does on AMD right now with both the Crimson and Mesa drivers is down to the work completed during development by our developers and testers, however there are still issues outstanding preventing support.
As with all our other games we'll keep you all updated if/when AMD support is announced in the future as the issues above are closed off.
Today's high-end GPUs can process data at > 300 GBytes/sec while the PCIe bus and the single CPU core feeding commands and data to/from the GPU can only process about 6GBytes/sec. Conversely, Intel graphics, AMD APUs ( and the current console generation ) don't have to suffer the PCIe bus at all, and all CPU and GPU cores process data in the same memory. These massively different hardware environments actually require an application/game to use an adaptive design depending on the hardware present, but not all do.
Traditional game engine design has the CPU ( preferably using cores other than the one feeding OpenGL ) responsible for processing global game activity, game AI, collision detection etc for each frame, and only once the "state" of a frame is determined does the engine use the GPU to render that state as an image in the framebuffer, transferring additional data to the GPU as needed. This sort of design still works OK for Intel graphics and AMD APU because there the GPU shares CPU memory directly; but modern discrete GPUs are often heavily under-used with such a design, and can suffer significant stalling while waiting for the CPU to complete its tasks or PCIe to transfer data.
Recognition of this problem is why we have OpenGL 4.x, which is designed to allow developers to treat the GPU as a much more general purpose processor, performing tasks that would otherwise be carried out on the CPU. The more of a game's tasks that can be run on a discrete GPU, the less likely that GPU is to be restricted by waiting on the CPU and PCIe bus. However, if you don't adapt the game for the OpenGL 4 model and confinue to use the OpenGL 3.x model ( to cater for those installations that only have OpenGL 3, for example ), then you are almost certain to have unexpectedly poor performance with newer GPUs.
One of Ferel's other ports, Shadow of Mordor, is a good example of a demanding game that works very well. If the AMD drivers were "bad" and "buggy" as a lot of people seem to think, this game would be a disaster. In fact, ( sample of one ) it worked flawlessly. Frame rates were consistent, load times low, rendering seemed perfect. I struggled to understand why Feral said AMD was not supported.
As far as I understand it, the reason why drivers are being continually updated is to add bespoke code paths for the actual way a given game or application uses the driver, as opposed to the way they were supposed to use it. You could argue as to who is at fault here. It should be crystal clear from the specification how a developer should use it, and how a driver will behave, but I.m not sure it always is.
What does seem to be true is that NVidia do a better job at updating their driver suite to provide per game optimised driver configurations. But, whether this is because they have a better mechanism for this in their drivers, or just because they dedicate more resources to the task I don't know.
Just as I don't think it reasonable to always blame the drivers, I think that blaming Feral ( as some people are ) is also unreasonable. Ultimately, they are porting an existing design, with all it's faults and limitations, often also using engines or middle-ware from third parties. The quality of what comes out of Feral's process is certainly no worse than the PC game industry norm; it is a separate question as to whether you consider that norm acceptable.
XCOM 2 released for SteamOS & Linux, port report included
5 Feb 2016 at 12:58 pm UTC
That's not to say developers have no responsibility, and couldn't do more before release. At the very least, ironing bugs out through an opt-in private beta program of pre-order customers would seem a sensible step.
5 Feb 2016 at 12:58 pm UTC
Quoting: tuubiFrom a developer viewpoint GPU drivers are black boxes, and the Unreal Engine is another black box that uses the first black box. In such circumstances, trying to force the black boxes into the optimal behaviour is quite difficult, particularly given the variety of hardware in use, which leads to different issues on different configurations.Quoting: PolochampsYes I did. Still if the game does not perform well on any combination of hardware or software, there's no reason to blame anyone but the developer. If you feel the need to blame someone that is.Quoting: tuubiYou read the article?Quoting: PolochampsBut in this case, if you look at the graphical demands of the game and the performance output on the higher end cards, one may think otherwise that this isn't just about bugs but driver issue as well.Blaming drivers for performance problems across operating systems and graphics hardware vendors is disingenuous at best.
That's not to say developers have no responsibility, and couldn't do more before release. At the very least, ironing bugs out through an opt-in private beta program of pre-order customers would seem a sensible step.
XCOM 2 released for SteamOS & Linux, port report included
5 Feb 2016 at 12:21 pm UTC Likes: 1
If you use an engine, the same can probably be achieved by rendering every item type once to force the shaders to be compiled and activated. This would obviously look naff unless you can render the geometry with full transparency, or turn off frambuffer updates.
5 Feb 2016 at 12:21 pm UTC Likes: 1
Quoting: LukeNukemSounds like it's the old OpenGL issue of only compiling shaders when they get used (hence bombing framerates for new weapons)Compiling, Linking and Loading shaders to the GPU are all active programmer choices ( i.e. you actively make OpenGL API calls ), so they can be completed ahead of time if you manage your own shaders.
If you use an engine, the same can probably be achieved by rendering every item type once to force the shaders to be compiled and activated. This would obviously look naff unless you can render the geometry with full transparency, or turn off frambuffer updates.
- Nexus Mods retire their in-development cross-platform app to focus back on Vortex
- Canonical call for testing their Steam gaming Snap for Arm Linux
- Windows compatibility layer Wine 11 arrives bringing masses of improvements to Linux
- European Commission gathering feedback on the importance of open source
- GOG plan to look a bit closer at Linux through 2026
- > See more over 30 days here
- Weekend Players' Club 2026-01-16
- CatKiller - Welcome back to the GamingOnLinux Forum
- simplyseven - A New Game Screenshots Thread
- JohnLambrechts - Will you buy the new Steam Machine?
- mr-victory - Game recommendation?
- JSVRamirez - See more posts
How to setup OpenMW for modern Morrowind on Linux / SteamOS and Steam Deck
How to install Hollow Knight: Silksong mods on Linux, SteamOS and Steam Deck