Patreon Logo Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal Logo PayPal. You can also buy games using our partner links for GOG and Humble Store.
Latest Comments by EKRboi
New Nvidia 349.12 Beta Driver Released
26 Mar 2015 at 9:32 pm UTC

Quoting: filsdHow do you guys install the latest nvidia driver?
I'm on ubuntu 14.04 and the the PPA Xorg Edgers bugs on installation. :/
I had issues with edgers multiple times when I used xubuntu.

I don't know if there is another PPA these days that has Nvidia drivers, but I use to install/maintain my own by downloading them right from Nvidia and installing using the .run when I used xubuntu. It is not the "preferred" Linux way of using the package manager, but I never ran into issues other than possibly being on too new a kernel and needing to patch the drivers. If you are using the ubuntu provided Kernel that wouldn't be an issue for you though.

~rough steps from memory, I use Arch these days.~
1. download driver from Nvidia (64-bit 349.12 beta HERE [External Link])
2. Save any work and close all programs for good measure
3. Change to tty1 (ctrl+alt+f1)
4. sudo service stop lightdm (or whatever DM they use these days)
5. sudo apt-get purge nvidia* (don't remember exactly what it was called)
6. cd to dir with nvidia_driver_you_want_2_install.run
7. chmod +x nvidia_driver_you_want_2_install.run
8. sudo ./nvidia_driver_you_want_2_install.run
9. follow onscreen prompts
10. reboot and pray ;)

To uninstall just use "sudo ./nvidia_driver_you_want_2_install.run --uninstall" instead, and then you can reinstall from the ubuntu/ppa if you need/want to.

*DISCLAIMER* Don't attempt this if any of those steps make you uncomfortable and/or you are not confident you can recover from something going wrong.

New Nvidia 349.12 Beta Driver Released
26 Mar 2015 at 4:24 pm UTC

Quoting: WorMzy
Quoting: BeamboomCome on Nvidia, add Nvidia 3d Vision support to your Linux drivers please. I got this sweet little set of glasses collecting dust here. :(
Ah, but surely Windows users are crying out for this feature too. After all, feature parity is something that the nvidia devs are very determined to maintain [External Link]...
LOL. I know they wished they never used that sentence. I throw it around their Linux forum all the time.

Our control panel is exactly the sa.. err.. I mean at parity to the windows control panel. Oh wait.. No its not. We don't have a sane way to do per app/game settings.. I swear half of the settings in there do absolutely nothing except fill a checkbox. Hell you can't even setup SLI in there.

We have surround sli mode.. Oh yea.. No we don't because nvidia locks that to quadro cards in Linux because $$..

When you do enable SLI in Linux you can only use a single monitor.. Not like it matters.. Most of the time enabling SLI actually causes worse perf. (This is supposedly an opengl problem??)

I'm sure the list goes on... Those are the few that piss me off the most though.

Streaming Service Twitch Compromised, Change Your Passwords
24 Mar 2015 at 9:31 pm UTC

+1 to getting the email. I just got it about an hour ago.

BioShock Infinite Released For Linux, Thanks To Virtual Programming
20 Mar 2015 at 2:58 am UTC

Quoting: dubigrasuGreat :)

Edit: Still a bit undecided/confused, according to
;On PC, pool size only gets used if -ReadTexturePoolFromIni is passed in on the commandline. Otherwise it is auto calculated based on your video card memory.
I need to use the -ReadTexture....parameter if I want a custom PoolSize.

However, If I do that the usage never goes beyond 1000mb for me despite the PoolSize set to 3000.
While if I only set a custom (3000) PoolSize without using the -ReadTexture...parameter, the usage easily goes beyond 2000mb by the end of the level.

I played one level few times using both methods and it always seem to be somehow better without the -ReadTexture parameter, and even the stutter seem to go away.

Hm
I think I can concur. Like you said, the notes in the ini and what is actually happening seems to conflict. I am getting more VRAM usage (and less stuttering) with simply setting the pool size to 3000 than with pool size to 3000 AND using -ReadTexturePoolFromIni launch option.

Sadly, nothing I've tried allows me to run the game on 3 monitors in Linux. I know performance wouldn't be good even on 1 GTX970, but I had hoped if one day Nvidia made SLI on Linux actually work right or in 4 years when I can run it on 3 monitors and a single GPU it would work ;(

In Exilium, A Fun Looking RPG Released On Steam For Linux, Some Thoughts
19 Mar 2015 at 9:39 pm UTC

Well.. I've been asking for more hack-n-slash the last couple of weeks after blasting through Torchlight II. I wish it had amazing reviews, but I'll probably pick it up anyways just to help show there are those of us who want these types of games on Linux. Plus, maybe I'll get some enjoyment out of it for $7.49.

BioShock Infinite Released For Linux, Thanks To Virtual Programming
19 Mar 2015 at 4:38 pm UTC Likes: 1

[quote=dubigrasu]Well, some of the issues reported for the Linux port I've seen them (and can still do) reported by Windows users, for example the black screen at first boot, occasional hangs and stuttering, so I'm not sure if VP can do something about it.
I personally never had the hang issue in Windows, but I did had the black screen and the stuttering.

About VRAM usage:
Googled a bit about it and it seems that it boils down to increasing the PoolSize in the "XEngine.ini" file. The default for me was "400" and I raised it to 3000.
(it has to be a number lower than your card total VRAM memory with around 600MB.)
Previously I've never seen the VRAM going past 1000mb, now at times goes to 2700mb, but not higher than that. The textures are flushed every time a new level gets loaded.
The stuttering (still present) is also reduced.

It seem that this must be used in conjunction with the:
-ReadTexturePoolFromIni
parameter to the game's launch options.
Supposedly it reduces stutter, but I'm still undecided about this one.

There are a bunch of other tweaks (some old) floating around, but the "PoolSize"is the only one I've seen to has some positive effect.

Edit:
Found some interesting info about how memory is managed (in DefaultEngine.ini file):
[TextureStreaming]
UseTextureFileCache=TRUE
; We now auto calculate the texture pool size on PC.
; The equation is basically TexturePoolSize = Detected video memory - size of frame buffers - estimate for other resource useage like vertex buffers.
; TexturePoolSizeReductionMB is the estimate of how much we'll need for resources than the frame buffers
TexturePoolSizeReductionMB=40
;On PC, pool size only gets used if -ReadTexturePoolFromIni is passed in on the commandline. Otherwise it is auto calculated based on your video card memory.
PoolSize=400
; hard coded "safe" max texture pool size if running in low or very low
LowPCTexturePoolSizeMB=256
Ha, I remember that now from when the game first came to win and the game wasn't properly auto calculating pool size for some people. The same thing must be happening here. I changed the pool size to 3000 which is more than enough for this game and used -ReadTexturePoolFromIni in the launch options and it is now using 1600mb of VRAM in the beginning of the game (about right, it uses 2.5gb in win running 3 monitor surround on Ultra) and the MEGA stutter and subsequent MEGA FPS hit are no longer there. There is still a little stutter in some places but I would consider it playable now. With the way it was acting last night there would be no way I could play through it.

Thanks for the reminder!

BioShock Infinite Released For Linux, Thanks To Virtual Programming
19 Mar 2015 at 4:21 am UTC

It finally allowed me to install it today and I just got around to testing it out. I'm pretty damn impressed. It is not perfect and needs some work, but I'm still impressed.

intel 5930k, 16gb, gtx970

When the game is not streaming in textures it is parked @ 60fps and utilizing 50-75% of my GPU on Ultra preset. There is an annoying stutter when it is streaming in textures, it happens very often and when it does it tanks FPS for a second or two.

Now on to my theory of why it is happening. It is hardly loading anything into VRAM. With the game running on one monitor, steam on another and a terminal(htop) and nvidia settings open on the other I have not seen it use use more than 900mb of my VRAM(200mb is desktop). It should be far more than that @ ultra settings. So I don't think it is pre-loading textures or even keeping textures from prior scenes around. So it has to pull those textures from disk every time a scene really changes.

If that is the case and not that for some crazy reason Nvidia-settings is reporting VRAM usage wrong for this game only, then I can't Imagine they did this on purpose and has to be a bug that slipped in not long ago. So hopefully it will get fixed quickly. If they can fix that then I think it is going to be rock solid 60fps all day long and some people(myself included) probably have some words to eat concerning eON... :whistle:

*I would still prefer a native port obviously

Bioshock Infinite Early Linux Port Report
18 Mar 2015 at 1:07 am UTC

Grrr. I'm starting to think all the Bioshock: Infinite released for Linux articles and comments are just some elaborate scheme to mess with my head and the game is not really out! :D

Not really, but I seriously still can't install it. Says not available for my platform. I know this stuff happens from time to time when a depot or two is forgotten to switch or something, but I've seen nobody else saying anything about it this time.

EDIT* guess I should have thought to look at steam's forums. Others are having problems. Apparently those who pre ordered the game are currently out.

BioShock Infinite Released For Linux, Thanks To Virtual Programming
17 Mar 2015 at 9:09 pm UTC

Quoting: GuestSticking with GL4.x allows VP to use certain features that really improve performance. I suspect that's why - they could possibly have stuck with GL3.3, but they have to draw a line somewhere - and there's really no reason now for new things to stick with anything less than 4.x, especially on a PC.
I think it is a DX11 only game as well so if I'm not mistaken OGL4+ is necessary. Not to mention it is a pretty demanding game so even if OGL4 was not necessary and they could have used extension on top of say 3.3 to make everything "work" then it is likely like you said, they need to use the features of 4.x to get respectable performance out of it.

The old consoles were keeping dx9 "alive".. 2014 saw a lot of DX11 ONLY games now that the new consoles have moved on. If you don't have DX11/OGL4 capable hardware at this point in time, it is really time to start looking into an upgrade as soon as financially possible. Pretty sure all current DX11/OGL4 hardware will be DX12/Vulkan compatible as well. No matter what OS you play games in you will want to be ready for those.

BioShock Infinite Released For Linux, Thanks To Virtual Programming
17 Mar 2015 at 8:41 pm UTC

Well, talk about unexpected. Obviously we knew it was coming, but I kinda like the surprise. I would be testing it out right now if steam would let me. It is telling me it is not available on my platform =/