Patreon Logo Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal Logo PayPal. You can also buy games using our partner links for GOG and Humble Store.
Latest Comments by slaapliedje
NVIDIA to support VESA Adaptive Sync with 'G-SYNC Compatible' branding
8 Jan 2019 at 4:21 pm UTC

Quoting: mahagr@slaapliedje Both my monitors claim the same, but also say 60Hz, which means that G-Sync frequency isn't set by the graphics card. In order to make sure if G-Sync is truly being used, the only way is to enable FPS counter from your monitor and start a game. If the FPS clock goes to 1-10 while loading the game, G-Sync is turned on. Of course another way to know is just to play something which isn't running >60 FPS.

Most common monitors are cheapo 24" (1080p) right now. For 1440p I would buy 27-28" monitor and for 4K I would take nothing else than 32" monitor just to keep font scaling at 1x and being still able to read the text. What comes to the screen resolution I would personally use 1440p for gaming and 4K for productivity unless you have the money to buy GTX 1080 Ti+ level graphics card.

What comes to 32" screens, I really love them. It may take a few weeks to get used to them (especially if you're used to full screen windows), but oh boy how much you can fit into a single screen!
Mine say 144hz on both, but I've also set them both to 144hz within Gnome-shell's display configuration. But I did just test Mad Max with the visual indicator checked in the nvidia settings (only one monitor on/detected by gnome) and it said it was in BLIT mode with vsync on, but it also doens't allow over 60hz, so I'm guessing it doesn't actually support anything beyond standard vsync.

As far as the size of monitors, I have two 27" LG 4k monitors at work (they support freesync even) and I LOVE the font size and everything, when I can fit multiple VMs to work on, plus tons of terminals and browsers etc, and can read them just fine from the foot or so away, they're the perfect size. I also have a Veridesk that I raise up, and two 32" monitors simply wouldn't work on that.

Granted I also got Lasik (mainly for VR) and can now see without glasses, so have decent eyesight, but generally speaking Gnome scales very well on hi-dpi monitors.

NVIDIA to support VESA Adaptive Sync with 'G-SYNC Compatible' branding
8 Jan 2019 at 2:52 am UTC

Quoting: Guest
Quoting: jarhead_hFor most computer desks that screensize is 3224in.
There, fixed that for you.

More seriously, I haven’t been able to find statistics about monitor sizes.
If I were to take a somewhat educated guess, I'd say 22-27" is probably the most popular. 32" monitors are ridiculously large for being so close to one's face.

NVIDIA have put out a new Vulkan beta driver with better pipeline creation performance
8 Jan 2019 at 2:23 am UTC

Quoting: mahagr
Quoting: dubigrasuAt the same time I don't remember the drivers stuck on performance mode while in desktop mode, sure the modes were alternating depending on desktop activity, but stuck on max power, no.
For me my GTX 1080 Ti is always in the p0 state (max clocks) until I switch to the console. There's a known issue for this in nVidia bug tracker as well as a public thread in nVidia forums. I am running my computer with two 4K G-Sync monitors, which seems to make the issue worse. Windows shares the same power management code, but the difference between Windows and Linux is that in Windows they have information if more draw calls are coming to the pipeline or not, which allows the graphics card to go to a lower power state earlier.

I agree that keeping the clocks high is great when you're gaming (but only if your game needs 100% of your GPU), but it's not great if your card runs hot 24/7 and never stops the fans because of the power saving doesn't work for a few users.

Do not get me wrong: I am and have been nVidia user for a long time (just threw away a broken GT 8800 card among with some other old hardware) and I will likely be using nVidia graphics cards in the future, too.

PS. Regarding to my first comment on nVidia Linux driver quality.. That came from an nVidia employee who I know. I have also worked in a few companies where the main reason not to release source code was a bad code quality (not because of bad workers but because of there was no time to polish the code).
Try this?

https://forums.geforce.com/default/topic/454144/geforce-drivers/190-42-linux-how-to-change-powermizer-settings-to-underclock-gpu-/ [External Link]

NVIDIA to support VESA Adaptive Sync with 'G-SYNC Compatible' branding
8 Jan 2019 at 2:16 am UTC

Quoting: mahagr
Quoting: slaapliedjeWhat if you have two G-Sync monitors?
Yup, I have two 4K G-Sync monitors (for work, not for gaming). You will not get any game to run on both monitors -- well, I did get one Windows game to spread up like that by accident, but still no G-Sync (I believe it was in windowed mode). So I can safely say that I've not found any way to enable G-Sync on multiple monitors. You really need to disable another monitor to get G-sync to work at all.

I've heard that Windows has the same issue -- unless you get the game to run on both displays.
Truly, 4k monitors are fantastic for work, and suck for gaming :P

Windows has an option to allow G-Sync within Fullscreen / Fullscreen and Windowed mode.

Both my monitors claim they are in G-Sync mode, for whatever that's worth. Dell S2716DG are what I have.

NVIDIA to support VESA Adaptive Sync with 'G-SYNC Compatible' branding
8 Jan 2019 at 2:14 am UTC

Quoting: rkfg
Quoting: slaapliedjeFrom all I've seen, G-Sync works fine on both monitors, though I never got the 'gsync indicator' to work, it's been there long before the 4xx drivers though, contrary to the other person that said that.
Are you sure it works? What does the Graphics API Indicator say? I said it appeared in 4xx because a friend of mine tested 396.54 (IIRC) and there were no indicator options at all in nvidia-settings, and that's the last 3xx branch version.

The G-Sync indicator appears in the top right corner, it's a green label that says "G-SYNC" in a thin ugly (for my taste) font, nothing fancy like in the Windows driver. If it's not there (and the option in nvidia-settings is on) you probably don't have G-Sync enabled and only have the regular V-Sync.

Quoting: dpanterTLDR: Multi-monitor G-Sync in Linux sucks goat nads. My next card will be AMD. Period.
Is Freesync better in this regard? I thought it's a display hardware or Xorg's limitation as it's all the same on Windows. But I haven't researched that.
Like I've said, I've clicked on the check mark for the indicator but nothing happens. It does in Windows, but it also has an option for Full screen, or Full screen and Windowed.

What I have been going by is that it actually says the monitor is in G-Sync Mode when I click on the settings. Otherwise it just says 144hz.

Now I do have in my .nvidia-settings-rc file this; but again, doesn't seem to allow the indicator to work.
#
# /home/$USER/.nvidia-settings-rc
#
# Configuration file for nvidia-settings - the NVIDIA X Server Settings utility
# Generated on Mon Jan  7 19:12:17 2019
#

# ConfigProperties:

RcFileLocale = C
DisplayStatusBar = Yes
SliderTextEntries = Yes
IncludeDisplayNameInConfigFile = Yes
ShowQuitDialog = Yes
UpdateRulesOnProfileNameChange = Yes
Timer = PowerMizer_Monitor_(GPU_0),Yes,1000
Timer = Thermal_Monitor_(GPU_0),Yes,1000
Timer = Memory_Used_(GPU_0),Yes,3000
Timer = Memory_Used_(GPU_1),Yes,3000
Timer = Thermal_Monitor_(GPU_1),Yes,1000
Timer = PowerMizer_Monitor_(GPU_1),Yes,1000

# Attributes:

skint:1.0/SyncToVBlank=1
skint:1.0/LogAniso=0
skint:1.0/FSAA=0
skint:1.0/TextureClamping=1
skint:1.0/FXAA=0
skint:1.0/AllowFlipping=1
skint:1.0/FSAAAppControlled=1
skint:1.0/LogAnisoAppControlled=1
skint:1.0/OpenGLImageSettings=1
skint:1.0/FSAAAppEnhanced=0
skint:1.0/AllowGSYNC=0
skint:1.0/ShowGSYNCVisualIndicator=1
skint:1.0/ShowGraphicsVisualIndicator=1
skint:1[DPY:DP-0]/Dithering=0
skint:1[DPY:DP-0]/DitheringMode=0
skint:1[DPY:DP-0]/DitheringDepth=0
skint:1[DPY:DP-0]/ColorSpace=0
skint:1[DPY:DP-0]/ColorRange=0
skint:1[DPY:DP-0]/SynchronousPaletteUpdates=0
skint:1[DPY:DP-1]/Dithering=0
skint:1[DPY:DP-1]/DitheringMode=0
skint:1[DPY:DP-1]/DitheringDepth=0
skint:1[DPY:DP-1]/ColorSpace=0
skint:1[DPY:DP-1]/ColorRange=0
skint:1[DPY:DP-1]/SynchronousPaletteUpdates=0
skint:1[DPY:HDMI-0]/Dithering=0
skint:1[DPY:HDMI-0]/DitheringMode=0
skint:1[DPY:HDMI-0]/DitheringDepth=0
skint:1[DPY:HDMI-0]/ColorSpace=0
skint:1[DPY:HDMI-0]/ColorRange=0
skint:1[DPY:HDMI-0]/SynchronousPaletteUpdates=0
skint:1[DPY:DP-2]/RedBrightness=0.000000
skint:1[DPY:DP-2]/GreenBrightness=0.000000
skint:1[DPY:DP-2]/BlueBrightness=0.000000
skint:1[DPY:DP-2]/RedContrast=0.000000
skint:1[DPY:DP-2]/GreenContrast=0.000000
skint:1[DPY:DP-2]/BlueContrast=0.000000
skint:1[DPY:DP-2]/RedGamma=1.000000
skint:1[DPY:DP-2]/GreenGamma=1.000000
skint:1[DPY:DP-2]/BlueGamma=1.000000
skint:1[DPY:DP-2]/Dithering=0
skint:1[DPY:DP-2]/DitheringMode=0
skint:1[DPY:DP-2]/DitheringDepth=0
skint:1[DPY:DP-2]/ColorSpace=0
skint:1[DPY:DP-2]/ColorRange=0
skint:1[DPY:DP-2]/SynchronousPaletteUpdates=0
skint:1[DPY:DP-3]/Dithering=0
skint:1[DPY:DP-3]/DitheringMode=0
skint:1[DPY:DP-3]/DitheringDepth=0
skint:1[DPY:DP-3]/ColorSpace=0
skint:1[DPY:DP-3]/ColorRange=0
skint:1[DPY:DP-3]/SynchronousPaletteUpdates=0
skint:1[DPY:DP-4]/RedBrightness=0.000000
skint:1[DPY:DP-4]/GreenBrightness=0.000000
skint:1[DPY:DP-4]/BlueBrightness=0.000000
skint:1[DPY:DP-4]/RedContrast=0.000000
skint:1[DPY:DP-4]/GreenContrast=0.000000
skint:1[DPY:DP-4]/BlueContrast=0.000000
skint:1[DPY:DP-4]/RedGamma=1.000000
skint:1[DPY:DP-4]/GreenGamma=1.000000
skint:1[DPY:DP-4]/BlueGamma=1.000000
skint:1[DPY:DP-4]/Dithering=0
skint:1[DPY:DP-4]/DitheringMode=0
skint:1[DPY:DP-4]/DitheringDepth=0
skint:1[DPY:DP-4]/ColorSpace=0
skint:1[DPY:DP-4]/ColorRange=0
skint:1[DPY:DP-4]/SynchronousPaletteUpdates=0
skint:1[DPY:DP-5]/Dithering=0
skint:1[DPY:DP-5]/DitheringMode=0
skint:1[DPY:DP-5]/DitheringDepth=0
skint:1[DPY:DP-5]/ColorSpace=0
skint:1[DPY:DP-5]/ColorRange=0
skint:1[DPY:DP-5]/SynchronousPaletteUpdates=0
skint:1[DPY:USB-C-0]/Dithering=0
skint:1[DPY:USB-C-0]/DitheringMode=0
skint:1[DPY:USB-C-0]/DitheringDepth=0
skint:1[DPY:USB-C-0]/ColorSpace=0
skint:1[DPY:USB-C-0]/ColorRange=0
skint:1[DPY:USB-C-0]/SynchronousPaletteUpdates=0

NVIDIA have put out the 410.93 driver for Linux today
7 Jan 2019 at 5:58 pm UTC

Quoting: cprn
Quoting: Guest[...] Also, do not use the compton config together with the nvidia composition options - that combo made my screen tear.
I believe I tried every possible solution but just to be clear: which one helped you with tearing?
  • driver composition options without compton (at all? just without vsync option?) or
  • compton without driver composition options
I'd really like to see a video of what one considers too much tearing and no tearing.

I never notice that much tearing unless I'm trying to wiggle my windows around as fast as possible. Or if I'm attempting to play a 4k Youtube video when I have clocked down the video card to minimum for battery saving.

NVIDIA to support VESA Adaptive Sync with 'G-SYNC Compatible' branding
7 Jan 2019 at 5:41 pm UTC

Quoting: mahagrJust a note on G-Sync on multiple displays: Make sure you disable the secondary screen(s) from nVidia settings (or Ubuntu display settings etc) or G-Sync will not work. Just turning the screen off doesn't work either, it has to be disabled from X server. This is in latest Ubuntu, but I believe that it applies to all the other distros as well.

It's a bit annoying that you cannot use the secondary screens while gaming, but it is what it is and I can live with it. It would be cool, though, if the secondary screens would be automatically turned off just like in Windows.
What if you have two G-Sync monitors?
Edit: helps if I read the rest of the replies.

From all I've seen, G-Sync works fine on both monitors, though I never got the 'gsync indicator' to work, it's been there long before the 4xx drivers though, contrary to the other person that said that.

The first beta for Lutris 0.5 is out with a refreshed UI and GOG support
6 Jan 2019 at 9:00 pm UTC

Quoting: iiari
Quoting: HavingtoGetSignedHow do I instal this in Ubuntu Budgie 18.04 Bionic?
https://linuxconfig.org/install-lutris-on-ubuntu-18-04-bionic-beaver-linux [External Link]
The instructions are right there on their page...
https://lutris.net/downloads/ [External Link]

NVIDIA have put out a new Vulkan beta driver with better pipeline creation performance
6 Jan 2019 at 8:57 pm UTC

Quoting: mahagrnVidia drivers are bad quality which is likely the main reason why they do not open up the code. I'm frustrated on their drivers as they force the card to run on maximum power for 45 seconds every time there's an opengl draw call. Basically it means that if you install Ubuntu and use default Gnome (which uses opengl X composite extension), your graphics card never goes into powersave state and consumes ~4x more power than it should. They have the same issue in Windows, but because of architectural differences it's not as bad in there.

I guess they do that because of nobody has bothered to implement proper power saving feature and because of not running the card in maximum power makes the cards to look bad in benchmarks. IMHO they really should fix the issue and allow cards to run on optimal clocks as laptops get more and more common. :)
You sure about that? I've set mine in power save mode and it doesn't seem to jump around when I use Gnome. Granted, I also set the powermizer settings in my laptop.

NVIDIA have put out the 410.93 driver for Linux today
4 Jan 2019 at 9:35 pm UTC Likes: 1

Allow G_Sync, assuming you have a gsync monitor, will prevent tearing.

I do love my Gsync monitors, but man were they pricey.