Support me on Patreon to keep GamingOnLinux (and me) alive.

How To: An update on fixing screen-tearing on Linux with an NVIDIA GPU

Posted by , 11 January 2017 at 3:57 pm UTC / 16057 views
My original guide on how to help fix screen-tearing on Linux with an NVIDIA GPU is a bit dated, so here’s an even easier way.

Notes
You will likely need the 375.26 driver or newer for this to show up in "nvidia-settings".

These options may cause a loss in performance. For me personally, the loss is next to nothing.

It probably won't work with Optimus right now, but this may be fixed in future.

What to do
Previously you needed to edit config files, and it was a little messy. Thankfully, NVIDIA added options in nvidia-settings to essentially do it all for you. The options were added in a more recent NVIDIA driver version, so be sure you're up to date.

Load "nvidia-settings" and you will need to go to this screen and then hit “Advanced” at the bottom (my screenshot doesn't have the button, as this is what you see after you hit it):
image
Tick the boxes for “Force Composition Pipeline” and “Force Full Composition Pipeline” and then hit "Apply".

You can then enjoy a tear-free experience on Linux with an NVIDIA GPU. It really is that damn easy now.

Note: You will likely need to run nvidia-settings with “sudo” for the below to work.
If you want this applied all the time on startup (without needing to do anything), you can hit “Save to X Configuration File”. I have mine located at “/etc/X11/xorg.conf.d/xorg.conf” on Antergos, but your location may be different. I also recommend backing any existing xorg.conf file up if one exists.

This step isn't needed, but it's a fun and useful extra!
I also have mine set to a script and then to a keyboard shortcut, for those times when a game reverts the desktop to a low resolution after exiting, or when a game turns off a second monitor, this will turn it back on.

For that I manually set the resolution like so:
nvidia-settings --assign CurrentMetaMode="DVI-I-1:1920x1080_60 +0+0 { ForceFullCompositionPipeline = On }, HDMI-0:1920x1080_60 +1920+0 { ForceFullCompositionPipeline = On }"
Edit that for your details, like your resolution and monitor connections (you can see them by running "xrandr --query" in terminal), and then save it as an easy to remember filename. You can then set it as a custom shortcut, I use “CTRL+ALT+F4” as it’s not used for anything else.

This has been tested and works for me perfectly across Ubuntu Unity, Ubuntu MATE and Antergos KDE.
Comments
Page: «4/7»
  Go to:

tuubi commented on 11 January 2017 at 9:12 pm UTC
View PC info
  • Supporter

rea987Cause, Linux Mint will stick with kernel 4.4 for 2 years until the next LTS of Ubuntu released. Of course it will be possible to install backports but it still will be a hassle for end users.
No need for backports. 4.8 is available in the official repositories, and can be installed using Mint's kernels dialog. It only takes a few clicks.


liamdawe
tuubiBTW: Compton's VSync guide also has this to say about NVidia's ForceFullCompositionPipeline option:
QuoteHowever it’s indicated that it introduces huge (~30%) performance loss on some OpenGL applications.
I guess this info could be old and obsolete though?
That's very outdated information, my own tests have shown it to be a 2-3FPS difference, so that can be written off for benchmark differences each time you run it. Old cards may have more of an impact, but it hasn't been a problem for quite some time.
Good to hear.


knotted10 commented on 11 January 2017 at 9:26 pm UTC

rea987
knotted10I've made a research about this issue, it will be fixed on the release for xorg server 1.19, having kernel 4.5+ and latests would fix every screentearing issue for optimus laptops
I'm actually waiting for the moment that the arch linux release that xorg version to the stable repos and i'll switch to manjaro right away

That's interesting. Cause, Linux Mint will stick with kernel 4.4 for 2 years until the next LTS of Ubuntu released. Of course it will be possible to install backports but it still will be a hassle for end users.

iplaygameswearingatuxGuys who have laptops with two graphics cards: don't waste your time reinstalling drivers. As for now, Nvidia isn't really bothering with optimizing drivers for laptops. Editing xorg.conf seems like the only way to go for us; but I don't know if it would work.

Yeah, I figured that out long time ago. But finding a magical xorg.conf for the exact hardware is quite hard...

for me was pretty straight forward: open update manager --> view --> linux kernels --> selected 4.5


yzmo commented on 11 January 2017 at 9:33 pm UTC

Somehow none of those checkboxes show up for me...


stan commented on 11 January 2017 at 9:35 pm UTC

tuubiCompton seems to be a popular choice for Xfce users like us, and for me it works pretty much perfectly, but looking at this I think the optimal settings might be different for different graphics hardware. In fact the guide suggests that it might be better to use VSync options provided by the driver over Compton's own, if available.

(…)

BTW: Compton's VSync guide also has this to say about NVidia's ForceFullCompositionPipeline option:
QuoteHowever it’s indicated that it introduces huge (~30%) performance loss on some OpenGL applications.
I guess this info could be old and obsolete though?
I just ran the benchmark of Deus Ex: MD to compare the performance with Compton and ForceCompositionPipeline:
- No VSYNC at all (with tearing): 52.8 FPS
- ForceCompositionPipeline/ForceFullCompositionPipeline: 52.1 FPS
- Compton: 51.7 FPS

I ran each test twice and there was very little variation if at all. That’s on a GTX 1060 with the nvidia 375.26 drivers. Compton options: "--vsync opengl-swc --paint-on-overlay --backend glx --glx-no-stencil". Window manager: OpenBox.

FCP and FFCP both remove tearing and give the same framerate.

So that’s a 2% loss for Compton and 1.3% loss for FCP, for my system and this game… But it could be higher for different hardware/software (I measured a 9% loss with Compton on my GTX 660 on Dying Light some months ago).

Edit: for fun and completeness I also ran the benchmark with the ingame vsync:
- double buffering: 32.8 FPS
- triple buffering: 32.2 FPS

Obviously the triple buffering isn’t working as it should (it should return similar values to Compton and FCP).


Last edited by stan at 11 January 2017 at 10:39 pm UTC


jasondaigo commented on 11 January 2017 at 9:36 pm UTC

sadly dying light still give me motion sickness; no tearing though xD


liamdawe commented on 11 January 2017 at 9:50 pm UTC

yzmoSomehow none of those checkboxes show up for me...
What driver version, what GPU, is it optimus?


yzmo commented on 11 January 2017 at 9:53 pm UTC

liamdawe
yzmoSomehow none of those checkboxes show up for me...
What driver version, what GPU, is it optimus?
I'm running 375.26, no optimus. Perhaps it only works if multiple monitors are present as nvidia treats it as a per-monitor setting?


liamdawe commented on 11 January 2017 at 9:57 pm UTC

yzmo
liamdawe
yzmoSomehow none of those checkboxes show up for me...
What driver version, what GPU, is it optimus?
I'm running 375.26, no optimus. Perhaps it only works if multiple monitors are present as nvidia treats it as a per-monitor setting?
No, it works with any amount of monitors.

Send a screenshot?


yzmo commented on 11 January 2017 at 10:37 pm UTC

Here:
image


liamdawe commented on 11 January 2017 at 10:39 pm UTC

What version of nvidia-settings do you have?


  Go to:

Due to spam you need to Register and Login to comment.


Or login with...

Games & Release Calendar
Livestreams & Videos
Popular this week
View by Category
Contact
Latest Forum Posts
Facebook