Every article tag can be clicked to get a list of all articles in that category. Every article tag also has an RSS feed! You can customize an RSS feed too!
We do often include affiliate links to earn us some pennies. See more here.

Update: You can do it easier now with the NVIDIA control panel. See this newer article for info.

Thanks to a few different people for their advice (xpander for the initial advice on a script and HeavyHDx on twitter) I have finally found a way to stop screen tearing with Nvidia proprietary drivers.

I have been struggling with this issue for months across all the different desktop environments I tried (KDE, GNOME, Cinnamon, Unity), and it has caused me a fair amount of headaches and stress, so I am pleased to finally find a solution. It's not perfect, slightly annoying, but also quite useful too.

You have probably heard of ForceFullCompositionPipeline before and that is what I am using. I have two scripts setup on keyboard shortcuts depending on the resolution that I am using (4K or 1080p). Why both? I hear you ask. It's simple, performance in a lot of games at 4K resolution is terrible, and some games have tiny unreadable text, so I run certain games at 1080p.

Here's where the confusion came from...
The problem with ForceFullCompositionPipeline is when you play a game that has a fullscreen mode that changes your desktop resolution, instead of stretching a fullscreen window, is that ForceFullCompositionPipeline is reset back to disabled. If you have noticed screen tearing returning at times even with using ForceFullCompositionPipeline, that could well be your issue too. Like me, if you didn't know that, it was probably bugging you a lot. This is also why simply putting it in an xorg config file will not 100% solve it, where as with this method you can just re-run it any time you need to.

So, here are the two very simple scripts I run. They are both put in plain text files and allowed to run as an executable (right click -> properties -> permissions -> tick "Allow executing file as program").

First up is for the 4K resolution (I have this set to run at start-up so I don't have to mess with xorg stuff directly):
nvidia-settings --assign CurrentMetaMode="DP-4:3840x2160_60 +0+0 { ForceFullCompositionPipeline = On }, DVI-I-1:1920x1080_60 +3840+0 { ForceFullCompositionPipeline = On }"
And for 1080p resolution:
nvidia-settings --assign CurrentMetaMode="DP-4:1920x1080_60 +0+0 { ForceFullCompositionPipeline = On }, DVI-I-1:1920x1080_60 +1920+0 { ForceFullCompositionPipeline = On }"
If you only have one monitor, you won't need the addition part after the comma.

You can run the script at any time. Your monitor(s) will blink, and then come back all sorted.

You will of course need to change things like "DP-4" and "DVI-I-1" to the connections your monitor is using (or monitors in my case as I have two). You can find them out by running the "xrandr" command in terminal. It will give you a list of things, like this:

QuoteDP-4 connected primary 3840x2160+0+0 (normal left inverted right x axis y axis) 621mm x 341mm

 


I hope this helps someone else, as it has been driving me nutty. They are pretty safe scripts to use, I have been testing switching between them constantly, but don't blame me if you blow your computer up.

These two little scripts have literally changed my gaming life on Linux for the better.

Where it becomes even more useful
A nice side-effect of the script: Games like RunningWithRifles which has poor multi-monitor support, it actually turns off my main monitor. Hitting the desktop shortcut I set for it will bring that monitor back, and still allow me to play the game. So not only do you get zero tearing, you get your normal multi-monitor experience back.

Feel free to share what methods you're using on your favourite desktops. Let's see if we can help each other in the comments.

 

Article taken from GamingOnLinux.com.
Tags: Editorial, HOWTO
0 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
64 comments
Page: «5/7»
  Go to:

IDNO May 13, 2016
Option "metamodes" "1920x1080_144 +0+0; {FroceFullCompositionPipeline = On}, viewportin=1920x1080, viewportout=1920x1080+240+0}"

with

Option "TearFree" "on"

it works for me with vsync off :P in xorg.conf

(btw yeah i disabled vsync and stuff but i have 144hz monitor)
N30N May 13, 2016
Quoting: GuestThe compositor of Xfwm 4.12 has a vsync option, but it doesn’t really work.
One of the developers said he had "tear-free compositing working here with GL support on proprietary NVidia driver with current code". So likely a misconfiguration or a new bug needs to be reported.


Last edited by N30N on 13 May 2016 at 4:43 pm UTC
fagnerln May 13, 2016
I'm happy with compton, I installed and set to run when logged, simply, but this is a good tip.

Some people uses triple buffering, but one thing that need more caution is the VRAM usage
N30N May 13, 2016
Quoting: liamdaweI did make a bug report to the Cinnamon developers back in January with no reply: https://github.com/linuxmint/Cinnamon/issues/4990
Your bug report doesn't include enough information to be helpful, no mention of hardware, drivers, etc…

You could also try reporting it upstream, in your case that'd be: https://github.com/linuxmint/muffin
N30N May 13, 2016
Quoting: Guest
Quoting: N30NOne of the developers said he had "tear-free compositing working here with GL support on proprietary NVidia driver with current code". So likely a misconfiguration or a new bug needs to be reported.
They are talking about unreleased code, post 4.12.
Oh, yes. Well at least you know a fix is coming (and/or where to get it). ;)


Last edited by N30N on 13 May 2016 at 6:11 pm UTC
leonelpro May 13, 2016
Hi there.

Noob question. How could I run the script at boot in Ubuntu?

Thanks!
Liam Dawe May 13, 2016
Quoting: N30N
Quoting: liamdaweI did make a bug report to the Cinnamon developers back in January with no reply: https://github.com/linuxmint/Cinnamon/issues/4990
Your bug report doesn't include enough information to be helpful, no mention of hardware, drivers, etc…

You could also try reporting it upstream, in your case that'd be: https://github.com/linuxmint/muffin
While true, still no one replied to even ask for more info.

As for "upstream", well Cinnamon is the overall upstream for all of it since it affects Cinnamon directly.
eddie-foss May 13, 2016
[opensuse tumbleweed, amd fx8350, nvidia gtx760]

In my case, I set nvidia driver to use "Performance Mode" making a .desktop file to add in autostart folder with this cmd [ /usr/bin/nvidia-settings -a '[gpu:0]/GPUPowerMizerMode=1' ].

I was using kde5 some time ago to avoid screen tearing I set the var kwin_triple_buffer as true with [ export KWIN_TRIPLE_BUFFER=1 ] which I set in .bashrc in home as I only have one user and doesn't need special permission.

for gnome I only had screen tearing in firefox after configuring nvidia driver as performance mode, to fix tearing in fullscreen firefox I found this cmd [ gdbus call --session --dest org.gnome.Shell --object-path /org/gnome/Shell --method org.gnome.Shell.Eval 'Meta.disable_unredirect_for_screen(global.screen);' ] which also I made a .desktop to put in autostart folder.

I hope helps someone else who's having same problem
TheRiddick May 13, 2016
I tried to use Mutter on my 390x, causes the screen to wobble uncontrollably. Went back to Compton with GPU Acceleration, problem went away.


Last edited by TheRiddick on 13 May 2016 at 9:29 pm UTC
gqmelo May 14, 2016
Quoting: liamdawe
Quoting: gqmeloUnfortunately for owners of laptops with PRIME (like me), there is yet no vsync support at all:

https://devtalk.nvidia.com/default/topic/775691/linux/vsync-issue-nvidia-prime-ux32vd-with-gt620-m-/7
Have you tried this script to see if it helps at all?

Yes I have tried these settings, but there is no way it can work on a laptop with PRIME, because PRIME involves an integrated Intel card with a dedicated Nvidia. The way they interact with each other is different than a single card. There is a Nvidia developer assigned to solve this, but no good news yet.


Last edited by gqmelo on 14 May 2016 at 1:19 am UTC
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.