Check out our Monthly Survey Page to see what our users are running.
The Witcher 3 in Wine
Page: «156/179»
  Go to:
Pangaea 19 Feb, 2020
Not seen nor heard about the Flying Dutchman, and it looks like you have to basically know about it to come across it. Pretty cool easter egg, though.
DaiKaiser93 24 Feb, 2020
So after playing for a while (75 Hrs) and even though I reached Skellige, I been having some problems.

Sometimes a cut-scene happens or there's too much detail to render (like in the forest for example) sometime I get what I can only describe as a GPU crash (loss of video signal, the fans start to spin faster, the audio continues but I can't Ctrl+Alt+F4, requiring me to force shutdown the system)

I started to force the fan speed to 80% with Radeon-profile before starting the game but even so, sometimes still happens.

I recently upgraded my PSU from an EVGA 500 BT to a Corsair CX750M, believing that it was maybe my RX580 was not getting sufficient power.

I have read the game has apparently memory leaks, is that a thing?

Has anyone else played the game thru Lutris? I can't really run it via Steam as I don't use it.

I apologize if I posted this in the wrong section, it's just that I want to continue playing this game, and want to heard if anyone had the same problems.
Shmerl 24 Feb, 2020
I'm playing TW3 (GOG version) using Wine with esync patches + dxvk. I don't really use Lutris in general. It works very well. I haven't noticed memory leaks really.
Pangaea 24 Feb, 2020
Haven't noticed such issues either, and I have a GTX 770 card. If you have the GOG version, I would recommend to install using this script:

I believe it's basically what shmerl mentioned, but everything is set up for you, so no need to muck about with stuff yourself. It runs very well on my end.

PS: You may also want to download and install this package of packages (heh): It is made by the author of the script, and has packages that are usually needed for running games on Linux. Maybe you lack something here, and that is what causes the problem?
Shmerl 25 Feb, 2020
Interesting find: Reverse engineering the rendering of The Witcher 3.

Last edited by Shmerl on 25 February 2020 at 2:27 am UTC
catbox_fugue 25 Feb, 2020
nvidia hairworks is the worst thing ever.

upgrading my GPU lately has had me very interested in continuing my Witcher3 goty playthrough.
Shmerl 25 Feb, 2020
I always disable hairworks.
DaiKaiser93 27 Feb, 2020
Quoting: ShmerlI'm playing TW3 (GOG version) using Wine with esync patches + dxvk. I don't really use Lutris in general. It works very well. I haven't noticed memory leaks really.

I just disassembled my GPU, apparently I forgot to add one of the Thermal pads the last time I changed the thermal paste (about a year ago), got some thermal pad's arriving tomorrow so I will update if that was the reason.

EDIT: Just finished playing through The Battle of Kaer Morhen and for all the effects used I was expecting a couple crashes, but it runned without a problem.

Ended up adding two layers of Artic Thermal pad on each chip.

Last edited by DaiKaiser93 on 2 March 2020 at 5:27 am UTC
Mohandevir 4 Mar, 2020
Here is the context... I'm on Ubuntu 19.10 with Nvidia driver 440.59 (GTX 1660 Super). I've been trying to play Witcher 3 on Gnome/KDE/Cinnamon desktop and had a lot of tearing/stuttering issues. I tried the usual "Force Full Composition Pipeline" and it kind of worked (there is still some micro stuttering), but If I try to activate the in-game Vsync, I'm back to square one. Thing is, there is no fps cap and my GPU fans tends to skyrocket, without Vsync.

So... This week, I discovered a workaround... Just creating the xorg.conf file (via the nvidia control panel), without the "Force Full Composition Pipeline" option seems to solve my issue... After that, in-game Vsync works great, comparable to when I run my games in the SteamOS-compositor.

Does it makes sense? Is this a placebo effect? :)

Last edited by Mohandevir on 4 March 2020 at 2:35 pm UTC
Shmerl 5 Mar, 2020
From what I've read, "force full compositing pipeline" was always an ugly and poorly performing hack that Nvidia came up with, to compensate for the lack of proper vsync support, which in turn is caused by the fact that they refuse to upstream their driver and therefore can't integrate it with proper DRM/KMS interfaces that handle vsync.

So I'm not surprised that it's giving bad experience.

Last edited by Shmerl on 5 March 2020 at 4:32 am UTC
While you're here, please consider supporting GamingOnLinux on:

Patreon, Liberapay or PayPal Donation.

This ensures all of our main content remains totally free for everyone with no article paywalls. We also don't have tons of adverts, there's also no tracking and we respect your privacy. Just good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register

Or login with...
Sign in with Steam Sign in with Twitter Sign in with Google
Social logins require cookies to stay logged in.