You can sign up to get a daily email of our articles, see the Mailing List page.
We do often include affiliate links to earn us some pennies. See more here.

NVIDIA have a little present available for Linux fans today, with the release of the 435.17 beta driver now being available.

This is a beta driver and it includes quite the highlight with the addition of PRIME render offload support for Vulkan and OpenGL. This is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games. It's usually found in Notebooks and it's been a source of annoyance for NVIDIA Notebook owners for a long time, so it's really pleasing to see proper progress like this.

It comes with some caveats though, as it needs a very up to date X.Org Server with git commits not available in a normal release yet. However, if you're on Ubuntu 19.04 or 18.04 NVIDIA have provided a PPA. There's a little additional work needed for now too, you can read more about the PRIME render offload support here.

For the rest of what's in this new driver, it has the usual assortment of bug fixes and "experimental support for runtime D3 (RTD3) power management on Turing notebook GPUs". The full changelog can be found here.

Article taken from GamingOnLinux.com.
24 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
32 comments
Page: «3/4»
  Go to:

edo Aug 14, 2019
I have been waiting this for so many years.
qgnox Aug 14, 2019
Quoting: Ivancillo
QuoteThis is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games.

Does this mean that it only work on laptops with Intel CPU?

What about Ryzen ones?
It works, I got it running on a Ryzen 3750h laptop with picasso gpu and nvidia 1660ti, this is the /etc/X11/xorg.conf if you have one of those laptops:
Section "ServerLayout"
    Identifier "layout"
    Screen 0 "amd"
    Inactive "nvidia"
    Option "AllowNVIDIAGPUScreens"
EndSection

Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
    BusID "1:0:0"
EndSection

Section "Device"
    Identifier "amd"
    Driver "modesetting"
    Option "TearFree" "true"
    Option "DRI" "3"
    BusID "5:0:0"
EndSection

Section "Screen"
    Identifier "amd"
    Device "amd"
EndSection

Section "Screen"
    Identifier "nvidia"
    Device "nvidia"
EndSection
Boldos Aug 14, 2019
View PC info
  • Supporter
You are a decade late, nVidia... Too damn late.
I just bought a new Ryzen 3700U + Vega 10 notebook, and I'm not looking back!
Ivancillo Aug 14, 2019
Quoting: qgnox
Quoting: Ivancillo
QuoteThis is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games.

Does this mean that it only work on laptops with Intel CPU?

What about Ryzen ones?
It works, I got it running on a Ryzen 3750h laptop with picasso gpu and nvidia 1660ti, this is the /etc/X11/xorg.conf if you have one of those laptops:
Section "ServerLayout"
    Identifier "layout"
    Screen 0 "amd"
    Inactive "nvidia"
    Option "AllowNVIDIAGPUScreens"
EndSection

Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
    BusID "1:0:0"
EndSection

Section "Device"
    Identifier "amd"
    Driver "modesetting"
    Option "TearFree" "true"
    Option "DRI" "3"
    BusID "5:0:0"
EndSection

Section "Screen"
    Identifier "amd"
    Device "amd"
EndSection

Section "Screen"
    Identifier "nvidia"
    Device "nvidia"
EndSection

Thanks.

On the AMD side, are you using AMDGPU PRO or just AMDGPU?
qgnox Aug 14, 2019
Quoting: Ivancillo
Quoting: qgnox
Quoting: Ivancillo
QuoteThis is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games.

Does this mean that it only work on laptops with Intel CPU?

What about Ryzen ones?
It works, I got it running on a Ryzen 3750h laptop with picasso gpu and nvidia 1660ti, this is the /etc/X11/xorg.conf if you have one of those laptops:
Section "ServerLayout"
    Identifier "layout"
    Screen 0 "amd"
    Inactive "nvidia"
    Option "AllowNVIDIAGPUScreens"
EndSection

Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
    BusID "1:0:0"
EndSection

Section "Device"
    Identifier "amd"
    Driver "modesetting"
    Option "TearFree" "true"
    Option "DRI" "3"
    BusID "5:0:0"
EndSection

Section "Screen"
    Identifier "amd"
    Device "amd"
EndSection

Section "Screen"
    Identifier "nvidia"
    Device "nvidia"
EndSection

Thanks.

On the AMD side, are you using AMDGPU PRO or just AMDGPU?
amdgpu, you can replace also in the xorg.conf the driver modesetting for amdgpu to have less tearing in apps running with the amdgpu but it doesn't affect the nvidia offload.
minkiu Aug 15, 2019
View PC info
  • Supporter
Quoting: NanobangI gave up futzing with all this a long time ago, shortly after Primus came along. I set my (I-will-never-buy-another) Optimus laptop to "Nvidia" and keep it plugged in. The downside is that sounds and feels like an idling Harrier Jump Jet. The upside is that it will probably die sooner, and the sooner it dies, the sooner I can look into non-Optimus options. :D

I am on the same boat, although (at least in Fedora with RPMFusion drivers) you can disable the GPU at boot time on grub, that way if I know I'm not going to game, I just use the Intel/Noveau.

It would be really cool if when updating nvidia drivers a new grub entry was created with nvidia disabled, I saw an open ticket (don't remember in which project) that was exactly this... but didn't get much traction.

That said I tried doing something myself, bit cant really seem to get the grubby docs.
Luke_Nukem Aug 15, 2019
Well... I've been playing No Man's Sky v2.0 through Proton on the beta drivers. Nice, smooth, fast.

Laptop is an MSI GS65-Stealth RTX-2060. Also played a few games through Proton D9VK without issue too... I'm blown away...

Power use with power-management set up drops down to 7-10w for browsing etc. 7w just idling. 4-5w with screen off. Guesstimate 6-10 hours battery time depending on what I'm doing.

To get proper power-management I needed to do sudo tee /sys/bus/pci/devices/0000:01:00.0/power/control <<<auto
sigz Aug 15, 2019
Quoting: Leopard
Quoting: sigz
Quoting: LeopardBumblebee is thrash and not necessary.

Don't say that... bumblebee helped a lot in the past when there was no other solutions..

No? Bumblebee didn't help for anything. Prime was a better solution , at least it was reliable.

While Bumblebee was not.

Seems you never knew the time there was no prime. Bumblebee was a solution, not a perfect solution, but it was here before prime. You cannot say it's thrash


Last edited by sigz on 15 August 2019 at 12:27 pm UTC
Munk Aug 15, 2019
It would be interesting to experiment on desktop with offloading the overhead of desktop rendering to the capable-enough onboard graphics, which otherwise just go unused, on a desktop. Unless there's some large overhead to this, which I don't see why there would, I would expect to see modest performance gains, especially when running multiple high-resolution displays in which only one is used for gaming.

Does anyone know if the VRAM is separate as well? If so, that'd be a major boon, especially for AI work. Right now around half of my VRAM can be eaten up just by basic multitasking. I'd love to be able to offload this to system memory and have my VRAM reserved for processes that actually need the performance.

Another interesting thing would be the possibility of discrete GPU driver updates without having to reload X.

As long as your onboard graphics are good enough for your basic desktop tasks, being able to pick and choose which applications use your discrete GPU seems like a major win for desktop users just as much as laptop users.
Luke_Nukem Aug 15, 2019
Quoting: MunkIt would be interesting to experiment on desktop with offloading the overhead of desktop rendering to the capable-enough onboard graphics, which otherwise just go unused, on a desktop. Unless there's some large overhead to this, which I don't see why there would, I would expect to see modest performance gains, especially when running multiple high-resolution displays in which only one is used for gaming.

Does anyone know if the VRAM is separate as well? If so, that'd be a major boon, especially for AI work. Right now around half of my VRAM can be eaten up just by basic multitasking. I'd love to be able to offload this to system memory and have my VRAM reserved for processes that actually need the performance.

Another interesting thing would be the possibility of discrete GPU driver updates without having to reload X.

As long as your onboard graphics are good enough for your basic desktop tasks, being able to pick and choose which applications use your discrete GPU seems like a major win for desktop users just as much as laptop users.

This is unworkable due to hardware differences. Laptops use a mux to direct graphics output etc, and are fairly integrated together. Whereas the desktop has these as very separated components with separate outputs and memory. The copy from the gfx ram to the iGPU ram would be hideously slow.

But you could hook up two displays :shrug:
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.