Don't want to see articles from a certain category? When logged in, go to your User Settings and adjust your feed in the Content Preferences section where you can block tags!
We do often include affiliate links to earn us some pennies. See more here.

NVIDIA today just released a big new stable driver for Linux with 450.57. It pulls in a whole bunch of big features from the recent 450.51 Beta.

Compared with the Beta, it looks like it's mostly the same plus a few extra fixes. However, it's worth a reminder now it's stable because everyone should be able to upgrade knowing it's a supported driver version. NVIDIA 450.57 is exciting for a few reasons. One of which is the inclusion of support for NVIDIA NGX, which brings things like DLSS to their Linux drivers.

There's also now Image Sharpening support for OpenGL and Vulkan, support for Vulkan direct-to-display on DisplayPort displays which are connected via DisplayPort Multi-Stream Transport (DP-MST), various VDPAU improvements, PRIME enhancements like support for PRIME Synchronization when using displays driven by the x86-video-amdgpu driver as PRIME display offload sinks along with "Reverse PRIME" support too.

On the bug fix side, one of the big ones is that is should be a smoother Wayland experience as NVIDIA fixed a bug that could cause a KDE Plasma session to crash when running under Wayland. They also fixed a bug that prevented X11 EGL displays from being reinitialized. Another KDE issue was also solved, as after some investigation the NVIDIA team found that KDE panels freezing when compositing was disabled was a problem in their driver so that was fixed too.

See the release notes here.

Article taken from GamingOnLinux.com.
27 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
54 comments
Page: «4/6»
  Go to:

jens Jul 10, 2020
  • Supporter
Quoting: EikeI don't understand you negativity. What are you fearing?

I guess Nvidia taking over the world ;)

On a more serious note, I guess it is no secret that our @Shmerl has a very strong distaste of closed source technology coming from the likes of Nvidia or Microsoft. I guess he is either trying to use every argument to convince users to stay away from it, or his opinion is already that biased due to knowing where it comes from that he can't objectively reason anymore. Could also be both.

@Shmerl, please don't take it personally, I value your knowledge for everything concerning Open Source technology. To quote another user, please "define yourself by the things you love and not by the thing you hate", I would recommend you to stay away from topics with the green banner on it.


Last edited by jens on 10 July 2020 at 9:56 am UTC
dubigrasu Jul 10, 2020
Quoting: jens"define yourself by the things you love and not by the thing you hate"
Nice, who said that? (I see is a quote from something?)
TheRiddick Jul 10, 2020
Quoting: ShmerlSo it doesn't increase quality,

I don't think you understand whats going on one bit. But 'when' everyone is doing similar things as to what DLSS does, I'll watch you eat your own hat! :)

Also as someone pointed out, native everything would be great, but lets face it, unless a magical fairy comes down from the silicon heavens and unleashes a compute power revolution, then we aren't going to see CPU's or GPU's for the consumer handle future graphics very well without some way to 'optimize' performance at higher resolutions or fps.

In saying that DLSS2.0 has shown that in areas a 1440p image can look better then 2160p, its not universal but it CAN look decently better in areas.


Last edited by TheRiddick on 10 July 2020 at 10:36 am UTC
jens Jul 10, 2020
  • Supporter
Quoting: dubigrasu
Quoting: jens"define yourself by the things you love and not by the thing you hate"
Nice, who said that? (I see is a quote from something?)

Thanks for the response, yeah, I guess some places could be a lot nicer when everyone (me including) would have that quote more often in the back of their minds!

I've read it somewhere here on GOL in another discussion, unfortunately not sure anymore where exactly that was and no idea where it originates from (I haven't looked for though).
Projectile Vomit Jul 10, 2020
Quoting: furaxhornyx
Quoting: Projectile VomitI'm still wrestling with this damn Nvidia/Intel hybrid thing. I switched, recently, to Manjaro (KDe- I love KDe. Leave me alone.) and have never seen this hybrid thing until now. I tried switching to just the Nvidia 440 drivers, and rebooting did not give me the desired results. I am the guy who simply decided to reformat (after backing up everything using a liveUSB), when I mess up the graphics. I have never been very good at recovering a system from a graphics issue. So I reformatted with Manjaro (I get better results with my music production than from other distros, which may have something to do with the hybrid video drivers, as Nvidia is known not to play nice with audio production, but I'm not entirely sure). I am back at the hybrid drivers and, for now, I'm leaving them. Music production is a bit more important to me than games, at least on this computer (my only computer, at this time). I hope a switch that doesn't have me altering files and jumping through hoops comes along soon.

While Manjaro runs fine on my desktop (no hybrid), I had the same problem than you with my MSI laptop. I finally installed Linux Mint on it, and it works ok. I haven't tried making music on it though.

I was using Linux Mint until they dropped their KDe spin. Ever since the loss of Gnome 2.0, I have stuck with KDe.
melkemind Jul 10, 2020
Quoting: Shmerl
Quoting: TheRiddickHave you even tried DLSS 2.0? (and 3.0 down the pipe will be better)

Your comments strongly suggest you've only ever read about it
If you increase resolution, reconstructed image can only be an approximation, no matter how much machine learning you'll throw at it. That's just how it works by definition.

What you're saying might actually matter if we were talking about images of real things. These are computer-generated images in the first place. Using an A.I. to "approximate" them is meaningless to your eyes. If it looks right and runs faster on your machine, why would it matter?
Shmerl Jul 10, 2020
Quoting: herbertAs he said you just missing the point. It does increase quality if you set lower settings.

I explained my point. It decreases quality in comparison with using high settings.

Quoting: herbertWhy do you want to waste energy power when you can have an almost as good render but with higher FPS ?

Because "almost as good" is worse. This whole resolution race is just marketing. If GPU isn't ready for higher resolution no amount of tricks can compensate for lack of compute power. And instead of wasting die space on those trick ASICs, GPU makers can actually use it for proper compute units.
Shmerl Jul 10, 2020
Quoting: EikeI don't understand you negativity. What are you fearing?

I don't like solutions that lower quality and waste GPU die space, while at the same time are sold as some kind of super cool feature. That's just what Nvidia does. They know that competition is already head to head with them in compute power, so they start resorting to tricks like "hey, look we can bump resolution more than competition and it won't be so horrible still". But for that, they stuff the card with ASICs just for those tricks. The alternative (proper approach) is continuing to increase compute power.

Basically to sum up. DLSS is a marketing and market manipulation tool, it's not a good technology.


Last edited by Shmerl on 10 July 2020 at 3:59 pm UTC
Eike Jul 10, 2020
View PC info
  • Supporter Plus
Quoting: Shmerl
Quoting: herbertAs he said you just missing the point. It does increase quality if you set lower settings.

I explained my point. It decreases quality in comparison with using high settings.

Yeah, obviously.
And calculating only for the real resolution decreases quality in comparison with supersampling.
By the way, usual rendering decreases quality compared to raytracing.
So everything below raytracing is to be avoided.

Sure, the feature is a compromise, but every computer rendering is compromise.


Last edited by Eike on 10 July 2020 at 3:59 pm UTC
Shmerl Jul 10, 2020
Quoting: EikeSure, the feature is a compromise, but every computer rendering is compromise.

Yes, but the way Nvidia does it is not good technologically. However since they have more money / resources, they know that once they push this approach (adding more ASICs to the GPU), others will need to either follow it even if it's bad (cheap and wrong way to address quality) or they'll need to invest a lot more money in proper compute advancement. So for them it's a sneaky way to get an edge, but for the end user it's a bad deal.

You should think beyond koolaid logic here. And it's not really about open source vs closed source. It's about technology progress. Nvidia has a lot of power over the market now, and they push garbage approaches due to that.


Last edited by Shmerl on 10 July 2020 at 4:06 pm UTC
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.