Join us on the Linux Gaming community on Lemmy, the federated open source alternative to Reddit.

NVIDIA to launch DLSS support for Proton on Linux tomorrow (June 22)

By - | Views: 17,906

While DLSS has been technically available in the NVIDIA drivers for Linux for some time now, the missing piece was support for Proton which will be landing tomorrow - June 22.

In one of their GeForce blog posts, they made it very clear:

Today we’re announcing DLSS is coming to Facepunch Studios’ massively popular multiplayer survival game, Rust, on July 1st, and is available now in Necromunda: Hired Gun and Chernobylite. Tomorrow, with a Linux graphics driver update, we’ll also be adding support for Vulkan API DLSS games on Proton.

This was revealed originally on June 1 along with the GeForce RTX 3080 Ti and GeForce RTX 3070 Ti announcements. At least now we have a date for part of this extra support for Linux and DLSS. This, as stated, will be limited to games that natively use Vulkan as their graphics API which will be a short list including DOOM Eternal, No Man’s Sky, and Wolfenstein: Youngblood. Support for running Windows games that use DirectX with DLSS in Proton will arrive "this Fall".

With that in mind then, it's likely we'll see the 470 driver land tomorrow, that is unless NVIDIA have a smaller driver coming first with this added in. We're excited for the 470 driver as a whole, since that will include support for async reprojection to help VR on Linux and hardware accelerated GL and Vulkan rendering with Xwayland.

Article taken from GamingOnLinux.com.
32 Likes
We do often include affiliate links to earn us some pennies. We are currently affiliated with GOG and Humble Store. See more here.
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
50 comments
Page: «4/5»
  Go to:

slaapliedje 22 Jun
View PC info
  • Supporter Plus
Quoting: CatKiller
Quoting: slaapliedjeGranted, nvidia doesn't support Mac at all, which I still find amusing.
They can't. On Windows and Linux, the GPU vendor provides the API implementation. On Macs, Apple do. Apple and Nvidia had a falling out, so no more Nvidia hardware in Macs, so no support from Apple for Nvidia hardware in Macs.
Yup, that's what I find amusing. Like two people who used to be best buds let a woman get between them or something. Though to be fair (to be fai-uh), it isn't like nvidia is hurting for money because of it.
slaapliedje 22 Jun
View PC info
  • Supporter Plus
Quoting: x_winghttps://en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support

40 games in ten years... I call that far away from a success.
This is kind of a false pretense. The PhysX engines have been built into the GPUs for years now, and so special support for it is no longer a thing. So 40 sounds about right. New games for the most part just use the hardware if they need/want to.
x_wing 22 Jun
Quoting: slaapliedje
Quoting: x_winghttps://en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support

40 games in ten years... I call that far away from a success.
This is kind of a false pretense. The PhysX engines have been built into the GPUs for years now, and so special support for it is no longer a thing. So 40 sounds about right. New games for the most part just use the hardware if they need/want to.

40 games and most of them (if not all of them) being sponsored by Nvidia in ten years. And as far I know, most of the nowdays game physics are still running on the CPU. So, the idea was to accelerate physics execution using the GPU but their reluctance to make a standard made them fail and 15 years after they first release of Physx we are still using the CPU. IMO, that's a failure.


Last edited by x_wing on 22 June 2021 at 5:30 pm UTC
3zekiel 22 Jun
  • Supporter
Quoting: x_wing
Quoting: 3zekielPhysX was a success for a long time, tessellation also made a lot of noise for them, and did give them an edge. OF COURSE it does not last forever - for as long a there is competition - (CUDA has for a very long time though). I am not appreciating it, I am being purely realist. I don't particularly like it, I don't encourage it as a consumer, but I do understand the rational from their PoV. And wishing them to do otherwise in their position is, well, wishful thinking.
We can wish all that we want, but R&D cost money, a lot of money. So companies want some pay back for it. Nvidia is already doing the effort of supporting most features faster and faster on Linux. And now contributing directly to Proton too so we get even more. So from our point of view, it is a clear win.

https://en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support

40 games in ten years... I call that far away from a success.

You don't encourage it but you see it as a win. idk, for me it's clear that the best that can happen is that DLSS has the same fate as Physx, which is quite probable as their implementation requires a lot of resources from Nvidia.

Quoting: 3zekielAs for the open standard DLSS, it would be useless as of now, and while it might help getting more games with XeSS if Intel does make it good, it would not change much anyway as long as they do not open the background which they won't for very obvious reasons I already pointed out in another message.

To this day, AMD still has no real support for RT on Linux (except in the proprietary driver that one uses and no developers target).Also they have a very bad track record in term of day 1 support for GPUs themselves (yes they tend to boot now, clap clap, well done, thx for allowing us to boot your gpu, now also give all features and a stable driver). Nvidia has lagged behing for wayland support (but honestly, from a user perspective this does not matter one bit).
Who even knows when/if FSR will have (good) support on Linux, and even more so on Proton. It might work in reshade though according to GN's video.

And somehow you end up with a rant against AMD using arguments that applies for past releases of Nvidia hw as well...

All I have to say is hat any AMD problems of the past won't change the fact that Nvidia practices are anti-competitive. You may like them from a corporate point of view, but as a end user you should definitely feel them as despicable.

Quoting: 3zekielThey open sourced the headers ( of NVAPI), it was in news here multiple times. So I would guess you can do the plumbing behind that. Obviously no implem behind that, just headers. I am not saying it became an "open standard" per se either. It has no frozen version for others to implement etc. It might come, who knows.

Now, ignoring all that, FSR might still help a little with sub par configs, and it is always nice to have. But it does not seem like support is too hot either - metro said they wouldn't, and the games which does are not so hot either. Maybe reshade will save it... And even in consoles, I doubt it does much better than checkboard.

Correct me if I'm wrong, but I always understood that DLSS was part of NGX, not NVAPI.

Not so sure which part DLSS is in, I just now they specifically open sourced headers so it would be stubbed into proton. It was not to be competition friendly clearly.

For the rant on AMD, ok, I am still salty for the last card I bought in prev gen (btw, it is really not that old, like 1 or 2 year ?). I am back on nvidia because that was a very painful experience - maybe I had my hopes too high but well.

And Nvidia is anti competitive yes ... I mean, I would do the same as them in their position, and frankly most sane people would, so I have a hard time criticizing them. At the same time, they are not a charity, but a company, they are supposed to be making money, not give kiss and hugs to everyone.
My only wish is that they open source the core driver, for which it makes absolutely no sense from a business perspective to keep closed source. It would keep everyone happy too, as those who do not want proprietary features could ignore them.

As for PhysX, it is embedded in engines directly since a long time already, and the point for them is not so much if many games use it or not, but at some point it was the cool thing that made you buy an Nvidia GPU. It's really all that matters to them, and it was a clear win on that point. It was also still used in metro last light at least, not sure for redux. I'd say it is mostly phased out by new techs - I remember it was used for some lighting, which as an example would be replaced by RT now. Once a feature like that is used up, you just do the next one. It also profits everyone eventually since the competitors will implement an alternative, potentially cross vendor and cross platform. Or it will just become a de facto standard, depends.

The win I present for us is the subject of the news, that is, Nvidia cares enough about us to support its features here. And I did throw some salt at AMD for their (lack of) RT support, but also OC that came after a long time etc ... (yeah I do not forgive easily, I know).
x_wing 22 Jun
Quoting: 3zekielAnd Nvidia is anti competitive yes ... I mean, I would do the same as them in their position, and frankly most sane people would, so I have a hard time criticizing them. At the same time, they are not a charity, but a company, they are supposed to be making money, not give kiss and hugs to everyone.
My only wish is that they open source the core driver, for which it makes absolutely no sense from a business perspective to keep closed source. It would keep everyone happy too, as those who do not want proprietary features could ignore them.

Nobody is expecting that they behave as charity company. But creating standards is not about charity but to create a sustainable market environment and allow it to evolve for the better.

Quoting: 3zekielAs for PhysX, it is embedded in engines directly since a long time already, and the point for them is not so much if many games use it or not, but at some point it was the cool thing that made you buy an Nvidia GPU. It's really all that matters to them, and it was a clear win on that point. It was also still used in metro last light at least, not sure for redux. I'd say it is mostly phased out by new techs - I remember it was used for some lighting, which as an example would be replaced by RT now. Once a feature like that is used up, you just do the next one. It also profits everyone eventually since the competitors will implement an alternative, potentially cross vendor and cross platform. Or it will just become a de facto standard, depends.

As I answered slaapliedje, the innovation was to move physics calculations into the GPU but they completely failed, mostly because their crappy proprietary API strategy.

Quoting: 3zekielThe win I present for us is the subject of the news, that is, Nvidia cares enough about us to support its features here. And I did throw some salt at AMD for their (lack of) RT support, but also OC that came after a long time etc ... (yeah I do not forgive easily, I know).

Which can also be seen as a marketing movement. I mean, you didn't get DLSS Proton support until AMD came up with FSR and what a coincidence that we get the Nvidia "new linux feature" on top of the AMD FSR article.
3zekiel 22 Jun
  • Supporter
Quoting: x_wing
Quoting: slaapliedje
Quoting: x_winghttps://en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support

40 games in ten years... I call that far away from a success.
This is kind of a false pretense. The PhysX engines have been built into the GPUs for years now, and so special support for it is no longer a thing. So 40 sounds about right. New games for the most part just use the hardware if they need/want to.

40 games and most of them (if not all of them) being sponsored by Nvidia in ten years. And as far I know, most of the nowdays game physics are still running on the CPU. So, the idea was to accelerate physics execution using the GPU but their reluctance to make a standard made them fail and 15 years after they first release of Physx we are still using the CPU. IMO, that's a failure.

40 big games for such feature is okay. Also, it is in fact open https://github.com/NVIDIAGameWorks/PhysX. It seems indeed to be used a lot on the CPU, but that is likely not because of openness or not but because it does not make much difference nowadays. From what I read, I see some games do win from using the GPU (PUBG-https://www.reddit.com/r/PUBATTLEGROUNDS/comments/c17kol/is_it_safe_to_put_the_physx_settings_from_auto/) while others see no difference (rocket league). Not idea why, but well. It seems every unreal engine game can potentially use it, and you have an option on whether you put in on auto-gpu or cpu (I am mostly browsing reddit and co, so don't quote me too much either).
So yeah, far from a loss I would say ? Once again, it did turn out to be a win in term of image. So I doubt Nvidia sees it as a loss.

Overall, it is not a loss for us, since the CPU implem seems to be pretty fast now. So win win ?
It is the same story in a way as G-Sync VS Freesync. Nvidia spearheaded the effort, made the R&D and marketing, then locked it in. It pushed competitors, who already had a reference so it was easier for them, to propose an alternative. And bam, we got freesync. Better yet, Nvidia gracefully allowed the use of freesync on their GPUs (which means not so locked in and evil, they could really just have dropped the price on gsync and "sponsored" it to death), and turned G-Sync into a quality indicator: not supported == we did not test it, and likely the panel quality is meh, expect flicker - freesync validated (or whatever the name) == we tested it and it's cool - gsync == massively tested it, and have very high quality standard. Now, whether you want to pay for the difference between the "validated" and gsync is your own affair. There is some gain, but for me it is not worth it clearly. Other people might think that it is a vital difference. But in the end, everyone win, we have a feature which we probably would never have been implemented if not spearheaded by them, and a "seal of quality" now that they adopted the new standard. That is also why I do not particularly hate them, they do spearhead a lot of stuff we got as standard today.
They are also fixing the stupid stuff (virt io lock) they did before. So I am much more "kind" to them than one year or one year and a half ago.

I do appreciate that AMD is trying to catch up, and that they open the result, thus participating in getting everyone together after the front runner opened the path. I appreciate less their code drop approach, but many companies do that ... So can't completely blame them either.
So far, if you want a real open source support (as in, working upstream and ahead of time) only Intel does that. Will they still do it with DG2 ? Time will tell, if so, and if XeSS is good and supported then count me in.
3zekiel 22 Jun
  • Supporter
Quoting: x_wing
Quoting: 3zekielThe win I present for us is the subject of the news, that is, Nvidia cares enough about us to support its features here. And I did throw some salt at AMD for their (lack of) RT support, but also OC that came after a long time etc ... (yeah I do not forgive easily, I know).

Which can also be seen as a marketing movement. I mean, you didn't get DLSS Proton support until AMD came up with FSR and what a coincidence that we get the Nvidia "new linux feature" on top of the AMD FSR article.
\

Of course it is a marketing movement, but it means we matter, which is by far the most important. If we did not, they would just not implement it. They also supported virt io properly, and are coming on multiple other features (wayland, NvFBC when using gsync or gsync compatible etc).
Also, they have dropped headers quite a while ago, so I don't think it was originally with FSR in mind. Most likely, the date is because of FSR, not the feature. At that time, there was no reaction from the community though, no implem or anything that I could see at least, so it seems they lost patience and pushed it themselves. Maybe they just did not open it enough too.

What you are saying here, is that competition is good and make companies be more consumer friendly. Which I 100% agree on. I also am not wishing death to AMD or anything. I just want them to push more fwd. For now, I am still disappointed of their Linux support, and hope for them to do better. I also wish they would be more clean in their marketing for FSR. And I want Intel to enter the market full force too, and give another standard of good support. Hell, if a 4th one could enter, it would be even better. Strong competition will enforce differentiation, but at the same time, will accelerate the standardization of the most important differentiators (since every competitor will cooperate to take the crown and make a standard to share it).
CatKiller 22 Jun
Quoting: x_wingSo, the idea was to accelerate physics execution using the GPU but their reluctance to make a standard made them fail and 15 years after they first release of Physx we are still using the CPU.
No, the idea was that you'd buy a separate card just for accelerating physics calculations. But that was silly: no one was going to buy a card just for that, and no one was going to put support into their game for something that no one had. So Nvidia bought the company and made it so that you could run those calculations on the GPU that you already had. Then they open sourced it some time later.
x_wing 22 Jun
Quoting: CatKiller
Quoting: x_wingSo, the idea was to accelerate physics execution using the GPU but their reluctance to make a standard made them fail and 15 years after they first release of Physx we are still using the CPU.
No, the idea was that you'd buy a separate card just for accelerating physics calculations. But that was silly: no one was going to buy a card just for that, and no one was going to put support into their game for something that no one had. So Nvidia bought the company and made it so that you could run those calculations on the GPU that you already had. Then they open sourced it some time later.

IIRC, the first sample of Physx I saw was on 2005 and it was from the former company that created the tech, using dedicated hardware, which was in a very early stage (I'm almost sure that their dedicated solution never got to the market). In the moment that Nvidia bought that company, their strategy was to implement that solution into the GPU. So, Nvidia wanted to move physics calculation into GPU as use case of GPGPU. But they fucked up with that proprietary API that only became open source long after the hype was gone. That's my point.

Time will tell what will win. But I'm confident to say that Nvidia will fuck up once again.
CatKiller 22 Jun
Quoting: x_wingIIRC, the first sample of Physx I saw was on 2005 and it was from the former company that created the tech, using dedicated hardware, which was in a very early stage (I'm almost sure that their dedicated solution never got to the market).


The PPUs definitely existed. I doubt that many got sold, because the business case for them was rubbish, but you could get pre-built gaming machines with them in. The technology was also in a bunch of console games before Nvidia bought Ageia.

QuoteIn the moment that Nvidia bought that company, their strategy was to implement that solution into the GPU. So, Nvidia wanted to move physics calculation into GPU as use case of GPGPU.

Of course they did. Buying an extra PPU was silly, but GPGPU is great. And of course they wanted it to be a market differentiator to make back the purchase price, particularly as Intel had just bought Havok at the time.
While you're here, please consider supporting GamingOnLinux on:

Patreon, Liberapay or PayPal Donation.

This ensures all of our main content remains totally free for everyone with no article paywalls. We also don't have tons of adverts, there's also no tracking and we respect your privacy. Just good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register

Or login with...
Sign in with Steam Sign in with Twitter Sign in with Google
Social logins require cookies to stay logged in.

Livestreams & Videos
Community Livestreams
Latest Forum Posts