Lossless Scaling is a very popular application for Windows, and the new lsfg-vk project aims to bring Lossless Scaling's Frame Generation to Linux.
It's a new project and very much in-development, so parts may not work as expected. The developer has been rapidly fixing issues as they come in.
The installation process is a bit involved though, so you need to be pretty Linux-comfortable right now to actually make use of it as it's quite a manual process. You also need to have purchased Lossless Scaling on Steam, and it makes use of a legacy version.
Hopefully now it has been revealed, more Linux hackers can jump in and improve the process for everyone.
On the GitHub Wiki the developer posted up on how they actually achieved it, which is really interesting if you love the technical side of how people do things. Like the "Porting LSFG to native Vulkan" article that mentions the "psychological torture" the dev went through "to make this project work".
You can find it on GitHub.


It's important that these type of projects that address such issues exists. Hopefully it can be implemented on ProtonUp-Qt and ProtonPlus eventually.
Seems to be important to some people though. Maybe it's the size of my screen or something.
And if I can't tell the difference on a still image, I'm definitely not going to notice in-game.
Actually, I think you're much more likely to notice it in game. But not as a positive thing. It comes in the form of dynamic artefacts, which you could never see on screenshots. And I'm so tired of those that I'm actually considering a blanket refund policy on anything that needs upscaling to perform. And unfortunately for game devs, that likely includes everything that needs ray tracing.
And I'm so tired of those that I'm actually considering a blanket refund policy on anything that needs upscaling to perform.I think upscaling is fine when used properly. I remember Serious Sam Fusion had an upscaling option already in 2017, which allowed me to enjoy the game on my shitty laptop back then.
Now frame generation is what baffles me. It actually causes artefacts, lag, and other self-induced problems just to get that artificially high FPS number.
Here is an article that sums it up quite nicely. A bit lengthy and ranty, but worth a read:
https://blog.sebin-nyshkim.net/posts/nvidia-is-full-of-shit/