Patreon Logo Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal Logo PayPal. You can also buy games using our partner links for GOG and Humble Store.
We use affiliate links to earn us some pennies. Learn more.

Oh deary me, NVIDIA have a bit of a wildfire on their hands here, with NVIDIA DLSS 5 being compared with AI generated slop "art".

Perhaps lumping together their previous rather good upscaling and frame generation tech, with something else entirely that completely changes the faces of characters was not the best idea huh? Who could have seen this coming? Apparently not NVIDIA.

NVIDIA are now doing a little damage control, posting in the replies of their own video with a pinned comment that notes:

Important to note with this technology advance - game developers have full, detailed artistic control over DLSS 5's effects to ensure they maintain their game's unique aesthetic. The SDK includes things like intensity, color grading and masking off places where the effect shouldn't be applied. It's not a filter - DLSS 5 inputs the game’s color and motion vectors for each frame into the model, anchoring the output in the source 3D content.

Even Bethesda are doing some damage control of their own too, as Starfield was one of the games being recently shown off with a post on X/Twitter in reply to Digital Foundry noting:

Appreciate your excitement and analysis of the new DLSS 5 lighting here. This is a very early look, and our art teams will be further adjusting the lighting and final effect to look the way we think works best for each game. This will all be under our artists’ control, and totally optional for players.

Across the internet, it seems a whole lot of people and game developers have begun (rightly so) absolutely ripping into NVIDIA for DLSS 5 and what it's doing to game visuals. NVIDIA say it's "not a filter", but it's hard not to laugh at how it changes character faces into what looks exactly like you would imagine generative AI beautification looksmaxxing tools would do. Or, yassifying, if you will. I'm learning a lot of new silly words thanks to this. This is the kind of stupid AI generative filtering I would have expected from some sort of AI porn generation website, not from the likes of NVIDIA. Who cares about art direction when you can plump up the lips of a character right?

Some favourite funnies from the situation include:


Source


Source


Source


Source

I feel like I could go on forever, as I keep chuckling away at NVIDIA creating what is possibly one of the funniest video-game related meme templates recently.

Anyway, here's the video if you missed all the fun:

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link

Thanks NVIDIA, I hate it.

Article taken from GamingOnLinux.com.
Tags: AI, Misc, NVIDIA
28 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
All posts need to follow our rules. Please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Readers can also email us for any issues or concerns.
25 comments
Page: 2/2
  Go to:

Johnologue 2 hours ago
The "RTX on" meme has fully turned against them
GoEsr 2 hours ago
User Avatar
Quoting: alka.setzerNvidia explicitly said that they were running two 5090, one for rendering the other for DLSS 5 processing so that they could maintain fluid fps on the demos. They also said they had DLSS 5 running in a single card on the lab (didn't said which, could be a rtx 6090). DLSS 5 is supposed to come out on Q4.
That still doesn't really answer it. As I said before, the performance issue could be the result of changes that are needed conflicting with other aspects of the pipeline. DLSS runs on the tensor cores, so running it on a second card means the tensor cores on the primary card are doing nothing, unless they're actually running the upscaling/frame gen split off from this stuff because of conflicts/bugs.
If they've only just gotten it working on a single card in lab settings (the 60 series isn't coming until the end of 2027 at the earliest, so it's not that) to me that would indicate something besides raw computational cost or they'd never be able to get it out by Q4 for anything except a 5090.
My guess is there are either bugs that cause it to interfere with upscaling or they just haven't been able to get the newer model to fit in the cache of the tensor cores.

Last edited by GoEsr on 17 Mar 2026 at 8:57 pm UTC
Laephis 2 hours ago
User Avatar
"Sloptracing" is my new favorite word to describe all this.
kit89 2 hours ago
If they had taken high fidelity renderings of the characters provided by the artist (doesn't need to be photoreal), and showed it running on a 4060 where the low definition character was replaced with the high definition version, then I think that would have been impressive.

Instead we saw exaggerated features, that didn't align with the original character, that required 2 5090s.
GoEsr 2 hours ago
User Avatar
That actually sounds like DLSS 1.0 where the model was trained on each game. If they let developers train the model on their own ground-truth, I wouldn't really have a problem with this, but who knows how much computational power that would require.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon Logo Patreon. Plain Donations: PayPal Logo PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register