Every article tag can be clicked to get a list of all articles in that category. Every article tag also has an RSS feed! You can customize an RSS feed too!
We do often include affiliate links to earn us some pennies. See more here.

AMD came out of the gates swinging wildly at Computex 2021 with new chips, new tech and lots more new including: AMD 3D chiplet technology, AMD Ryzen 5000 G-Series desktop APUs, next-gen gaming laptops with their new AMD Radeon 6000M Series Mobile Graphics and their DLSS competitor in FidelityFX Super Resolution.

There's quite a lot to unpack here and we're still going through it, so we will update the article if we missed anything vital. The big one is no doubt the FidelityFX Super Resolution, an open source spatial upscaling technology that can be compared with NVIDIA DLSS (which is coming to Proton!). Being open source is quite exciting though! Although not yet, AMD said "in due course" it will be under the GPUOpen branch and under the MIT license.

With the FidelityFX Super Resolution tech AMD are betting big, with it clearly firing shots at NVIDIA with it being fully cross-platform across DirectX 11 & 12, Vulkan, and even NVIDIA GPUs too. AMD say when it's released "FSR can be ported onto multiple platforms without restriction.".

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link

AMD continue pushing the boundaries of their processor tech, with the introduction of AMD 3D chiplet technology. What could be a real breakthrough in packaging technology combines AMD's innovative chiplet architecture with 3D stacking they claim "provides over 200 times the interconnect density of 2D chiplets and more than 15 times the density compared to existing 3D packaging solutions" which they've been collaborating on with TSMC. They showed it in a real-world application too as they did this 3D bonding with a 5000 Series processor prototype. AMD claim they're going to begin production with these 3D chiplets by the end of this year.

We're finally seeing AMD bring their next-generation APUs to the desktop for system builders too with the AMD Ryzen 5000 G-Series desktop APUs. They've split them between consumer models and business models, here's the consumer models that we care about (click to enlarge each image):

The AMD Ryzen 5000 G-Series desktop APUs will be available "later this year".

On top of that AMD also announced the new AMD Radeon 6000M Series Mobile Graphics, based on RDNA2 they say it gives "up to 1.5x" higher performance or "up to 43 percent" lower power at the same performance as the RDNA architecture. It also brings over their AMD Infinity Cache and Ray Tracing to next-gen laptops.

Model

Compute Units & Ray Accelerators

GDDR6

Game Clock9 (MHz)

Memory Interface

Infinity Cache

AMD Radeon RX 6800M

 

40

12 GB

2300Mhz

@ 145W

192-bit

96 MB

AMD Radeon RX 6700M

 

36

10 GB

2300Mhz

@ 135W

160-bit

80 MB

AMD Radeon RX 6600M

 

28

8 GB

2177Mhz

@ 100W

128-bit

 

32 MB

"At Computex, we highlighted the growing adoption of our high-performance computing and graphics technologies as AMD continues setting the pace of innovation for the industry," said Dr. Su. "With the launches of our new Ryzen and Radeon processors and the first wave of AMD Advantage notebooks, we continue expanding the ecosystem of leadership AMD products and technologies for gamers and enthusiasts. The next frontier of innovation in our industry is taking chip design into the third dimension. Our first application of 3D chiplet technology at Computex demonstrates our commitment to continue pushing the envelope in high-performance computing to significantly enhance user experiences. We are proud of the deep partnerships we have cultivated across the ecosystem to power the products and services that are essential to our daily lives."

If you want to catch the whole thing, you can watch it in the below video:

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link
Article taken from GamingOnLinux.com.
22 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
33 comments
Page: «3/4»
  Go to:

Purple Library Guy Jun 3, 2021
Quoting: scaine
Quoting: Guest
Quoting: ShmerlI think for an anti-feature, it's good enough to end DLSS. Because it works everywhere and will also get better over time. The other side of it - it's general purpose. While AI/ML is more limited to specific use cases. So it is better in some cases and worse in others. Everything is a trade off.

This sounds like it was written by someone who doesn't understand either of these features. Hardware-assisted AI/ML being "limited" to specific use cases while an upscaling tech being "general purpose"? Give me a break...

Honestly, you sound like you don't know how Machine Learning and Artificial Intelligence works. It's typically incredibly focused.
Personally, I think a more honest name for it would be Artificial Instinct. 'Cause like, instinct is this stuff animals do without actually being smart or thinking or figuring it out, where through evolution they became really good at some particular task they need to be good at to survive. Machine Learning AI seems to be this forced evolution thing where you let the algorithms survive that are good at some specific task, until you've evolved a black box that can do that specialized thing really well but has no general reasoning ability. So it's instinct, but for some (marketing) reason we call it "intelligence".
scaine Jun 3, 2021
View PC info
  • Contributing Editor
  • Mega Supporter
Quoting: Guest
Quoting: scaine
Quoting: Guest
Quoting: ShmerlI think for an anti-feature, it's good enough to end DLSS. Because it works everywhere and will also get better over time. The other side of it - it's general purpose. While AI/ML is more limited to specific use cases. So it is better in some cases and worse in others. Everything is a trade off.

This sounds like it was written by someone who doesn't understand either of these features. Hardware-assisted AI/ML being "limited" to specific use cases while an upscaling tech being "general purpose"? Give me a break...

Honestly, you sound like you don't know how Machine Learning and Artificial Intelligence works. It's typically incredibly focused. Using ML/AI in an implementation of anything doesn't magically make it versatile or flexible.

However, that said, the point of this technology is to enable fast frame rates of complex scenes (potentially with ray tracing thrown in) at high resolutions. That's a pretty focused target. So I'm not really sure why an upscaler like FSR is somehow more "general purpose" than DLSS.

No, I know how ML works because I use it in my daily job. Don't try to lecture me how these things work.

I've liked your detailed responses that follow this post, because they explain how DLSS works. Thank you.

But this initial response was you dismissing other people's opinions snarkily. I've no time for that attitude. The air of superiority and mocking of other people's comments is infuriating. And you continue to do so. Language like "the other user" (they have a name), "Tried to pretend" , "gimme a break", etc. So condescending.

As for avoiding Gsync being silly - Toms Hardware ran an article pointing out that fully gsync-compatible monitors still require hardware to reach that certification. That hardware carries a premium, of course. Might be pennies, who know? But I know I'm not going to use it, so I'm happy to avoid paying for it.
(source: https://www.tomshardware.com/uk/features/gsync-vs-freesync-nvidia-amd-monitor)
Nocifer Jun 3, 2021
Quoting: Purple Library Guy
Quoting: scaine
Quoting: Guest
Quoting: ShmerlI think for an anti-feature, it's good enough to end DLSS. Because it works everywhere and will also get better over time. The other side of it - it's general purpose. While AI/ML is more limited to specific use cases. So it is better in some cases and worse in others. Everything is a trade off.

This sounds like it was written by someone who doesn't understand either of these features. Hardware-assisted AI/ML being "limited" to specific use cases while an upscaling tech being "general purpose"? Give me a break...

Honestly, you sound like you don't know how Machine Learning and Artificial Intelligence works. It's typically incredibly focused.
Personally, I think a more honest name for it would be Artificial Instinct. 'Cause like, instinct is this stuff animals do without actually being smart or thinking or figuring it out, where through evolution they became really good at some particular task they need to be good at to survive. Machine Learning AI seems to be this forced evolution thing where you let the algorithms survive that are good at some specific task, until you've evolved a black box that can do that specialized thing really well but has no general reasoning ability. So it's instinct, but for some (marketing) reason we call it "intelligence".

Ah, thanks for bringing up my favorite pet peeve (OK, one of many really) of the last ~5 years or so: the fact that we went from AI = "Skynet wakes up and takes over the world" to AI = "my coffeemaker can deduce how many sugars I take my coffee with and suggest new recipes for my morning coffee based off a list it downloaded from the coffeemaker vendor's server".

I wouldn't even call this "instinct", because instinct implies the subconscious ability to adapt to new situations by creating new responses on the fly. I'd just call it for what it is: highly specialized deterministic algorithms programmed by highly specialized people to do highly specialized tasks, with the ability to better adapt to these tasks by using a preprogrammed set of highly specialized criteria to perform a highly specialized form of reflection.

In other words the key differentiating factor when compared to normal software is "highly specialized", so kudos to the developers that have dedicated thousands of man-hours to make such a tech possible, but otherwise AI/ML is simply just another form of good ole programming. Nothing more, nothing less, despite what 21st century marketing-speak would have us believe.
Nocifer Jun 3, 2021
Quoting: Guest"Instinct" doesn't imply that because in its nature, an instinct is a genetically encoded sequence of reactions to specific stimuli. It can be as simple as a plant reacting to the change of humidity and as complex as humans striving to preserve their race.

Sure, but instinct can evolve on its own and create new reactions and motives over time (and over generations), while our algorithms can't, for the simple reason that we don't adequately understand the natural mechanism that creates these new reactions and motives, so we can't perfectly imitate it (just as we can't imitate e.g. abstract thought and convert it into electrical signals - we simply don't know enough about how the brain works). As it stands, no matter how complex our "AI" appears to be, its limit to "create" is only what we've programmed it to be able to create, and no more. So the more "specialized" the person creating the AI is, the more "specialized" that AI can ultimately be.

Quoting: GuestPeople don't have to be "highly-specialized" to implement a neural-network or a genetics algorithm - as you said, they're just regular algorithms - "good ol' programming".

Well alright, if you say so - I was just trying to be gracious and not call them "good ole script kiddies" :P But joking aside, I'm not bashing the quality of these algorithms, or their usefulness, or their limitations, or how easy or hard it is to create them and/or train them; I'm only bashing the marketing-focused change in lingo: what we market today as AI/ML, and how we market it to the masses, is parsecs away from actual intelligence, let alone actual awareness, i.e. what "AI" originally used to mean. And IMHO we shouldn't be calling it instinct either because even instinctive reactions in nature are (or can be) infinitely more complex and more nondeterministic than what we can currently do with our AI/ML stuff.
Purple Library Guy Jun 3, 2021
Quoting: Guest
Quoting: scaineI've liked your detailed responses that follow this post, because they explain how DLSS works. Thank you.

But this initial response was you dismissing other people's opinions snarkily. I've no time for that attitude. The air of superiority and mocking of other people's comments is infuriating. And you continue to do so. Language like "the other user" (they have a name), "Tried to pretend" , "gimme a break", etc. So condescending.
Don't you think it's also very condescending and sassy to dismiss innovative technology as "something over-specialized and good-as-dead"?
Do you not draw a distinction between dismissing a technology and dismissing a person? If you are the literal personification of Machine Learning, such that an attack on it is an attack on you, I'm sure we all apologize, and I for one would like to express my starstruck feelings at finally meeting an actual Platonic Ideal.


Last edited by Purple Library Guy on 3 June 2021 at 4:26 pm UTC
sub Jun 3, 2021
[quote=scaine][quote=Shmerl]
Quoting: scaineJust being Nvidia-only is good enough for me to fully get behind FSR. And the AMD showcase video for it was pretty impressive given how young the technology is (dunno what user "sub" was talking about above, claiming that FSR doesn't look as good - not only is there barely any difference, the whole point of these technologies is that they won't look as good, but you'll get 100%+ FPS out of them at high-res, and if you can only tell the difference in a side-by-side video, then that's clear "good enough").

I'm a strong AMD supporter and I absolutely hate the politics of Nvidia exploiting vendor-lockins.

Let's see how this does when it ships. I'd be more than happy to be proven wrong here,
as the open approach across vendors is the way to go, imho.
scaine Jun 3, 2021
View PC info
  • Contributing Editor
  • Mega Supporter
[quote=sub][quote=scaine]
Quoting: Shmerl
Quoting: scaineJust being Nvidia-only is good enough for me to fully get behind FSR. And the AMD showcase video for it was pretty impressive given how young the technology is (dunno what user "sub" was talking about above, claiming that FSR doesn't look as good - not only is there barely any difference, the whole point of these technologies is that they won't look as good, but you'll get 100%+ FPS out of them at high-res, and if you can only tell the difference in a side-by-side video, then that's clear "good enough").

I'm a strong AMD supporter and I absolutely hate the politics of Nvidia exploiting vendor-lockins.

Let's see how this does when it ships. I'd be more than happy to be proven wrong here,
as the open approach across vendors is the way to go, imho.

Yep, I'm with you. I was just surprised you found FSR to be poor quality. I was watching that video and thinking, holy cow, I can't tell the difference, but the framerates are 60%+ better! And the way you can choose quality or framerates, very nice. I hope it succeeds.
sub Jun 3, 2021
[quote=scaine][quote=sub]
Quoting: scaine
Quoting: Shmerl
Quoting: scaineJust being Nvidia-only is good enough for me to fully get behind FSR. And the AMD showcase video for it was pretty impressive given how young the technology is (dunno what user "sub" was talking about above, claiming that FSR doesn't look as good - not only is there barely any difference, the whole point of these technologies is that they won't look as good, but you'll get 100%+ FPS out of them at high-res, and if you can only tell the difference in a side-by-side video, then that's clear "good enough").

I'm a strong AMD supporter and I absolutely hate the politics of Nvidia exploiting vendor-lockins.

Let's see how this does when it ships. I'd be more than happy to be proven wrong here,
as the open approach across vendors is the way to go, imho.

Yep, I'm with you. I was just surprised you found FSR to be poor quality. I was watching that video and thinking, holy cow, I can't tell the difference, but the framerates are 60%+ better! And the way you can choose quality or framerates, very nice. I hope it succeeds.

We'll see. I indeed find the tech demo rather washed out (compared to DLSS 2.0 tech demos I saw).

You'll notice that from left to right there are less and less high-contrast sharp edges in the scene itself.
I claim the choice of the tech demo and it's segmentation was not by chance.
While it suggests "hey, it's the same scene you see" it looks to me that the scene was specifically crafted
to hide the problematic bits.
Purple Library Guy Jun 3, 2021
Quoting: Guest
Quoting: Purple Library Guy
Quoting: Guest
Quoting: scaineI've liked your detailed responses that follow this post, because they explain how DLSS works. Thank you.

But this initial response was you dismissing other people's opinions snarkily. I've no time for that attitude. The air of superiority and mocking of other people's comments is infuriating. And you continue to do so. Language like "the other user" (they have a name), "Tried to pretend" , "gimme a break", etc. So condescending.
Don't you think it's also very condescending and sassy to dismiss innovative technology as "something over-specialized and good-as-dead"?
Do you not draw a distinction between dismissing a technology and dismissing a person? If you are the literal personification of Machine Learning, such that an attack on it is an attack on you, I'm sure we all apologize, and I for one would like to express my starstruck feelings at finally meeting an actual Platonic Ideal.

Have you read the rest of my comment? If you did, then you have my answer.
Yeah. You said "a partially emotional response was justified and expected" but you didn't give a rationale for why that should be the case. "You work in the field" is not the same as "You are individually under attack when anyone says anything about that field". I have a good friend who works for an oil company; he doesn't get defensive any time someone worries about pipeline spills--in fact, he's the first to point out that all pipelines leak.
And you said "It's not my job to educate the layman". Well, nice of you to go above and beyond, and condescend to correct the unwashed, I suppose.
scaine Jun 3, 2021
View PC info
  • Contributing Editor
  • Mega Supporter
[quote=sub][quote=scaine]
Quoting: sub
Quoting: scaine
Quoting: Shmerl
Quoting: scaineJust being Nvidia-only is good enough for me to fully get behind FSR. And the AMD showcase video for it was pretty impressive given how young the technology is (dunno what user "sub" was talking about above, claiming that FSR doesn't look as good - not only is there barely any difference, the whole point of these technologies is that they won't look as good, but you'll get 100%+ FPS out of them at high-res, and if you can only tell the difference in a side-by-side video, then that's clear "good enough").

I'm a strong AMD supporter and I absolutely hate the politics of Nvidia exploiting vendor-lockins.

Let's see how this does when it ships. I'd be more than happy to be proven wrong here,
as the open approach across vendors is the way to go, imho.

Yep, I'm with you. I was just surprised you found FSR to be poor quality. I was watching that video and thinking, holy cow, I can't tell the difference, but the framerates are 60%+ better! And the way you can choose quality or framerates, very nice. I hope it succeeds.

We'll see. I indeed find the tech demo rather washed out (compared to DLSS 2.0 tech demos I saw).

You'll notice that from left to right there are less and less high-contrast sharp edges in the scene itself.
I claim the choice of the tech demo and it's segmentation was not by chance.
While it suggests "hey, it's the same scene you see" it looks to me that the scene was specifically crafted
to hide the problematic bits.

Well, it's a demo, so you wouldn't expect otherwise, right? They'll be savvy to show the tech doing its best, naturally.

But my point is that side by side... ok, maybe you can see the difference. But I struggled to see any meaningful difference, so I can absolutely guarantee that when a) I'm immersed in game and b) there's no side-by-side comparison, I'm definitely not going to see any difference.

But I do care about framerate. I do definitely see when framerate starts to suffer. So any tech that can deliver solid framerates while still looking beautiful at 4K... that gets my vote.

I mean, for me, comparing FSR to DLSS is like those reviews that compare iPhone to Android. I just don't care. I'm never going to buy Nvidia, or Apple... so I admit that I'm extremely narrow-minded about this stuff. FSR isn't as good as DLSS? Okay, then. So what? I guarantee that FSR is better than not having FSR, and that's all that matters to me, personally!


Last edited by scaine on 3 June 2021 at 9:47 pm UTC
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.