We do often include affiliate links to earn us some pennies. See more here.

NVIDIA have now formally announced the Ampere GPU architecture

By - | Views: 19,939

Today, NVIDIA released a series of videos with CEO Jensen Huang, taking place instead of the cancelled GTC conference and the Ampere GPU architecture was revealed.

With Turing being the current generation powering GPUs like the GeForce 20 series with their dedicated Ray Tracing cores and GeForce 16 series without RT cores and Ampere is what's going to succeed it. Presenting in their own kitchen due to the Coronavirus, NVIDIA CEO Jensen Huang goes over various hardware advances from Ray Tracing to upscaling and more. However, to make it clear, no new GeForce has yet been announced so nothing on the consumer side is there to really be excited about just yet.

Part 1 can be seen below:

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link

See the whole playlist of the videos on YouTube.

If you're interested in a deep-dive about NVIDIA Ampere, they announced the NVIDIA A100 GPU based on the new NVIDIA Ampere GPU architecture which you can read more about on the NVIDIA devblog post. It sounds like an absolute monster. They also announced the DGX A100, which houses eight of the new A100's "delivering 5 petaflops of AI performance and consolidating the power and capabilities of an entire data center into a single flexible platform for the first time". The first lot of these DGX are going onto help fight COVID-19.

Now that Ampere itself is formally announced it should hopefully mean the next generation of GeForce GPUs aren't too far away from being introduced too. NVIDIA are keeping secretive about it for now. Whenever the new GeForce powered by Ampere happens, we will let you know.

Article taken from GamingOnLinux.com.
16 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
15 comments
Page: «2/2
  Go to:

randyl May 14, 2020
Quoting: tpauNot necessarily, it may just not be part of the keynote and come independently later.
Yeah, it could come later and probably will. I think it's the how much later that people are skeptical about, not to mention the quality of that announcement.

This is just my opinion but I think historically AMD's poor market position, compared to Nvidia, has influenced them to be more open and adoptive of open source. That's just my perception. I think Nvidia possibly facing some similar struggles in the future could influence them to be more open source friendly and adoptive. Although I think they have a mountain of tech debt in their driver stack that makes doing so daunting. Also just an opinion.
CatKiller May 14, 2020
View PC info
  • Supporter Plus
There never was a statement about opening the source to their drivers: that's just the fevered imaginings of wishful thinkers.

They were going to have a talk called "Open Source, Linux Kernel, and NVIDIA" in which they said
QuoteWe'll report up-to-the-minute developments on NVIDIA's status and activities, and possibly (depending on last-minute developments) a few future plans and directions, regarding our contributions to Linux kernel; supporting Nouveau (the open source kernel driver for NVIDIA GPUs, that is in the Linux kernel), including signed firmware behavior, documentation, and patches; and NVIDIA kernel drivers.

So maybe they have a plan to eventually stop making things hard for the nouveau people, and maybe they'll start actually making things easier for the nouveau people, but no announcement about open sourcing their own drivers has been made or implied.

I'm sure it would have been quite an interesting talk, though.


Last edited by CatKiller on 14 May 2020 at 6:54 pm UTC
Shmerl May 14, 2020
Either way, it didn't happen and they said nothing ever since about it and neither did they fix the situation with Nouveau. Talk is cheap and no talk is even cheaper ;)


Last edited by Shmerl on 14 May 2020 at 7:25 pm UTC
CatKiller May 15, 2020
View PC info
  • Supporter Plus
Quoting: randylIt better be powerful, the die size is enormous. From the blog link in the article, "....with a die size of 826 mm2".

For reference TU102 (2080ti) has a die size of 754mm2 and TU104 (2060, 2070, etc) has a die size of 554mm2.

The Volta V100, which is the predecessor of the part talked about in the keynote, has a die size of 815 mm².

The talk was about their HPC and datacentre stuff, and what they'd done with the $7 billion purchase of Mellanox, rather than GPUs. I'm quite interested in the actual details of the Ampere architecture, which I think are due on Tuesday.
randyl May 15, 2020
Quoting: CatKiller
Quoting: randylIt better be powerful, the die size is enormous. From the blog link in the article, "....with a die size of 826 mm2".

For reference TU102 (2080ti) has a die size of 754mm2 and TU104 (2060, 2070, etc) has a die size of 554mm2.

The Volta V100, which is the predecessor of the part talked about in the keynote, has a die size of 815 mm².

The talk was about their HPC and datacentre stuff, and what they'd done with the $7 billion purchase of Mellanox, rather than GPUs. I'm quite interested in the actual details of the Ampere architecture, which I think are due on Tuesday.
GV100 are also large, power hungry, expensive, space sucking die. I feel like you reiterated my point with the comparison. They created a huge hot costly processing unit instead of improving architecture like AMD has done. Piling on more transistors and power to eek out more performance doesn't excite me that much.

If anything, I think Nvidia should be concerned because Intel is right around the corner with a die shrink and new server and discrete consumer cards. It's the server cards that should really worry Nvidia, but they'll have pressure on both enterprise and consumer markets. In the last several months Intel has put massive effort into their video driver stack as well, improving performance for existing graphics chips by up to 30%. Nvidia cannot say the same on any platform. And this is important for discrete offering because server architecture is what trickles down into consumer offerings.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.