We do often include affiliate links to earn us some pennies. See more here.

NVIDIA announce the RTX 3090, RTX 3080, RTX 3070 with 2nd generation RTX

By - | Views: 39,748

Today the 'Ultimate Countdown' from NVIDIA ended with the announcement of the RTX 3090, RTX 3080 and the RTX 3070, all of them being absolute monsters with 2nd generation RTX. Powered by their Ampere generation, this definitely sounds like a big generational leap. It's really easy to be excited about it and I am.

Not just in terms of power, the price of the main two RTX 3080 and the RTX 3070 puts them well in line with the current 20xx generation which is pretty amazing for the difference in power. We need to take anything with a pinch of salt though of course, until independent benchmarks can be done.

Here's some shots from the presentation:

Meanwhile, the RTX 3090 is aimed at replacing the TITAN GPUs they did previously. A true overkill GPU, which they claim will let you play 8K.

Full specifications:

  GeForce RTX
3090
GeForce RTX
3080
GeForce RTX
3070
NVIDIA CUDA® Cores 10496 8704 5888
Boost Clock (GHz) 1.70 1.71 1.73
Standard Memory Config 24 GB GDDR6X 10 GB GDDR6X 8 GB GDDR6
Memory Interface Width 384-bit 320-bit 256-bit
Ray Tracing Cores 2nd Generation 2nd Generation 2nd Generation
Tensor Cores 3rd Generation 3rd Generation 3rd Generation
NVIDIA Architecture Ampere Ampere Ampere
NVIDIA DLSS Yes Yes Yes
PCI Express Gen 4 Yes Yes Yes
NVIDIA G-SYNC® Yes Yes Yes
Vulkan RT API, OpenGL 4.6 Yes Yes Yes
HDMI 2.1 Yes Yes Yes
DisplayPort 1.4a Yes Yes Yes
NVIDIA Encoder 7th Generation 7th Generation 7th Generation
NVIDIA Decoder 5th Generation 5th Generation 5th Generation
VR Ready Yes Yes Yes
Maximum Digital Resolution (1) 7680x4320 7680x4320 7680x4320
Standard Display Connectors HDMI 2.1, 3x DisplayPort 1.4a HDMI 2.1, 3x DisplayPort 1.4a HDMI 2.1, 3x DisplayPort 1.4a
Multi Monitor 4 4 4
HDCP 2.3 2.3 2.3
Length 12.3" (313 mm) 11.2" (285 mm) 9.5" (242 mm)
Width 5.4" (138 mm) 4.4" (112 mm) 4.4" (112 mm)
Height 3-Slot 2-Slot 2-Slot
Maximum GPU Temperature (in C) 93 93 93
Graphics Card Power (W) 350 320 220
Recommended System Power (W) (2) 750 750 650
Supplementary Power Connectors 2x PCIe 8-pin 2x PCIe 8-pin 1x PCIe 8-pin

On top of all that, they're also the first to come with support for the AV1 codec, which is very promising for being more efficient and more feature-filled than older popular codecs.

Pricing / Availability

  • RTX 3090 - £1,399 / $1,499 - Available September 24th
  • RTX 3080 - £649 / $699 - Available September 17th
  • RTX 3070 - £469 / $499 - Available October (no exact date given)

Each will also have a special Founders edition available, which an increased price.

When going by price to performance ratio, the RTX 3070 sounds absolutely ridiculous. If (big if), it truly has the RTX 2080 ti level performance for a vastly cheaper sum then it alone could help usher in a new level of gaming performance for a great many people. More GPUs based on Ampere will be coming, as these are just the first. I can't help but think about the RTX 3060, whenever they announce one that is. Considering the power of the RTX 3070, the RTX 3060 is quite likely to be a mega deal for people after both performance and an affordable price. Also makes me curious even more so now on Intel, as they announced their Xe-HPG gaming GPUs which arrive next year. Then we have AMD too with RDNA2 which is supposed to be this year. So much to look forward to for hardware enthusiasts!

Since NVIDIA support Ray Tracing already on Linux with their own extensions, and they already have early support for the vendor-neutral provisional Ray Tracing extensions in Vulkan it's going to be very interesting to see just how far developers will push it. Ideally though, we need more games on Linux that use it.

For those of you who really do love the serious technical side of it, NVIDIA has announced the release of a 'whitepaper' on the RTX 30 series which will be available on September 17. They say it will give a lot of detail on every aspect of the new cards in 'exhaustive detail'.

Other Windows-only stuff was announced too of course like RTX for Fortnite, a new 'NVIDIA Broadcast' effects suite application for livestreamers like backgrounds and noise removal, another RTX-powered application for video editing and more. Not that any of it matters for Linux users and gamers - sadly.

You can watch the whole thing below and see the announcement here.

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link
Article taken from GamingOnLinux.com.
Tags: Hardware, NVIDIA
18 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
55 comments
Page: «4/6»
  Go to:

randyl Sep 2, 2020
When my GTX970 started giving me problems I picked up a 1660ti to hold me over. It's a decent card that I'll use in another build, but the 6GB memory is a limiter for a few games that want 8GB+ for high rez textures. The RTX3070 is very attractive while the 3080 and 3090 don't appeal to me at all. The price is right for the 3070 as well.

Power consumption and heat are considerations for me. I'm waiting to see what AMD brings to the table with RDNA 2 before I form any strong opinions. Plus, I want to see Phoronix and Anandtech benchmarks before I get too excited.
a0kami Sep 2, 2020
Quoting: ShmerlOh, an interesting tidbit. Firefox started supporting VAAPI hardware accelerated video decoding both on Wayland and X11 now. But it won't work on Nvidia blob due to it lacking dmabuf support.

That's quite interesting, could you elaborate on that please ?
WJMazepas Sep 2, 2020
I looking forward to see RDNA2 results now. They should have 50% more perf/W and the CUs having 25% more performance at the same clock while using the 7nm+ from TSMC should give then enough horsepower to fight this cards.

The only thing that we dont have any ideia on how they compare is RT. Nvidia really step up their game here
WJMazepas Sep 2, 2020
Quoting: Guest
Quoting: kokoko3kI just hope old gpu prices goes down, but it never happens.

They'd really prefer to price gouge you at all points in time, not just at release.

All that needs to be done is:

a) passing laws that force companies to put all of their money fairly into worker's wages, products, and expansion, and stop allowing investors and CEOs to horde money. Oh, and make it so investors aren't allowed to enslave companies once they've already "made it" and no longer need help.

Or b) other economic systems such as government-funded research, which is already common in many countries (and often gets monopolized by private corporations).


How in the hell did you went from "Old GPU prices dont go down" to government-funded GPUs research?

Seriously, this is a Nvidia thing, they stop producing old cards and only make their newer available, but AMD still sells RX580s at a really cheap price these days
Shmerl Sep 2, 2020
Quoting: aokamiThat's quite interesting, could you elaborate on that please ?

dmabuf is a Linux kernel API that allows sharing memory buffers between CPU and GPU:

https://www.kernel.org/doc/html/latest/driver-api/dma-buf.html

In order to effectively use hardware video decoding, you need something like that. Open source drivers can use it fine. Nvidia blob can't use it, since it requires GPL compatibility.

Firefox is using dma-buf for hardware video acceleration over VAAPI, which works fine with AMD and Intel (and Nvidia with Nouveau). It doesn't work with the blob.


Last edited by Shmerl on 2 September 2020 at 3:15 am UTC
Shmerl Sep 2, 2020
What is interesting is AV1 support. AMD shouldn't slack with that and should add AV1 to RDNA 2 cards.
3zekiel Sep 2, 2020
Quoting: dubigrasuInteresting bit from Nvidia labs:



(from the video posted in op)

Actually, you touch something I see all the time in the silicon industry: all the internals are using Linux, or at least open source RTOSes when it comes to smaller targets - albeit in that case the sdk is usually developed on Linux too -. I don't know many OS/driver/runtime devs who enjoy working on something else than Linux. But you have some pressure from above to support losedow$ and such ... In some company you reach a ridiculous point where you have an "official" losedow$ PC that you must have because it's the rules & a grey zone (still paid by the company though) Linux one to actually work. In others you can only work in a VM, which is even worst.
CatKiller Sep 2, 2020
View PC info
  • Supporter Plus
Quoting: gardotd426Nope. No DLSS or RTX on Linux (except for Quake II with RTX since it's native, but effectively no).

No DLSS (or alternative) is going to really, really hurt Linux adoption going forward now that the consoles and Nvidia and supposedly AMD will now all support it, and it's probably a bigger deal than RTX.

You might not care about RTX, or DLSS, or Gsync/Freesync on multiple monitors, or HDR support, etc, but odds are 95% of people will care about at least ONE of the things like that that Linux has absolutely no answer for, and most of them there's not even an answer on the horizon.

Both RTX and DLSS are possible on Linux. If someone were making a Linux-native game they could use both of them right now. The ray tracing has been part of Vulkan since it became a thing, and Nvidia started including their NGX library with their driver recently.

The issue is getting other people to use them. With the Linux gaming market being small, spending time working on a feature that can only be used by a subset of that market is a tough sell; it's only profitable at all if you can avoid any speed bumps to the development process.

The other side of it is translating work done for Windows (which is almost entirely using DirectX) into something that will work on Linux. Structurally the Vulkan and DirectX implementations of ray tracing are (deliberately) very similar, and there's a tool (for the initial developers, not really for Wine) to automatically convert from one to the other, but it still takes work. There's no shortage of other work that also needs to be done. I imagine it will get there eventually, but I don't expect that it's a high priority since, again, it only helps a subset of users, and then the ones that are hardest to help because of the proprietary nature of the driver.

Given that reluctance, the sensible thing for Nvidia to do would be to contribute the translation for ray tracing between DirectX and Vulkan to VKD3D, since they're intimately familiar with both halves, and an implementation of the DLSS calls that use their library to Wine. But they probably won't do that, because Nvidia, and because it's not a particularly profitable segment. Which means we have to wait for someone else to get round to it. In principle Valve could badger Nvidia into doing more, sooner.
a0kami Sep 2, 2020
Thank you Shmerl!
lqe5433 Sep 2, 2020
Quoting: ShmerlI'm waiting for RDNA 2 as well. Even if Nvidia will claim 3x increase in performance, it won't impress me until they'll upstream their drivers.

I think they will never open source their drivers. They have many new tech, which is top secret for them.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.