Check out our Monthly Survey Page to see what our users are running.
We do often include affiliate links to earn us some pennies. See more here.

As if you forgot, right? Today, the real next generation in gaming begins, with the release of the NVIDIA GeForce RTX 3080 as the first in the desktop Ampere architecture.

Need a reminder of just how ridiculous and powerful the RTX 3080 is? Here's some specs:

GEFORCE RTX 3080
NVIDIA CUDA® Cores 8704
Boost Clock (GHz) 1.71
Standard Memory Config 10 GB GDDR6X
Memory Interface Width 320-bit
Ray Tracing Cores 2nd Generation
Tensor Cores 3rd Generation
Maximum GPU Temperature (in C) 93
Graphics Card Power (W) 320
Recommended System Power (W) (2) 750
Supplementary Power Connectors 2x PCIe 8-pin

Additional details: they will support the latest Vulkan, OpenGL 4.6, HDMI 2.1, DisplayPort 1.4a, HDCP 2.3, PCI Express 4 and support for the AV1 codec.

Stock is expected to be quite limited, especially since they did no pre-ordering and stores will likely sell out quite quickly. Even so, here's a few places where you might be able to grab one. Some of the sites are under quite a heavy load too due to high traffic, so prepare to wait a bit. I've seen plenty of "website not available" issues today while waiting to get links.

UK

USA

Feel free to comment with more and we can add them in.

Driver Support

Along with the release, NVIDIA also put out a brand new Linux driver with 455.23.04. This is a Beta driver, so there may be some rough edges they still need to iron out. It brings in support for the RTX 3080, RTX 3090 and the MX450.

On top of new GPU support, it also has a bunch of fixes and improvements including support for device-local VkMemoryType, which NVIDIA said can boost performance with DiRT Rally 2.0, DOOM: Eternal and World of Warcraft with DXVK and Steam Play. Red Dead Redemption 2 with Steam Play should also see a bug fix that was causing excessive CPU use.

The VDPAU driver also expanded with support for decoding VP9 10- and 12-bit bitstreams, although it doesn't support 10- and 12-bit video surfaces yet. NVIDIA also updated Base Mosaic support on GeForce to allow a maximum of five simultaneous displays, rather than three. For PRIME users, there's also some great sounding fixes included too so you should see a smoother experience there.

Some bits were removed for SLI too like "SFR", "AFR", and "AA" modes but SLI Mosaic, Base Mosaic, GL_NV_gpu_multicast, and GLX_NV_multigpu_context are still supported. There's also plenty of other bug fixes.

What's next?

Today is only the start, with the RTX 3090 going up on September 24 and the RTX 3070 later in October. There's also been a leak (as always) of a RTX 3060 Ti which is also due to arrive in October. Based on the leak the upcoming RTX 3060 Ti will have 4864 CUDA cores, 8GB GDDR6 (no X) memory clocked at 14Gbps with a memory bandwidth of 447Gbps which means even the 3060 is going to kick-butt.

Are you going for Ampere, sticking with what you have or waiting on the upcoming AMD RDNA 2 announcements? Do let us know in the comments.

Article taken from GamingOnLinux.com.
13 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
71 comments
Page: «7/8»
  Go to:

tuubi Sep 18, 2020
View PC info
  • Supporter
Quoting: emphy
Quoting: The_Aquabat
Quoting: bisbyxcompared the 3070 to the 2080ti, then compared the 3080 to the 2080 (non super). They primed your brain so when you hear "twice as fast" you think "twice as fast as the 2080ti"... Even though 25% faster than the 2080ti is exactly what their graph (shown in the article) roughly says.

interesting I didn't notice the 2080ti at first glance they put it like in furthest part of the graph another marketing technique here??

That is just a consequence of 2080ti's price. For the marketing trick you need to be at the other side of the graph. I'll leave it for you to discover; it's quite subtle.
You mean how the price axis starts at $200?
x_wing Sep 18, 2020
Quoting: emphyThat is just a consequence of 2080ti's price. For the marketing trick you need to be at the other side of the graph. I'll leave it for you to discover; it's quite subtle.

More over that: somehow people believe that Nvidia GPUs prices were reduced... while the true is that they just kept the same price per tier.
Mohandevir Sep 18, 2020
Quoting: Shmerl
Quoting: The_AquabatIf you make such a bold claim like twice performance, you have to prove it.

I knew from the start that 2x improvement claim was just marketing. It simply sounded unrealistic.


Quoting: The_AquabatSo I expect Big Navi to score some wins at least on compute tasks, and maybe some games optimized for AMD, and it is definitely not even close to "dead on arrival"

Regarding AMD, they gave projected improvements in their RDNA 2 slides and I don't expect it to be far off:



That's for performance per watt. What's not known yet, is how powerful their highest end card will be using that improvement and whatever amount of compute units they'll put in it. It could be quite a leap from 5700XT.

Question:
The RX 5600 xt supports Radeon Rays (AMD raytracing)... What's the state of it, on Linux?


Last edited by Mohandevir on 18 September 2020 at 5:31 pm UTC
pete910 Sep 18, 2020
View PC info
  • Supporter Plus
Quoting: Mohandevir
Quoting: Shmerl
Quoting: The_AquabatIf you make such a bold claim like twice performance, you have to prove it.

I knew from the start that 2x improvement claim was just marketing. It simply sounded unrealistic.


Quoting: The_AquabatSo I expect Big Navi to score some wins at least on compute tasks, and maybe some games optimized for AMD, and it is definitely not even close to "dead on arrival"

Regarding AMD, they gave projected improvements in their RDNA 2 slides and I don't expect it to be far off:



That's for performance per watt. What's not known yet, is how powerful their highest end card will be using that improvement and whatever amount of compute units they'll put in it. It could be quite a leap from 5700XT.

Question:
The RX 5600 xt supports Radeon Rays (AMD raytracing)... What's the state of it, on Linux?

No it don't, not on a hardware level.
Mohandevir Sep 18, 2020
Quoting: pete910
Quoting: Mohandevir
Quoting: Shmerl
Quoting: The_AquabatIf you make such a bold claim like twice performance, you have to prove it.

I knew from the start that 2x improvement claim was just marketing. It simply sounded unrealistic.


Quoting: The_AquabatSo I expect Big Navi to score some wins at least on compute tasks, and maybe some games optimized for AMD, and it is definitely not even close to "dead on arrival"

Regarding AMD, they gave projected improvements in their RDNA 2 slides and I don't expect it to be far off:



That's for performance per watt. What's not known yet, is how powerful their highest end card will be using that improvement and whatever amount of compute units they'll put in it. It could be quite a leap from 5700XT.

Question:
The RX 5600 xt supports Radeon Rays (AMD raytracing)... What's the state of it, on Linux?

No it don't, not on a hardware level.

Aaaaah! Radeon Rays Audio... What the...?!

Ok! Sorry then.


Last edited by Mohandevir on 18 September 2020 at 5:45 pm UTC
Shmerl Sep 18, 2020
Right, ASICs for ray tracing will be added only in RDNA 2.
Dunc Sep 18, 2020
Well, I'm still pretty pleased with my 960. Mind you I got it for free, which gives it an infinite price/performance ratio. Beat that!
emphy Sep 19, 2020
Quoting: tuubi
Quoting: emphy
Quoting: The_Aquabat
Quoting: bisbyxcompared the 3070 to the 2080ti, then compared the 3080 to the 2080 (non super). They primed your brain so when you hear "twice as fast" you think "twice as fast as the 2080ti"... Even though 25% faster than the 2080ti is exactly what their graph (shown in the article) roughly says.

interesting I didn't notice the 2080ti at first glance they put it like in furthest part of the graph another marketing technique here??

That is just a consequence of 2080ti's price. For the marketing trick you need to be at the other side of the graph. I'll leave it for you to discover; it's quite subtle.
You mean how the price axis starts at $200?

That's part of it.
TobyGornow Sep 19, 2020
Using Gamer's nexus numbers and comparing 2080 FE vs 3080 FE, we have under Furmark 235 W vs 324 W raw consumption that gives us a 89 w increase ( +38 % increase ), and according to LTT we have a average 30 % increase in fps (in 1440p).

I'm really not impressed, they only cranked up the power to eleven and I don't get the hype...Yeah for 4k players and Vr maybe (respectively 2.24% and 1.7% on steam hardware survey).

What I'm really hoping for is that AMD delivers on their promises on the power efficiency and give us 50% increase (I sincerely doubt it) for the same power consumption as the 5700xt. For me that where they can differentiate themselves with RDNA2 not participating in this stupid race and keeping it reasonable price wise and consumption wise.

Nvidia, a machine for pigs ?

EDIT: Grammar


Last edited by TobyGornow on 19 September 2020 at 12:54 pm UTC
wvstolzing Sep 20, 2020
I see one 'generational leap' on that graph; & it's the Olympic record leap in ridiculous pricing taken by the 2080Ti.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.