As if you forgot, right? Today, the real next generation in gaming begins, with the release of the NVIDIA GeForce RTX 3080 as the first in the desktop Ampere architecture.
Need a reminder of just how ridiculous and powerful the RTX 3080 is? Here's some specs:
GEFORCE RTX 3080 | |
---|---|
NVIDIA CUDA® Cores | 8704 |
Boost Clock (GHz) | 1.71 |
Standard Memory Config | 10 GB GDDR6X |
Memory Interface Width | 320-bit |
Ray Tracing Cores | 2nd Generation |
Tensor Cores | 3rd Generation |
Maximum GPU Temperature (in C) | 93 |
Graphics Card Power (W) | 320 |
Recommended System Power (W) (2) | 750 |
Supplementary Power Connectors | 2x PCIe 8-pin |
Additional details: they will support the latest Vulkan, OpenGL 4.6, HDMI 2.1, DisplayPort 1.4a, HDCP 2.3, PCI Express 4 and support for the AV1 codec.
Stock is expected to be quite limited, especially since they did no pre-ordering and stores will likely sell out quite quickly. Even so, here's a few places where you might be able to grab one. Some of the sites are under quite a heavy load too due to high traffic, so prepare to wait a bit. I've seen plenty of "website not available" issues today while waiting to get links.
UK
USA
Feel free to comment with more and we can add them in.
Driver Support
Along with the release, NVIDIA also put out a brand new Linux driver with 455.23.04. This is a Beta driver, so there may be some rough edges they still need to iron out. It brings in support for the RTX 3080, RTX 3090 and the MX450.
On top of new GPU support, it also has a bunch of fixes and improvements including support for device-local VkMemoryType, which NVIDIA said can boost performance with DiRT Rally 2.0, DOOM: Eternal and World of Warcraft with DXVK and Steam Play. Red Dead Redemption 2 with Steam Play should also see a bug fix that was causing excessive CPU use.
The VDPAU driver also expanded with support for decoding VP9 10- and 12-bit bitstreams, although it doesn't support 10- and 12-bit video surfaces yet. NVIDIA also updated Base Mosaic support on GeForce to allow a maximum of five simultaneous displays, rather than three. For PRIME users, there's also some great sounding fixes included too so you should see a smoother experience there.
Some bits were removed for SLI too like "SFR", "AFR", and "AA" modes but SLI Mosaic, Base Mosaic, GL_NV_gpu_multicast, and GLX_NV_multigpu_context are still supported. There's also plenty of other bug fixes.
What's next?
Today is only the start, with the RTX 3090 going up on September 24 and the RTX 3070 later in October. There's also been a leak (as always) of a RTX 3060 Ti which is also due to arrive in October. Based on the leak the upcoming RTX 3060 Ti will have 4864 CUDA cores, 8GB GDDR6 (no X) memory clocked at 14Gbps with a memory bandwidth of 447Gbps which means even the 3060 is going to kick-butt.
Are you going for Ampere, sticking with what you have or waiting on the upcoming AMD RDNA 2 announcements? Do let us know in the comments.
Quoting: ShmerlSeparating the drivers and having separate releases for them would improve that a lot. It does't make much sense to wait for a big package with dozens of modules to be tested and make it into the stable release/update if all you want is to upgrade a single piece of your hardware which only needs the update of a tiny part of the package. And from a user's point of view it doesn't matter who's to blame. The only thing that matters is: can you use the hardware on release date or not.Quoting: Alm888Hey, AMD! Did you see that? That's how it is done! A driver on a day-1!
They publish their dkms driver (which is open source) pretty much on day one. They can't force upstream kernel release schedule or Mesa release schedule, or llvm and so on.
So you can be in situation when upstream projects are lagging behind, while needed support is already public. What can be improved though, is distros providing their own bundled support in such cases until upstream catches up. But most distros don't care about such use cases.
Quoting: peta77Separating the drivers and having separate releases for them would improve that a lot.
Not necessarily. It's a trade off. Having stuff out of tree brings its own issues even if you get the benefit of having an independent release schedule.
At least with ACO, Mesa is less dependent on llvm for Vulkan now.
Quoting: EMO GANGSTERHas been working great for me for the better part of a year now. I also game on a 1080p TV and it runs everything I own at maximum settings with power to spare.Quoting: emphyThese cards are all way out of range of what I consider to be acceptable pricing, no matter how good the performance.
What's much more interesting to me is that the rx5700xt numbers are not all that much behind in the reviews (price/performance ratio) and that I am already seeing those cards in the used market for *very* reasonable prices, which I expect to get even more tempting in the coming few months.
have the drivers gotten better for this cards I'm hoping they drop in price when RDNA2 comes out I game on a 1080 tv so I don't need crazy powerful card?
Quoting: pete910Just to note while I agree on the late oss driver support on distros like *buntu there was nothing stopping you from using AMD's prop driver.
I ordered the 5700XT on launch day and used the AMD proprietary driver on 18.04 LTS, there was an update a few weeks after to 18.04 LTS and the driver no longer worked.
Last edited by Breeze on 17 September 2020 at 7:29 pm UTC
I'm sure efficiency must have increased somewhat, my card is now four years old (says the internet). To be honest I don't have any numbers and don't invest much time on it. Usually I take a look what models lie in my desired power range (< 100 W) and then fire up some of those "GPU X vs. GPU Y" comparisons and am disappointed at the not impressive performance gain.
My guts say it's a different story with CPUs. Still running on a X4 860K. Just compared it to a Ryzen 5 3600 which has less TDP (65 vs. 95 W) and, from some benchmarks I found, has 2 to 5 times the performance.
Am I just missing some products here or is the GPU market really evolving so much differently?
Quoting: NagezahnI'm sure efficiency must have increased somewhat, my card is now four years old (says the internet). To be honest I don't have any numbers and don't invest much time on it. Usually I take a look what models lie in my desired power range (< 100 W) and then fire up some of those "GPU X vs. GPU Y" comparisons and am disappointed at the not impressive performance gain.the answer to your question is Navi, I upgraded to a 5600xt from a 570 and it uses the same power with 80% of performance increase, on a 470 it would be like 90% or more for the same watts.
It idles at 11watts and normally at games it needs 100watts (it can go up to 135watts, but pulls around 100watts most of the time when gaming or even less) (also remember that some 470 can pull up to 160wats)
Last edited by Koopacabras on 17 September 2020 at 8:07 pm UTC
Quoting: NagezahnStill running on RX 470 (I didn't even remember and had to lspci for it ), usually enough for the kind of games I play. Every time there is some new GPU generation announced, I hope for a reasonably priced model that has some lower power consumption and significantly higher performance so I might consider an upgrade. But with every new generation I have the feeling that only monsters are created or budget cards for office scenarios.
I'm sure efficiency must have increased somewhat, my card is now four years old (says the internet). To be honest I don't have any numbers and don't invest much time on it. Usually I take a look what models lie in my desired power range (< 100 W) and then fire up some of those "GPU X vs. GPU Y" comparisons and am disappointed at the not impressive performance gain.
My guts say it's a different story with CPUs. Still running on a X4 860K. Just compared it to a Ryzen 5 3600 which has less TDP (65 vs. 95 W) and, from some benchmarks I found, has 2 to 5 times the performance.
Am I just missing some products here or is the GPU market really evolving so much differently?
In fact, the gaming GPU market is driven by two selling points, atm: 4K 144hz monitors and RTX. If you don't plan to jump in those bandwagons, you won't feel as compelled to upgrade your GPU.
Personnally, I have a 24in 1080p 75hz monitor and a 50in 1080p 60hz HDTV that I will keep for the foreseable futur. My GTX 960 4gb is still serviceable, if I accept the fact that I won't play at Ultra settings (Very High in the large majority of cases, medium in DE:MD). So... RTX 3080 has absolutely no appeal to me. Overpriced, overpowered. In fact, I'm I looking for the best RDNA2 GPU (to be ready for Gamescope/Wayland) that fits a 450w PSU. That's what I have in my Cougar QBX.
Last edited by Mohandevir on 17 September 2020 at 8:20 pm UTC
Quoting: BreezeQuoting: pete910Just to note while I agree on the late oss driver support on distros like *buntu there was nothing stopping you from using AMD's prop driver.
I ordered the 5700XT on launch day and used the AMD proprietary driver on 18.04 LTS, there was an update a few weeks after to 18.04 LTS and the driver no longer worked.
Well thats not good I admit but you did have day one support so the reply to Breeze is true in which he/she is talking complete bullshit !
Where is Nvidias day one support with there OSS driver? Oh, wait....
See more from me