You can sign up to get a daily email of our articles, see the Mailing List page.
We do often include affiliate links to earn us some pennies. See more here.

NVIDIA have announced the TITAN Xp and it's a monster

By - | Views: 20,713
Based on their Pascal architecture, NVIDIA has announced their top-end GPU the 'TITAN Xp' and it's an absolute monster of a card. NVIDIA only announced their 1080 Ti back in March, so I was wondering if they would do another TITAN.

It has 3840 cuda cores with a memory speed of 11.4 Gbps and a massive 12 GB GDDR5X with an impressive memory bandwidth of 547.7 GB/s. It supports a max resolution of 7680x4320 at 60Hz, for those high definition display you're all hoarding.

tagline-image
You can find the full specification and more info here. It will cost a small fortune at £1,159.00 so you might want to sell a few limbs.

Would you be looking to buy one? I can only imagine the performance levels with something like that.

My 980ti is sounding a little old right now, but I still personally want to move to an AMD GPU to take advantage of the open source Mesa drivers. Still, I can't help feeling excited by how GPUs have progressed in recent years to be able to get a behemoth like this. Article taken from GamingOnLinux.com.
Tags: Hardware, NVIDIA
2 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
32 comments
Page: «4/4
  Go to:

FireBurn Apr 10, 2017
Quoting: ShmerlThey probably put it out, to look better when Vega will come out. But it's hard for them to compete on price. I'm getting a Vega GPU as soon as kernel / Mesa will be in shape to support it. Seems like there will be a delay with that, because DC code wasn't merged yet.

If you're pairing it with built in Intel or AMD graphics you should be able to do without DC and use PRIME to offload the graphics rendering to the dGPU, it works rather well on my laptop since fencing was added (i.e. no tearing)
NovenTheHero Apr 11, 2017
Don't AMD cards still suck pretty bad on Linux? I haven't had an ATI card since my 9800 pro back in the day because of the horrible driver support.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.