Check out our Monthly Survey Page to see what our users are running.

Intel giving hints at a possible Intel Xe dedicated GPU release in June 2020

Posted by , | Views: 6,368

Raja Koduri from Intel put out a bit of a teaser on Twitter recently for for their upcoming dedicated GPU.

In the Twitter post, which was retweeted by the official Intel Graphics Twitter account was the below image which has the date of June 2020 on the license plate. Not exactly cryptic, it's a pretty clear teaser towards a release date for the Intel Xe or whatever they actually end up calling it once it's out. That's pure speculation of course on my part but it would line up given who sent the tweet and Intel previously saying the Xe series will be out in 2020.

We've yet to really see any solid information on exactly how powerful they will be. What we do know though, is that they should get first-class Linux support as Intel has been working through their drivers on Linux. They talked openly before about their commitment to open source and their focus on Linux gaming too so it's quite exciting.

NVIDIA and AMD could use more GPU competition, as the more we have the more it should hopefully push them to improve both their hardware and prices for future generations.

Article taken from GamingOnLinux.com.
15 Likes, Who?
We do often include affiliate links to earn us some pennies. We are currently affiliated with GOG, Humble Store and Paradox Interactive. See more information here.
About the author -
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
19 comments
Page: «2/2
  Go to:

mirv 7 October 2019 at 8:01 pm UTC
View PC info
  • Supporter
  • Top Supporter
ShmerlThere is enough demand in datacenters though, so I don't think they will oversaturate the market. It's a good thing.

I still think that while Intel might put a showing into rendering, their main target will no doubt be compute. I think it's Intel that really have been pushing Sycl (although someone correct me if I wrong there please). I see them eyeing Cuda and wanting to take some of that market. And yeah, plenty of room for someone else to compete in there.
TheRiddick 7 October 2019 at 10:36 pm UTC
Apparently NVIDIA will have their next series of cards out Q1 2020 so it will be interesting. Honestly the only reason to go NVIDIA is for the TOP END card performance at 4k, if your 1080p or 1440p gamer then there is absolutely no reason to only consider them, Intel and AMD are releasing middle end cards, there will be no top end.

I don't care what AMD says, by the time their big NAVI comes out its going to be way too late and cost WAY too much. That's how its been for quite some time now and I see no reason why it would change.
Shmerl 7 October 2019 at 11:45 pm UTC
TheRiddickI don't care what AMD says, by the time their big NAVI comes out its going to be way too late and cost WAY too much. That's how its been for quite some time now and I see no reason why it would change.

How aren't Nvidia 2080 cards too much though? They are already crazily priced, not something I'm interested in paying for a GPU. I doubt they'll lower those prices. So if AMD will also price it that way (like they did with Radeon VII), it's not likely I'm going to get those cards.

TheRiddickHonestly the only reason to go NVIDIA is for the TOP END card performance at 4k, if your 1080p or 1440p gamer then there is absolutely no reason to only consider them

IMHO for monitors higher refresh rate (matched with framerate in games) is more valuable than higher resolution. So 4K is quite a red herring to chase such expensive cards. I.e. I'd take something like 2560x1440 with > 100 fps over 4K with lower framerate. And even super high cards would struggle to push high framerate at 4K on max settings in demanding games. It will probably take a few generations of GPUs still, for 4K to become usable at high framerates. So no rush there.


Last edited by Shmerl on 7 October 2019 at 11:49 pm UTC
GustyGhost 8 October 2019 at 12:24 am UTC
If Intel plan on getting into the consumer dGPU space, they very likely will be strong armed by Hollywood and friends to implement a "secure content path" all for "the benefit of the users". Get ready for more firmware or hardware level DRM.
Arten 8 October 2019 at 5:43 am UTC
Why is that license plate on Tesla? They want compete wit Nvidia tesla in datacenters?
razing32 8 October 2019 at 7:37 pm UTC
GustyGhostIf Intel plan on getting into the consumer dGPU space, they very likely will be strong armed by Hollywood and friends to implement a "secure content path" all for "the benefit of the users". Get ready for more firmware or hardware level DRM.

OK , that comment flew over my head.
A bit of context please ?
GustyGhost 11 October 2019 at 11:32 pm UTC
razing32A bit of context please ?

You cannot use your current GPU without loading a proprietary blob firmware which uses encryption to make sure that the GPU ultimately obeys somebody else that isn't you. Part of this (mis)functionality is required for other "protection" schemes such as HDCP. Who would pressure for such (mis)functionality to be integrated into hardware and firmware?
sub 12 October 2019 at 10:32 am UTC
GustyGhost
razing32A bit of context please ?

You cannot use your current GPU without loading a proprietary blob firmware which uses encryption to make sure that the GPU ultimately obeys somebody else that isn't you. Part of this (mis)functionality is required for other "protection" schemes such as HDCP. Who would pressure for such (mis)functionality to be integrated into hardware and firmware?

Plus, aren't those companies trying to hard-locked diversify their platforms?
Providing some features only on one (premium) product and not on others, allowing them to use (almost) the same silicon for a wide range of product which can significantly lower the cost.

I guess this is also one argument against a fully open stack for at least some companies - looking in particular at you Nvidia.
razing32 13 October 2019 at 12:37 pm UTC
Thanks.
Guess I learned something new
  Go to:
While you're here, please consider supporting GamingOnLinux on Patreon, Liberapay or Paypal. We have no adverts, no paywalls, no timed exclusive articles. Just good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!

You need to Register and Login to comment, submit articles and more.


Or login with...

Livestreams & Videos
None currently, submit yours here!
See more!
Popular this week
View by Category
Contact
Latest Comments
Latest Forum Posts