Confused on Steam Play and Proton? Be sure to check out our guide.
We do often include affiliate links to earn us some pennies. See more here.

Intel's new discrete GPU will have a focus on Linux gaming

By - | Views: 18,382

Not exactly surprising, Intel have stated that for their new discrete GPU that Linux gaming will have a focus for them.

After having a chat with Intel, HotHardware mentioned this:

We should also mention that Ari underscored that Linux gaming will be a focus for Intel as well.

It's not really surprising, given Intel do have a history of supporting Linux and that goes back quite some years. According to the dedicated fellow over at Phoronix, they're also working on a new GPU driver too. This new driver, might perhaps be work towards supporting their new dedicated GPU.

I've been personally debating on getting an AMD GPU for my next upgrade, but considering how long Intel has supported their open source drivers on Linux I'm pretty happy to wait and see how Intel's new GPU turns out first.

Additionally, I do hope Intel's new GPU will see some sort of success. We've been trapped for too long with mainly AMD and NVIDIA on the desktop. They could both do with some more competition. Their new GPU isn't due until some time in 2020, so we do still have a while to wait.

Hat tip to Nod.

Article taken from GamingOnLinux.com.
Tags: Hardware, Intel
32 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
19 comments
Page: 1/2»
  Go to:

hardpenguin Dec 3, 2018
That's a yay from me, go Mesa, go opensource, thanks Intel :D
Brisse Dec 3, 2018
Best of luck Intel. The competition is needed, and the FOSS support is appreciated.
pete910 Dec 3, 2018
View PC info
  • Supporter Plus
QuoteI'm pretty happy to wait and see how Intel's new GPU turns out first.

You're going to be waiting a while :P

Just go get a AMD card!
Liam Dawe Dec 3, 2018
Quoting: pete910
QuoteI'm pretty happy to wait and see how Intel's new GPU turns out first.

You're going to be waiting a while :P

Just go get a AMD card!
My 980ti doesn't need replacing and won't until after those are released anyway, I'm not in any rush.


Last edited by Liam Dawe on 3 December 2018 at 1:57 pm UTC
iiari Dec 3, 2018
View PC info
  • Supporter Plus
Quoting: pete910
QuoteI'm pretty happy to wait and see how Intel's new GPU turns out first.

You're going to be waiting a while :P

Just go get a AMD card!
2020 is indeed far away in computer years...

However, and this almost certainly isn't the place for this, I don't see the point of an AMD card. I'm thinking of getting a new desktop and it appears on Linux even the fastest AMD card is slower than the Nvidia 1070 I've been running the past two years without issue. Except if you're very devoted to the idea of FOSS for drivers, and if you're not cost retrained, why go AMD? Again, taking the FOSS vs non-FOSS out of the equation... Honestly wondering.


Last edited by iiari on 3 December 2018 at 2:45 pm UTC
Purple Library Guy Dec 3, 2018
Quoting: iiari
Quoting: pete910
QuoteI'm pretty happy to wait and see how Intel's new GPU turns out first.

You're going to be waiting a while :P

Just go get a AMD card!
2020 is indeed far away in computer years...

However, and this almost certainly isn't the place for this, I don't see the point of an AMD card. I'm thinking of getting a new desktop and it appears on Linux even the fastest AMD card is slower than the Nvidia 1070 I've been running the past two years without issue. Except if you're very devoted to the idea of FOSS for drivers, and if you're not cost retrained, why go AMD? Again, taking the FOSS vs non-FOSS out of the equation... Honestly wondering.
Well, one thing to consider is the Wayland issue. Yeah, I know, Wayland is taking forever to dominate, but it is getting used more and more and adoption of this sort of thing tends to accelerate after a certain point, so if we're talking time horizons like 2020 . . .
My understanding of just what the problem is is fuzzy, but I hear Nvidia don't play well with Wayland.


Last edited by Purple Library Guy on 3 December 2018 at 6:02 pm UTC
Luke_Nukem Dec 3, 2018
AMD graphics power to power use ratio is way off. One of the primary reasons I won't go AMD even though I want to. Their APU game looks decent though.

So, I'm definitely keen on seeing this new card. Might be something that can give them something of a boost.
svartalf Dec 3, 2018
I distinctly remember their LAST attempt at going back into the Discrete market and how lackluster it was overall.

One of the problems was that they didn't try enough R&D experimentation before going for announcing it. They foolishly thought that someone could just field stripped down CPUs (it's doable, just not in the way THEY did it, actually) and never once found out the problem they ran into.

http://libre-riscv.org/3d_gpu/

Here's a bit of a set of musings by part of the RISC-V crowd that run similar to some of the same ones I have had and continue to do so. One of the telling things is this:

Quotenyuzi is a modern "software shader / renderer" and is a replication of the intel larrabee architecture. it explored the concept of doing recursive software-driven rasterisation (as did larrabee) where hardware rasterisation uses brute force and often wastes time and power. jeff went to a lot of trouble to find out why intel's researchers were um "not permitted" to actually put performance numbers into their published papers. he found out why :) one of the main facts that jeff's research reveals (and there are a lot of them) is that most of the energy of a GPU is spent getting data each way past the L2/L1 cache barrier, and secondly much of the time (if doing software-only rendering) you have several instruction cycles where in a hardware design you issue one and a separate pipeline takes over (see videocore-iv below)

Hm... There's a reason they went down in flames. They got broken upon the wheel on bandwidth. Couldn't get the graphics data in and out of the pipelines fast enough with the bus design they made.

Now, if you duplicate something like the AMD Southern Islands with a decent enough rasterizer and peel part of the crap out of AMD's design, they might have something as this is quite in the space AMD and NVidia are currently living in and it might even consume less power.

If they try something Larrabee-ish or something a bit bolder like Adapteva's mesh. coupled with something more like a VideoCore IV, it might have better legs and maybe even hand you other rendering methods than the GL/Vulkan/Metal/DX ones.

There might be other paths. But unless they've been experimenting with Silicon in the form of high-end, high F-Max capable (1GHz or better) class FPGAs or fully taped out silicon, this is sadly going to be another attempt at hyping up a failed attempt to, "stay relevant," in that space for Investors, much like Larrabee before it.
Whitewolfe80 Dec 3, 2018
MMM makes sense from intel point of view they are never going to take nvidia place from windows desktop gaming but on linux its much easier task especially if they hit the ground with as good or better performance in gaming.
svartalf Dec 3, 2018
Quoting: Whitewolfe80MMM makes sense from intel point of view they are never going to take nvidia place from windows desktop gaming but on linux its much easier task especially if they hit the ground with as good or better performance in gaming.

Oh, it most certainly does. But the thing is...they claimed this once before.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.