Confused on Steam Play and Proton? Be sure to check out our guide.
We do often include affiliate links to earn us some pennies. See more here.

With Intel's brand new dedicated GPU due next year, they've begun talking up their efforts of getting Linux support in early.

On Twitter, their team highlighted their work:

Our journey toward a new visual computing experience is underway, and that includes a commitment to the #OpenSource community. Local memory implementation is the first of many steps toward robust Linux support for our future discrete graphics solutions.

The post links to this set of patches which reads:

In preparation for upcoming devices with device local memory, introduce the concept of different memory regions, and a simple buddy allocator to manage them. At the end of the series are a couple of HAX patches which introduce a fake local memory region for testing purposes. Currently smoke tested on a Skull Canyon device.

Intel have traditionally been pretty great with their Linux support and so this isn't exactly surprising. Even so, it's very pleasing to see them hype this up so we know we're getting first-class support.

It's exciting, we've long needed another horse to enter the race. 2020 is certainly going to be interesting. We've no idea what their target audience will be for it though, hopefully the price will be reasonable.

Could you see yourself buying an Intel discrete GPU?

I'm still rocking my NVIDIA 980ti which, thankfully, still has a good amount of time left. I've been considering an AMD GPU for a while, but it seems waiting another year might be worth it.

Article taken from GamingOnLinux.com.
20 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
26 comments
Page: «2/3»
  Go to:

razing32 Feb 18, 2019
Interesting.
Will be curios about the future reviews.
Xaero_Vincent Feb 18, 2019
If GVT-g is supported with their consumer discrete cards like their iGPUs now, then I'll definitely consider one in my next build.
zimplex1 Feb 18, 2019
This may be autistic, but I'll never pair anything by AMD with NVIDIA or Intel. Which is why both my CPU and GPU are AMD.
Scoopta Feb 18, 2019
Quoting: zimplex1This may be autistic, but I'll never pair anything by AMD with NVIDIA or Intel. Which is why both my CPU and GPU are AMD.
Yeah I'm in the same boat right now. I've always had AMD and I really like their stuff especially seeing as their FOSS drivers are awesome. I'll never touch nvidia unless they have a serious change of heart regarding Linux and Intel is a maybe depending on how competitive the price to performance is. If I'm being honest though I don't see myself leaving team red anytime soon.


Last edited by Scoopta on 18 February 2019 at 9:51 pm UTC
Zelox Feb 19, 2019
Intel gpu, yes I would buy it! Well if it was for gaming then, and for a reasonable price for my wallet.
Some gpus are waaay to expensive at the moment.

It's also gonna be fun to see how intel will market there gpu.


Last edited by Zelox on 19 February 2019 at 12:40 am UTC
Purple Library Guy Feb 19, 2019
Quoting: zimplex1This may be autistic, but I'll never pair anything by AMD with NVIDIA or Intel. Which is why both my CPU and GPU are AMD.
So stuff has to match? Not so much autistic as OCD-ish, I guess. Kind of gives AMD the advantage there, since they're the only outfit seriously doing both things at the moment. But with stuff like this article, you could end up in position to go Intel + Intel.


Last edited by Purple Library Guy on 19 February 2019 at 3:30 am UTC
TheRiddick Feb 19, 2019
Given that Intel GPU drivers are open source and we can assume their new dedicated products to follow a open source route, then yes I could buy their GPU's, but only if their competitive price and performance, which they won't be!

Plus to top it off I'm only looking at GPU's significantly faster then 1080TI, which Intel has said no too, they are just going to have RX580 competing products...
KuJo Feb 19, 2019
Supporting the commercial underdog is neccessary for diversity on the markets. That is the reason why I has and I will prefer AMD-products. And, by the way, we are not at the technical end of performance when looking at the Ryzen-CPUs. The Vega-GPUs are a little behind what Nvidia shows, but they still have the performance to play in the high regions counted in FPS. By the way, similar to the CPUs from Intel, I expect medium to significantly higher prices than with AMD. They will probably be in similar regions as the Nvidia cards.

Therefore: As long as there is no performance or no price/performance ratio that makes everything else look just crude old and knocks me off my socks, I will continue to support the commercial underdog and AMD.
appetrosyan Feb 21, 2019
Finally!

Intel has had an open source graphics stack long since AMD/ATI. If they can fit into the power vacuum left by AMD in the high end, and not let Ngreedya fill it in with more overpriced rock, I’m all for it.

I’d still much rather prefer a Risc V-sequel solution, that comes from a company that never abused the FOSS licenses and has no proprietary drivers, butintel is good enough.
Shmerl Feb 21, 2019
Quoting: appetrosyanIf they can fit into the power vacuum left by AMD in the high end,

I don't think there is any vacuum. AMD are working on new architecture for high end, so Intel will be competing like others.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.