You can sign up to get a daily email of our articles, see the Mailing List page.
We do often include affiliate links to earn us some pennies. See more here.

Intel reveals Arc Graphics A-Series desktop GPU specifications

By - | Views: 17,606

As Intel ready up their dedicated desktop graphics, they have released fresh details on the specifications so here's what's about to arrive. Released in a short Q&A video, I'll spare you diving in with the details right here but the video is below if you wish to listen in.

Just like their CPUs they will be split across different tiers including Intel Arc 3, 5, and 7 graphics. With 7 beiong "High Performance Gaming", 5 being "Advanced Gaming" and 3 being "Enhanced Gaming". Here's the specs:

When it comes to the higher-end though, they're split between 8GB and 16GB. Intel said that partners will mostly ship with 8GB but their Limited Edition GPU doubles it to 16GB. That's about the only difference they've revealed in the line-up. They also said that each card is fully features with HDR, variable refresh XeSS, video encoding for AV1 and other popular codecs.

You can see their full video below:

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link
Article taken from GamingOnLinux.com.
10 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
16 comments
Page: 1/2»
  Go to:

Jpxe 9 Sep
I'm excited! I hope these will be great cards for Linux gaming
denyasis 9 Sep
Me 2. I might seriously consider one as a card upgrade depending how they take benchmark / thermal wise.
Ehvis 9 Sep
View PC info
  • Supporter Plus
It'll be good to see some practical numbers. In the video I see that the Arc 7 is rated for 225 W, which is substantially less than the heavy hitters from AMD and NVidia, so I wonder how much performance they squeeze out of that.

Of course, for me personally there are other things that are important now. How would this work with VR on Linux? Would it support async reprojection for VR. Will it support VRR in the same seamless manner as my NV GPU does G-SYNC now?
Quoting: GuestThe only way Intel will get my attention is if

#1 their pricing is "pre-pandemic." I'm not paying $1000 for a card that I could buy for $250 2 years ago. This is going to be a rough time for Intel due to this because their costs will be higher and they are going to want to recoup all that R&D and material overhead bump somewhere.

#2 they offer a decent number outputs. I run 12 screens on my machine and the only GPU that comes close to my needs is a WX6800 with a piddly 6 outs i.e. I'd still need TWO and they cost like $6K!

#3 Obviously they need to be on par with other brands performance. This will probably be a pipe dream given my use case which means I and others like me are probably not going to care about this and it won't be competitive in a way that brings choice or better prices to the market as a whole.

#4 The drivers will have to be hella good. Right now because I run multi GPU I can deligate work with no disruption. Watching videos on one GPU, surveillance and logs on another and playing Tarkov on the main GPU: max performance no BS...for a single GPU it will need to toggle many different video demands with no hiccups to catch my eye.

I definitely agree with your first point, however, while I can't say I disagree, points #2 - #4 feel way out of proportion for a company's first foray into Graphics cards. You're talking about running TWELVE screens and splitting workflow between those screens... a task you seem to claim can only be handled by two of one specific card. There's no way ARC could come close to that out of the proverbial gate (I'd be floored if wrong). You are most definitely not the "market as a whole", but, rather, a very niche customer.

Like I said, though, I very much agree with your first point. Price-wise, ARC definitely needs to target their pricing very aggressively. Intel is a big company, to be sure, but they are a serious underdog in the graphics market. NVIDIA has decades on Intel, and so does AMD to be fair. Furthermore, AMD, I feel, is heftily outperforming Intel in integrated graphics as well, and Intel has been doing IG for some time.

At minimum, I feel ARC has to be able to handle 1080p, 60hz, High Quality at a cost of $50 to $100 less than either AMD or NVIDIA's comparable cards, and maybe double or triple that price gap at 4k. Intel is fighting a serious uphill battle and their best bet is to target the more budget conscious gamers willing to take the risk on new hardware.

That said... I remain very much a 1080p/60hz gamer myself, and still, I probably wouldn't consider ARC for a generation or two.
Bumadar 9 Sep
Drivers, drivers, drivers oh and drivers.

It does not matter how good the hardware is, if the drivers are not up to par then its pretty useless. They are very far behind the nvidia's and amd's of this world right now on all there 3 mentioned levels of gaming, so tbh I think waiting till the next round might be smarter.


Last edited by Bumadar on 9 September 2022 at 6:58 pm UTC
denyasis 9 Sep
I think those are some good points. I think one thing in Intel's favor might be there driver stability. Intel graphics have "just worked" with very few problems relatively compared to Nvidia/AMD for a very very long time.

I would assume a good part of that is that their IG isn't exactly targeting anything bleeding edge, but from the comments above, it doesn't seem they are necessarily targeting high end right away (is there a business reason for that? It doesn't make sense to me).

Either way, I'm interested in seeing more. If they can match performance of cards released in the last 3-4 years and it's relatively problem free, I think that would be a huge success.
The Intel GPUs are a letdown after all the hype. Hopefully, in the future, they will create healthy competition for NVIDIA and AMD so the consumer can benefit in the end.
Quoting: denyasisfrom the comments above, it doesn't seem they are necessarily targeting high end right away (is there a business reason for that? It doesn't make sense to me).
I think more of a technical reason: That shit is really, really fucking hard to do. You need to build up expertise over years.
kit89 10 Sep
Does anyone know when these cards will be released in the EU, specifically the UK?
MiZoG 10 Sep
A380 performs worse in gaming than Radeon RX 6400 and consumes ~92 Watt, almost 30% more than the amd card. That's a not a very promising start. I would have been tempted by a low-profile offer. Not this...
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone with no article paywalls. We also don't have tons of adverts, there's also no tracking and we respect your privacy. Just good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register

Or login with...
Sign in with Steam Sign in with Twitter Sign in with Google
Social logins require cookies to stay logged in.