We do often include affiliate links to earn us some pennies. See more here.

Intel Arc A770 GPU releases October 12th

By - | Views: 27,201

Along with announcing their new Raptor Lake desktop CPUs, Intel has also finally announced the release date of their Arc A770 GPU for October 12th. This is their top-end GPU, that they claim "delivers 65% better peak performance versus competition on ray tracing".

Coming in at $329, putting it in line with the RTX 3060 which they previously compared it against. They're facing some fierce competition with NVIDIA and AMD already well established in the market although given the insane price on NVIDIA Ada Lovelace, perhaps more will be looking at alternatives?

Very little else was really said about it that I can find, other than what they've previously revealed about it which is pretty much everything with all the specifications already available on their website. As a refresher, their Limited Edition version is mainly the card with the 16GB GDDR6 and they said previously most of their partners will do it in 8GB.

You can find out more on the Intel website. The XeSS SDK is also now available on GitHub.

I'm keen to see how Intel do with their first lot out properly, as a third vendor has been needed like this in the GPU space. With open source drivers too, there's a lot for Linux users to like about it.

Article taken from GamingOnLinux.com.
21 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
25 comments
Page: «2/3»
  Go to:

redman Sep 28, 2022
Quoting: ElectricPrismI'm thinking of buying one for a server or something. I very much hope the GPU War of the future is AMD vs Intel and Nvidia can go away (mostly) -- which since EVGA is no longer making NVIDIA seems likely as they were #1.

I've phased out nearly all Intel CPUs as Ive been on Thank You AMD phase, so this is a weird turn for me -- I just acknowledge Intel has had a open source driver since forever.

Open source drivers dictates a much higher probability I will buy something.

Nvidia is not going anywhere... They rule in the AI and Data science world with CUDA, AMD is second class for say that has something. Perhaps Intel and AMD can take the gamer market and the low cost market, but where the big bucks are expended will be on these A 100

Just my humble opinion!
psycho_driver Sep 28, 2022
Quoting: kit89I wish they had stuck it in the 200-250 bracket on price.

I think this is where the 580 will eventually settle? I agree though this used to be the sweet spot for price/performance in GPUs and since the bitcoin mining craze it's been a deadzone.


Last edited by psycho_driver on 28 September 2022 at 4:25 am UTC
Phlebiac Sep 28, 2022
Quoting: GuestSomeone, please explain that Intel math on the first picture, where there are 45 and 47 stand next to each other, and 47 is somehow 25% over 45.

I think they were trying to say that their benchmark score for that game improved by 25% in their latest beta drivers. In other words, their performance was terrible in that scenario, and they figured out how to fix it, rather than dropping it off the comparison list. ;-)
jordicoma Sep 28, 2022
I would like to see an imagination (the old powervr) desktop card. I don't know how good the linux drivers are.
https://www.imaginationtech.com/products/gpu/img-cxt-gpu/
They invented the hardware raytracing accelerator, and if it can go mobile probably it could scale well to desktop with good power efficiency.
llorton Sep 28, 2022
Keeping an eye on the A380, I wonder how it compares to my RX560 and if it could be an upgrade.
Phlebiac Sep 28, 2022
Quoting: jordicomaI would like to see an imagination (the old powervr) desktop card.

They tried that years ago; it was a total flop. Of course, the same could be said for Intel...
sarmad Sep 28, 2022
Quoting: denyasisI'm definitely interested. But I want to see some good solid benchmarks. I know Ray tracing is all the rage, but I'd really like to see how it compares to cards from the last few years in general gaming.

If it's a good step up from, say my 1070ti and has a decent cooler, at that price, that's an upgrade I can afford!

I would say don't count on ray tracing just yet. I was trying Quake the other night and was getting around 500fps without ray tracing on my RTX 3060 laptop. After switching on ray tracing that went down to around 50fps. This was on FHD screen. On a WQHD (3440x1440) that number went down to around 25fps. So, the penalty you pay for RT is still pretty large and can probably only be afforded on RTX 40 series or something. If you have a big screen (big TV for example) then 4K resolution + higher frame rate gives you better payoff than ray tracing.
denyasis Sep 29, 2022
Quoting: sarmad
Quoting: denyasisI'm definitely interested. But I want to see some good solid benchmarks. I know Ray tracing is all the rage, but I'd really like to see how it compares to cards from the last few years in general gaming.

If it's a good step up from, say my 1070ti and has a decent cooler, at that price, that's an upgrade I can afford!

I would say don't count on ray tracing just yet. I was trying Quake the other night and was getting around 500fps without ray tracing on my RTX 3060 laptop. After switching on ray tracing that went down to around 50fps. This was on FHD screen. On a WQHD (3440x1440) that number went down to around 25fps. So, the penalty you pay for RT is still pretty large and can probably only be afforded on RTX 40 series or something. If you have a big screen (big TV for example) then 4K resolution + higher frame rate gives you better payoff than ray tracing.

Thanks for the info! I must offer my apologies as well. I realized my post was not written well. I'm not really interested in ray tracing, but more general purpose gaming benchmarks. I should have been more clear now what I look at it. Ray Tracing is neat and all, but like you indicated, the tech doesn't really seem ready for general use.
Matombo Sep 29, 2022
XeSS. I'm gona call it Xtreme-SS! Oh wait ... I shouldn't do that ...
syylk Sep 29, 2022
Quoting: redman
Quoting: ElectricPrismI'm thinking of buying one for a server or something. [...]

Nvidia is not going anywhere... They rule in the AI and Data science world with CUDA, AMD is second class for say that has something. Perhaps Intel and AMD can take the gamer market and the low cost market, but where the big bucks are expended will be on these A 100

Just my humble opinion!
Glad someone mentioned it.

Nvidia is printing money in the tensor core field. They absolutely dominate the HPC/Top500 rankings and on their homepage "AI" is far more prominent than Games. If you check it right now, besides the GTC announcements, all "Solutions" dropdown and "For You" dropdown should suggest what nV is concentrated on.

At this point in time, their gaming products are just a side-effect of their main focus, not the other way around.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.