Intel Finally Shows Off Actual Arc Alchemist Desktop Graphics Card

silversurfer

Level 85
Thread author
Verified
Honorary Member
Top Poster
Content Creator
Malware Hunter
Well-known
Aug 17, 2014
10,048
In the past few months, Intel has talked quite a lot about its upcoming discrete graphics cards for desktop PCs, without actually showing them in person--at least until now. At its Intel Extreme Masters pro gaming tournament in Texas today, the company demonstrated its Arc Alchemist 700-series board, powered by the ACM-G10 GPU. Only time (and lots of our testing) will tell if this card or one of its variants earns a spot on our best graphics cards for gaming list.

Specifically, the card that Intel showcased at IEM is called the Intel Arc Limited Edition. It was photographed and Tweeted by Bryce_GfxDriverGuru, an Intel Arc Community Advocate, so consider the demonstration official. Intel demonstrated a rendering of its Arc Limited Edition graphics board back in March, and has revealed some important details about the product.

First up, we know that the board is based on the 'big' Arc Alchemist GPU known as the ACM-G10, and it features up to 32 Xe cores (or 4096 stream processors, if you wish). Secondly, we also know that the graphics processor is paired with 16GB of GDDR6 memory. Thirdly, the board has four DisplayPorts and one HDMI port.

One of the mysteries about the Arc Limited Edition product is whether it is powered by the Arc Alchemist A770 or the A780 GPU model. The latter is obviously the fastest of the two, but whether is used on the Limited Edition model is something that even our the folks at VideoCardz, who tend to keep their ears close to the ground, do not seem to know.
 

SpiderWeb

Level 10
Verified
Well-known
Aug 21, 2020
468
What's the price and even more important, do they support raytracing? Intel could not have chosen a worse time to enter the GPU market when AMD and Nvidia are aggressively transitioning to raytracing which is computationally expensive and requires lots of AI and software trickery. I mean Intel has the engineers but do they have the willpower to understand that they need to be competitive to have a chance in this industry?
 
  • Like
Reactions: Vasudev

cruelsister

Level 42
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Apr 13, 2013
3,133
I mean Intel has the engineers but do they have the willpower to understand that they need to be competitive to have a chance in this industry?
As they have had great issues moving off the 14nm node for their CPU'S (and the move to 7nm has once again been delayed until 2023-4) I wouldn't hold my breath for any earthshaking developments.

Intel has cutting edge PR, but products? Not so much...
 
L

Local Host

What's the price and even more important, do they support raytracing? Intel could not have chosen a worse time to enter the GPU market when AMD and Nvidia are aggressively transitioning to raytracing which is computationally expensive and requires lots of AI and software trickery. I mean Intel has the engineers but do they have the willpower to understand that they need to be competitive to have a chance in this industry?
Raytracing does not rely on AI whasoever (NVidia RT cores do not use AI, they just make raytracing calculations faster) and is a niche feature with huge performance costs on both AMD and NVidia, either way I can't see anyone buying Intel GPUs no matter how good they end up being.

People are already relutant from switching between AMD and NVidia, not to mention Intel drivers are extremely basic. If Intel tries to apply anti-consumer tactics like they do in the CPU market, they'll find NVidia won't be as kind as AMD in terms of lawsuits.
 

Vasudev

Level 33
Verified
Nov 8, 2014
2,224
If Intel released the cards 6 months ago with sub 400$ pricing inclusive of taxes I'd have bought it and tried in eGPU because 3000$ for low end GPUs from red or green team didn't make sense due to scalping prices. Anyway, I don't think people are ready to buy it since RDNA3 and Lovelace are incoming which means Intel cards are DOA. If prices are sub 300$ there might be some sales but most of them will account for OEM models since Intel CPU + intel GPU pricing will be sweet deal.
 
  • Like
Reactions: plat

SpiderWeb

Level 10
Verified
Well-known
Aug 21, 2020
468
Raytracing does not rely on AI whasoever (NVidia RT cores do not use AI, they just make raytracing calculations faster) and is a niche feature with huge performance costs on both AMD and NVidia, either way I can't see anyone buying Intel GPUs no matter how good they end up being.

People are already relutant from switching between AMD and NVidia, not to mention Intel drivers are extremely basic. If Intel tries to apply anti-consumer tactics like they do in the CPU market, they'll find NVidia won't be as kind as AMD in terms of lawsuits.
Oh I thought I heard Jensen say that they use deep learning because the GPU simply cannot raytrace everything so it just renders a patchy/noisy image and their software "predicts" the blanks and denoise the image? Introducing the NVIDIA RTX Ray Tracing Platform

You might ultimately be right. My point is, Nvidia and AMD understand that they bit off more than they could chew by pushing raytracing when we still don't have the computational power so they often just use computational approximation to achieve the raytracing look which takes skill to do right but when done right is barely distinguishable from the real thing. I think even though raytracing in games is still a gimmick, I don't think people are willing to pay $$$ for a high-end GPU that doesn't support it in games when the other GPUs do.
 
  • Like
Reactions: Vasudev
L

Local Host

If Intel released the cards 6 months ago with sub 400$ pricing inclusive of taxes I'd have bought it and tried in eGPU because 3000$ for low end GPUs from red or green team didn't make sense due to scalping prices. Anyway, I don't think people are ready to buy it since RDNA3 and Lovelace are incoming which means Intel cards are DOA. If prices are sub 300$ there might be some sales but most of them will account for OEM models since Intel CPU + intel GPU pricing will be sweet deal.
I wonder where you live to see low end GPUs at 3000$, haven't seen a single low end gpu beyond 200$, honestly the high end GPUs have been around 900~1500$ during the whole inflation, no GPU was even near 2000$.
Oh I thought I heard Jensen say that they use deep learning because the GPU simply cannot raytrace everything so it just renders a patchy/noisy image and their software "predicts" the blanks and denoise the image? Introducing the NVIDIA RTX Ray Tracing Platform

You might ultimately be right. My point is, Nvidia and AMD understand that they bit off more than they could chew by pushing raytracing when we still don't have the computational power so they often just use computational approximation to achieve the raytracing look which takes skill to do right but when done right is barely distinguishable from the real thing. I think even though raytracing in games is still a gimmick, I don't think people are willing to pay $$$ for a high-end GPU that doesn't support it in games when the other GPUs do.
You confusing ray tracing with DLSS, which is used to aliviate the RT performance loss. I personally care as much about ray tracing as I cared about 3D, which means I don't, and most my social circle pretty much agrees with me.
 
Last edited:

SpiderWeb

Level 10
Verified
Well-known
Aug 21, 2020
468
I wonder where you live to see low end GPUs at 3000$, haven't seen a single low end gpu beyond 200$, honestly the high end GPUs have been around 900~1500$ during the whole inflation, no GPU was even near 2000$.

You confusing ray tracing with DLSS, which is used to aliviate the RT performance loss. I personally care as much about ray tracing as I cared about 3D, which means I don't, and most my social circle pretty much agrees with me.
You are right they are only using AI through DLSS in real time raytracing but it plays an essential role in RTX and you make it sound like it wasn't created by Nvidia trying to achieve real-time raytracing. RTX is a product of all of Nvidia's pipelines working in unison to achieve the effect. They do have an AI raytracing denoiser (Optix) but it's only being utilized in Maya, Cinema4D, DAZ, Blender type of applications for now. And they have used deep learning, heuristics, approximation to optimize their raytracing pipeline. Regardless my point is I do not believe that it's a gimmick like 3D. Lighting is the final frontier to making graphics look more believable and raytracing is the solution and if Intel is not taking this seriously they will simply not sell GPUs.


Time stamp 1:00:50 (hour)

 
  • Like
Reactions: Vasudev

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top