Intel Arc is still doomed, and this could be the final nail in the coffin

Intel Arc Pro A-series GPUs
(Image credit: Intel)

Oh, Intel Arc, you were supposed to be something special. I remember first hearing about Intel’s big plans for entering the discrete GPU market, back when I was but a fresh-faced freelancer breaking into tech journalism. It was exciting: finally, a third major player in the graphics card game, something to light a fire underneath Nvidia and AMD, driving competition and innovation. You were the Chosen One, Arc.

But like Obi-Wan leaving his scorched mentee beside a river of lava, I have to walk away. I’ve wanted Intel Arc to be good for so long, and now it’s finally here, and I feel nothing but hollow disappointment. I’m not angry with Intel; the tech giant could never have foreseen the devastation that COVID would wreak on its big GPU plans. But it sucks, and the news I read today only crystallized that downfall.

Yes, there’s more bad news on the Arc front, just days after I penned this article about how Intel GPU venture was on the back foot before the fight even began. Long-standing graphics market analysis firm Jon Peddie Research recently published an editorial on the state of AXG, Intel’s Accelerated Computing Systems and Graphics Group. Spoiler alert: it’s not good.

Image of Gunnir's new Arc A380 Photon GPU

(Image credit: Gunnir, Intel)

Will Gelsinger put Arc out of its misery?

Jon Peddie’s report began by highlighting just how much Intel has cut costs in recent years. Intel CEO Pat Gelsinger has demonstrated that he’s willing to kill off projects that don’t turn a profit, from Optane to the firm’s ill-fated drone department. Intel’s most recent financial report confirmed that the Arc project has cost it $2.1 billion, which would put it squarely on Gelsinger’s chopping block.

It gets worse. Peddie’s analysis estimated that the actual figure could be closer to $3.5 billion or more - a staggering amount of money for Intel to lose, and with very little to show for it. The delays were bad enough, but for Arc to struggle upon release even after years of preparation was nothing short of a disaster.

Quiet releases in Asian markets (mostly in pre-built PCs and laptops, with very few discrete desktop GPUs going on sale) didn’t really help. With little to no fanfare, a lot of consumers have either found it hard to care about Arc or missed the boat entirely and aren’t even aware of it. The news that one of Intel’s laptop production partners (as of yet unnamed) was dropping the project did nothing to help matters.

Intel has clearly been running damage control, with chief architect Raja Koduri frequently taking to Twitter to assuage people’s concerns and assure them that yes, Arc is coming for real, and it’ll be good. They even released a video showcasing the Arc A750 (which is not yet available to consumers) just barely outperforming Nvidia’s RTX 3060 on average across almost 50 games.

Close up of Pat Gelsinger on a conference stage against a pinkish-purple backdrop

(Image credit: Horacio Villalobos for Corbis/Getty Images)

Where do we go from here?

Beating the eighteen-month-old RTX 3060 isn’t going to be enough, and all the ersatz laughter in that YouTube clip won’t convince me. Let’s face it: RTX 4000 will be here before long, AMD’s next-gen GPUs too, and when that happens Intel is in serious trouble.

There are some small (very small) glimmers of hope; if Intel sells off its AXG arm in line with Peddie’s recommendation, there’s a chance it could be secured by a firm other than AMD and Nvidia, potentially creating new competition for the two GPU giants. If they decide to keep it, the recently-announced Arc Pro A-series GPUs might see Intel’s discrete graphics tech find a home in workstations and laptops for business.

I won’t hold my breath, though. With everything that’s gone wrong, I agree with Jon Peddie; Intel should cut its losses and put the Arc program to bed. Focusing on integrated graphics looks like a smarter move right now, to counter AMD’s push for CPU graphics and Apple’s mighty M1 and M2 chips.

When you look at what modern game consoles can do with AMD processors and no discrete graphics, it makes you wonder how much longer dedicated GPUs will be around. I hate to say it, but this might be exactly the right time for Intel to drop the bag and run. Whether Team Blue will do that remains to be seen; their messaging right now certainly makes it seem like they’re committed to the bit, but at this point? I really hope they’re not.

Christian Guyton
Editor, Computing

Christian is TechRadar’s UK-based Computing Editor. He came to us from Maximum PC magazine, where he fell in love with computer hardware and building PCs. He was a regular fixture amongst our freelance review team before making the jump to TechRadar, and can usually be found drooling over the latest high-end graphics card or gaming laptop before looking at his bank account balance and crying.

Christian is a keen campaigner for LGBTQ+ rights and the owner of a charming rescue dog named Lucy, having adopted her after he beat cancer in 2021. She keeps him fit and healthy through a combination of face-licking and long walks, and only occasionally barks at him to demand treats when he’s trying to work from home.

TOPICS