Intel could kill performance PC graphics in 2015

Are traditional graphics cards doomed?

Visual computing. Remember that? It was the buzzphrase from Intel's IDF shindig in San Francisco in 2008. No, I can't remember that far back either. I had to leaf through the dusty TechRadar annals to dig up the date.

Of course, the next big thing for Intel back then as well as being one of the supposed main drivers of visual computing was the ill-fated Larrabee graphics chip. It looked extremely exciting at the time, what with its programmable architecture and x86 origins.

But it was an idea ahead of its time. PC graphics, it turns out, is pretty competitive. And Larrabee wasn't up to snuff. So it got nixed. Or rather it morphed into a co-processor for highly parallelised industry computing.

As far as I'm aware, nobody has really been buying Larrabee based products as yet. It's still in the evaluation phase with a commercial product, codenamed Knights Corner and to be sold under the Xeon Phi brand, due out later this year or maybe 2013.

Ye olde Larrabee

But that doesn't matter. What does is that the brave new world of vector-accelerated visual computing promised by Larrabee is still on Intel's roadmap for desktop PCs and it may yet have a huge impact on your computing experience, but perhaps not as you'd imagined.

There are three closely linked issues in play here. One is the question of when, if ever, integrated graphics will ever be good enough for high quality gaming. The other is whether today's model of discrete CPU and graphics chip will be entirely usurped by system-on-a-chip (SoC) designs before that can happen. The last is whether Intel will still support discrete graphics for consumer PCs when it does.

What we can say for certain is that none of this will happen with Intel's next big architectural shift, known as Haswell. You can read all about Haswell shortly in a TechRadar deep dive written by yours truly, so keep your scanners peeled for that.

But in this context, Haswell doesn't do anything all that exciting in terms of either graphics or integration for desktop PCs. It has much more powerful 3D hardware. But it's largely the same old GenX architecture as seen in the current Intel HD Graphics, just with a load more cores.

Haswell, Broadwell and Skylake

The follow up to Haswell is Broadwell, which is essentially a 14nm die shrink of 22nm Haswell as decreed by Intel's Tick-Tock new-architecture-then-die-shrink development cadence, as the marketing patois goes.

That means it carries over the Haswell graphics architecture. It's no child of Larrabee, in other words. However, it is taking a step closer to SoC status by moving the PCH chip (otherwise known as the southbridge) onto the CPU package for all consumer models (some Haswell chips for ultrabook PCs will also have the PCH inside the CPU package).

Where things get really interesting, however, is the next chip along the line, known as Skylake, due out 2015. Now, it's a little too far out for any of this to be set in stone. But my understanding is that Skylake will be both a true single-chip SoC and finally deliver on the Larrabee promise of a fully flexible graphics pipeline and one that blurs the line between CPU and GPU.

The question is, with Skylake's greater level of integration, will Intel leave the door open for attaching ultra high bandwidth peripherals, ie graphics cards? Or will it take the unilateral decision that its integrated graphics is good.


Technology and cars. Increasingly the twain shall meet. Which is handy, because Jeremy (Twitter) is addicted to both. Long-time tech journalist, former editor of iCar magazine and incumbent car guru for T3 magazine, Jeremy reckons in-car technology is about to go thermonuclear. No, not exploding cars. That would be silly. And dangerous. But rather an explosive period of unprecedented innovation. Enjoy the ride.