Multi-core processors: hype or the real deal?

Multi-core processors offer great potential but aren't without their problems. We look at what they are

Of equal and growing importance, AMD is also working collaboratively with the software community to develop better tools for programming within both homogenous and heterogeneous processing environments as part of our Accelerated Computing initiative. We look forward to these methods becoming part of a cross-industry, standards-based approach to overcoming the challenges and achieving the full potential of parallel processing."

A new computing paradigm

Last year, Intel demonstrated an 80-core chip that was capable of exchanging data at a speed of one terabyte a second. The company hopes to have these chips ready for commercial production within five years. Industry experts have suggested that chips like these will produce a totally new computing paradigm. I asked Intel's James Reinders whether he agreed with this view.

Article continues below

"Yes, I'm a huge believer that large numbers of cores will change many things. Mainframes, mini-computers, personal computers – more performance has always brought new computing paradigms. For me, I can't understand why anyone would think what we have today is 'good enough' and will be all there is in the future!"

So what is this new paradigm? What new ways will people be using computers in the future thanks to the multi-core revolution? "It takes a little imagination to envision the future. It always has," Reinders commented. "My imagination has thoughts on how it will work out. I tend to think of five things.

First, speculation – the computer does things it thinks I will want, so it's ready when I actually ask for it. Second, modelling – the computer models the world it resides in, and adapts better to it. A far too simple example is having my computer default to 'USA' when I'm using it in the USA; a more sophisticated model is having it learn my face and notice it in photos, videos, etc. Why can a five year old do this so well, but no computer seems to even try to do it?

Third, virtual reality at a level far beyond what we experience today; making that commonplace, replacing current computer graphics as completely as the VGA/CGA replaced the original monochrome text displays. Fourth, speech recognition. And lastly, eliminating the wait-cursor (the Windows hourglass). I'm sure they will turn out to be only part of what happens, and many days I worry that wait-cursors are as inevitable as death and taxes. We'll see."