Do new CPUs threaten Nvidia's future?

What price performance?

For proof, he points to the contrasting fortunes of CPU and GPU pricing in recent years. The current CPU price war proves consumers are not sold on the latest high performance multi-core chips. And yet buyers continue to pay a stiff premium for high end 3D chips.

There's also no doubting the enormous difference between a high end graphics card and an entry level item, in terms of the end user experience. It's much, much larger than the typical gap between budget and premium CPUs. If you want decent graphics performance, discrete will be the way to go for years to come.

But what about the threat from Larrabee, Intel's first real effort to crack the discrete graphics market and due out late next year? Is the assumption that ray-tracing will be the next big thing in 3D graphics accurate? Not exactly, according to Kirk, who says, "there's nothing new about ray-tracing".

Historically, ray-tracing hasn't been used for real-time rendering because it is extraordinarily computationally expensive. That remains the case today. For many operations rasterisation does a very good job and does it 100 times faster than ray tracing. However, there are areas where ray tracing can be used efficiently to increase realism.

Hybrid rendering

The future according to Kirk is therefore much more likely to involve a hybrid approach to rendering. "Ray-tracing will not replace rasterisation. But it will add to our bag of tricks." In any case, Kirks says, there's no reason to assume that Nvidia's GPUs won't be extremely good at ray-tracing. Either way, the implication is that Intel's Larrabee will have an extremely tough fight on its hands.

As for the suggestion that CPUs and GPUs are converging towards a single, floating-point solution, Kirk simply isn't having it. "Even the latest multi-core CPUs only offer a small fraction of the floating point power of Nvidia's fastest GPUs," he says. If anything, this performance advantage will mean so-called general purpose applications for Nvidia's GPUs (known as GPGPU for short) are likely to win an increasing share of the market for really intensive computational solutions.

"We've gained lots of traction in the scientific community. Molecular modelling, astrophysics, climate modelling - all of these are highly parallel tasks that demand much more performance than is currently available."

Not that Kirk thinks that GPUs will replace CPUs. He accepts the need for truly general purpose processors will remain for the foreseeable future. But so will the demand for the increasingly flexible and powerful co-processor that is the modern GPU - preferably Nvidia's.