Do new CPUs threaten Nvidia's future?
Chief scientist David Kirk talks ray tracing, Fusion and Intel
What price performance?
For proof, he points to the contrasting fortunes of CPU and GPU pricing in recent years. The current CPU price war proves consumers are not sold on the latest high performance multi-core chips. And yet buyers continue to pay a stiff premium for high end 3D chips.
There's also no doubting the enormous difference between a high end graphics card and an entry level item, in terms of the end user experience. It's much, much larger than the typical gap between budget and premium CPUs. If you want decent graphics performance, discrete will be the way to go for years to come.
But what about the threat from Larrabee, Intel's first real effort to crack the discrete graphics market and due out late next year? Is the assumption that ray-tracing will be the next big thing in 3D graphics accurate? Not exactly, according to Kirk, who says, "there's nothing new about ray-tracing".
Historically, ray-tracing hasn't been used for real-time rendering because it is extraordinarily computationally expensive. That remains the case today. For many operations rasterisation does a very good job and does it 100 times faster than ray tracing. However, there are areas where ray tracing can be used efficiently to increase realism.
Hybrid rendering
The future according to Kirk is therefore much more likely to involve a hybrid approach to rendering. "Ray-tracing will not replace rasterisation. But it will add to our bag of tricks." In any case, Kirks says, there's no reason to assume that Nvidia's GPUs won't be extremely good at ray-tracing. Either way, the implication is that Intel's Larrabee will have an extremely tough fight on its hands.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
As for the suggestion that CPUs and GPUs are converging towards a single, floating-point solution, Kirk simply isn't having it. "Even the latest multi-core CPUs only offer a small fraction of the floating point power of Nvidia's fastest GPUs," he says. If anything, this performance advantage will mean so-called general purpose applications for Nvidia's GPUs (known as GPGPU for short) are likely to win an increasing share of the market for really intensive computational solutions.
"We've gained lots of traction in the scientific community. Molecular modelling, astrophysics, climate modelling - all of these are highly parallel tasks that demand much more performance than is currently available."
Not that Kirk thinks that GPUs will replace CPUs. He accepts the need for truly general purpose processors will remain for the foreseeable future. But so will the demand for the increasingly flexible and powerful co-processor that is the modern GPU - preferably Nvidia's.
Technology and cars. Increasingly the twain shall meet. Which is handy, because Jeremy (Twitter) is addicted to both. Long-time tech journalist, former editor of iCar magazine and incumbent car guru for T3 magazine, Jeremy reckons in-car technology is about to go thermonuclear. No, not exploding cars. That would be silly. And dangerous. But rather an explosive period of unprecedented innovation. Enjoy the ride.