Can a PC graphics card be too powerful? As cards have become ever larger, more expensive and complex in recent years, that question has become a key one. Intriguingly, the two big players in computer graphics – AMD and Nvidia – have different answers.
Nvidia has been sticking to its traditional approach of building literally the biggest graphics chips possible – the monumental new GeForce GTX 480.
AMD, however, has gone in the opposite direction. With both of its previous flagship GPUs – the Radeon HD 3800 and Radeon HD 4800 – AMD wound back the afterburners and delivered slightly smaller, more cost-effective chips.
According to AMD, the aim is to maximise performance at the relatively accessible £200 price point. That makes for a more relevant graphics card than the sort of £400 monster few can afford.
It was slightly surprising, therefore, to find that AMD's latest uber-GPU – the Radeon HD 5870 – is a bit of an old-school bruiser. Launched late last year, the 5870 represents a near-doubling of the Radeon HD 4800's specifi cations. The stream shader count explodes from 800 for the Radeon HD 4800 to a scarcely believable 1,600.
Likewise the 5870 packs twice as many texture and render outputs, at 80 and 32 respectively. Of course, the 5870 was also AMD's first GPU to deliver support for DirectX 11, Microsoft's latest multimedia API.
With that comes a requirement for new hardware features. Most notable is the tessellator – a sort of geometry creation engine designed to improve the realism of 3D graphics. Think curvier, more detailed surfaces and fewer boxy-looking objects and you'll get the idea.
Inevitably, the added features only make the 5870 even more complex. Put it all together and you have a chip containing 2.15 billion transistors. Not only is that well over double the 4800, it's also around 80 per cent more than any currently available CPU. Make no mistake: this is an incredibly complex chip.
Power vs value
The problem is, despite the use of the latest 40nm silicon production process, at around 330mm2 the 5870 is much larger than the 265mm2 4800. Larger chips are of course more expensive. Even now, several months after launch, the cheapest 5870s command more than £300.
With that in mind, you might think that Sapphire is going in entirely the wrong direction with its newest Radeon HD 5870 board. At £399 it's the most expensive we've seen yet. What could possibly justify the added cost?
Most obvious is the doubling of graphics memory from 1GB to 2GB. Cards with added graphics memory often tend to be marketing gimmicks, but Sapphire would argue that extra memory can be a major benefit when running games at ultra-high resolution and detail settings – just the sort of workloads this card is designed for.
For good measure Sapphire has also upped the core and memory clockspeeds from 850MHz and 4.8GHz to 926MHz and 4.9GHz. So even without the added memory, it ought to be faster than a standard 5870. And so it proves.
Admittedly, the advantage in most benchmarks is marginal: perhaps one or two more frames rendered per second. The real difference comes when running the most technically advanced and demanding 3D games. The classic example here is Crytek's magisterial Crysis: Warhead. It's been around for nearly 18 months, but it's still the most visually stunning 3D engine yet created.
More importantly, it consumes astonishing amounts of graphics memory, particularly when running at super-high resolutions such as 2,560 x 1,600. Crysis: Warhead demands so much memory that the standard 1GB 5870 can't cope and is forced to fall back onto system memory to accommodate some of the texture, geometry and shader data. Such swapping of data over the PCI Express bus can really hurt performance, explaining why Sapphire's 2GB card is sometimes 50 per cent faster when running Crysis.
But is that worth nearly £100 extra? Rationally speaking, it's hard to justify. But for those willing to spend £300 in the first place, the future-proofing that comes with 2GB of memory might actually make sense...
Follow TechRadar Reviews on Twitter: http://twitter.com/techradarreview