ATI radeon hd 5870

Given the immense specifications, the fact that the 5870 turns out to be the fastest single-GPU graphics card ever comes as no surprise. But there are some intriguing details to note.

Take Far Cry 2, something of a staple shooter on the PC. Whatever the resolution, the 5870 beats Nvidia's GeForce GTX 285 by a bigger margin than the GTX in turn beats AMD's outgoing champ, the Radeon HD 4890.

Shift the focus to a more strategy orientated game such as World in Conflict and the pecking order predictably tightens up. The 5870 is still on top, however.

But what about that most lethal destroyer of graphics performance, the shader-soaked visual masterpiece that is Crytek's Crysis: Warhead?

Well, the 5870 is the first card to average over 30fps and thereby deliver tolerably playable frame rates at full HD resolutions and with all the eye candy enabled.

Never mind the fact the Crysis: Warhead is a rather tedious game to play. The real-time visuals as powered by a Radeon HD 5870 must rate as one of the wonders of the modern world.

What's more, despite the ability to crank out these awesome frame rates, the 5870 manages to consume less power than the GTX 285, both at idle and at full pelt.


Indeed, it's worth remembering that all of the above is achieved running existing games under DirectX 10. If AMD is to be believed, it's a simple matter for developers to port their existing titles to DirectX 11 and unleash a further performance boost as great as 25 to 30 per cent.

All of which means the new Radeon HD 5870 is a no brainer if money isn't an object. It's the very fastest graphics chip available to mankind and that knowledge is a beautiful thing if there's one humming away in your PC.

Back in the real world, affordability does matter. If AMD had hit the same £200 price point as it achieved with the 4870 at launch, we'd be hailing the 5870 as the greatest graphics card of all time.

It's still a very, very good card. But at £300 it's simply much less relevant. The passage of time will no doubt fix the pricing problem. But by then, who knows what magic Nvidia may have worked with its own DX11 monster.

Follow TechRadar Reviews on Twitter: