ATI Radeon HD 3870 X2 review

Double your gaming pleasure, double your gaming fun?

TechRadar Verdict

Much better than we were expecting. But we'd still rather have a powerful single-GPU card

Pros

  • +

    Huge performance – when it works

  • +

    It seems to work most of the time

Cons

  • -

    Dependent on Crossfire driver profiles

  • -

    Big, hot and very power hungry

Why you can trust TechRadar We spend hours testing every product or service we review, so you can be sure you’re buying the best. Find out more about how we test.

Need some extra speed? Just bung on a few more cores. That's the prevailing philosophy guiding the development of PC processors these days. In truth, it's a pretty effective approach and seems to be the only viable option for making the most of the ever-burgeoning transistor counts that shrinking silicon production technology has enabled.

The same sort of thinking has been creeping into graphics technology of late, too. First, NVIDIA rolled out its dual-GPU SLI technology back in 2004, only to be swiftly matched by the Crossfire platform from its main rival ATI. Since then, we've had predictable rounds of tit-for-tat multi-GPU willy-waving, culminating in triple-card SLI from NVIDIA and ATI's quad-card Crossfire upgrade.

Reality Check

When these multi-GPU solutions work correctly, the performance gains can be truly spectacular. And yet, SLI and Crossfire have utterly failed to gain any real traction in the market. Take the recent Steam surveys from Valve, for example. Between 0.75 and 1.5 per cent of PC gamers are currently running PCs with more than one GPU.

It's into that context that ATI's latest flagship video card arrives, the twin-GPU Radeon HD 3870 X2. Essentially, it's little more than a pair of Radeon HD 3870 GPUs crammed onto a single board. Indeed, AMD itself concedes that in terms of multi-GPU performance scaling, it has no advantage over a pair of individual Radeon HD 3800 series cards running in tandem courtesy of Crossfire.

Surprisingly, however, ATI has since managed to crank up the clockspeeds. Compared to the 775MHz frequency of the single-chip card, both of the X2's GPUs run at a screaming 825MHz. Memory speeds, by contrast, have taken a bit of a dip, down from 2.25GHz to 1.8GHz. Of course, AMD would say that's not an issue with two GPUs sharing the load. Don't worry, though - we won't be taking their word for it.

Architecturally, the X2's GPUs are standard Radeon HD 3870 GPUs. So, along with 512MB of graphics memory, it's got the same not-very-old 320 shader units and 16 texture and render output units in each surprisingly compact 55nm chip. They also pack the most advanced 3D feature set on the planet, thanks to the 3800 Series' unique support for DirectX 10.1. ATI says the key improvement over the plain old DirectX 10 API from Microsoft is more realistic lighting. For now, there are no games written for DirectX 10.1, so the jury remains out.

Power hungry

What's not up for debate is the size of the HD 3870 X2. This is one big, butch, beefy board. The circuit board is the same length as NVIDIA's big ol' GeForce 8800 GTX. But thanks to the utterly epic cooling solution required to keep those high-clocked GPUs in check, the X2 is much thicker. Unsurprisingly, it also guzzles more gas - 60 more Watts than an 8800 GTX under load.

It's an imposing card, therefore, with a frankly terrifying on-paper specification. But does it deliver the pixel-pumping goods in practice? Yes, but with a large, 12-storey neon-lit caveat, which we'll come to in a moment. When the video driver correctly identifies the 3D application and applies a Crossfire profile, this big graphics stick really pounds out the pixels, beating the 8800 GTX with ease. Most impressive is the way it holds itself together at silly resolutions, such as 2560x1600. You might expect the 256-bit memory buses and 512MB (per GPU) of graphics memory to buckle under that sort of strain. Perish the thought.

Can't wait?

Problem is, the dependence on driver profiles means dual-GPU scaling can be unreliable. In theory, ATI pledges to maintain comprehensive support for all the most important games. In practice, however, it's not unusual to have to wait for a week or two after the launch of a game before Crossfire support is enabled. Similarly, routine patches and software updates can also cause problems.

The obvious example from our benchmarks is Call of Duty 4. Due to technical issues with the Steam platform, we had to benchmark the demo rather than the full game - not a problem for a single GPU card. However, it clearly confused the Crossfire driver and the result was single-GPU performance.

To be fair, these problems apply equally to NVIDIA's rival SLI multi-GPU technology. Actually, it's the old GeForce 7950 GX2 dual-GPU card that gives us most cause for concern. From the beginning it suffered from poor driver support. Things only got worse when the GX2 ceased production. Will ATI be keener to invest time and resources into driver support for the X2 when it reaches the end of its life?

The TechRadar hive mind. The Megazord. The Voltron. When our powers combine, we become 'TECHRADAR STAFF'. You'll usually see this author name when the entire team has collaborated on a project or an article, whether that's a run-down ranking of our favorite Marvel films, or a round-up of all the coolest things we've collectively seen at annual tech shows like CES and MWC. We are one.