CrossFire and SLI have finally come of age.
Just a couple of years ago, you were hard-pushed to squeeze an extra 50 per cent more gaming power out of your system by adding an identical card to the mix. And even then, the setup was often impossibly fiddly.
It worked perfectly with some games, and just, well, not at all with others. You'd see diminishing returns on your frame rate investment to the point where you may as well plump for a big, fat, flagship card.
But ATI and Nvidia have both honed the tech to a point where we're now seeing major benefits to the polygon punt – in some cases, close to twice the power for two cards – and it's been a learning process for both outfits.
Architectural innovation and API-leveraging aside, they've had to work their driver releases to really get two cards on-song. And just as crucially, they've had to invest more heavily than ever before in close working relationships with game developers.
That sort of commitment doesn't come cheap, especially when you consider the tight milestones that both hardware, driver, and game-development teams have to work to. But both Nvidia and ATI have grasped the nettle and built these relationships, their engineers jetting off to work on-site with game studios to ensure this stuff actually works.
The result is a process of fine-tuning that enables dual-card setups, finally, to fly like they should with the games we all play.
And yet, each company seeks to differentiate, offering a slightly different proposition. Elegant ATI looks to court the multimedia market with EyeFinity and power-efficiency, while the poke-focussed Nvidia, as ever, seeks to shift more pixels at a greater speed than the competition, and attempts to make 3D vision a household technology.
Our remit for the following shootout, as ever, is to ascertain which twin GPU setup offers the greater value proposition. Both companies have their mid-range DX11 cards on the table now, so let the games commence…
Life with SLI and CrossFire has never been easier. Provided you have a supporting motherboard and you're not limiting the capabilities of these fine DX11-capable GPUs with something more archaic than a decent Core 2 Duo, you'll see some serious gains for having a dual-card setup.
And we really don't mean to sound glib when we say that CPUs prior to Core 2 Duo will hold back your graphics card. Not only will an older CPU limit the speed capabilities of a new graphics card, but there's DX11 to take into account.
DX11 doesn't just make standards of tessellation and stream-processor leverage. It aims to make your CPU work harder for the game engine, with multi-threaded rendering. The more cores, and the more threads per core in your CPU, the better gaming performance will be. A fancy Phenom or Core i7 will really take the load off your GPU, and let it do its thing unhindered.
But before we go anywhere near the benchmarks of our stable of cards, and what they mean in terms of a buying decision, let's pause to examine your motivations.
Why a dual-card setup?
In the great flow-chart of whether-or-not-you-want-two-graphics cards, you should first ask yourself why. There are two good reasons to plump for a pair of cards, which depend entirely on your financial situation.
The first is the prospect of an inexpensive future upgrade, which requires that you have already bought a motherboard with dual-card capabilities, with future-proofing in mind, you clever thing. If one card is good for the price, why not save up for a couple of months and add another card to your system for a major increase in frame rates?
It's a compelling proposition, especially when retailers are constantly and incrementally driving down the prices of cards to keep up with the competition. And now that SLI and CrossFire offer really solid benefits for doubling up, it's become a more valid motivation than ever before.
The second reason to invest in two cards comes down to overall value in a single purchase. Can you really get more performance from a twin-card setup than from a more expensive single card? That's largely down to your budget, which we'll come to in a while.
The purpose of our tests is to find the current best mid-range dual-card setup, so let's examine what the cards offer.
The cards on test
We're looking at paired sets of four distinct cards: ATI's HD 5750 and HD 5770, and Nvidia's GTS 450 and GTX 460. All four of these cards differ in price, performance and capabilities. In a sense, there's never been a richer crop of mid-range cards – and a more confusing series of choices to make.
And by mid-range, we mean cards that drive mid-range resolutions – between 1,680 x 1,050 and 1,920 x 1,080 – 22-24-inch monitors, basically. Regardless of every other feature a graphics card may possess, we're primarily interested in gaming performance, and when that's your key metric, the native resolution of your monitor is a major deciding factor.
Your price/performance sweet spot lies with the cards that run games happily at your native resolution. The cheap seats At the budget end of our scale are the HD 5750 and the GTS 450 – both pared-down versions of the GPUs boasted by their bigger DX11 stablemates.
Traditionally, this means fewer cores/stream processors and a lower memory bandwidth than higher-end cards in the series, and this is true of these budget barnstormers as well. They just can't shift as much data as their freer-breathing bigger brothers, which means lower frame rates.
The compensating factor, as ever, is price. 1GB examples of both the HD 5750 and the GTS 450 (these are the ones we tested, over their 768MB counterparts) can be found for less than a hundred quid.
The middle brothers to the HD 5750 and GTS 450 are the HD 5770 and GTX 460. Specs-wise, they have exactly what you'd expect: a bigger memory interface with considerably more memory bandwidth, and greater power consumption to boot.
Here's where the prices really start to differ as well, with the 1GB version of the HD 5770 coming in around £60 cheaper, on average, than the similarly-equipped GTX 460. So while competition is as tight as can be at the budget end, the upper-midrangers start to pull away, and a different story begins to emerge. Just what is it that makes these graphics cards so different?
We ought to look at some benchmarks. But first, a word on compatibility.
The games we used to run our benchmarks were: DiRT 2, Far Cry 2, Just Cause 2 and the synthetic DX11 benchmark, Heaven 2.0 Benchmark.
The only game in which we saw some curious results from a dual-card setup was DiRT 2 on CrossFire. The difference between ATI's and Nvidia's dual-card setups is that CrossFire often requires Application Profile packages to be installed alongside the driver (although why the two can't be combined into a single package seems odd).
The result, if you haven't downloaded and installed the latest Application Profiles from the AMD website, is a drop in performance in those games that require them.
Our benchmarks were performed using the latest Catalyst and Forceware drivers, and in all other tests with all our other games, everything was ticketyboo. Catalyst 10.9, it appears, doesn't yet have an Application Profile for DiRT 2, which meant we actually saw a dip in performance with two cards, as the game struggled to leverage the hardware.
We take these results with a pinch of salt, however. The cards are perfectly capable of running the game in CrossFire mode, and the catalyst 9.X packages had their own hotfix. One just hasn't been introduced for 10.9 yet, and when it does, we'll see that rise in performance.
It's just a little irksome that we need discrete Application Profiles at all with CrossFire, seeing as SLI just works out of the box with commensurate performance gains over one card, and no requirement to download extra software.
So let's start with the humble HD 5750. Alone, the card runs games at a respectable rate at middling resolutions. But at nearly 40fps in Far Cry 2 at 1,920 x 1,080 with high AA and AF, and all the effects turned on? That's great DX10 performance for under a hundred pounds.
The card is also whisper-quiet, and while the fan speed scales to cool when the pressure is on, it never really becomes audible.
Whack two together in CrossFire, however, and you'll see a 65-90% increase in frame rates across the board, depending on the game, and scores that outperform any single card on test here bar the GTX 460. That high-resolution Far Cry 2 test jumps to 54fps, just shy of the holy-grail that is 60fps. And if you're running at 1,680 x 1,050, the setup nearly cracks the 100fps mark. Impressive indeed.