TechRadar Verdict
Finally, a genuinely affordable but powerful card with full DirectX 10 support
Pros
- +
Unprecedented shader power at this price point
- +
Slim single-slot form factor and significantly reduced power consumption
- +
Added DX10.1 functionality
Cons
- -
Anti-aliasing still performed in the shader units
- -
Quad-card Crossfire support a bit of a gimmick
Why you can trust TechRadar
Wherever Nvidia treads, ATI must surely follow. Or is it the other way round?
With their latest 3D chipset launches, Nvidia and AMD's PC graphics subsidiary have once again mirrored each other extremely closely.
Nvidia in disguise?
There's an absolutely uncanny resemblance between ATI's new Radeon HD 3800 series and Nvidia's GeForce 8800 GT.
More than anything, both chipsets aim to bring the performance and features of current high end DirectX 10 3D cards down to a much more mainstream price point. The only differences are in the fine details and final pricing.
For the 3800 series, ATI has essentially subjected its existing flagship DirectX 10 GPU, the rather underwhelming Radeon HD 2900 XT (also known as R600), to a die shrink from 80nm to a finer 55nm node. That alone makes for a much smaller, cheaper and more power efficient GPU. ATI reckons both performance per watt and bang for buck are both doubled with the 3800 series.
It also makes for cards that work with a wider range of systems. The entry level Radeon HD 3850 model is a single-slot board, while the top-end Radeon HD 3870 is a full dual-slotter. However, like its slimmer sibling, the 3870 requires only one six-pin supplementary power connector. The implication is obvious: You don't need a pricey 1,000 Watt power supply to runs these cards, even in multi-GPU configurations.
More to the point, you won't need to spend much on the boards themselves, at least compared with the outgoing 2900 XT. The 3850 is yours for just £110 including VAT. The 3870 will sell for as little as £140.
As for speeds and feeds, core clock frequencies for the 3850 and 3870 are 670MHz and 775MHz, respectively. Meanwhile, the 3870 packs 512MB of exotic GDDR4 memory running at 2.25GHz. The 3850 makes do with 256MB of more pedestrian 1.66GHz GDDR3 memory.
Architecturally, however, the two models are identical and mostly match the outgoing Radeon 2900 XT. All 320 stream processors are present along with 16 texture units and 16 render output units.
Less memory
The only victim of ATI's quest for a smaller, more affordable chip is the memory controller, downgraded from 512-bit to 256-bit. For the beefier 3870 chipset, ATI claims that has been offset by the faster memory and optimisations to the memory controller.
ATI has also used the six months or so since the launch of the 2900 XT to add a few new features. First up is support for the upcoming 10.1 revision of Microsoft's DirectX platform. There is much debate over the value of DirectX 10.1 compared with 10.1.
ATI claims it enables more realistic lighting courtesy of high-performance global illumination. Nvidia, whose cards do not support 10.1, unsurprisingly begs to differ. Only time, and the introduction of games supporting the new 10.1 standard, will tell.
The 3800 series is also ATI's first card to support the PCI Express 2.0 interface. It doubles the available bandwidth for shunting data to and from graphics cards. In single-card scenarios it's mostly about providing headroom for future architectures. However, it is relevant today for bandwidth-hungry multi-GPU setups.
And it just so happens that ATI is rolling out the mother of multi-GPU solutions with 3800 Crossfire. Part of the company's overall "Spider" platform, the key feature is support for two, three and four-card configurations running in parallel. ATI reckons a performance boost of up to 3.2x is possible compared with a single graphics card.
Indeed, we're surprised to note that setting up Crossfire with the 3850 is extremely painless. Simply shut down, drop in the second card and ATI's software will automatically detect and install it on reboot. The only further step is a brief jaunt into the control panel to flick the 'enable' switch.
More's the pity, therefore, that once configured the pair 3850s confirmed our doubts about multi-GPU technology. Like Nvidia's 8800 GT cards in SLI, they failed to consistently accelerate Crytek's stunning new game, Crysis.
Still, for just over £100, the 3800 in particular offers staggering value for money. In most regards, it's clearly a high-end GPU. It sports no less than 666 million transistors and comes pretty close to matching the Radeon HD 2900 XT for pure performance. Lest you have forgotten, the latter is a card that sold for as much as £300 just six months ago.
So long as you don't plan to pair the 3850 with an ultra-high resolution monitor, it's an awfully nice little card. At this price point, most gamers will likely connect it to LCD monitors in the increasingly popular 20 to 22-inch widescreen range, all of which sport 1680 x 1050 pixel grids. That's a resolution this card can handle with ease in all but the very latest and most demanding games.
Our only reservation involves anti-aliasing. Like the 2900 XT, the 3800 series suffers from rather slow, shader-based anti-aliasing. Combing that weakness with lower memory bandwidth is hardly ideal.
Finally, ATI has upgraded both the 3800's UVD video decode engine and improved the chip's power management. The former remains the only solution capable of providing hardware acceleration for all types of HD DVD and Blu-ray discs (Nvidia cards lack full support for the VC-1 codec).
Blink Outdoor 4 review: affordable, beginner-friendly home security that gets the job done
Google says its next data centers will be built alongside wind and solar farms
“It's not just one silver bullet” - AWS unveils plans for continued major environmental push as it looks to lead the way on sustainability