AMD HD3D technology: what you need to know

AMD HD3D technology explained
AMD's open platform for stereoscopic 3D

Stereoscopic 3D might not be to everyone's tastes, but it's certainly a lucrative business.

For the past few years Nvidia has enjoyed a virtual stereo 3D monopoly on the PC with its 3D Vision technology, but AMD has belatedly entered the ring with its own technology called HD3D.

For those who have heavily invested in Nvidia's technology it might be too little too late, but AMD's new offering introduces a number of innovations that make it worth considering – even if you think stereo 3D is just an expensive way to get a headache.

The biggest attraction of HD3D is AMD's commitment to making it an open platform that supports many different standards. Unlike Nvidia's approach, where you're tied to Nvidia 3D Vision-certified hardware, HD3D's open ecosystem should allow for a more diverse choice.

It's pretty encouraging that AMD seems to have identified one of consumer stereoscopic 3D's biggest problems, and the reason why so many of us have so far held back from embracing stereo 3D: the vast and confusing array of incompatible 3D standards. If AMD's mission with HD3D is to allow us to buy 3D hardware from various different manufacturers, without us having to worry whether it will all work together, that can only be a good thing for us consumers – and might lead to more of us setting up our rigs for 3D.

How it works

AMD 3d vision

When it comes to three-dimensional gaming, the bulk of the stereo horsepower comes from the HD3D driver. The driver uses a quad-buffer to produce stereo 3D. Usually with non-stereo 3D graphics (monoscopic) the driver uses double-buffering.

This is where the GPU renders content to one specific place in memory, known as a buffer. At the same time, a second buffer is used to deliver the display output of the GPU to a monitor or other display. Quad-buffering essentially doubles this, producing two images – one for the left eye and one for the right – for every frame generated.

AMD's Direct3D engineering team produced an API that supports OpenGL and DirectX 9, 10 and 11. AMD's open approach allows middleware partners, such as DDD and iZ3D to convert games from monoscopic to stereoscopic. This takes some of the pressure off AMD to ensure games are compatible with HD3D and has led to an already impressively long list of compatible games.

Launching into HD3D

If you want to set up HD3D then the first thing you need to do is make sure that you've got a supported Radeon graphics card. The ATI Radeon HD 5000 Series using Catalyst 10.10 or later and AMD Radeon HD 6000 series or above are compatible.

You can also use HD3D if you have an HP Envy 17 3D with ATI Mobility Radeon HD 5850, an MSI Wind Top AE2420 All-in-One with ATI Mobility Radeon HD 5730, or a Lenovo IdeaPad y560d with ATI Mobility Radeon HD 5730.

You'll also need a 3D-capable display device, like a 3D monitor, TV or projector. There are plenty of supported devices – click here for a full list. If the display device is a 3D TV or a 3D projector then it needs to support HDMI 1.4a, and you'll need an HDMI 1.4a cable.

Unless you can afford a glasses-free 3D display, you'll need the chunky eyewear too. You'll also want DDD's TriDef 3D for the AMD HD3D driver, or the iZ3D 3D driver for 3D gaming.

Thanks to AMD's partnership with the middleware driver makers, you can get 50 per cent off either driver if you have HD3D compatible hardware. Make sure you have the latest AMD Catalyst 10.10 software or later from AMD's website. Install either the DDD TriDef 3D or iZ3D drivers and select either 'AMD – AMD HD3D Technology (HDMI 1.4a)' or '120Hz 3D Devices' respectively.

You'll then need to set your desktop resolution to 1,920 x 1,080, 24Hz or 1,280 x 720, 60Hz. Using either TriDef 3D or iZ3D you need to select the game profile and point the software to the game's executable file, and then launch the game itself from inside the 3D software.

AMD has outsourced a lot of the work involved in creating compatible games to the third party middleware companies iZ3D and DDD. Headline games include Call of Duty: Modern Warfare 2, Mass Effect 2 and World of Warcraft, and are included in a growing list of compatible games.

You can be pretty confident that the biggest new releases will be compatible, but for a full list of compatible games go to and the TriDef website.

It's not all down to third parties though. AMD is working directly with games developers on native support for HD3D. This year AMD worked closely with Eidos to support native stereo 3D in Deus Ex: Human Revolution. This meant that Eidos rendered images for both the left eye and the right eye in game, and was able to utilise HD3D's quad-buffer without the need for middleware.

Hopefully as the HD3D standard gains popularity, more games will include native support as they are released. So how does AMD's plucky little David compare to Nvidia's Goliath?

In terms of performance and raw power, Nvidia's head start in stereo 3D certainly gives 3D Vision the edge. HD3D's reliance on the HDMI 1.4a specification leaves it with a maximum TMDS throughput of 10.2Gb/s, allowing for 1080p gaming at 24 frames per second for each eye, or 720p at 60 frames per second.

This is a lot lower than 3D Vision Surround's dual-link DVI connection capable of resolutions up to 5,760 x 1,080 over dual monitors. AMD hopes that as more new monitors begin to support DisplayPort 1.2, HD3D will be able to get around HDMI 1.4a's limitations with a bandwidth of 17.28Gb/s – enough for 1080p at 60 frames per second per eye.

HD3D vs 3D Vision

AMD hd 3d

AMD's embrace of open standards really sets the HD3D apart from Nvidia's 3D Vision. Not being tied to specific hardware and standard restrictions gives us far greater scope to build a 3D-capable rig that meets our needs.

It can also be more cost effective – not only can we shop around for the best components at the best prices, we can also avoid some of the hidden costs associated with Nvidia 3D Vision. You don't need a propriety 3DTV Play driver to play 3D Blu-rays, and you don't have to rely on expensive USB emitters.

You do need to pay for third party middleware drivers, though. Still, there is enough potential in HD3D to get us excited about its future, and to cause Nvidia some concern over 3D Vision's future.