Nvidia G-Sync vs AMD FreeSync

While Nvidia was first to come up with the idea of the GPU controlling the monitor's refresh rate, AMD has struck back hard by coming up with its own solution, called FreeSync. It is based on an open standard – DisplayPort 1.2a. AMD collaborated with the VESA group to modify the DisplayPort standard to incorporate AdaptiveSync, which allows compatible GPUs and monitors to automatically negotiate the optimal refresh rate for the monitor. It thus requires no proprietary elements which in turn keeps costs lower than Nvidia's offerings. AMD even go so far as to claim that G-Sync will reduce frame rates, rather than making things better.

Source AMD

The key technical difference between G-Sync and FreeSync, apart from the licensing requirements (or lack thereof!), is the way in which they handle GPU output that lies outside of the refresh rate range of the monitor. FreeSync is limited to matching refresh rates to frame rates via AdaptiveSync.

It cannot perform any other refresh rate tricks in the same way that G-Sync can, when the GPU's frame rate is outside of the monitor's refresh rate. Therefore, a FreeSync GPU has a frame rate outside of its monitor's refresh rate range, it defaults back to working with or without VSync as per the user's preference, which means there will be tearing or stuttering again.

Oh great, another format war

It is early days in the world of variable refresh rate graphics and so your options are limited if you want to try either technology. According to AMD, there are eight FreeSync monitors on the market, and Nvidia reports six G-Sync monitors are now available.

It is almost impossible to compare any of these monitors given the vastly differing specifications; but generally the G-Sync monitors are more expensive than the FreeSync models. As for GPUs, Nvidia's website says that any GPU from the 600 series onwards will support G-Sync. Meanwhile, compatible AMD Radeon GPUs are limited to the "R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs".

An important consideration for gamers is that we tend to upgrade monitors far less regularly than GPUs. So whatever type of variable refresh rate technology you choose now is going to determine your choice of GPU brand for a long time to come.

As ever, when big brands get into a format war, consumers become collateral damage.