Nvidia G-Sync vs AMD FreeSync

Calling all gamers, we have good news and bad news. The good news is that AMD and Nvidia have both solved the problem of screen tearing and frame stuttering in demanding PC games. The bad news is that they each have their own solutions which are not compatible – in short we have a format war… again.

What is happening during PC Gaming Week?

What's the problem?

The source of displaying high-performance PC games is that monitors have a constant refresh rate, e.g. 75Hz means that the screen is updated 75 times per second. Meanwhile, graphics cards (GPUs) redraw the screen at a variable rate, depending on the computational load they're bearing.

Source Nvidia

The difference in timings of the two mean that the current frame on the monitor and the current frame on the GPU become unsynchronised. Therefore, partway through the process of sending a frame to the monitor, the GPU moves onto the next frame.

Source Nvidia

This switch appears as a discontinuity in what you seen on-screen. Usually, the discontinuity travels down the screen on concurrent frames as the phase difference between the GPU and monitor reduces. This discontinuity is what we call tearing and in extreme cases there can be several tears at once.

Source Nvidia

Most PC games employ something called VSync as a way to reduce the tearing effect. VSync effectively limits the frame rate of the GPU, such that if one particular frame has taken too long to be rendered on the GPU and it misses its slot on the monitor, the GPU will delay sending any graphics data to the monitor until the next screen refresh.

Source AMD

Brilliant! Problem solved then? Well, not quite. VSync is not perfect, the delay in sending a frame to the monitor causes stuttering and lag during the times that the GPU is under the most processing load, which is also the time a gamer needs the most response. Hence, many gamers choose to disable VSync in order to get the most responsive system, despite the ugly tearing effect. So, while VSync was the only remedy to tearing, many gamers choose to disable it.

Nvidia to the rescue, kind of

Since 2014, Nvidia has been promoting its solution to the VSync problem, that it has dubbed G-Sync. The basic concept with G-Sync is that the GPU actually controls the refresh rate of the monitor. By doing this, the monitor and GPU are always in sync and therefore there is never any tearing or stuttering. Prior to this, Nvidia had already been working on Adaptive VSync.

As PC Perspective notes, there are three regimes in which any variable refresh rate GPU/monitor system needs to operate within: A) when the GPU's frames per second is below the minimum refresh rate of the monitor. B) When the GPU's frames per second is between the minimum and maximum refresh rate of the monitor. C) When the GPU's frames per second is greater than the maximum refresh rate of the monitor.

Source Nvidia

Case B mentioned above is straightforward – the GPU simply sets refresh rate of the monitor to equal its frames per second.

When a G-Sync compatible GPU and monitor are operating in case C, Nvidia has decided that the GPU should default to VSync mode. However, in case A, G-Sync sets the monitor's refresh rate to be an integer multiple of the current frames per second coming from the GPU. This is similar to the delaying frames strategy of VSync, but has the advantage of keeping in step with the monitor because of the whole number multiplier.

The (somewhat literal) price of this solution is that Nvidia needs to have a proprietary chip in every G-Sync compatible monitor. This has the undesirable result of G-Sync monitors incurring increased costs due to requiring the extra electronics and paying the associated license fees to Nvidia. Finally, it is not supported by AMD GPUs either.