Nvidia GeForce GTX 680 review

The top-end Kepler GPU is here for you, Mr Gamer

Nvidia GeForce GTX 680
Slower shaders, lower power, more speed? Really?

Why you can trust TechRadar We spend hours testing every product or service we review, so you can be sure you’re buying the best. Find out more about how we test.

Nvidia GeForce GTX 680

So, the Nvidia GeForce GTX 680 is a pretty quick card then?

Yes, and it also bodes well for the cards that are yet to come.

The EVGA precision tool that Nvidia provided for benchmarking allowed for some serious overclocks (with GPU Boost giving even more on top of that), with most games running just under the 1,300MHz mark.

True the AMD cards, Tahiti in particular, have a huge amount of overclocking headroom, but you have to invalidate your warranty to get there.

The GTX 680 will give you automatic access to those higher clocks without any such worry.

You can also let the GPU draw up to 32% more power, which in turn means the GPU Boost will ensure even higher regular clocks.

But for the mid-range cards it's the Frame Rate Target that interests us most.

The average PC gamer may not necessarily want to overclock their new investment, but by letting the GPU decide when it needs to up the clocks and voltage - within its capabilities - then telling it to just try and hit 30FPS will make getting great performance simple.

The fact the GPU Boost is so temperature dependent means that aftermarket coolers, specifically water-cooled units, will deliver much higher performance.

That does though mean our Scandinavian cousins will have faster cards than us on average, and you're also likely to drop frame rates in the Summer.

But this is Nvidia back to its PC gamer-supporting best.

This is a card for gamers, this is a card that's built to be the best right now and built for the next generation of games. If Epic's Samaritan demo is anything like what Unreal Engine 4 can give us, the GTX 680 is well-placed for the future.

We liked

It's great to see the Nvidia GeForce GTX 680 coming in with relatively low power-draw. Eschewing the usual 8-pin/6-pin PCIe power connector combo is good news for your electricity bill.

Maybe not for the PSU manufacturers wanting to push their 1,500W beasts though.

It's fast enough too – hitting the 1GHz GPU clock that seems to be the 28nm norm. Nvidia has obviously used the time to engineer the GTX 680 up to a point where it can just about stay ahead of the AMD Radeon HD 7970.

We have to say we're also rather enamoured with the GPU Boost technology.

AMD's HD 7000 series cards may have serious overclocking headroom, but Nvidia's GTX 680 gives you access to it without necessarily needing you to mess around with your expensive new hardware yourself.

There are other impressive bits of surrounding tech too – like the Frame Rate Target and Adaptive VSync, a method of dynamically turning VSync on and off to avoid both tearing and the stutter associated with dropping frame rates with standard VSync on.

We disliked

There's a niggling concern at the back of our collective minds about the fact it is such a low-powered card.


Sure it's quick enough to be called the fastest GPU of this generation, but a full-fat, 250w version, with a chunkier GPU housing more SMX modules must surely have been created in the labs.

That maybe should have been Nvidia's top-end, £400 card.

This relatively small GPU, with its lower memory-bus and power draw seems more like a GTX 670.

So essentially that makes us a little narked about the price.

It's £400 because that's the standard for top-end GPUs now, despite the fact that it probably is more of a mid-range card compared to the performance the technology could realistically manage.

It's £400 because Nvidia can charge that much, and because AMD's Radeon HD 7970 hasn't pushed it to create something even more spectacular.

Verdict

That shouldn't take away from the fact that we've got a card in front of us that's cool, quiet and less power-hungry than Nvidia's normal top-end GPUs.

It's not the power-crazed GPU behemoths we're used to from Nvidia, but it's still got the performance chops and some neat extra tricks.