Nvidia’s RTX 4080 GPU comes in two flavors, as you’re likely aware, and Team Green has just provided some benchmarks which are a starting point for illustrating the relative performance that these graphics cards will deliver.
As flagged up by Overclock3D (via VideoCardz), both the RTX 4080 16GB and the 12GB lower-tier model were shown benchmarked in three games – A Plague Tale: Requiem, F1 22 and Microsoft Flight Simulator (all of which are among the first titles with DLSS 3 support).
As measured by Nvidia, in those games, the RTX 4080 16GB is 21%, 27% and 21% faster than the 12GB graphics card respectively (about 23% quicker on average). This is running at 4K resolution with no ray tracing (rasterization performance, with DLSS turned on), in a PC with an Intel Core i9-12900K processor.
What’s interesting is benchmarks for current-gen graphics cards are also included, and we see that in the battle of the RTX 3080 versus the RTX 3070, the former is 29%, 30%, and 15% faster in those same tests, averaging at 25% quicker.
Analysis: Nvidia’s layering and naming strategy laid bare?
Why consider the difference between the RTX 3080 and 3070 here? Well, with the RTX 3080 being 25% faster than the 3070, we can drawn a comparison to the RTX 4080 16GB versus 12GB with the former being 23% quicker on average.
Which is to say that the difference between the two RTX 4080 GPUs is almost the same as the gulf between the RTX 3080 and 3070 (at least in this three-game set of benchmarks). Or to put it another way, the lower-tier RTX 4080 fits more or less with the performance of what could have been the RTX 4070.
As you may realize, the specs of the RTX 4080 12GB are considerably cut down – it actually uses a different GPU to the 16GB flavor, a chip that was expected to power the RTX 4070 – and one theory is that Nvidia intended this to be the 4070, and cranked it up to be a 4080 variant late in the day. A theory which the above comparison to the mentioned RTX 3000 cards rather backs up.
Of course, we do have to bear in mind that 4K resolution is where the more powerful RTX 4080 will shine brighter, with more grunt required from the GPU than lesser resolutions. But still, these benchmarks seem to underline the performance gap between the two 4080 models, and how the lower-end offering seems very much to be positioned in the performance envelope of a would-be (should-have-been?) RTX 4070.
What we also see here is that the RTX 4080 12GB really needs that boost from DLSS 3 to outdo the RTX 3090 Ti – without the frame rate booster, it’s actually a bit slower than the current fastest Ampere card. (Remember, the 3090 Ti can only use DLSS 2, which doesn’t boost it as much as DLSS 3 that the 4080 benefits from when upscaling tech comes into play – but upscaling won’t be the norm, of course).
At any rate, when you look at the relative pricing of the RTX 4080 16GB versus 12GB, that’s in line (more or less) with the performance difference here. But the price tag pinned on the RTX 4080 12GB is still causing quite the controversy online, with the most affordable initial Lovelace offering being far from actually affordable as such. And of course, if this model had been an RTX 4070 as some have theorized was the original plan, that frustration would’ve been compounded (to say the least – bearing in mind the RTX 4080 12GB costs 80% more than the MSRP of the RTX 3070).
Part of Nvidia’s strategy here is to keep selling RTX 3000 models, of course – Team Green referred to the overlap between Lovelace and Ampere as a ‘layered’ launch – but the company certainly needs to be careful not to alienate folks too much with pricing, particularly as we move deeper into what seem to be more economically challenging and uncertain times. Lest gamers turn to AMD and what RDNA 3 might bring forth soon, including an opportunity for Team Red to put the pressure on with pricing, one which we sincerely hope Nvidia’s big rival takes.
Get daily insight, inspiration and deals in your inbox
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
Darren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - 'I Know What You Did Last Supper' - was published by Hachette UK in 2013).