Nvidia’s next-gen Ampere GPUs could be way better at ray tracing and with cheaper flagship prices

(Image credit: Future)

A whole bunch of fresh rumors have just emerged concerning Nvidia’s next-gen Ampere graphics cards, and a huge boost is being promised on the ray tracing front, alongside the prospect of slightly cheaper top-end models.

This comes from Wccftech and we should make it sparklingly crystal clear upfront that a lot of the info divulged seems pretty sketchy and airy – with even the tech site itself underlining the fact that everything should be taken with a grain of salt. Although we might be tempted to keep a bucket of the stuff handy…

At any rate, this speculation comes snaking down the graphics grapevine due to Nvidia apparently having been chatting to graphics card manufacturers to communicate some ‘big picture’ details on what to expect with Ampere GPUs.

So what should we (allegedly) expect? RTX 3000 graphics cards (as they will presumably be named) are set to come out in the first half of 2020, and more video RAM will be on-board, plus clock speeds will purportedly run around 100MHz to 200MHz faster than current Turing GPUs – with Ampere achieving this with a greater level of power-efficiency to boot.

Such are the benefits of Samsung’s 7nm EUV process used by Ampere, the argument runs, and these next-gen GPUs will also sport bigger frame buffers than Turing. (Although equally, note that it’s not clear if Ampere cards will indeed be produced by Samsung on that 7nm process – all the rumors surrounding this are fairly muddied based on what Nvidia has said in the past, as we’ve discussed previously.)

One slight fly in the alleged spec ointment is that the next-gen cards will run at even lower voltages than Turing, perhaps under 1.0V, and there’s chatter about this potentially dampening overclocking prospects.

Worried about rays

Nvidia is also apparently doubling-down on ray tracing with the RTX 3000 cards, and given how invested the push has been with the current GPUs, that wouldn’t be a surprise.

The issue with ray tracing currently lies in the performance hit for the added visual finery, of course, and so Nvidia is promising “massive” improvements on that front. The RTX 3000 series will have ray tracing cores which are faster and more power-efficient, and presumably a lot faster given the language employed here.

That said, it’s one thing to promise huge improvements in frame-rates when running ray tracing in games, and another to actually deliver them.

As well as the drive with ray tracing, Ampere will also push hard on the (traditional) rasterization front to achieve better gaming performance, Wccftech observes.

The final piece of speculation chewed over here is perhaps one that gamers will find most interesting, namely pricing.

We talked in the past about how using Samsung’s 7nm EUV process (if indeed that is what pans out) means that chips are cheaper to produce, which raises the prospect of more affordable graphics cards than Turing gave us.

And there’s good news and bad news here – all of it rumored of course – with the broad picture being that Ampere will generally cost around the same as Turing. That would be disappointing, but at the same time, not surprising.

However, the glimmer of hope is that the high-end cards, meaning the RTX 3080 and RTX 3080 Ti (assuming this is what they’re called) could be slightly cheaper than their Turing counterparts.

Trying to dissect potential pricing to any real degree at this point, though, is something of a fool’s errand, given that Nvidia will need to take into account exactly what movements have happened in the GPU market over the course of 2020. And indeed how AMD is pitching whatever new graphics cards it might have coming out, including an alleged ‘Nvidia killer’ targeting the high-end of the GPU spectrum – which could have something to do with the rumored pricing here.

Still, it’s interesting to hear Nvidia’s alleged intentions and feelings at this point regarding price tags, of course. And there may be extra pressure on keeping pricing competitive given that Intel is expected to join the GPU race perhaps as early as mid-2020, and will need to do something to make an entrance against the two established players…

And one obvious high-impact move could be to go cheap, or at least cheaper (and indeed for Intel to implement a seriously slick multi-GPU system, which could make for an affordable upgrade path, as other speculation runs).

Darren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - 'I Know What You Did Last Supper' - was published by Hachette UK in 2013).