Nvidia's RTX 5000 GPUs are a confusing proposition for gamers like me

A man with a confused expression scratching his head.
(Image credit: Shutterstock)

So, Nvidia is definitely making another generation of desktop GPUs - currently bearing the codename ‘Ada Lovelace-Next’. The current generation of RTX 4000 cards has been a mixed bag, to say the least, and these new GPUs aren’t slated for release until 2025 - but I’m already a bit concerned about the whole situation, to be honest.

Before we get started, I’m going to make some key assumptions here. The ‘Next’ codename suggests that these new GPUs could potentially just be a hardware refresh of some description, hypothesized by some to be RTX 4000 Super cards - but given the 2025 release date, I’d be shocked if it doesn’t turn out to be a whole new generation: namely, RTX 5000.

Now, Nvidia has been ramping up performance pretty impressively with its current generation of GPUs - the RTX 4090 is, after all, the most powerful consumer graphics card on the planet right now, and the recently-released RTX 4060 offers a solid value proposition for PC gamers on a budget. But it does leave me thinking: where do we go from here?

Stepping up

Ultimately, no gamer really needs an RTX 4090. Even for 4K gaming, the RTX 4070 Ti is actually perfectly sufficient. It’s frankly unlikely we’re going to see 8K gaming become the norm in just two years’ time, especially when the Steam Hardware Survey shows that the majority of PC gamers are still using cards best suited for playing at 1080p.

Intel Arc A7 graphics card

Intel's Arc A770 GPU is - after several much-needed driver updates - a solid choice for 1080p gaming. (Image credit: Intel)

The best cheap graphics cards are far more appealing to gamers in the current economic climate - so the prospect of a potentially-even-more-expensive RTX 5000 series has me feeling doubtful. It’s partially because I don’t really see a concrete need for even more powerful GPUs, especially when so many gamers are still rocking older components.

It’s not a problem unique to Nvidia, of course; the tech hardware industry’s relentless march for progress means that just about everything gets superseded less than two years after its release. I often wonder if CPU and GPU makers could learn a little from the home console space, where generations last as long as five to six years before a new wave of gaming machines hits the market.

AI is the future (again)

Another big factor when it comes to my reluctance surrounding new GPUs is the current prevalence of AI-powered upscaling technology. Nvidia spearheaded the charge into this new arena with the incredible DLSS, which uses AI to render a game at 1080p before upscaling it to your target resolution in real-time.

With DLSS 3 available in RTX 4000 cards and offering new frame-generation capabilities for further boosting your framerates, it’s honestly hard to see why any new hardware should exist at all. If Nvidia instead holds off on next-gen GPUs for a while and pours money into improving its upscaling tech and ensuring that every major game has full DLSS 3 support, the need for new cards practically evaporates, at least for a few more years.

A man's hand holding up the Nvidia RTX 4060

Got an RTX 4000 GPU? Congrats, you have DLSS 3 - and that's a bigger deal than any new hardware. (Image credit: Future / John Loeffler)

All of the best graphics cards offer some form of upscaling nowadays, whether it’s DLSS or AMD and Intel’s competing FSR and XeSS software. This tech just keeps getting better, too; odds are by the time we reach 2025, upscaling will offer even larger performance boosts than it does today, with more widespread support in PC games.

And quite frankly, I don’t want to feel obliged to buy a new GPU every two years. There’s been speculation that Nvidia might eventually get out of the consumer graphics market in favor of the big bucks offered by AI developers for machine-learning training hardware, which further erodes my confidence in the need for more gaming GPUs - though I wouldn’t begrudge Team Green in the slightest for making such a move, given the current boom in AI popularity.

In any case, my RTX 4080 probably isn’t going anywhere for the next four or five years. After all, I’m perfectly happy with it - I’ve got a 4K monitor and no plans to upgrade that either. As they say, if it ain't broke...

Christian Guyton
Editor, Computing

Christian is TechRadar’s UK-based Computing Editor. He came to us from Maximum PC magazine, where he fell in love with computer hardware and building PCs. He was a regular fixture amongst our freelance review team before making the jump to TechRadar, and can usually be found drooling over the latest high-end graphics card or gaming laptop before looking at his bank account balance and crying.

Christian is a keen campaigner for LGBTQ+ rights and the owner of a charming rescue dog named Lucy, having adopted her after he beat cancer in 2021. She keeps him fit and healthy through a combination of face-licking and long walks, and only occasionally barks at him to demand treats when he’s trying to work from home.