This graphics card generation is over and it was mostly trash

A graphics card on a wooden table in shadow
(Image credit: Shutterstock)

With the release of the AMD's latest graphics cards, the current generation of graphics card launches looks to be at its effective end. Sure, there might be some variants hitting the market next year, but for the most part Nvidia and AMD have released the overwhelming bulk of their product stacks, and I've had the privilege of reviewing them all.

As components editor, I've poured countless hours at our testbench in NYC running and rerunning benchmark tests, swapping out graphics cards in my gaming PC at home (and performing clean driver installs over and over again), and recording enough data into a carefully maintained and formatted spreadsheet to give an accountant a bad case of envy.

I've performed the old fractional cross-multiplication formula to calculate how much better one score is relative to another, read through dozens of spec sheets multiple times, and even written automated batch scripts to make my life marginally easier so I don't have to click "Run" on a benchmark's GUI every two minutes for hours each day during the review process.

I know this generation of graphics cards inside and out, as well as the generations that came before, and I feel I'm qualified to say to those who've been following these GPU launches that they aren't wrong to feel that there has been something seriously off this time around. Take it from me, this generation of graphics cards mostly sucked.

From overpowered to overpriced

An Nvidia RTX 4090

(Image credit: Future)

We kicked things off this go-around with the Nvidia GeForce RTX 4090, unquestionably the best graphics card for consumers (or prosumers, really) on the planet in terms of raw performance. There's very little that that card can't do, except maybe save you money on your electric bill. Or present a 100% guarantee that it won't melt your power cables.

Nvidia seems to have taken a page from Intel's playbook when faced with a resurgent AMD nipping at its heels: more power, literally. Just as Intel Alder Lake CPUs revived Team Blue's fortunes against AMD's rival Ryzen processors by pushing so much electricity through the processor that there isn't a consumer CPU cooler on the market capable of keeping the chip from boiling water, Nvidia scored a knockout blow to start this gen off by putting out a 450W behemoth of a GPU that actually had credible reports of melting 12VHPWR cable connectors pretty much from day one. 

Nvidia pointed to user error as the culprit, but given that the Nvidia RTX 4080, Nvidia RTX 4070 Ti, and others down the line didn't have the same problem is curious indeed. Maybe Nvidia RTX 4090 users are simply too reckless, whereas the rest of Nvidia's users read the instructions carefully.

Then there was the RTX 4080, which went on sale for the same price as the RTX 3080 Ti did back in 2021. While certainly powerful, it was badly priced from the jump, and sales of that GPU haven't been good at all when for just $400 more you could get the aforementioned RTX 4090. It also highlighted the biggest complaint against Nvidia from gamers: that it is a cynical, greedy megacorp that charges whatever it wants because it knows gamers have few other options. Fair or not, Nvidia's pricing strategy certainly fuels that particular fire.

AMD's response to this has been the AMD Radeon RX 7900 XTX, which smartly kept things below $1,000, which is already beyond what a mainstream consumer should have to pay for a single PC component. It too stumbled on the pricing front, however, with the AMD Radeon RX 7900 XT, which was just $100 cheaper than AMD's flagship card, but wasn't nearly good enough on its own to justify buying it over getting the RX 7900 XTX.

Nvidia came back with the RTX 4070 Ti, which at least got Lovelace down below $1,000, and it was emblematic of a problem that was going to dog Nvidia for the rest of the generation: too little VRAM. With just 12GB GDDR6X VRAM, the RTX 4070 Ti was able to crank out 4K gaming, but it was really the last card Nvidia launched that was able to get away with too little RAM for its target resolution.

The RTX 4070, RTX 4060 Ti, and RTX 4060 would all suffer unnecessarily for lack of VRAM, but especially the RTX 4060 Ti, which ended up being the biggest disappointment of the entire generation, performance-wise.

An AMD Radeon RX 7900 XTX on a table against a white backdrop

(Image credit: Future)

Over at AMD, things were quiet for most of 2023 after its successful 7900 XTX/XT launch. It wasn't until midyear that the AMD Radeon RX 7600 dropped ahead of more the powerful 7800 XT and 7700 XT cards.

This was a smart move on AMD's part, especially as it aggressively priced it for the budget-conscious, easily making it the best cheap graphics card of the generation and stealing a lot of the thunder from Nvidia's RTX 4060 (which was one of Nvidia's bright spots this year, in my opinion) before it launched.

Finally, the midrange offerings from Nvidia and AMD are a mixed bag to say the least. The Nvidia RTX 4070 is a great 1440p graphics card, but its price pushes the very definition of "midrange" to the extreme. Meanwhile, the AMD Radeon RX 7800 XT is the best 1440p graphics card in this class and excellently priced, but it's marginal gen-on-gen performance gains will undoubtedly leave many wondering what could have been. 

As for the AMD Radeon RX 7700 XT, at just $50 less than the RX 7800 XT, there's not really any reason to buy this card at this price, much like the RX 7900 XT.

Nvidia also dropped DLSS 3 with Frame Generation, which is amazing, but it is effectively behind a very high paywall (only RTX 4000-series GPUs can use it) that makes it pretty much irrelevant for most gamers out there. AMD, meanwhile, just announced FSR 3 with its own frame generation tech that is due to hit soon, but it's still too early to tell how that will pan out in the end.

So, essentailly, we've got a graphics card that can potentially melt your PSU is you plug it in wrong, several overpriced cards, and a couple of cards that were barely an improvement over the cards they are replacing. This is not a recipe for a stellar generation of graphics cards. So what the hell happened?

Moore's Law is real, but card manufacturers need to level with us

Nvidia Grace superchip

(Image credit: Nvidia)

I can't speak to anything about the pricing on many of these cards. My best guess is someone somewhere got the vapors and started scribbling numbers next to the RTX 4080 and RX 7900 XT and no one thought to revisit them, but it could just be greed. You should always give greed its due as a powerful motivator.

But the high prices of these cards might have been stomachable had their performance justified the price. The RTX 4090 is outrageously expensive, but it is so powerful that its performance to price value actually makes it one of the best 4K graphics cards if what you care about is getting the most for your money. The RTX 4080 can't say the same.

Further down the stack, the performance gains were largely a giant shrug at consumers along with a bill for several hundred dollars or pounds. And this latter part is what really concerns me here.

Moore's Law, the shorthand rule that transistor density should roughly double every two years (more transistors mean more power, but don't necessarily scale 1:1, so twice as many transistors doesn't mean it's twice as powerful), has long since died, despite assurances to the contrary from Intel, Nvidia, and AMD. 

These companies are reaching the physical limit of the performance gains they can get from transistor density. Honestly, that isn't their fault, and I'd defend them on this front if it weren't for the fact that they all keep pretending—and telling consumers—that they've found a way around this problem.

Instead, if we have to accept maybe 10-15% gen-on-gen improvements, that's fine, but level with the consumer that this is the case. Marketing a 5% improvement as "the fastest ever!!!!" might technically be true, but we all know that this kind of marketing trick can sour with customers real quick. Better to focus on value-add features and emphasize those rather than overpromise on performance you can't physically achieve anymore.

What will future graphics card generations offer?

My hunch is that this is exactly where things are going to go. 

Nvidia is already very keen on emphasizing DLSS as the solution to this problem, and AMD is likewise touting its own tech as well. I expect that AMD will introduce more AMD-card dependent technology in the RDNA 4 generation, rather than continue to be card-agnostic the way FSR is right now. AI will also factor into this as well now that AMD is starting to ramp up its AI hardware in other areas.

I've also said previously that Nvidia isn't long for the graphics card scene thanks to its absolute motherload of AI money pouring in, which will likely see Nvidia shift resources in that direction rather than the Nvidia 5000-series GPUs to come. Whatever they do release will likely rely even more heavily on AI tech to boost performance, but that will depend on developer implementation, which has been an issue since this technology kicked off a few years back.

In short, if there's anything that could come to define the next generation of graphics cards, it will probably be much bigger AI integration and advances. How much of a frame rate boost that will give remains to be seen, but it's the only logical move any of these companies can make.

Oh, yeah, and don't count out Intel Arc Battlemage, I suspect that GPU series will end up being a sleeper hit in a couple of years time.

John Loeffler
Components Editor

John (He/Him) is the Components Editor here at TechRadar and he is also a programmer, gamer, activist, and Brooklyn College alum currently living in Brooklyn, NY.

Named by the CTA as a CES 2020 Media Trailblazer for his science and technology reporting, John specializes in all areas of computer science, including industry news, hardware reviews, PC gaming, as well as general science writing and the social impact of the tech industry.

You can find him online on Bluesky @johnloeffler.bsky.social