With the RTX 4080 on the horizon, should you still buy an older, cheaper GPU?
We're not out of the woods just yet
As we stand on the precipice of a new generation of graphics cards from Nvidia and AMD, a familiar discussion has taken root online – is the previous generation of GPUs still worth buying when something newer, shinier and more powerful is on the horizon?
I can clear that up for you: uh, yea.
The argument remains the same as it has for the past few generational releases – that it’s pointless to buy a product on the verge of it becoming ‘outdated’ and the prices often do not fall enough to make it worth the money. There is logic to this argument, but it fails to take into account a huge issue: the GPU market has not followed logic for a long time.
At least not in the ways that matter. Yes, in years gone by you could walk down to your local brick-and-mortar computing store and grab a new graphics card with relative ease, be that a brand new model at MSRP or the previous generation it was replacing at a tasty discount, but the last few years have proved that you simply cannot predict the future of a market, GPU or otherwise.
When GPUs like the Nvidia GeForce RTX 2070 and RTX 2080 were first released, plenty of PC gamers and computing enthusiasts believed an upgrade from the previous generation couldn't be justified by the insanely high prices, and that waiting for the 3000 series to drop was the more sensible option - and then all hell broke loose.
History could repeat itself
The Ampere series of cards like the GeForce RTX 3060 had a divisive pricepoint – some finding it to be reasonable, while other lamented the apparent death of affordable graphics cards – but a terrible cocktail of word issues also made them almost impossible to find.
The Cryptomarket saw a boom in Ethereum, which made the cards very attractive to miners who snapped them up in bulk for use in cryptomining farms, while the Covid-19 pandemic caused supply chain issues and a chip shortage that affected almost everything in the tech world, from computing components to cars and appliances.
All this scarcity inflated the price of GPUs to wild levels, with the GeForce RTX 3080 hitting an average resale price of almost x3 times its original MSRP during the height of the Great GPU Shortage. AMD fared a little better, though even Team Red was still blighted by shortages and scalpers.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
One difference between how the two companies approached the situation was in regards to cryptomining, with Nvidia eventually re-releasing updated models of almost its entire Ampere series equipped with anti-mining preventatives.
These LHR (low-hashrate) cards weren’t completely uncrackable, but they may have helped dissuade folks from snapping up mountains of RTX 3060s. On the other hand, AMD acknowledged the situation and stated that once a customer buys a GPU, they’re free to do what they like with it.
A big issue however, is this scarcity didn’t just affect that generation of graphics cards – it also inflated the price of almost every GPU on the market. The argument of waiting until the next generation of graphics cards was released in order to snap up a cheaper, older model or a fairly priced new release completely disappeared almost overnight.
It's your money, your requirements and your choice
Will this happen again? It's hard to say.
The shortage was caused by a variety of issues that just happened to occur at the same time, but should Covid-19 cause more supply chain constraints then it's likely that both Lovelace and RDNA3 GPUs could see their prices start to skyrocket due to demand. The crypto market also has every likelihood to recover given its volatility, so you shouldn’t feel like we’re out of the woods just yet.
I do have a simpler argument in all of this though: it's worth buying a new GPU if it’s worth it to you.
Older GPUs still very much have a place in the market right now. You only have to look at the Steam Hardware Survey to see how many gamers are still using cards that are several generations old at this point, and depending on the games you play, it's likely you don’t actually need an especially powerful graphics card. Most first-person shooters and battle-royale style games intentionally keep the system requirements low to attract more players for example.
I wrote a piece several months ago off the back of the RTX 3070 Ti getting poor reviews. It was marked down for its price and performance, but the joy it has bought me is beyond value. I won’t feel bad for my choices given the circumstances given graphics cards were harder to find than gold dust. Would I have preferred an RTX 3090? Sure, but did I need one? Absolutely not.
As the cost of living is becoming increasingly high in many countries around the world, there is simply no point in upgrading to the ‘next big thing’ just for the sake of doing so, and both Ampere and RDNA2 GPUs will still be relevant and capable for years to come.
If you see an especially good deal on a cheap graphics card in the coming weeks, don’t let early adopters try and convince you to wait for Lovelace or RDNA3 unless you’re happy to do so – we simply don’t know what the market will look like when they launch.
Jess is a former TechRadar Computing writer, where she covered all aspects of Mac and PC hardware, including PC gaming and peripherals. She has been interviewed as an industry expert for the BBC, and while her educational background was in prosthetics and model-making, her true love is in tech and she has built numerous desktop computers over the last 10 years for gaming and content creation. Jess is now a journalist at The Verge.