Elitism can run rife in the PC gaming community. This isn't to say everyone within it has an overly inflated superiority complex, but for what essentially boils down to being a hobby, a loud minority of people can take things a little too seriously when it comes to hardware.
At least, that's my opinion on the matter, and one I hold for things outside of computing. From cars to phones, I've never seen the need to own 'the best' of something just for the sake of it, especially when my requirements are fairly low. Still, until recently I was rocking a second-hand GeForce GTX 1070 Ti, and as well as it served me I did find it had started to outgrow my minimum needs (as well as make some rather unsavory noises).
- Check out our guide for where to buy the RTX 3070 Ti
Want vs Need
Upgrading your graphics card is easier said than done these days though. Thanks to a dastardly mix of cryptominers buying up available hardware for farms and the global shortage of silicon, a 'perfect storm' was created that resulted in even the best cheap graphics cards – many several years old – becoming some of the most highly sought-after hardware on the market. It didn't take long for scalpers to jump on board to rapidly inflate prices either, just to really rub some salt into the wounds.
Like many, I had my heart set on grabbing an Nvidia GeForce RTX 3080, highly praised in reviews for being a great balance of performance and affordability and absolutely one of the best graphics cards you're going to find.
While the Nvidia GeForce RTX 3090 was significantly out of my budget even before scalpers had messed around with the market, I additionally dismissed it because I'd feel bad having something that powerful in my rig knowing it was mostly going to be put to work running The Witcher 3 at 1080p or rigging up Vtubing models, as opposed to delivering 8K performance.
As it turns out, I should have set my sights even lower. Even in our own Techradar review, the Nvidia GeForce RTX 3070 Ti failed to impress, only scoring three out of five stars, with similar commentary for its low score being shared across other media sites. Comparing the RTX 3070 Ti vs RTX 3080, the former was the next step down from the latter that I coveted, and it didn't seem worth the cash I'd be saving.
In this current market where 'beggars can't be choosers', I never did get my RTX 3080, instead, I've been lucky enough to have tried both the Nvidia GeForce RTX 3060 and then the RTX 3070 Ti and it genuinely made me re-evaluate a few things.
The step from using a rickety GTX 1070 Ti to a brand-spanking-new RTX 3060 is a significantly bigger jump than I ever imagined. I'm not naive, I knew It was going to be an improvement but it really did prove that DLSS is basically witchcraft.
Thanks to the AI upscaling I was able to actually play Cyberpunk 2077 at an enjoyable framerate, and while the raytracing capabilities of the entry-level GPU won't be fighting alongside beefier cards any time soon, I wouldn't shut up about Minecraft RTX for a few weeks.
The RTX 3070 Ti has been critiqued as an unwise investment by many given where it sits between other GPUs, costing 20% more than the 3070 for around an 8% performance boost. It's also only $100 cheaper than the RTX 3080 if you have the luxury of that being a small sum of cash, leading to the opinion that you're actually better off buying either of the non-Ti flavored cards.
Not that those recommended retail prices set by Nvidia actually matter right now. I'm starting to sound like a parrot when I mention how awful the market is at the moment for anyone trying to buy a new graphics card but the longer it lasts, the more it feels like its never going to end. Actually getting my hands on my RTX 3070 Ti was a mixed bag of emotions due to some weird sense of guilt that I even had one and the slight disappointment that I hadn't secured the more powerful GPU I wanted.
And then I installed my RTX 3070 Ti into my rig and used it for a few days and It gave me a fresh perspective on things.
Putting pen to digital paper
I've had experience jumping into digital art through borrowed hardware, and while the RTX 3060 was happy to run software like Photoshop, I was having a less enjoyable time with 3D rendering and sculpture, an old hobby of mine from university. The RTX 3070 Ti has allowed me not just to play AAA games at a great quality level, but also rekindled my love for digital art mediums for the first time in years, making demanding applications accessible to me on a daily bases.
It's hard to shake the 'bigger number means better everything' mentality that PC gaming exudes but I absolutely love this GPU. Outside of benchmarking hardware for my job, I've started to switch off framerate displays in games because it became clear after a week that everything I played was exactly what I needed to have fun. Everything was maxed out, my FPS (whatever that may be across every title) was buttery smooth and I could even play around with ray tracing in the games I play that had the feature. Very nice, very shiny.
I don't want to list statistics at you across a few popular titles because it defeats the object I'm trying to get at, which is achieving insane framerates or comparing how it stacks up against other cards from Nvidia or AMD shouldn't impact you just having fun. I wasn't when I was using my sickly GTX 1070 Ti, and now I am, and that's all I care about. Now that i've achieved that, why on earth would I need to upgrade to a beefier product?
If I had snubbed the RTX 3070 Ti in favor of waiting for an RTX 3080, not only would I have overpaid on how much my happiness is worth, but i'd likely still be waiting for stock to become available. If you're lucky enough to run into one of the newly released graphics cards then I would implore you to not think too hard about numbers. The increase in performance matters, but probably less than you think when it comes to just having a good time.
- Stay up to date on all the latest tech news with the TechRadar newsletter
Sign up to receive daily breaking news, reviews, opinion, analysis, deals and more from the world of tech.
Jess is a former TechRadar Computing writer, where she covered all aspects of Mac and PC hardware, including PC gaming and peripherals. She has been interviewed as an industry expert for the BBC, and while her educational background was in prosthetics and model-making, her true love is in tech and she has built numerous desktop computers over the last 10 years for gaming and content creation. Jess is now a journalist at The Verge.