I tested the RTX 5060 - is 8GB of VRAM really enough in 2025?

The Asus Dual OC RTX 5060 being held by a person's left hand.
(Image credit: Future)

It's almost here: Nvidia's RTX 5060 has been a hotly-discussed upcoming GPU, with arguments on both sides about pricing, performance, and of course, the much-maligned 8GB of video memory.

Well, I've got my grubby little mitts on one of these new GPUs ahead of its official launch on May 19, and Nvidia was kind enough to let me take it home and slap it into a build. Granted, a lot of details are still embargoed (look out for our full review here at TechRadar in the near future), but Team Green did give me limited permission to talk about some early test results with a pre-agreed pool of games, so here's the scoop.

Setting things up

Before I discuss the game performance, I'll start by laying out the testing parameters. At $299 (other regional prices TBC), the RTX 5060 doesn't represent a generational price increase from the venerable RTX 4060 - which was itself cheaper than the previous RTX 3060. As such, I wanted to put it in an appropriate system; the rig I used for testing features an AMD Ryzen 3600, 16GB of Crucial Ballistix DDR4 RAM, and a midrange X570 motherboard from ASRock.

Nothing fancy here, in other words. The xx60 GPUs from Nvidia have always sat at the budget-to-midrange section of the GPU scale; perhaps never quite being true budget cards, but still aiming to offer strong value for money with a sensible price tag for gamers who can't afford to splash thousands on a high-end card. It's worth noting that there's no Founder's Edition model of the RTX 5060, so the exact card I'm using here is the Asus Dual OC RTX 5060 8GB.

The Asus Dual OC RTX 5060 inside a PC case.

The RTX 5060 can fit snugly inside most PC cases, and only requires a standard 8-pin power connection - ideal for gamers looking to upgrade from an older GPU. (Image credit: Future)

Next up, the actual game settings. I wanted to really put this card through its paces with a broad variety of tests, but Nvidia was quite insistent about the use of its shiny new AI features. As such, DLSS 4's resolution upscaling is set to the 'Quality' preset at 1080p output, and Multi Frame Generation (MFG) is used for all of the tests - yes, I have thoughts on this, and you can read them further down. All of the following games were tested at their respective maximum in-game graphical presets at 1080p.

Delivering some Doom

The first game on the testing rig was Doom: The Dark Ages. While its predecessor Doom Eternal held a fairly positive reputation for running well on lower-end hardware, this techno-medieval new romp is a little more demanding. Still, at the 'Ultra Nightmare' graphics preset with DLSS 4 and MFG enabled, I was getting crisp framerates of more than 220fps, with lows never dropping below 200. The Dark Ages absolutely slaps, by the way.

It looks fantastic in motion, too. DLSS has come a long way since its public release more than six years ago, and old complaints of visual tearing and glitching are completely non-applicable here. The frame-generation works great too; even in fast-paced arena battles against the hordes of Hell, I wasn't able to register any visual issues, and with the graphics cranked up to maximum everything looked very pretty (when it wasn't covered in blood and viscera, anyway).

A screenshot of the rideable dragon in Doom The Dark Ages

It should surprise absolutely nobody that Doom: The Dark Ages is very much My Kind Of Game. (Image credit: id software/Bethesda)

The same was true of Marvel Rivals, which saw framerates comfortably sitting around the 250-260fps mark, with some dips (though never below 200) during particularly intense firefights. Again, the game looked great, with no visual issues that could be readily discerned. I did experience two individual crashes across about two hours of playtime, but we can reasonably put that down to early driver instability; this is technically an unreleased GPU running on beta drivers, after all.

The perils of generating your own frames

All looking very good so far, then? After all, the majority of PC gamers with new Nvidia GPUs are using DLSS or frame-gen in some capacity. It's free framerate; why wouldn't you? But there are some important caveats surrounding upscaling and frame generation, and those caveats did rear their heads in my testing.

Upon running the built-in benchmark for Cyberpunk 2077 - in its ridiculously demanding RT Overdrive preset, no less - I got a perfectly good framerate average of 121fps, but upon actually jumping into the game, I was getting occasional issues with stuttering during high-action sequences like car chases and intense gunfights.

It was never unplayable, to be clear. Visual fidelity and clarity remained very good with the DLSS Quality preset, with no obvious tearing or artifacting. We're talking about very brief, very occasional drops into the 30-40fps range, quite likely caused by a combination of the heavy VRAM demands of Cyberpunk's RT Overdrive mode (on an 8GB card) and the key drawback of frame generation: it's only ever as effective as the base rendering rate.

DLSS vs native performance figures with Frame Generation

DLSS and Multi Frame Generation are something of a revolution, but they're not entirely without drawbacks. (Image credit: Nvidia)

See, if MFG is able to run at maximum capacity (exclusively on RTX 5000 GPUs) it offers up to 4x frame-gen. In other words, for every individual frame rendered directly on the GPU, you get three AI-generated frames ('fake frames', as some detractors online have dubbed them) which are then inserted between the rendered frames to speed up your end framerate.

The problem with this is that it's directly tied to the original rendered framerate. So if you've already got a healthy fps above 60, then great, you're getting an extra 180 frames per second for free. But if your framerate dips are bottoming out in the single digits, no amount of MFG will produce enough extra frames to salvage the game's playability. Thankfully, Cyberpunk never quite got that bad in my 1080p testing, but it makes me wonder how effective the RTX 5060 will really be when it comes to 1440p gaming.

Override it out

There's one other major caveat to consider here, and that's developer support for DLSS and MFG. Game devs need to build support for Nvidia's nifty performance-boosting features into their games, and while Doom: TDA, Marvel Rivals, and Cyberpunk all include that support, the last two games I tested didn't feature integrated MFG functionality.

Fortunately, Nvidia has a workaround for this: DLSS Override, which lets users force-activate both resolution upscaling and frame generation on a driver level, effectively bypassing the need for dev implementation. Less fortunately, the results of this tool are... mixed, to say the least.

A screenshot of the city of Paradis in Avowed.

Avowed looks fantastic in still screenshots, but in motion with DLSS Override things weren't quite as clean. (Image credit: Obsidian / Xbox Game Studios)

Hogwarts Legacy, with MFG enabled via the Override setting in the Nvidia App, was getting a pretty consistent 170-180fps. Visual fidelity was solid, with only a very small amount of tearing here and there, mostly around bright particle effects. It was certainly playable; I'd say these extremely minor graphical issues were worth the massively boosted framerate. With DLSS Quality and the Ultra graphics preset, it still looked great at 1080p.

Avowed, on the other hand, did not seem to enjoy being overridden nearly as much. While MFG and DLSS 4 granted it an excellent 189 average fps, the bright and colorful scenery of the Living Lands was rife with small but noticeable issues with blurring and artifacting, especially in busy environments like the port city of Paradis.

Closing thoughts

This was a case where I felt DLSS Override wasn't really worth it; it'll vary a lot from game to game, of course, and the fact that Override exists and functions at all is still hugely impressive to me. But Avowed had a little too much visual jank for my liking with Override's frame-gen running, to the point where I think I'd rather just drop the graphical settings down a notch or two and play with a lower framerate but more consistent visuals.

A screenshot from Hogwarts Legacy

Hogwarts Legacy ran reasonably well using DLSS Override to 'force' frame generation, but that's not the case for every game that lacks native MFG support. (Image credit: WB Games)

Still, at the asking price, the RTX 5060 is a definite improvement over the RTX 4060, thanks in no small part to the introduction of MFG - and while that will only reliably help your performance in supported games, more and more devs are now working to include MFG functionality in their titles.

As for that 8GB question... honestly, I still don't think it's enough, especially considering that 1440p is becoming a more popular resolution for PC gamers. If we don't get a desktop RTX 5050, this card can be considered the bottom rung of Nvidia's Blackwell desktop GPU stack, so it's not entirely unreasonable for it to remain a 1080p-focused card, but this is 2025: it's time we step up our VRAM game a bit, folks.

You might also like...

TOPICS
Christian Guyton
Editor, Computing

Christian is TechRadar’s UK-based Computing Editor. He came to us from Maximum PC magazine, where he fell in love with computer hardware and building PCs. He was a regular fixture amongst our freelance review team before making the jump to TechRadar, and can usually be found drooling over the latest high-end graphics card or gaming laptop before looking at his bank account balance and crying.

Christian is a keen campaigner for LGBTQ+ rights and the owner of a charming rescue dog named Lucy, having adopted her after he beat cancer in 2021. She keeps him fit and healthy through a combination of face-licking and long walks, and only occasionally barks at him to demand treats when he’s trying to work from home.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.