After five years, ray tracing still isn't worth the bad performance

Real-time ray tracing in Cyberpunk 2077
(Image credit: CD Projekt RED)

Ray tracing has been mainstream now for over five years, having originally debuted with the Nvidia RTX 20-series (Turing) graphics cards in 2018. Two successor GPU generations, the 30 (Ampere) and 40 series (Ada), have iterated on the technology and it has even found its way to the Xbox Series X and PS5 as well. 

The goal of this real-time rendering technique is to deliver a more realistic and immersive approach to in-game lighting, reflections, and shadows. There’s no doubting that it’s impressive when it works, but the performance of modern games on both console and PCs using ray tracing means that it’s just not there yet for the vast majority of us. 

Ray tracing works by simulating the process of light bouncing off objects from sources via a complex algorithm. It remains an incredibly hardware-intensive task for even high-end gaming PCs and the likes of the Xbox Series X and PS5 due to the frames themselves taking much longer to render in real-time due to the amount of rays being cast and calculated simultaneously. Where computing hardware has become more capable with handling this tech in target resolutions such as 1440p and 4K (2160p), the same cannot be said for current-gen consoles which are feeling the struggle less than three years into their lifespans. 

The key difference between the ray tracing performance available on console and with some of the best graphics cards from AMD and Nvidia is their implementation of upscaling. In Nvidia’s case, DLSS 3 uses the onboard Tensor cores (A.I. powered) alongside Frame Generation to downscale the native picture quality, upscale into a target resolution and use artificial intelligence to fill in the blanks. In contrast, AMD’s Radeon Super Resolution is less hardware-bound, meaning that it works on a wider array of GPUs as it’s in-driver and open source. Through upscaling a lower resolution, the likes of the RTX 4080 and the Radeon RX 7900 XTX largely uphold their end on ray tracing but the performance drops are noticeable making the sacrifice difficult to defend.  

Ray tracing on console 

Ray tracing in Star Wars Jedi: Survivor

(Image credit: EA / Respawn)

While some of the best RTX games can look incredible on console, the performance leaves too much to be desired. We can look at recently released games such as the Dead Space remake, Star Wars Jedi: Survivor, and Elden Ring’s patch 1.09 which brings the tech to FromSoftware’s open world adventure on both systems. You’re looking at a maximum of 30fps in all three games with resolutions ranging from sub-1080p to 1440p; falling short of the claimed 4K120 output that they are technically capable of. Ray tracing looks great, but the trade-off is a lower image quality and juddery framerates.

It’s not unique to newer releases either. Framerate and resolutions being hit substantially have persisted since both machines debuted. You only need to cast your mind back to the likes of Control and Cyberpunk 2077 (Patch 1.61) which runs at 1440p and 30fps which is the standard for the vast majority of ray-traced titles available on both consoles. At a time when 30fps just isn’t good enough for Xbox Series X and PS5, ray tracing seems more like a compromised afterthought than a must-have feature. 

Ray tracing on PC  

Ray tracing in Miles Morales

(Image credit: Insomniac Games)

With performance like this, it’s hard to argue that ray tracing works best with the vastly more powerful hardware of current-generation video cards. However, games are far from perfect even when employing the latest GPUs from AMD and Nvidia. The Nvidia RTX 4080 is a $1,199 / £1,080 / AU$1,740 graphics card armed with 12GB GDDR6X VRAM and 9728 cores. It’s tremendously powerful, but there’s still a noticeable hit when playing some of the best PC games with ray tracing enabled. 

Marvel’s Spider-Man: Miles Morales, originally a PS5 launch title, received a PC port in late 2022. The latest high-end Ada graphics card can play the game in 4K, with ray tracing disabled averaging, at 120fps, but the framerates halves to around 60fps when enabling the likes of real-time reflections and shadows. Cyberpunk 2077, the poster child for ray tracing and widely considered to be the benchmark touted by Nvidia, three years after release remains inconsistent as the game’s optimization is far from stellar. Pushing DLSS to Performance mode, the title hovers around 60fps (via Gaming Bench). You’re looking at a roughly 50% reduction in framerate, and a heavy reliance on A.I. upscaling to achieve playable framerates in the 4K target resolution. 

Now, the vast majority of PC gamers don’t have machines running such a high-end video card. In fact, according to the most recent Steam hardware survey, the most widely used ray tracing enabled video card is actually the RTX 3060, the mainstream, mid-range Ampere GPU which released back in early 2021. Its price undercuts the PS5 and Xbox Series X by coming in starting at $329 / £300 / AU$499 and it’s geared around 1080p and 1440p. The performance here tracks with the impact of ray tracing from both consoles and high-end video cards of around 50%. You only need to look as far as Cyberpunk 2077 and Marvel’s Spider-Man: Miles Morales which run at an average of 60fps in 1080p from the most-used mainstream GPU without ray tracing and then hover around the 30-40fps mark in the lowest conventional resolution (via ShadowSeven on YouTube)

Ray tracing looks incredible. However, it remains a sacrifice for playable framerates; something has to give, whether that’s playing in a lower resolution, halving the framerates, or relying extensively on upscaling. The compromise may be worth it on console and PC if you’re okay with 30fps (and even lower in some instances) but to maintain 60fps in 1440p and above, there’s a serious cost involved that still makes ray tracing hard to justify, and difficult to recommend. 

Whether you use ray tracing or not, you'll get the best experience with the best monitors for PS5, the best monitors for Xbox Series X, and the best gaming monitors for PC. 

Aleksha McLoughlin
Contributor

Aleksha McLoughlin is an experienced hardware writer. She was previously the Hardware Editor for TechRadar Gaming until September 2023. During this time, she looked after buying guides and wrote hardware reviews, news, and features. She has also contributed hardware content to the likes of PC Gamer, Trusted Reviews, Dexerto, Expert Reviews, and Android Central. When she isn't working, you'll often find her in mosh pits at metal gigs and festivals or listening to whatever new black and death metal has debuted that week.