Intel’s XeSS frame rate boosting tech – the rival to Nvidia DLSS and AMD FSR – has been put through its paces and proved a sterling performer, which must be something of a relief for Team Blue, after having a tough time of things lately with its Arc graphics cards.
This was a thorough affair using a tailored build of Shadow of the Tomb Raider that supports XeSS, with some huge frame rate jumps in certain scenarios, and some more modest but still worthwhile boosts in others.
For example, an 88% increase in frame rate was found at 4K resolution with XeSS ‘performance’ mode – the setting that favors frames per second (fps) over image quality. That means compared to the game running at native 4K, the upscaled 4K with XeSS was almost twice as fast, a blistering increase (with the caveat that using performance mode, the graphics quality is obviously notched down compared to native 4K).
Using ‘quality’ mode, though, which aims to keep the image quality at a similar level to native 4K, there was still a 47% increase in the frame rate, which is very impressive. Even ‘ultra quality’ which drives hardest to get a near-native 4K picture still witnessed a 23% performance uplift, which is well worth having.
At 1440p less benefit was recorded – not surprising seeing as 4K is obviously far more stressful on the graphics card – but performance mode still ushered in a 52% frame rate boost, which is pretty nifty. In quality mode, a 26% uptick in fps was observed.
So what about the other facet of XeSS, the image quality achieved versus native? Well, Digital Foundry found that Intel’s upscaling tech performed admirably, and indeed was a worthy match – give or take – for Nvidia DLSS (with both running in quality mode, of course).
The odd glitch was noted in certain modes when examining static images, like shimmering artifacts, but hopefully these kinds of small flaws will be ironed out by Intel before too long. Sometimes this can occur with DLSS, too, but Digital Foundry clearly illustrated that it was more prevalent with XeSS, and when it happened with both, it was to a noticeably lesser extent with DLSS.
When the game was in motion, mind, XeSS offered some impressive clarity on a par with DLSS, and with fast movement scenes – which can be hard to track for upscaling tech – XeSS also performed well. And that’s in marked contrast to FSR 2.0, which struggled more in these respects with motion compared to DLSS.
Analysis: The best of both DLSS and FSR rolled into one?
The upshot is that this is a real achievement for Intel with its first run at XeSS. Just like DLSS, it’s a temporal upscaling solution, using AI for refinement (whereas FSR 2.0 does not employ those machine learning chops – although that could change in the future, if the rumors are right). Given that, we hoped to see similar results to DLSS, but it was far from certain that Intel could pull this off. However, from this first in-depth look at XeSS, it seems that’s the case; which is great news.
Particularly as the big advantage with XeSS is that it works not just with Intel’s Arc graphics cards, but the firm’s integrated graphics, and also rival GPUs meaning AMD and Nvidia products. That’s because Intel has taken the laudable route of this being an open standard (like AMD, commendably for both firms, but not Nvidia, with DLSS being proprietary, working with its own GPUs only).
There are caveats about support for other graphics cards, of course, in terms of it only being relevant to more modern GPUs (that support HLSL Shader Model 6), and there are some downsides elsewhere. Namely that the results aren’t quite up there quality-wise with using an Arc graphics card, and Digital Foundry shows an example of an RTX 3070 which has a slightly more sluggish frame time with XeSS (but there’s not a huge impact in this respect, by any means). Certainly, even with some drawbacks, it’s still great to have the option of frame rate boosting for those with non-RTX Nvidia graphics cards (or indeed AMD GPUs).
The long and short of it is, then, that Intel XeSS already looks to be a worthy rival for DLSS, and with further honing – and the fact that it applies across a much wider range of GPUs – it might just be Intel’s secret weapon. Although we will need to evaluate how it performs in other games, naturally – and we’ll also want to see those Arc A7 graphics cards hitting the shelves soon. Not to mention for Team Blue to redouble its efforts and work to nail the Arc graphics driver at a faster pace.
Sign up for Black Friday email alerts!
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
Darren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - 'I Know What You Did Last Supper' - was published by Hachette UK in 2013).