Intel’s DLSS competitor, XeSS, appears to be a hit

Digital foundry published an in-depth look at the upscaling technology that will be included with Intel’s upcoming Arc GPUs and compared its performance to Nvidia’s offering, Deep Learning Super Sampling (DLSS). Based on the tests they’ve run so far, Intel’s Xe Super Sampling, or XeSS for short, seems to be doing a reasonable job of holding its own against more mature technologies – although it’s worth noting that Digital foundry only tested with Intel’s most high-end board, the Arc A770, and mostly in a single game, Shadow of the Tomb Raider.

The idea behind XeSS and other similar technologies is to run your game at a lower resolution and then use a set of machine learning algorithms to scale it in a way that looks much better than more basic scaling methods. In practice, this allows you to run games at higher frame rates or enable fancy effects like ray tracing without giving up a huge amount of performance, because your GPU is actually rendering fewer pixels, then augments the resulting image, often using dedicated hardware. For example, according to Digital foundry, if you have a 1080p screen, XeSS will run the game at 960 x 540 in its highest performance (AKA highest FPS mode) and 720p in its “Quality” mode before upscaling it to resolution native to your monitor. If you want more details on exactly how it does it, I recommend checking it out. Digital foundryis written on Eurogamer.

In this case, Nvidia’s DLSS (left) and Intel’s XeSS (center) did a better job of retaining detail in these fishnets than the game’s built-in anti-aliasing (right).
Image: Digital Foundry

In Digital foundry, XeSS performed this task quite well when running on the Arc A770 (the technology will also be usable on other non-Xe graphics cards, including those built into Intel processors and even Nvidia cards). It delivered a solid boost in frame rates over running the game at 1080p or native 4K, and there wasn’t a huge drop in quality like you’d expect to see without any sort of upgrade. At scale. Put side-by-side with the results from Nvidia’s DLSS, which is more or less the gold standard for AI-powered upscaling at this point, XeSS was able to retain a comparable amount of sharpness and detail. in many areas, such as foliage, character models and backgrounds.

Digital foundry found that XeSS added two to four milliseconds to frame times, or how long an image was displayed on screen before being overwritten. This could, in theory, make the game less responsive, but in practice the fact that you get more FPS helps even things out a bit.

That said, XeSS had a few issues that weren’t present or noticeably less intense when using DLSS. Intel’s technology particularly struggled with fine detail, sometimes showing shimmering moiré patterns or bands. These kinds of artifacts could definitely be distracting depending on where they showed up, and they got worse as Digital foundry pushed the system, asking it to scale lower and lower resolution images to 1080p or 4K (something you might have to do with particularly demanding games). Nvidia’s technology wasn’t entirely immune to these issues, especially in modes that focused more on performance than image quality, but they certainly seemed less prevalent. XeSS also added extremely noticeable jitter effects in water and less intense ghosting to some models when moving.

Gif showing the same scene rendered using three different technologies.  In Intel's, it shows a glitter pattern on a man's shirt, which is not present in other versions.

This type of motion issue might not be easy to spot in stills, but if you’re actually playing a game, it’ll likely stick out like a sore thumb. Intel’s version is in the middle, Nvidia’s is on the left, and an unscaled version is on the right.
GIF: Digital foundry

Intel also struggled to keep up with Nvidia when it came to a few particular topics – notably, DLSS handled Lara Croft’s hair noticeably better than XeSS. There were a couple of times when XeSS results looked better to me, so your mileage may vary.

XeSS is obviously still in its infancy, and the specs on Arc GPUs that it will primarily support are just starting to come out. So it’s hard to say how it will perform on Intel’s low-end desktop line and laptop graphics cards that have been around for a few months. It’s also worth noting that, as with DLSS, XeSS won’t work with all games – so far Intel’s site only lists 14 games as compatible, compared to the roughly 200 titles that DLSS works with (though that the company says it’s working with “many game studios” like Codemasters and Ubisoft to get the technology into more games).

Still, it’s nice to have at least a taste of how it will work and to know that he is, at the very least, competent. Without naming names, other early attempts at this kind of technology didn’t necessarily hold up to DLSS as well as XeSS. While we still don’t know if Intel’s GPUs are actually going to be any good (especially compared to the upcoming RTX 40 and RDNA 3 GPUs from AMD, which has its own upscaling technology called FSR), it is good to know that at least one aspect of them is a success. And if Intel’s cards end up being bad for gaming, XeSS may actually be able to help with that – it’s the small wins really.


Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button