HDMI on Xbox Series X: what you need to know

Xbox Series X
(Image credit: Future)

The Xbox Series X is now available to buy, giving a big boost to the capabilities you’d expect from previous home consoles. 

Unfortunately, stock issues are still preventing many people from finding an Xbox Series X (or Xbox Series S). But whether you’ve been lucky enough to find one yet or not, it’s worth considering that the new Xbox Series X’s HDMI capabilities – and its TV capabilities – are different to what you’d expect to find on a current-gen console. 

Both the Xbox Series X and Series S come with an HDMI 2.1 connection as standard. It's worth noting, though, that while the Xbox Series X comes with an HDMI 2.1 supporting cable, the Xbox Series S only comes with an HDMI 2.0 cable – which we'll look at in more detail below.

This allows the consoles to support a max 120fps (frames per second) in games running at 4K resolution. What this means when it comes to console gaming is a huge leap forward in gameplay, speed and performance. For comparison, the Xbox One X could only hit 4K/60fps, meaning the bar has been significantly raised for this console generation.

Not only does HDMI 2.1 allow for much better frame rates, it also means you can transfer any 4K and even 8K image qualities from your games console to a compatible TV. The biggest caveat here, however, this is that 8K gameplay will be limited to 60fps – although you likely won't be playing any 8K games for a while.

There’s a lot to get excited about when it comes to HDMI connections and the Xbox Series X, as well as a lot to learn, which is why we’ve created this guide. 

Xbox Series X/S ports and cables

Microsoft's Xbox Series X comes with an HDMI 2.1 port at the back of the console and a matching cable in the box. 

The Series S, on the other hand, comes with HDMI 2.1 port but an HDMI 2.0 cable ("high speed" rather than "ultra high speed"). This means that the cheaper console can play games in 4K, but it won't be able to play games at 4K/120fps with its included cable.

Now, it's important to note that even if your TV doesn't support an HDMI 2.1 connection, you can still plug the console into a standard HDMI port in a TV or monitor – just with more more picture limitations.

This all comes down to the differences between these two port/cable standards. The HDMI 2.1 standard supports a higher bandwidth of data transfer, at 48Gbps (gigabytes per second) compared to the 18Gbps supported by HDMI 2.0b.

More gigabytes equals more data, which equals higher resolutions and faster frame rates.

HDMI 2.1 vs HDMI 2.0

Put simply, different HDMI cables can transmit different values. These values are the video and audio quality which are transferred from the HDMI cable to the screen.

The difference between HDMI 2.0 and 2.1 is going to be noticeable to those with a more discerning eye here. An Xbox Series X game, running in 4K/120fps will obviously look and play magnificently and it's something a lot of PC users have been used to for a while now.

It's hard to describe how different it will look until you actually see it. But if you're instead playing an Xbox Series X game on a TV which has an HDMI 2.0b port then you'll notice that if you play in 4K, you won't be able to play in 120fps.

For the Series S, you'll be playing games in 1440p (which is known as QHD) and it will be able to hit 120fps. If, however, your TV does support 4K you can watch films and other media in 4K and the Series S will upscale games to 4K where compatible.

To put this leap in quality into perspective, the previous HDMI 2.0b standard used on most modern TVs today does allow up to 4K video at 60 frames per second, but to hit higher frame rates like 120fps you have to drop resolution quality to HD.

HDMI 2.1 removes this barrier, meaning you can play in Ultra HD quality while still having silky-smooth frame rates on your games.

Gears Tactics

(Image credit: Microsoft)

TV HDMI support

The capabilities of your TV are crucial here too. If your TV doesn't support 4K resolutions, for example, you won't be able to play next-generation games at 4K. Most 4K TVs tend to come with three or four HDMI 2.0 ports as standard, whereas newer models are starting to ship with anywhere between one and four HDMI 2.1 ports – it really varies between TV brands, and you'll have to be careful you end up getting the inputs you need for your intended level of play.

Even HD TVs, though, tend to come with HDMI 2.0 ports.

Frame rates vary between televisions, too. Most TVs come with a 60Hz or 120Hz panel (enabling 60fps or 120fps play respectively), though some cheaper models make do with a 50Hz panel, which will be somewhat limiting for Xbox Series X games like The Falconeer, which can hit 120fps at 4K if your TV supports that.

It is important to note that while the Series X uses HDMI 2.1 as the connection, it doesn't mean your TV also needs to support HDMI 2.1. You can play over an HDMI 2.0 port and cable, just with a max 4K/30fps performance ceiling.

8K may be on your mind too. Any 8K TV will almost certainly ship with the newer HDMI port – but 8K sets are still expensive, and 8K gaming isn't quite on the horizon yet.

A good example to use here is the difference between Monster Hunter: World on Xbox One and Xbox Series X. In this example, performance has nothing to do with the HDMI cable but instead your TV's refresh rate.

Using a Full HD TV, with a refresh rate of 60Hz, you can play Monster Hunter World at just about 60fps on Xbox Series X through the console's backwards compatibility.

If you played it on the same TV, using a current-gen Xbox One, you won't hit 60fps because the console simply isn't powerful enough to do so. In this case, the HDMI port and cable won't matter.

In general, however, if you want the best results from your shiny new next-gen console, you'll want to have a TV which also supports the HDMI 2.1 connection so you can enjoy 4K gaming at 120fps.

You might also want to check out the capabilities of HDMI on PS5.