While 2015 has seen 4K Ultra HD TV's launch into mainstream consumer-land, the people making the content have been focusing more on the future roll-out of HDR or 'high dynamic range'.
So what do we mean by HDR? The DR bit, or dynamic range, of a TV describes the contrast ratio the device is capable of displaying – how bright are the whites and (also importantly) how black are the blacks.
If HDR is done well, we get a dramatically better picture than today's TV – not just much brighter but also with much clearer detail.
For the purposes of the TV business we use an average, everyday TV for reference and call that SDR or 'standard dynamic range'. This would be a TV that can display something in the region of around 200-300 nits.
'Nits' is shorthand for brightness measured in 'candelas per square metre' and also scans better than 'cd/m2'.
That reference screen will have a contrast ratio of...well that depends also on who you ask and how they measured it...but a good estimate would be something like 1,000:1.
Future HDR services will give you a very much brighter screen and a vastly wider contrast ratio. Exactly what these figures will be is still a hot topic but early consumer tests suggest that – all being well - the general public are in for a treat. We're looking an increase in picture quality which is far more impressive than either the move from standard definition to high-def or high-def to 4K..
Eye of the beholder
The human visual system (HVS) itself is very complex, and still not fully understood, but evolution has driven the HVS to get a lot of important visual cues from contrast. Feed the HVS a better contrast ratio and it significantly increases the feel of 'being there'.
Now, HDR is likely to arrive at the same time as a much wider colour gamut (WCG) and a move to 10-bit (1,024 rather than the current 256 sampling steps) delivery to the home. These three enhancements (HDR, WCG and 10-bit) are collectively known as 'HDR+'.
Taking these three new factors together, we have a whole new level of TV quality coming.
Manufacturers will make the most incredible claims when it comes to their HDR capabilities. As always, caveat emptor. However, to give some idea of what we, the people behind the content think HDR should be, let's take one of the many competing ideas for HDR TV - Dolby Vision. It's a proprietary scheme that Dolby is currently pushing. Indeed the new Star Wars film, The Force Awakens, has just been announced as the latest Dolby Vision compatible movie.
Dolby's lab made a super liquid cooled display that was capable of displaying tens of thousands of nits. On testing, their research claims to show that viewers are most happy with a display capable of something like 20,000 nits.
For reference, a 100 watt incandescent lightbulb pumps out around 18,000 nits. Clearly a full image at such brightness would be severely uncomfortable but maybe a pinpoint reflection from the curves on a shiny new, bulletproof Aston Martin, for example, would be something to behold.
Too much power
These kinds of brightness levels are not practical however; the EU eco-police for example wouldn't be too happy about the power requirements of such devices. So let's move out of the lab and back to everyday life.
Based on a combination of my own experience, that of my acquaintances within the industry, and building into the equation power consumption and realistic manufacturing potential, a non-scientific estimate concludes that to call itself bright enough for HDR, a display needs to be around 1,500-2,500 nits.
Such a display will look great and not cause power station outages every time a popular live event is about to start. That also looks like a good range of numbers to combine with wider colour gamut.
Some manufacturers are quoting around 1,000 nits at the top-end thus far. So, still a work in progress then.
Then we must think about the contrast ratio. It's all well and good having lots of nits but just making an image brighter is not good enough, we also need deep, dark, 'true' blacks. If you just make the picture brighter you blow out much of the detail, thus losing the extra sharpness HDR is able to offer.
There are some manufacturers though claiming an incredible figure of a 1,000,000:1 contrast ratio. It's difficult to pin down exactly what measurement techniques have been used to arrive at that figure so rather than try to drill down into them one by one, maybe it's better to understand why 1,000,000:1 is such a magic number.
Keeping the numbers general, 10,000:1 is around the amount of contrast that can be perceived comfortably at any one time by the human visual system.
In film terms this is equivalent to 13.3 f-stops. Colour negative film captures around 14 f-stops and film prints are projected at a contrast ratio of around 6.5 f-stops or 1,000:1. The extra latitude in the negative therefore allows for more contrast detail for the creatives to work with prior to striking prints.
Meanwhile away from screens and in day to day life, the eye adjusts constantly to differing lighting conditions and is actually capable a total contrast ratio of some 1,000,000:1 (20 f-stops).
That's why 1,000,000:1 is the magic number and the film industry has known this stuff for decades.
Why am I referring to f-stops and film all the time? The TV industry is still playing catch up with film when it comes to HDR. Not having decided on a set of standards yet, the only and best reference I can pick on is the decades of work done in the film industry in deciding what HDR means.
Also by using f-stops as a reference It can be seen that the contrast ratio numbers are based on a logarithmic scale which is the same way that both film and our vision responds to light.
It should also be noted that manufacturers' claims of 1,000,000:1 contrast ratios may be based on some kind of arbitrary or even linear scale. Until they come clean and tell us their measurement methods, we should remain skeptical.
Unless they, by some miracle of engineering, have developed a paradigm shift in technology and nobody else in the business knows about it...
In reality the eye is happy with around 10,000:1 once it has adjusted to a given lighting situation. To simulate the way the eye adjusts in real life we need enough dynamic contrast in our TV display to be able to cope with the night lowlight and the bright daylight shots without adjusting the set.
We also want for a TV that can deliver more contrast than colour negative film. Another straw poll estimate suggests that a ratio close to 100,000:1 (16.6 f-stops) would give enough latitude to do so.
So all in all, let's say that a good HDR TV would have a wider colour gamut than today and be capable of delivering around 1,500-2,500 nits with a contrast ratio (independently measured) of something close to 100,000:1.
Right now, our favourite TVs aren't capable of such levels of brightness. Samsung's SUHD JS9500 can hit peak luminance of around 1,000 nits, while the top two OLEDs, from LG and Panasonic respectively, are batting around 450 nits.
HDR is what we want, not 4K
The number of 4K Ultra HD pixels in a normal viewing environment is not necessary (as I've spoken about before). It's the quality of the pixel, not the quantity of them that's what matters.
For years we have found our best content on film. The reason for this is the ability of film to collect as much colour and contrast as is needed to produce seriously beautiful imagery. Digital sensors of the past just couldn't match it.
Times have changed and so have our methods for displaying the content. However, the same wish applies: we want quality imagery and therefore quality pixels.
To achieve this we need to aim towards mimicking the colour and contrast capabilities of the human visual system. Better pixels give us deeper, more natural images. More contrast gives us great perception of sharpness and resolution and is more attention grabbing when used creatively.
And the advantages of more natural colour need no explanation.
For the moment, sticking to HDR alone, it is mostly accepted amongst professional opinions that HDR is a far more exciting technology than the 4K numerical boosting of pixel quantity.
In consumer testing, most people prefer HD HDR images rather than 4K SDR. It's just more visually pleasing to the eye. Now you can of course combine HDR and 4K – however it's still the HDR part which appears to be the most arresting enhancement.
HDR is also better for the reality of delivery.
Although the some are claiming 4K can be delivered at around 15Mbps across the network, most who care about quality imagery will suggest much higher bandwidth requirements. Maybe 25Mbps+ are needed for 4K, especially for content like sports.
That's going to be expensive and (for some service providers) extremely difficult to implement.
More nits needs more bits
Then comes the opportunity we touched on earlier, something the streaming guys are really relishing - the move to 10-bit.
Even though our current TV content will also benefit from brighter, deeper pixels, the extra contrast ratio will highlight the fact that content is currently broadcast in 8-bit. Having only 256 luminance samples per pixel to work with leads to banding artefacts and so by moving to 10-bit (1,024 samples per pixel), we eliminate banding artefacts as much as practically possible.
Again, the digital film industry worked this out about 25 years ago.
The problem is that the broadcast industry is not ready to deliver 10-bit yet. The industry as a whole largely complies with a set of standards put in place by the Advanced Television Standards Committee (ATSC).The body is meeting this month to talk about how to deliver ATSC 3.0, which is needed for 10-bit broadcast.
This suggests therefore that there is some work to be done, infrastructure to be developed and so on. A time consuming process.
It doesn't stop broadcasts of current content being enjoyed by people with HDR screens, but it's not taking full advantage of the opportunity HDR presents.
Way to go OTT
The streaming ('over the top' or OTT) guys, like Netflix and Amazon, have no such issue. They can deliver whatever they want as long as it can fit down the pipes.
Delivering a 10-bit HD HDR stream over IP needs arguably very little, if indeed any, extra bandwidth compared to HD SDR at 8-bit.
This is because 8-bit is actually quite hard to encode and it contains many errors, whereas 10-bit has fewer artefacts for the encoder to deal with. So, in the end, 8-bit and 10-bit are about the same payload.
That's much more practical than trying to squeeze 4K Ultra HD SDR into the same pipe.
Add to that the fact most content is already made in 10-bit (and has been for some time) then the streamers can also start to claim the high-ground when it comes to quality delivery of almost all content, including remastered film. They could even stream live content without much extra effort.
No need for UHD
In conclusion, I believe we really may not need that much 4K to be sent to the 4K TV's in the home.
We can capture the content with as high a resolution camera as is possible thus allowing for creating better quality subsampled images but for the normal viewing condition that Ultra HD resolution is just not needed.
However, beautifully crisp and sharp contrast detail in the form of HDR should give us a real reason to buy a new TV set. Then looking forward, by adding WCG and 10-bit, creating HDR+, we have a very colorful, crisp and bright future ahead of us.
Hopefully the content streamers will start to move faster on this and wake the TV manufacturers up to the opportunity also.
Further reading: Ericsson white paper - Understanding Ultra High Definition Television
Bill Scanlon is an experienced film and broadcast professional with a background stretching back over 25 years in the industry. He is strong advocate for 'Quality Pixels' throughout film and broadcast. An industry advisor and regular panelist discussing his background in 3D, CGI/ Visual Effects and most importantly now, new technologies like Ultra HD and how that is enabling the merge of the film and broadcast disciplines.
He was also the executive producer on the first live Ultra HD international broadcast of a major sporting event.