Putting human vision models into computer video display

eeColor
eeColor enhancements on left; original image on the right

Image processing technology has achieved remarkable breakthroughs, with more vivid colors, richer detail and higher definition images. This adds up to better resolution and a broader range of available colors at lower cost per pixel. But despite these stunning advances in visual display, it has been impossible to accurately reproduce what the human eye would see when viewing the scene directly.

No matter how advanced the technology, there has always been a difference between seeing something on the screen, and seeing it in real life. The human eye has an advantage in perceiving input, due to its ability to compensate on the fly for differences in lighting conditions both in static and mobile viewing.

There's no doubt that the future of television and video display rests in higher definition. Most recently, 4K TV, also known as Ultra HD, offers up dramatic improvements with twice the picture resolution of a standard 1080p full HD television.

What's next though, isn't just adding more pixels to the display and supporting larger color gamuts. The most dramatic improvement is in an entirely different approach that begins with a study of how the human eye organically perceives and processes color.

The human eye isn't just RGB

The original color standards defined a limited range of colors, by creating different intensities of red, green and blue (RGB) light emitted from rare earth phosphors grouped into sets of three. This system has persisted over time, but it does not allow for all possible colors, since it does not allow for negative amounts of a color to be used.

Nonetheless, it worked well, and has been extended a number of times. The most common standard continues to be sRGB, although some new color emitters in display devices are capable of creating more colors than are defined by the standard.

It's also important to note that the move from analog to digital displays came at a cost. In the real world, human eyes are not digital (unless you are a character from Star Trek). The natural color spectrum is analog, and every color in the frequency range of visible light is possible.

Digital displays impose an artificial limitation on the color gamut, because they have to rely on discrete digital values. Digital displays take the entire display as a single unit – only using crude adjustments of brightness that are applied across the board, which leads to a perception of some colors as being simply "wrong" in certain lighting environments.

The human eye adjusts how it sees colors based on brightness, and color of the viewing light. Technological displays, unlike the human eye, do not differentiate between regions that should be adjusted (such as shadows) and those that should not.

Also, digital standards do not take ambient light into account, and as a result, a display in an environment in which there is bright light, will look less colorful than it would in a dimly lit theater. The human eye does something that technology has until now been unable to do – and that is to adjust perception of colors based on the level of ambient light.

Putting human vision technology onto the digital screen

Applying the physical models of human vision to the computer or television display will come closer to natural vision than any other image technology on the market. This new era of real-time color processing, first developed by Entertainment Experience for its eeColor software application, in partnership with Rochester Institute of Technology, is now a reality. The new model displays vibrancy that even in Ultra HD, has never before been possible.

The technology applies real-time light sensors to automatically restore any quality that might be lost due to subpar lighting or bright sunlight, making it the first display technology suitable for equally vibrant displays in any lighting environment.