First, there was widescreen. As we migrated to LCDs, between a fifth and a third more screen space was tacked on to the sides of our panels.
Even long-standing acronyms for pixel resolutions such as XGA were stretched out with a 'W' prefix to denote their extra girth.
Now an even wider format looms. The longer, lower 16:9 ratio of length to height, common for HDTV sets, is already the screen ratio of choice for netbooks and has been used by Sony laptop line-ups for years.
On one hand, wide-scale adoption of the standard could be enormously beneficial. As it stands, there are far too many screen resolutions and abbreviations. It's reckoned that fewer than one in five consumers can correctly choose between the 720p and 1080p standards so intensively marketed in the high street.
So, in the more obfuscatory world of computer monitors, what chance do they have? By encouraging a shift towards one aspect ratio, you're simplifying the options. From a technical point of view, content producers could then concentrate on one format without fear of black bars or scaling issues.
Ride this train of thought to its conclusion and you find economies of scale for manufacturing too, since similarly sized panels can be shared by monitor and TV product lines.
This, of course, raises the question of why the two standards have arisen in the first place. The most commonly quoted reason why computer monitors adopted 16:10 as the norm is that it's almost perfect for viewing two full A4 pages of text side by side.
There's a lot to be said for that – one of the key features in Windows 7 is quick resizing of documents to fill exactly half the screen, which Microsoft thinks is one of the most common tasks performed on a PC.
There are other more esoteric – and probably spurious – arguments, as well. For instance, 16:10 is close to the mathematical 'golden ratio', whereby the sum of the ratio divided by the larger part equals the larger number divided by the smallest. The ratio crops up regularly in art and is considered aesthetically pleasing because it matches the shape of the human field of vision.
The trade off
As for the 16:9 standard, its origins can be traced back to 1980 and a film engineer called Kerns H Powers, who worked for the standardisation board SMPTE. The year before, Woody Allen had insisted that the TV version of his film Manhattan shouldn't be cropped to fill the smaller screen format. Trying to shoe-horn movie prints on to TV screens proved a headache.
After looking into all the options, Powers adopted 16:9 as the screen ratio that best accommodated all the different frame sizes currently being shot for cinema.
Modern 16:9 monitors also handle games well. The extra width helps to fill the peripheral vision, aiding immersion in a game world. On the other hand, they're not so good at displaying documents. A 24in 16:9 panel stands as tall as a 22in 16:10 one, so it's less adept at making side-by-side comparisons of files in portrait format.
Even on a 30in 16:10 monitor, users naturally place their windows in the centre of the screen, so extending the sides is simply increasing the amount of mostly unused desktop space. Has the decision to standardise already been made, though?
Over the last six months nearly every major manufacturer has released a 16:9 laptop with Blu-ray viewers in mind. Dell, LG and Iiyama have refreshed their 16:9 desktop screens amid enough fanfare to suggest something is up.
In the end, it's likely economics will win – if it becomes more cost effective to standardise all TVs and monitors, that's how things will go.
Article continues below