There's a delightful story that does the rounds regarding one of the founding fathers of Linux.
It's said that during the early days of the opensource operating system's development, this fellow took to attending conferences in complete silence.
All attempts to communicate via means other than hand gestures were refused. Instead, he pointed at things.
Apocryphal or not, the tale remains highly relevant today. Our hero's beef was with the windows-based graphical interface metaphor and its knack for turning us into mouse-pointing morons.
Fast-forward a decade or two and astonishingly little has changed. The windows GUI has, you might say, proven to be extremely gluey.
The classic case study is Microsoft's eponymous Windows OS. Admittedly, early versions of Windows would seem pretty alien to today's users – but that's an illusion. Look past the clunky graphics and Windows 95 is largely identical even to Windows 7, Redmond's latest and greatest OS.
Icons, taskbar, the folder metaphor – all are essentially the same as they were 15 years ago. That's a long time in any industry, but it's an absolute eternity in information technology.
Along the way, Microsoft has flirted with a few interesting new features. Early betas of Vista included widespread use of virtual folders and the promise of a fully vectorised and hence scalable graphical interface, for instance. But in the end, the retail build of Vista was yet another reskin of Windows NT, just a bit prettier.
Linux and Apple's Macintosh operating systems have scarcely been any more innovative. More user-friendly and configurable? Perhaps. More polished? Certainly. But both remain firmly rooted in the window-juggling keyboard-and-mouse camp. Compared to the enormous advances made in computer hardware, it's all a bit bizarre.
Back in 1995, a single-core Pentium processor running at 100MHz or so was your lot. That's an in-order 3.1 million transistor chip with 8kB of cache memory, for goodness sake. Today, we're up to six cores, multiple GHz, over a billion transistors and cache pools nigh on double-digits in MB. If you think that's merely a matter of scale rather than a new paradigm per se, what about features such as virtualisation or hardware-accelerated 3D graphics?
That's to say nothing of the rapid rise of LCD monitors and more recently solid-state drives. By any sane metric, computer hardware has been in a constant state of revolution. It's utterly relentless. So, not to put too fine a point on it, what gives with GUIs?
The answer, frankly, is that I don't know. Over the years, I've visited several labs dedicated to advanced interface research, including those of Microsoft and Intel. I've even interviewed luminaries from the heyday of interface research, including some who worked at the fabled Xerox PARC lab in Palo Alto. The very people who invented the GUI, in other words.
In fact, I reckon I've spoken to all the right people. I've played with all the latest table-top, touchscreen human-machine interfaces. But I remain essentially clueless. Nothing I've seen or heard of is obviously the next big thing.
At this point, Apple's iPad inevitably hovers into view. A remarkable device in many ways, it's no good for data input or content creation and therefore doesn't offer a plausible alternative for desktop computing. However, what it does is underline just how painful the Windows interface is.
Once you've danced around a few of your favourite websites courtesy of the iPad's delightfully responsive screen, the scrolly-scrolly, pointy-clicky PC experience seems pretty laughable.
Even a good smartphone can make the PC feel clumsy; I often prefer reading emails on mine. Replying to them is out of the question, but as a viewing device it's very pleasant and provides temporary relief from what is becoming an overly familiar and oppressive desktop computing experience.
You could say the differences are largely arbitrary, but trawling emails on my phone feels like a break from work. That's got to say something about the tiredness of the windows metaphor.
Microsoft's mooted Courier device looks interesting, too. It might just combine some of the better aspects of mobile touch interfaces with something extra by way of casual content creation – at least it did until it was cancelled. If you haven't seen it in action already, I reckon the demo videos are still worth a Google.
I'm still not convinced touchscreens in any form are suitable for heavy-duty desktop computing. But then I don't claim to have the answers. All I know is that something new is long overdue. I'm tired of pointing at things.
Article continues below