Processor

The phrase 'silicon chip' is associated so closely with computing that most people assume it's always been that way and always will be. Yet in the early days of computing, several technologies came and went before microprocessors established themselves as the dominant force.

First we had valve-based computers, then computers made by wiring together individual transistors. This approach then gave way to the use of small integrated circuits before the microprocessor came along. Now some experts are now suggesting that the silicon chip's 40 years in the limelight might be drawing to a close, as several competing technologies are waiting in the wings.

We'll take a look at some of these alternatives, but first we need to understand why the silicon-based microprocessor might just be running out of steam.

The microprocessor's history is one of phenomenal innovation. After all, in those 40 years chips have increased in speed a couple of million times – a feat that has involved, among other things, shrinking the minimum feature size from 10 microns (10,000nm) to 32nm and increasing the transistor count from 2,300 to around one billion.

Other trends haven't continued though – most notably that of increasing clock speed, which has been stuck at slightly over 3GHz for almost a decade. This hints at the possibility that other obstacles may prove equally insurmountable.

Take feature size as an example. Certainly the imperative to miniaturise chips to clock them ever faster no longer applies, but unless features continue to shrink, the race to provide ever larger caches, more cores and innovative new architectural features will result in some seriously large chips. Because semiconductors are manufactured using an optical process to etch the circuit onto the silicon, the progressive reduction in feature size has involved using light of an ever-shorter wavelength.

Today, deep ultraviolet is used instead of visible light, and the technology has been stretched out by employing clever techniques that allow the creation of features much smaller than the wavelength of the light. Even so, we have to accept that eventually the fundamental laws of physics will draw the era of fabrication using photolithography to a close, and even with the planned shift to extreme ultraviolet, it's been suggested that 11nm could be about the limit.

After that, we're well into uncharted territory. There's been talk of using electron beams or X-rays, and IBM has carried out research into chips that assemble themselves using molecular 'scaffolding', but so far all these alternatives are nothing more than research projects.

Hot topic

Then we have power consumption – the very thing that prevented chips from being clocked much faster than 3GHz. The more electrical power a processor consumes the more heat it generates, and unless that heat can be removed it will fry.

Admittedly, advanced techniques for moderating power consumption have been introduced in recent years and, as a result, top-end chips have dropped to 130W from around 150W a few years ago, despite massive performance gains. Even so, unless manufacturers have further power-saving tricks up their sleeves, it's clear that there's going to be trouble ahead.

Needless to say, if everything else stays the same, doubling the number of cores doubles the power consumption. However, if the feature size shrinks from 32nm to 22nm in the process, the area of silicon stays about the same.

Fortuitously, with the move to 22nm, Intel will be introducing its lower-power 3D transistors. This will cancel out that doubling of power from the same area, but it's a trick that can be done only once. When we bear in mind that the power density is already approaching 100W/cm2 – a figure significantly greater than that of an electric hotplate – it's clear that the future is going to be a hot one.

There's a host of esoteric cooling methods, some of which are well known in the overclocking community, but as the cost of preventing processors burning escalates, we have to question whether there's a better way. All we've seen so far could be described as technological hurdles, but there are more fundamental limits to what the laws of physics will allow, and these are even more worrying.

The fact is – and you'll have to take our word on this – as dimensions get smaller, electrons start to behave in ways that can only be described as downright weird because quantum effects come into play. Even the renowned physicist Neils Bohr admitted that quantum behaviour was peculiar when he famously declared, "If quantum mechanics hasn't profoundly shocked you, you haven't understood it yet".

For example, electrons can be in two places or in two states at the same time. Actually this can be used to good effect, as we'll see when we look at quantum computers, but on the reverse side of the coin, a major drawback for electronic circuits is that they can miraculously appear on the opposite side of a thin barrier even though there's no hole in it. If that thin barrier happens to be a layer of silicon dioxide, which is used as an insulator in chips – just let's say we've got problems.

Rising prices

Ironically though, for an industry that turns over almost $50billion, economists believe that financial rather than technological or scientific hurdles might herald the end of the line for silicon chips. Intel is reportedly spending $6-8billion on new facilities to manufacture chips based on the forthcoming 22nm process.

Yet this is the cost of an upgrade that could be described as evolutionary rather than revolutionary. We'll let you draw your own conclusions as to how much manufacturing costs would escalate if photolithography as we know it had to give way to a totally new method of producing silicon microprocessors.

We might have painted a rather bleak picture here, but there is a glimmer of hope – or, to be accurate, several specks of light – on the horizon.

In research establishments around the globe, scientists are intent not only on giving silicon technology an extra lease of life, but on offering a totally new technology for when today's silicon is eventually pensioned off.

Some initiatives stick with electronics but provide a more efficient material than silicon from which to build circuits. Others abandon the flow of electrons entirely, looking instead to the use of photons or chemicals to carry the signals required to perform arithmetic or logical operations.Others still – and here we could mention quantum and hypercomputers – adopt an entirely alien model of computation.

As we turn our attention to technologies that might knock the silicon chip off its pedestal, don't be too quick to dismiss them on the basis of performance. Some of the alternatives have shown speed beyond our wildest expectations while others are pedestrian, but many of these technologies are still in their infancy and all we've seen are their first steps.

Indeed, some experts may surely have dismissed the Intel 4004 – the world's first microprocessor, introduced in 1971 – for much the same reasons. It averaged just 60,000 four-bit instructions per second and could address 4kB of memory, but IBM's contemporary 360/195 mainframe could rattle through 10-15 million 32-bit instructions in the same time and access 4MB of memory. From humble beginnings…