Beyond silicon: the processor tech of 2035

The sci-fi tech that will revolutionise the PC

plastic chips

Although most everyday plastics are electrical insulators, several plastics have been produced that are conducting or have similar semiconductor properties to silicon. As a result, complete electronic circuits can be made out of plastics and, because they can be printed from solution using equipment similar to a desktop an inkjet printer, manufacturing costs are miniscule compared to fabricating silicon chips.

What's more, the circuits are flexible – they've already found applications in roll-up displays for use in e-readers. The difficulty in this technique is that plastic transistors aren't as reliable as silicon ones – they don't all turn on and off at the same voltage.

What's more, unlike a stuck pixel in a display, which barely matters, a single non-functioning transistor in a processor is a show-stopper. Earlier this year, scientists at Belgium's Imec research centre announced the first plastic processor, using an extra gate to tame errant transistors.

With rock-bottom prices, plastic processors will find new applications, as one of its developers explained: "Wrapped around food and pharmaceuticals, they might indicate that your tuna is rancid or that you forgot to take your pills".

Cloud computing

Cloud computing

Provided it did the job, the majority of people wouldn't be particularly bothered about whether the processor in their PC was made from silicon or nanotubes, and whether it worked by simply executing a series of instructions in sequence or by combining esoteric chemicals. The ultimate in removing us, as users, from the nitty gritty of what goes on behind the scenes, though, is the concept of cloud computing.

Locally, all you have on your tablet or netbook is sufficient computing power to drive the display and transmit data to and from the cloud. You'll never know exactly what's in the cloud, but in addition to the storage that today's internet provides, it also proves enough computing resources to carry out the task at hand.

Cloud computing is in its infancy, but even so, the chances are that not all of the computing resources you currently use will involve x86 processors running Windows. Sure, they'll contain silicon chips, but they may well be UNIX or Linux servers.

If some of the technologies we've discussed here come to fruition, they too could go into the cloud melting pot. Thanks to memristors, artificial neural networks might come of age and analogue computers may be reborn.

Neither are general-purpose models of computation though, so they're not going to replace silicon. And unless you're serious about simulation or facial recognition, you're not going to want to buy a neural or analogue co-processor for your laptop. In the cloud, however people and businesses could pay for these specialist resources only when they needed them.

The new analogue

Analogue pc

Back in the early days of electronic computing, digital computers lived alongside their analogue counterparts. The idea was that while digital computers were good at some tasks, analogue computers were good at others.

Analogue computers were used mostly for simulation, and were programmed using patch leads to wire up elements like summers and integrators that worked on continually varying voltages to solve differential equations.

Simulation could be done digitally, but without fast computers this was a very slow job. As digital computers became ever faster, analogue computers were eventually sidelined – until now.

As we explained above, memristors are an up-and-coming technology for digital computers, but they may just give a new lease of life to analogue computers too. Digital computers might have come on in leaps and bounds since the days of their analogue counterparts, but solving some of the planet's toughest simulation exercises still requires vast computing resources like the Japanese K computer, which uses 68,544 eight-core processors and cost $1.25bn.

According to the authors of a recent academic paper on memristors, a new generation of analogue computers, working alongside digital supercomputers, could be exactly what's needed for large-scale simulations like those used in climate change research.