10 ways PCs will change over the next 25 years

6. Turn anything into a display

tablet

Screens in our current devices tend to define their size. This is true of everything from laptops to tablets to mobile phones to all-in-one desktop PCs. For stationary machines this isn't much of an issue, but for mobile computers the inclusion of screens tends to constrain their portability and usefulness.

Devices like the Sony Tablet S2 attempt to resolve this problem through the inclusion of a folding split screen, while other smaller devices are designed to be used either on their own, or in conjunction with a larger screen for the best effect.

The latest generation of tablets, for instance, boast mini-HDMI connectors that let each device use a large television or projector. In the future, advanced forms of projection should enable some far more advanced usage patterns.

We've already seen smaller standalone projectors that are as small as a pack of cards, while Sony has recently released the Handycam HDR-PJ10, camcorder that boasts an integrated LED projector. Future units will feature brighter lamps with shorter throws, reduced battery consumption and more useful angles.

Digital whiteboard technology will come as standard in the future, so you'll be able to interact with anything you display. Thanks to the reduction in price of high-res panels, certain surfaces will act like TFT screens; plug your mini computer into the wallpaper and you've got a 30ft display.

7. Interact with PCs naturally

Science fiction has a tendency to define our technological aspirations, and has even had an effect on what is researched and implemented. Flying cars, holographic chess and sentient artificial intelligence have all been addressed by literature and movies, and are being actively researched too - however impractical they might be.

But one area where PCs are still struggling to catch up with sci-fi - and aging sci-fi, at that - is the way we interact with our computers. Natural speech input is possible today, but it's not commonplace and it's far from flawless.

The idea of having to train the software to recognise your voice by speaking a list of words isn't appealing to many people either. The accuracy of Google's voice search on mobile phones shows that untrained speech recognition is possible, although this is processed in server farms, showing that a lot of grunt is needed to do things properly.

Given another 25 years of processor development, this shouldn't be an issue. Unfortunately there's not much that can be done about the awkward embarrassment that comes with talking to a computer.

One key ingredient of human communication is body language, and it's possible that the inevitable successors to Microsoft's Kinect could provide the missing element that would let speech recognition draw cues about subtle meanings in words. Combined with gesture inputs, the days of telling your computer what to do shouldn't remain a dream for ever.

8. Better than Pixar graphics

Witcher 2 graphics

If we had to pick the area where the physical makeup of the PC has changed the most in the last 25 years, we'd be hard pushed to beat the advances in 3D acceleration. The first consumer 3D graphics card of note, the 3Dfx Voodoo, appeared 15 years ago and changed the face of gaming, and there have been hints that it could change the interfaces of tomorrow as well.

Advances like VRML may have struggled, but with GPUs now making it into CPUs, the market for 3D interfaces has grown and is ripe for exploiting. Budget graphics cards are going to struggle for significance in the next few years as the graphics capabilities of the APU/CPU increase, but at the higher end of the market we can expect graphics cards to continue to push the envelope for realistic rendering.

Cinema-level rendering techniques like true sub-surface scattering, deep shadow maps and ambient occlusion require significant processing power even on simple models, but as the polygon counts hit the billions and screens use UHDTV resolution, more advanced thousand-core GPUs will be needed.

With no obvious let up in the ongoing and productive battle between AMD and Nvidia, there's no reason to doubt that in 25 years' time the lines between cinema, 3D gaming, and even desktop interfaces will be blurred to the point of non-existence. Whatever else happens, the future is certainly going to be beautiful.

9. Game on the go

OnLive

The concept of the PC as a gaming platform has survived assaults from many angles over the last 25 years. Gaming PCs have responded well to these attacks, with improved graphics, higher resolutions and ever-advancing processors to make them arguably the best platform currently available for gaming - particularly compared to the static world of consoles.

However, it seems unlikely that the gaming desktop will survive another quarter century unscathed. One of the biggest threats is forming right now with streaming gaming services like OnLive and Gaikai.

By 2036 the teething problems of such technologies should have been resolved, letting you play high-end games on pretty much any hardware - PC, phone, console and so on. This means you can start a game during your lunch break at work, play it on the train on the way home, and finish it in front of your television.

And with ubiquitous access, everyone should be playing against each other anyway. The only problem with these services, and that is the inherent latency of transferring user actions to the servers.

This will mean that so-called 'twitch gamers' who enjoy first person shooters will need advanced networking technology; a separate internet channel for super-low latency transfers will be absolutely essential.

10. Be recognised everywhere

We're used to being targeted by specific advertisers based on our previous activities online through the use of tracking cookies, but extending this to the real world isn't such an outlandish idea.

The idea was best visualised by 2002 film Minority Report, but this isn't science fiction. Tests carried out five years ago in Tokyo used RFID tags to create a user-centric advertising environment. The test promoted offers to users' mobile phones as they walked near specific shops, potentially making those offers more attractive if the recipients didn't show interest.

Replace RFID tags with rudimentary facial recognition and the system is no longer an opt-in experience. Advertisements directed at you based on what you've been doing could soon get tiresome, but at least it means you won't be bombarded by adverts that aren't of any interest or relevance to you. Enjoyed that rollerball match? Why not go to another, or download the footage of the game you attended?

A key concept behind this is the notion that is often termed the 'internet of things', which describes the connections between various electronic devices, and what data they have access to. It's a concept that many futurologist are convinced will be necessary to tie all the devices together to produce a coherent technological future.

There is a potential danger here though, as outlined by futurologist Ian Peasons: "If the internet of things is not done properly you can just end up with a 1984-type surveillance state." We don't see that happening in 2036, but the potential is there.