Gaming technology often drives innovation across all computing fields. When Nintendo spent 22 billion yen (about €163million) on research and development way back in 2005, the fruits of that research produced the Wii, which has sold 27 million units worldwide. But what technology can we expect in the consoles and PCs of tomorrow? Read on to find out more about 3D goggles that place you inside your game, headphones that replicate a surround-sound environment and even gaming networks that try to predict what you're going to do next...
Controllers of the future
If you want an idea of how PC technology will change over the next 10 years, look closely at innovations concerning game controllers. Why? The first point of contact with a PC is always the controller (keyboard, joystick), and if that experience is exceptional, the memory processing power, multi-threaded computing and high-end software will also work better for the end user.
Think of the mouse: it changed every computing paradigm when Douglas Engelbart invented it in 1970. Even data centres have had to go through a radical transformation in the past 20 years as the mouse has become the dominant method of computer control, so much so that – in the next 20 years – a data centre will become like a remote power plant that mouse-wielding network administrators control from afar.
Look to games first
So what will be the new paradigm-altering controllers? They're mostly found in gaming. The Nintendo Wii remote is one contender, although it has some limitations. Most of the games on the Wii have radically simplified graphics because the Pixart motion-tracking technology is not capable of precise movement – it's not necessarily because Nintendo decided to eschew graphics realism. The PS3 has some new innovations, such as the controller's motion-sensing ability, but it can be difficult to use in certain games, such as Lair and Warhawk.
Novint Technologies has designed the Novint Falcon to showcase how a 3D controller can move in any direction and change PC gaming for the better. Before we mention the benefits, we should be clear: this technology is in its infancy. It's often difficult to move in 3D space and keep your bearings. Graphically, the games that come with the controller are subpar (think a technology showcase similar to Nvidia demos), especially in terms of gameplay and graphics.
Only when we tested several third-party commercial games – such as Half- Life 2: Episode 2 – using a free Novint driver did we see the potential of this technology. The device moves in a four by four by four inch space and has two pounds of resistance. You can feel the weight of the shotgun that you're using to mow down an alien Strider on a rampage.
From games to everyday computing
However, how could the device work for general computing? In science, the free-form movement encourages experimentation. Imagine taking a tour of the 3D world in Google Earth. A mouse only moves in straight and diagonal lines, so a 3D controller means you don't need to constantly adjust your position to move in a free-form fashion. "I think all of gaming, PC or console, is going to move towards 3D interaction like the Wii or the Novint Falcon," says Tom Anderson, CEO of Novint. "When you combine advanced 3D touch like that of the Novint Falcon with 3D stereo displays and 3D sound, you'll have a very compelling gaming experience."
Logitech-owned 3Dconnexion offers the SpaceExplorer 3D controller, which has similar potential to the Novint Falcon. The device feels rugged and metallic: you can immerse yourself into a gaming environment without wondering whether it will fall off the table. There are six movement types – left or right pan, up and down zoom and left or right rotate. We reprogrammed the buttons for greater freedom of movement before trying it out in Halo 2. The mouse worked well for driving vehicles in the game as it gave a better sense of momentum.
3Dconnexion offers an SDK for developers who want to experiment with gaming control. You can find it at: www.3dconnexion.com/news/ press/032007_SDK.php.
The Saitek Cyborg Command Unit, which is a more targeted gaming peripheral, reveals how an ancillary controller can also create more immersion. When using the controller you can concentrate one hand entirely on movement and the other on shooting with the mouse. The device has 144 programmable buttons, making it easy for you to find a 'natural' way to launch grenades or pick up objects in the game. In Crysis multiplayer matches, you'll notice considerably improved movement and higher kill counts thanks to more precise control.
The Command Unit highlights a key point about computing control. Our brains are wired to split functions into halves, which is why any future input devices should focus on one aspect of control – such as movement or clicking – and not attempt to use one general device for all control and movement. It also explains why the most complex keyboards and controllers, such as those that allow you to map keys to a particular game, often fail in the marketplace.
In the future, controllers should become more focused on the task at hand. Imagine a photo-editing controller that lets you quickly apply Gaussian blur to an image with just one click. Music and video producers have already figured out this categorisation and controllers now exist simply for fast-forwarding through video scenes or audio clips.
Controlling a PC is one thing; becoming fully immersed in an environment is more challenging. Once again, Nintendo is the clear leader in offering peripherals that enhance gaming. The Wii Fit makes you think more about weight loss, posture and balance than the fact that you are standing on a small white platform only a few centimetres high.
Successful immersion requires two factors: the hardware interface must become a natural extension of your interaction, and the software must interact fluidly with the hardware peripheral. 3D displays, 3D surround sound audio and head-mounted displays all point to how computing will change in the next few years.
No hardware add-on has so much potential – and yet is so disappointing – as the 3D display. These monitors could pull gamers, video producers and even software developers into a more realistic realm. In this environment, a video producer could interact with video – creating special effects, pulling files out of easy-to-organise bins and assembling footage – as though the film rolls were sprawled out on a table before him.
Limitations of 3D displays
Interestingly, the main limitation to 3D technology has more to do with human-computer interaction than actual display technology. In fact, 3D displays have matured gradually and solved several complex problems. For example, glass manufacturing is now much cheaper and 3D goggles look less like the instruments an eye doctor uses to test your eyesight and more like sunglasses. As a result recent first-person shooters use 3D polygons that can be easily mapped to three dimensions in space.
The hurdle is to do with physics. Your eyes have an amazing ability to focus. When you see an object in the distance, your eyes turn in their sockets to focus accurately. We are all literally 'drawn to the light', focusing on the brightest object. The most common method used to create 3D screen is called 3D Stereo, and it involves two planes of glass.
The Samsung HL56A650 DLP 59in display and the iZ3D 3D Monitor 22in both use this method. The back screen controls colour intensity and the front screen controls polarisation – the effect of offsetting the image so it looks 3D. The doublepane glass approach requires that you wear goggles and sit directly in front of the monitor.
At the moment, 3D Stereo causes fatigue after repeated use. There are a few reasons: your eyes are focusing on objects at depth as though they are emanating from the screen (a phenomena called parallax), yet at the same time you are focusing on the brightest – and, incidentally, only – light at the surface of the screen, causing eye strain.
Some viewers have strabismus, which is the inability to perceive the 3D effect from 3D Stereo technology. However, it's not all bad news: 3D Stereo is amazingly realistic, not as costly as other 3D monitor technologies and requires no software re-tooling. Samsung uses DDD technology for the software drivers that create a 3D Stereo effect.
Nvidia is planning to release 3D Stereo technology. When testing it at CES in Las Vegas, it was clear that it has tweaked the polarisation and software drivers for more realistic 3D modelling. Real-time strategy games such as Age of Empires took on an ultra-realistic 3D perspective.
Another innovator is SeeReal (www.seereal.com), which is using holography for 3D display rendering. Instead of using two panes of glass, holography modulates the timing, intensity and brightness of the light emitting from a standard LCD for each RGB sub-pixel, and then re-constructs the 3D image in space. It's still an untested technology, but the advantage is that the re-created images don't cause eye strain.
Visual displays are only one aspect of immersion and innovation on a PC. What you hear must match the more realistic environment, or the experience will be lessened. Second Life is a good example of how the use of video and audio when creating a virtual environment are not always in sync.
Too often, the graphics look relatively realistic but the two-channel audio – not to mention the obnoxious chatter between pre-teens – almost ruins the experience. When a tank lumbering along in Crysis moves out of a distant ravine to surprise you from behind, it should fill the audio spectrum in those locations and make gradual auditory movements.
Surround sound headphone sets simulate these movements accurately for gaming: the Saitek Cyborg 5.1, the Sennheiser RS-130 and the Sony MDRDS8000. Experiencing high-quality 3D audio while gaming would create an other worldly auditory experience matching the action on screen.
So how can a two-headphone set mimic 3D surround sound audio? Once again, like 3D display technology, it's all about the physics. Surround sound headphones split two-channel audio and isolate frequencies for location mapping inside the audio spectrum. The source material does not have to be encoded for surround sound audio.
The Sennheiser 130 headphones use a complex algorithm to detect the timing differences in a two-channel audio source, adding depth to the audio so that certain sounds – explosions, distant music, lightning bolts in the horizon – appear to be coming from behind or to the side.
The Ultrasone HFI-780 headphones go a step further: they use an audio driver that sends audio around your ear instead of into the ear canal. This creates a more natural surround-sound experience that's immediately obvious compared to headphones that change the frequency of audio electronically to create the sensation of 3D sound.
Jump into the game
Video goggles attempt to do away with the separate entities of LCD screen and headphones when creating a virtual environment. It's an admirable goal, because it means more portability for computing and a lighter impact on the environment because less power is required.
Video goggles also have incredible benefits for gamers. The display can be massive and portable. You could watch 1080p HD video anywhere you go simply using goggles and a mobile phone. In gaming, the effect is highly engaging to the senses because you're transported by the personal nature of the display and aren't as easily distracted by your surroundings.
The eMagin Z800 3DVisors simulate the experience of watching 105in display from 12ft away. OLED technology uses a thin panel that emits a bright light source. The panels use a CMOS silicon substrate that buffers data being fed to the goggles for each pixel and then reconstructs the image so that there is absolutely no flicker during viewing.
The 3DVisors also provide head tracking using a USB connection – so you can move the mouse with your head. This means that when you look around, the display moves at the same time.
The goggles are designed for PC use and include a standard VGA connection cable. The benefits are profound: more portability and more immersion. Yet when testing the product, it became clear that there's something not quite right about the goggles. They tend to cause nausea in a first-person shooter because it's so easy to lose your bearings.
An LCD monitor actually draws you in to a game more effectively because you can keep your bearings from stationary objects around you as you slug through an open-air sewer looking for alien bugs to blast into oblivion.
The Vuviz iWear AV920 – which do not provide head tracking – are similar to the 3DVisors in that they project a huge image (in this case, about 62in). The AV920 will work with an Apple iPod Video or a portable DVD player (or any RCA video source) and run for about five hours on one battery charge. Instead of using OLED, the AV920 uses two 640x480 LCD displays that you look through for a projected image.
The LCD projection technology is still in an early state: the goggles don't support widescreen viewing and aren't wireless, so you have to use at least one AV cable to connect to your mobile device. However, the iWear AV920 gives a brighter and crisper experience than the 3DVisors, thanks to a wider colour gamut of one million colours. During a multiplayer match of Enemy Territory: Quake Wars on an Xbox 360, the AV920 goggle display looked crisp and colourful, although like the 3DVisors, they did cause slight nausea.
It's possible that video goggles will never become a viable viewing technology for gaming – or any other computing technology – because the display seems to float in space, meaning that you can't get your bearings.
One solution to this problem may be to re-create stationary objects at high resolution. If the goggles emitted a table and desk in the room, with the moving image running on a virtual wall or even a virtual television, your eyes would adjust to the movement.
Say goodbye to lag
The most burgeoning technology for PC and consoles has to do with networking and multiplayer gaming. Several companies are focusing on latency issues. They're trying to solve problems that occur when your bandwidth can't keep up with the intensity and frame rate of the on-screen action.
The D-Link DGL 4500 X-Treme N Gaming Router uses GameFuel technology to give priority to packets used for multiplayer games and reduce throughput for other activities – such as downloading a file. The reason: when a download takes a few minutes longer, you don't notice, but when you're coming around a corner ready to fire a rocket launcher in Halo 3 and the screen pauses for a second due to latency, it's a little bit more obvious. Streaming media adaptors such as the Roku Netflix Player and the Apple TV also have the same latency problem: you can easily spot stuttering video frames.
One way of solving the problem is to do with software in the game itself. In the new Massively Multiplayer Online (MMO) game Age of Conan: Hyperion Adventures, there is very little drag during online multiplayer sessions because the code used for online gaming is constantly predicting the players' movements and filling in polygons as you move. Even if you experience a slight slowdown over your broadband connection, it's less likely that you will see any pausing on the screen than in previous generation MMOs.
This technology will eventually make its way into other computing paradigms: pervasive networks that are aware of your location, voiceover- IP applications, video chats over the Internet and scientific simulations between geographically dispersed laboratories can all immediately benefit from latency reducing programming techniques. For the most part, it's a predictive technology: the software is smart enough to know what should happen next both graphically and programmatically, even if the network is not running as fast as it should to present the next image.
Overall, gaming technologies are increasingly looking outside the box and redefining what computing means. Over the next 20 years, we can expect more 3D manipulation that uses any surface for a display, a drag-and-drop programming mindset for more free-form interaction with objects on the screen, and predictive networking technologies that use artificial intelligence to predict your needs and future actions.