Meet the real-life human cyborgs

HULC exoskeleton
The US military HULC exoskeleton

Stelios Arcadiou has an ear growing out of his arm. Rob Spence has a video camera hidden in his false eye. Jerry Jalava's finger is a detachable USB drive. Kevin Warwick – yes him – likes nothing better than sticking radio chips under his skin or connecting his central nervous system to robot arms.

Four very different men with four very different kinds of technology, but they all have one thing in common: they're cyborgs.

We've had cyborgs for a long time - the term was originally coined in 1960 to describe people whose bodily functions were aided or controlled by technology, so for example anyone with a pacemaker or hearing aid is a cyborg.

In recent years, however, we've gone beyond using tech to fix bits of us when they break. Increasingly, we're using technology to expand the possibilities of the human body and to blur the lines between (wo)man and machine.

But do we really need ears in our arms?

Ear we go

Stelios Arcadiou, aka Stelarc, probably isn't the template for Humans 2.0: his extra ear, grown in a lab from cells, is part of an ongoing performance art project designed to make us think. In interviews, he explains:

"I'm speculating on ways that individuals are not forced to, but may want to, redesign their bodies - given that the body has become profoundly obsolete in the intense information environment it has created…

"We shouldn't have a Frankensteinian fear of incorporating technology into the body, and we shouldn't consider our relationship to technology in a Faustian way - that we're somehow selling our soul because we're using these forbidden energies. My attitude is that technology is, and always has been, an appendage of the body."

Rob Spence is making a point, too. As the Toronto film maker explains: "I am a filmmaker who lost an eye so naturally I decided to modify my prosthetic eye into a video camera. I am not a lifecaster. I will use the eye-cam the same way I use a video camera now - or the same way any filmmaker would use a camera enabled cell phone."

Spence is working on a documentary "about how video and humanity intersect, especially with regards to surveillance."

ALL-SEEING EYE: Rob Spence's Eyeborg project uses a secret video camera implanted in his false eye

That doesn't mean artificial eyes and embedded cameras aren't coming. At MIT, researchers have developed a digital eye that enables the blind to see. At first, it was a giant machine costing $100,000. Then, a $4,000 desktop system. Now it's portable and costs around $500.

Elswhere at MIT you'll find Sixthsense, a wearable computer that uses a camera as an input device and nearby objects as display screens. The current prototype costs just $350 to build.

Rob Spence's eye uses a camera sensor developed by OmniVision, which specialises in high quality cameras for medical devices such as endoscopes. The firm is also working closely with Stanford University's Daniel Palanker on the Retinal Prosthesis project, a hugely complex and ambitious attempt to use sub-retinal implants to restore blind people's sight.

As an OmniVision spokesperson told us, the firm "agreed to participate in the project to jump-start and/or fuel research to provide vision for the blind."

Palanker has published a number of scientific papers detailing the project, and they make fascinating reading. In Design of a high-resolution optoelectronic retinal prosthesis [PDF] he explains how "an image from a video camera is projected by a goggle-mounted collimated infrared LED-LCD display onto the retina, activating an array of powered photodiodes in the retinal implant." Essentially the digital eye enables the blind to see again.

Hands-on technology

When Jerry Jalava was fitted with a prosthetic finger after a motorbike accident, he decided to make the finger more useful - by turning it into a USB drive containing Linux and some key applications.

"I'm planning to use the other prosthetic as a shell for the next version, which will have [a] removable fingertip and RFID tag," he writes.

Prosthetics have come a long way in recent years, with amputees being able to take advantage of myo-electric prosthetics that work just like real limbs. For example in May, Dawn O'Leary was fitted with a prosthetic arm that offers similar fine motor control to a real arm.

Sensors on her skin pick up nerve signals and operate the digits, enabling her to carry out complex tasks such as grasping the handle of a cup.

Researchers in Chicago have gone even further. The Neural Engineering Center for Artificial Limbs has developed techniques that combine myo-electric limbs with nerve transplants to deliver even finer motor control, with patients even being able to feel the objects they grip or touch. You can see the technology in action on YouTube.

TOUCH AND FEEL: Jesse Sullivan operates a bionic arm via nerve signals [Image from RIC video]

Carrie Marshall
Contributor

Writer, broadcaster, musician and kitchen gadget obsessive Carrie Marshall has been writing about tech since 1998, contributing sage advice and odd opinions to all kinds of magazines and websites as well as writing more than a dozen books. Her memoir, Carrie Kills A Man, is on sale now and her next book, about pop music, is out in 2025. She is the singer in Glaswegian rock band Unquiet Mind.