'When are the metal ones coming for me?'
Wired for War author PW Singer talks about robots in war
When should we salute our metal masters?
Let's not beat around the bush. The future vision described in the book is pretty frightening, isn't it? "Yes, it's pretty darn scary. But then again, the very first line of Wired for War is 'Because robots are cool.' That is my answer as to why someone writes a book about robots… [It's] also written in a way that isn't mean to scare but approach this important topic with a sense of both excitement as well as foreboding."
So how real does Singer think the threat from AI is in terms of having evil intent given the reaction of those he spoke to on this topic in the book? "Perhaps you should rephrase the question as, 'So when should we salute our metal masters?,' he jokes.
"Look, you can't write a book about robots and war without have to deal with the 'When are the metal ones coming for me?' question. Essentially, four conditions would have to be met. First, the machines would have to have some sort of survival instinct or will to power.
"In the Terminator movies, for instance, Skynet decides to launch a nuclear holocaust against humans in a bizarre form of self-defence after their scared attempts to take it offline when it reaches sentience.
"Second, the machines would have to be more intelligent than humans, but have no positive human qualities (such as empathy or ethics)," he continues.
"The third is that the machines would have to be independent, able to fuel, repair, and reproduce themselves without human help. And, fourth, humans would have to have no useful fail-safes or control interface into the machines' decision-making. We would have to have lost any ability to override, intervene, or even shape the machines' actions."
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
Could it happen?
Singer believes these conditions would present too high a barrier in the short term. "For example, most of the focus in military robotics today is to use technology as a substitute for human risk and loss," Singer explains. "It is the very opposite goal of giving them any survival instinct.
"Second, the ability of machines to reach human level intelligence may be likely someday, even sooner than most expect given the rapid doubling effect of Moore's Law on our technology just under every two years. But it is not certain.
"Third, while our real-world robotics are becoming incredibly capable, they all still require humans to run, support, and power them." Singer reaches for an example – the Global Hawk drone, the replacement for the manned U-2 spyplane.
"It can take off on its own from New York, fly 3,000 miles on its own to London, stay in the air 24 hours, using its surveillance and intelligence gathering systems to hunt for a terrorist over the entire city, then fly back 3,000 miles on its own to New York, and land on its own. But, the drone still needs humans on the ground to gas and repair it.
"Fourth, there are enough people spun up about the fears of a robot takeover that the idea that no one would try to build in any fail-safes is a bit of a stretch. Most importantly, perhaps, the whole idea of a machine takeover rests on a massive assumption.
"As many roboticists joke, just when the robots are poised to take over humanity, their Microsoft software programs will likely freeze up and crash.
"The counter to all of this, of course, is that eventually a super-intelligent machine would figure out a way around each of these barriers. In the Terminator storyline, for example, the Skynet computer is able to trick, manipulate, or blackmail humans into doing the sorts of things it needed (for example, emailing false commands to military units or putting humans in concentration camps), as well as rewrite its own software (something happening today with evolutionary software)."
Singer says the idea that we would not learn our lessons from science fiction is somewhat voided by the fact that real world military expediency has us carrying out research into all sorts of systems that science fiction directly warns us about.
"This is nothing new. HG Wells' warning of what he called an 'atomic bomb' in the anti-war story World Set Free instead served as the inspiration for the Manhattan Project. As I talk about in my book, one robotics firm was actually asked a few years ago by the military if they could design a robot that looked like the 'Hunter-Killer robot of Terminator.'
"It wasn't such a silly request. The design would actually be quite useful for the sort of fights we face now in Iraq and Afghanistan," Singer concludes.
Dan (Twitter, Google+) is TechRadar's Former Deputy Editor and is now in charge at our sister site T3.com. Covering all things computing, internet and mobile he's a seasoned regular at major tech shows such as CES, IFA and Mobile World Congress. Dan has also been a tech expert for many outlets including BBC Radio 4, 5Live and the World Service, The Sun and ITV News.