Member states of the United Nations are meeting this week in Geneva to discuss the legality of lethal autonomous weapons - or "killer robots".
As robotic technology improves, more and more countries are working to integrate it into their defence systems in a variety of different ways. But there is much debate and consternation over the ethics of such capabilities.
That's why the UN's 'Convention on Certain Conventional Weapons' exists. It creates rules for some weapons so that wars, if they're fought, are fought as humanely as possible. Landmines, flamethrowers, blinding laser weapons and chemical bombs are all examples of weapons that the convention has outlawed.
The members of that convention are meeting to determine how to frame the debate over what types of autonomous weapons should be added to that list. The discussion will likely involve hours of wrangling over how to define terms like "lethal", "autonomous" and "weapon". A similar conference a year ago failed to reach agreement.
In the hope of influencing the process, the Red Cross has issued a statement insisting that to fit existing humanitarian laws, such weapons must be predictable and countries must ensure that the authority and power to kill remains with humans, not algorithms.
That would mean, the group says, "strict operational constraints with respect to the task carried out, the targets attacked, the operational environment, the geographical space and time of operation, the scope to enable human oversight of the operation of the weapon system, and the human ability to deactivate it if need be."
Cyber or Electronic Warfare
Human Rights Watch has released a similar call, with senior arms division researcher Bonnie Docherty saying: "Humans should retain control of weapons systems and individual attacks, not only of overall operations… Mandating human control would help avoid threats to the fundamental moral principles over the decision to use force."
Last month, even the Pentagon weighed into the debate, releasing a statement that insisted it would never delegate lethal authority for a machine to make a decision. "The only time we'll delegate [such] authority [to a machine]", said deputy defense secretary Bob Work, "is in things that go faster than human reaction time, like cyber or electronic warfare."
No formal decisions will be made in Geneva this week on whether the technology should be outlawed or not. Instead, its goal is to build a base of knowledge about the technical, ethical, legal, operational, security and other concerns surrounding autonomous weapons.
The final decision is likely to come at the treaty's Fifth Review Conference, held every five years. That will take place in December 2016.
We tried social networking in virtual reality:
Article continues below