‘Drones shouldn’t decide who lives or dies’

A X47-B Navy drone touches down as it lands aboard the nuclear aircraft carrier USS George H. W. Bush.(AP Photo/Steve Helber

A X47-B Navy drone touches down as it lands aboard the nuclear aircraft carrier USS George H. W. Bush.(AP Photo/Steve Helber

Published Oct 15, 2014

Share

Johannesburg - University of Johannesburg law Professor Hennie Strydom on Wednesday advised against the use of programmed drones and robots during conflicts.

“The concern is that the critical functional use of force is controlled by a computer,” he told reporters in Johannesburg.

“At the moment it (autonomous system weapons) has only been used on machines against machines and has not been used on humans. But in principle (these) weapons that are now being developed can in future be used against human beings as well.”

He was speaking to the media ahead of his lecture on the implications of armed drones and autonomous weapons systems.

According to the United Nations Office of the High Commissioner for Human Rights (OHCHR), unmanned combat air vehicles (UCAVs), commonly known as drones, enable those who control lethal force not to be physically present when it is deployed, but rather to activate it by computer in faraway places and out of the line of fire.

The OHCHR adds that lethal autonomous robotics (LARs), are robots programmed to take action on their own once they sense a threat.

As such, targeting decisions could be taken by the robots themselves.

Strydom questioned the practicality of using such weapons in warfare and the implications of “machines deciding on who lives and dies”.

“Can they really make the distinction between legitimate targets and civilians? Can they distinguish a hunter from an enemy combatant?”

Strydom said the advancement of technology and its use to assist human life was welcome but human involvement should never be replaced by machines.

Drones had a level of human involvement, compared to a LAR which akin to a computer deciding on which target posed a danger and whether to eliminate such a danger.

“Isn't there inherently something wrong with machines taking the decision to kill human beings? If they can take our lives, they can take all kinds of other decisions,” he said.

“Do we want to concede that kind of decision of 'are you going to live or die?' to machines? It also affects human dignity as well. Do we want to live in a world where robots are killing people?”

The use of drones and LARs had changed the landscape of warfare with countries such as the United States being able to launch an offensive against a target hundreds of kilometres away without fear of losing personnel.

Strydom said although there had never been a case of LARs being used for offensive purposes during war, there should be laws that would protect and govern how such machines were used.

“Autonomous systems are used as shields to fend off incoming attacks or if there is a plane coming in. In an offensive manner, autonomous systems weapons have not been used as of yet but the closest thing can be the X-47B and BAE Systems Taranis,” he said.

The BAE Systems Taranis is a British demonstrator programme for unmanned combat air vehicle technology while the Northrop Grumman X-47B is a demonstration unmanned combat air vehicle designed for carrier-based operations.

Sapa

Related Topics: