Imagine a weapon that’s been given the leeway to sense and discern a military threat on the battlefield, instantaneously — much faster than a human, by the way — with orders to kill, on its own. Seek and destroy — no controls.
Military planners around the world are struggling with the ethics and the practicality of killer weapons on the battlefield. How much freedom can they give unmanned killer weapons? Or should they, at all?
Other members of the military feel a human being should always be in the circle of decision making, to oversee such a weapon.
Nobel Peace laureate Jody Williams sees only black and white on this issue — she is helping lead a campaign for a new international treaty to ban killer weapons, once and for all.
At a news conference Monday at the UN, Williams said these lethal autonomous weapons “are crossing a moral and ethical Rubicon and should not be allowed to exist and be used in combat or in any other way,” the Military Times reported.
The American peace activist said the Campaign to Stop Killer Robots, which started in 2013 and now has 130 groups in 60 countries supporting it, is trying to gain support from governments and people everywhere to step up pressure for a treaty, “so we don’t see these weapons unleashed on the world.”
Williams, who shared the 1997 Nobel Peace Prize for her key role in the successful campaign for a treaty banning land mines, came to New York with members of the killer robot campaign to meet with diplomats from the UN General Assembly’s disarmament committee, the Military Times reported.
They brought a robot with them to the news conference and to a side event for UN member nations afterward that spoke out against killer robots.
A machine is not a moral anything,” Williams said. “So, allowing machines, in theory through algorithms, decide what they will target and what they will attack is one of the huge reasons why we consider to be crossing the Rubicon, and grossly unethical and immoral.”
“Machines should be in the service of human beings,” she said. “Human beings should not be in the service of machines.”
Liz O’Sullivan, who resigned from the artificial intelligence company Clarifai Inc. over a project that could be used to build autonomous weapons and is now technical director of the Surveillance Technology Oversight Project, said she is devoting her life to prevent killer robots.
“There is absolutely nothing stopping a nation or any group of people from creating their own version of fully autonomous weapon systems today,” she said. “They wouldn’t work very well. They might not be safe, but they certainly do exist.”
O’Sullivan said it seems every branch of the military is working on their own version of these weapons, the Military Times reported.
“There are autonomous boats that will potentially have the ability to fire, there are autonomous drones … and also vehicles,” she said. “The easiest technological problem to solve is through the drones, so that’s most likely what we’ll see first, and the rest as technology advances.”
O’Sullivan said that is what is known just from public information and there are almost certainly classified programs.
“These killer robots … aren’t a future problem,” she said. “They’re possible today, and they’re something that we need to work to control right now.”
Mary Wareham, advocacy director of Human Rights Watch’s arms division and coordinator of the Campaign to Stop Killer Robots, said, “It is abundantly clear that we are moving very swiftly in the direction of fully autonomous weapons, which is why we’re calling for diplomacy to speed up.”
In mid-November the parties to the Convention on Conventional Weapons will be meeting in Geneva and they could agree to start such negotiations.
But Wareham said that committee operates by consensus and while “the vast majority of countries want to move forward,” they are being held back by nations countries most advanced in developing autonomous weapons.
She said the United States has led in developing these weapons, followed by Russia and China, but other countries are also “heavily invested,” including South Korea, Israel, the United Kingdom “to some extent” and perhaps Turkey and Iran.