Science

How Does the Story of 'Killer Robots' End?

"The technology makes it possible, but the politics makes it happen."

A man kneeling and taking a picture of an autonomous weaponry device
DARPA

In the next episode of the dystopic sci-fi soap that’s become about global current events, killer robots will be fighting our wars, choosing targets as they please, without human guidance. World leaders have increased access to autonomous weaponry, with militaries they control advancing into new technological territory.

Experts are growing concerned, so concerned in fact that a coalition called the Campaign to Stop Killer Robots was born out of the need to address this very issue. And this year, the 123 nations that comprise the international Convention on Conventional Weapons will address the challenges raised by autonomous weapons that are able to act outside meaningful human control.

That said, there’s no cause for panic just yet. To start a new military program with newly developed weapons could take anywhere from ten to twenty years, says Paul Scharre, senior fellow and director of the Future of Warfare Initiative at the Center for a New American Security.

“The current president gets to decide what to do with the military, but is very limited in his ability to change the scope of the military today,” Scharre says. Presidents actually lay the foundation for militaries that future presidents will inherit.

With the current technology, President Donald Trump could engage military forces, but would be unable to actually declare war. If the president were responding to a crisis, for example, he wouldn’t be allowed to keep armed forces abroad for more than 60 days without a declaration of war by Congress, thanks to a little thing called War Powers Act of 1973.

The last president to actually declare war was Franklin D. Roosevelt, when signing the declaration of war on Japan after the Pearl Harbor attacks in 1941. However, with the advent of autonomous weaponry, the limits on presidential power to dictate longer-term military engagement have become more hazy.

In 2011, for example, the White House asserted that President Barack Obama didn’t need Congressional approval to continue a military campaign in Libya because “U.S. operations [did] not involve sustained fighting or active exchanges of fire with hostile forces, nor [did] they involve U.S. ground troops.”

Since only robotic systems were operating in Libya, rather than human soldiers operating in harm’s way, the War Powers Act didn’t apply. “That’s really fascinating from a checks and balances authority perspective,” Scharre says. “To assert that the president doesn’t need approval from Congress to fight a war simply because no lives are at risk is, I think, a very troubling prospect.”

And the threat of autonomous weapon systems waging war for us could be even more cause for concern. “Passing a treaty doesn’t guarantee that countries refrain from building a weapon. In past examples, countries said they wouldn’t do certain things in time of war and when war happened, they did so anyway,” says Scharre. “That’s the nature of war, it’s a failure of cooperation. It’s possible autonomous weapons aren’t a great idea, but that’s the nature of military competition and distrust among nations.”

Scharre estimates that at least 30 countries possess defensive systems with a higher level of automation, while expert P.W. Singer, a fellow at New America, and author of Wired for War, estimates about 80 countries have military robotics. They have modes that can be switched on and off, and have the capability to track and shoot incoming objects on their own in scenarios that require an immediate response.

The Phalanx

One example of an autonomous weapon system is the Phalanx, which is used on American naval ships to target incoming missiles. When the system senses a missile, it automatically switches on and attacks.

The Harpy

Another autonomous weapon system is the Harpy, an Israeli-developed “fire-and-forget” drone that flies in circles looking to detect and destroy radar emitters. In 2003 when the United States invaded Iraq, for instance, the Harpy found and destroyed radar systems so that American forces could fly into the Iraqi airspace without being attacked.

An unmanned vessel optimized to robustly track quiet diesel electric submarines.

DARPA

And yet another example is ACTUV, an unmanned, anti-submarine drone, aimed at tracking and trailing target submarines. Moreover, the MQ-25 Stingray is an unmanned aircraft carrier for aerial combat, which resulted from UCLASS, the Unmanned Carrier-Launched Airborne Surveillance and Strike program.

“If you go back 15 years, our forces were going to Afghanistan after the 9/11 attacks and they had a handful of unmanned aerial systems — drones and none of the were armed,” says Singer. Today he says there are over 10,000 drones in the U.S. military inventory and another 12,000 on the ground. Just as everything from smart homes to self-driving cars are becoming automated, weaponry robotics are following the same trend, he explains. “You shouldn’t be surprised that things happening in civilian life are moving to war,” Singer says.”

Only now, the politics are also different. “You had these little ground robots before, but people weren’t as interested in using them,” he says. There was debate over whether they should be armed, but after 9/11, the debate ended with a consensus that they should be. “The technology makes it possible, but the politics makes it happen,” says Singer.

According to a Human Rights Watch release called “Ban ‘Killer Robots’ Before It’s Too Late,” experts predict that within 20 to 30 years, defensive weapons systems could become fully autonomous. While fully autonomous weapons don’t exist yet, “high-tech” militaries like those belonging to the United States, China, Germany, Israel, Russia, the United Kingdom, or South Korea, have been developing “precursors that illustrate the push toward greater autonomy for machines on the battlefield.”

The Campaign to Stop Killer Robots calls for a preemptive ban on weapons that don’t currently exist yet, explains Mary Wareham, who coordinates the campaign. “We acknowledge there are systems out there now with various levels of autonomy in critical function, we just don’t call any of them killer robots just yet,” she says.

“[Killer robots] would remove the human from the kill decision. That crosses fundamental moral and ethical lines we don’t believe should be crossed,” Wareham tells Inverse.

It’s not just about using fully autonomous weapons systems on the battlefield, but also in local law enforcement and at the border.

“There is a lot of money being sunk into weaponizing artificial intelligence. At the moment, the U.S. keeps saying, ‘don’t worry, there will always be a human involved,’ but involved how? To what extent over critical functions?” Wareham wonders.

In the Convention on Conventional Weapons, governments have been discussing killer robots since the end of 2013. The United States has been a key player thus far.

The scores of countries involved in these talks may come to new negotiations on a cooperative international approach to killer robots, but what with the new presidential administration, Wareham says the jury’s still out.

“A new international treaty on killer robots is going to happen, I just can’t say when and what it would cover and who would sign up,” Wareham says. “The U.S. is participating in those talks. We’ll know soon if it’s changed its position.”

In the meantime, she adds, the Campaign to Stop Killer Robots aims to create awareness and elicit support for their objective: a preemptive ban on the development, production, and use of fully autonomous weapons systems. Wareham works to build a coalition, bringing new groups to the campaign, especially from the global south. In the last three years, she’s secured more than 3,000 signatures from artificial intelligence and robotics experts, including 20 Nobel laureates, supporting a call for the ban. Nineteen countries have also come out in support of the ban, she says.

“We’re not trying to prohibit existing weapons systems, and it’s not our job, it’s the job of the government,” says Wareham. And though officials say a human continue to be involved in the weapons systems, the campaign worries that policy will weaken. “It’s still early enough where we can do something about it,” she says. “But we’re running out of time.”

Related Tags