If we don’t adopt legislation to stop killer robots from proliferating soon, they could become commonplace. That’s the message of a new report from the peace advocacy nonprofit PAX. The report claims weapons manufacturers and countries are moving toward producing autonomous weapons systems. These systems, the report warns, could cascade into immoral and unethical use.
PAX had fifty arms producers participate in a survey where they were asked if they were working on producing autonomous weapons and if they are “committed to not contributing to the development of lethal autonomous weapons.” From there, PAX rated companies based on how concerning their practices were. PAX rated 30 of the 50 companies as “high concern.” The report was published this week.
Of those 30 companies, American defense contractors like Lockheed Martin, Boeing and Raytheon made the list. The report, titled “Slippery Slope,” claims these companies are working on technology that could be used for killer robots and that they didn’t have “clear policies” outlining how to keep killer robots in check.
The kinds of weapons these companies are working on ranged from autonomous combat drones to autonomous submarines to autonomous tanks. Many of these companies are working on technology that would allow killer robots to swarm together.
What’s notable about this report is how much detail these companies were willing to give PAX. They often gave specific names for the weapons they were working on. It’s as if they’re not worried about being held accountable, and they’re probably right. The United States has longstanding ties to defense contractors on the list.
"The development of such weapons would have an enormous effect on the way war is conducted and it has been called the third revolution in warfare, after gunpowder and the atomic bomb.
As former Google employee and anti-killer robot activist Laura Nolan told Inverse in September, these machines are poised to become the next “weapons of mass destruction.” She said they will likely start out as weapons only the richer countries have but will eventually be obtained by terrorist groups and rogue nations.
PAX doesn’t believe military use of AI is problematic all of the time. The group feels using AI to assist soldiers while they’re on duty is generally okay as long as they can easily take control of whatever machine they’re operating. When AI is used to allow a machine to autonomously target and kill humans, then there’s a problem.
In order to address this growing problem, PAX is calling for these weapons companies to establish ethics committees to review what they’re producing and ensure “the principle of meaningful human control is an integral part of the design and development of weapon systems.”
The Campaign to Stop Killer Robots is calling for all countries to join a treaty that would ban the use of killer robots in warzones. It’s worth noting that we could soon find ourselves living amongst killer robots, as many American police departments have expressed interest in utilizing robots for their work in recent years. If we don’t start passing laws and signing treaties to stop these weapons from proliferating soon, we might end up trying to stop their spread after it’s already too late.