A professor of artificial intelligence has called on the United Nations to impose a global ban on lethal autonomous weapons, fearing the “terrible and terrifying prospect” of robots killing civilians. Toby Walsh, from the University of New South Wales in Australia, wants attendees at Friday’s UN disarmament conference in Geneva to commit to formal discussions over a ban.
“The reason I have been motivated to do this is simple,” Walsh said in a story published Thursday. “If we don’t get a ban in place, there will be an arms race. And the end point of this race will look much like the dystopian future painted by Hollywood movies like The Terminator.”
Walsh points to existing bans on chemical and biological weapons as a model. Their effectiveness is questionable, but they have greatly reduced the likelihood of coming across them in combat. He’s a strong believer in a ban: Walsh was one of the signatories to an open letter in 2015 calling for a ban, and on Thursday he spoke at the UN for a third time to persuade countries to consider the idea.
“All technology can be used for good or bad,” Walsh said in his article. “We need to make a conscious and effective decision soon to take the world down a good path.”
It’s not just on the battlefield that people are concerned about killer machines. Brad Wardell, CEO of Stardock, warned in September that private companies using autonomous security robots could lead to a mass increase in global inequality. As wealth accumulates with those in charge of autonomous robots, these machines will be used to crush any revolutionary uprisings occurring in the masses.
Beyond a global ban, A.I. ethicists are turning to guidelines to help encourage “best practices.” A draft IEEE report published this week recommends that weapons without any meaningful human control should be considered unethical. But any rules around A.I. will have to come down to the designers, as hard coding rules into machines is considered a less practical approach. A UN ban, if implemented, would likely mean arms corporations avoiding autonomous weapon designs altogether.