Science

The Ground War Will Be Autonomous

It's only a matter of time -- and whether robots will kill without human oversight.

A collage of three soldiers of the U.S. Army and a tank on a red background
Photo by Kimberly Bratic  U.S. Army Tank Automotive Research Development & Engineering/Inverse photo illustration

Last week, an armed coalition began its fight to take back the city of Mosul, Iraq from Islamic State terrorists. And in thoroughly modern fashion, the early hours of the mission were broadcast via Facebook Live. We saw convoys of trucks, plumes of black smoke, and armed soldiers. But soon, live streams like that one will include a lot less people, as more remote-controlled weapons hit the ground.

That reality isn’t here yet, but it might not be far off, says Dr. Robert Sadowski, the chief roboticist at the U.S. Army Tank Automotive Research, Development, and Engineering Center (TARDEC). (It’s the same shop that developed the hydrogen-powered truck that caused a stir earlier this month.)

While it’s pretty cool to imagine remote-controlled robo-tanks, like the one below, rolling through the dirt, at the heart of TARDEC’s R&D is keeping more soldiers alive. So far, one American — Navy SEAL Chief Petty Officer Jason C. Finan — has been killed. His Humvee rolled over an improvised explosive device on October 20 as it was exiting a minefield.

In future wars, Humvees like Finan’s might not have any people riding in them.

A ground vehicle being developed by TARDEC's "Extending the Reach of the Warfighter through Robotics" project.

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 16.0px Georgia; -webkit-text-stroke: #000000} span.s1 {font-kerning: none} U.S. Army Tank Automotive Research, Development and Engineering Center

At TARDEC’s facility in Warren, Michigan, the work underway is similar to much of what’s going on with commercial self-driving cars, with the big difference being that the U.S. military needs to be able to operate its vehicles in harsh conditions, without paved roads, and under enemy fire. And once the technology is perfected, another hurdle is ahead: The soldiers may need some convincing.

“There is a very healthy skepticism in the user community that says I can rely on my battle-buddy next to me; I don’t know if I can rely on my battle-robot,” Sadowski tells Inverse. “I can always ask the soldier next to me, ‘what are you thinking?’ And it’s not like a robot is going to come back and say, ‘based on this algorithm, I’m pointing right here.’”

Earlier this month, Sadowski outlined TARDEC’s plans for emerging autonomous ground systems at the annual meeting of the Association of the U.S. Army. Sadowski’s “autonomous kit” — a combination of sensors that give unmanned vehicle situational awareness — was its centerpiece.

“Any single sensor that you’re going to employ, isn’t going to be enough,” Sadowski says. “We mix a variety of sensors on the platforms in order for them to operate efficiently on the battlefield and on U.S. highways and roads.”

That combo includes cameras, infrared sensors, “plain automotive radars, like the kind used in collision warning and braking assistance in commercial vehicles,” and Lidar, or laser radar that’s used to determine distance, and is often employed in 3D mapping.

“Most people are familiar with pictures of the Google car, or the Apple car – you’ll see that big thing on top that spins. That’s the Lidar. The really advanced systems are very expensive, produce a lot of data, and they’ve slowly but surely shrunk down the number of sensors to try and get the cost down,” Sadowski says. “What we’re trying to demonstrate is that it’s not just any single technology. You have to fuse the data appropriately.”

The key to Sadowski’s work is ensuring that the end-user, in his case the soldier, has trust and acceptance in the platform he offers them, and that it’s developed enough so that no bugs appear in the wild.

As but one example, he mentions the BigDog, a four-legged robot developed by Boston Dynamics (later bought by Alphabet) that set the internet on fire in demonstrations that showed it hauling supplies, and even remaining upright after being kicked and shoved around by humans. There was just one problem: “It never went through the noise reduction, post-engineering effort that needed to be done, Sadowski says. “So it sounded like a chainsaw.”

Listen for yourself:

TARDEC’s work on its autonomous vehicles is still in the research and development phase. And Sadowski’s team is testing what it calls a “leader/follower” system that would be used in a logistics capacity rather than in combat. “What we do is have a [manned] lead vehicle that’s followed by a series of trucks that right now we have safety drivers in, but the goal is to get two people out of the cab. That testing right now is ongoing at Fort Bliss,” he says

More intensive “leader/follower” tests will occur in the next six months, but the idea is that at some point, you could have a ten-truck convoy with two humans in the lead truck and nine automated trucks following behind.

“Killer Robots”

The most controversial subject within autonomous machines is, of course, lethality. Human Rights Watch, and others, have called for a preemptive ban on autonomous weapons systems, or “killer robots.” Although the Pentagon doesn’t support that campaign, a Department of Defense directive requires that human beings make the final targeting decision when it comes to carrying out a potentially lethal strike. Many in the military, however, are pushing for accelerated autonomy in various weapons systems.

Sadowski, for his part, sees incredible value in semi-autonomous platforms, including unmanned vehicles with weapons on them. “I absolutely want to have first contact [with the enemy] with an unmanned platform,” he says.

Though it’s not exactly clear what that will look like — and Sadowski repeatedly stresses that he doesn’t see fully autonomous lethal weapons on the horizon — he offers a fairly straightforward example of what he’s describing it might look like.

If an unmanned tank had enough autonomy that it could be driven remotely and it’s weapon could be fired remotely, that could give U.S. troops a “greater protective bubble.”

A "Dismounted Soldier Autonomy Tools," or DSAT, being lowered by a helicopter.

U.S. Army Tank Automotive Research, Development and Engineering Center

“Creeping autonomy — just like the public has seen in cars and trucks over the decades — will steadily expand what the vehicles and weapons are capable of doing on their own,” he says.

The kind of remotely operated weapons Sadowski is talking about likely wouldn’t fall under the proposed autonomous weapons ban, says Peter Asaro, Co-Founder of the International Committee for Robot Arms Control, and supporter of a “killer robot” ban.

“I believe the real concern is that there is meaningful human control over the firing of weapons in each and every attack,” Asaro tells Inverse. “In the case of remote-control tanks, how much information is available to the human making the targeting decisions? How many tanks are they supervising? What is the quality of their situational awareness, in light of being physically remote.”

There should also be some assurance that a semi-autonomous weapon can’t be given greater autonomy with little oversight. “There is also a question as to how easily a tank could be turned into an autonomous weapon by updating its software,” says Asaro.

An often-overlooked issue in the field of autonomy in military operations is the role of automation bias, that is, the tendency for humans to blindly trust computers they’ve become accustomed to.

Virtually everyone agrees that autonomy will continue to spread across all platforms: vehicles, weapons, and cyber systems, both offensive and defensive. What exactly that will look like remains to be seen, but as of now there is still no end in sight to the conflict that began on 9/11, and has become so ingrained in U.S. policy that experts now refer to it as the “Forever War.”

The battle for Mosul that’s being waged right now could take weeks, months, or years. And no one knows what comes next. If history is any guide, the U.S. military will continue to operate in Iraq for the foreseeable future. At some point, a decade down the line, or more, the tanks Sadowski is helping to develop now, could be on the frontline of another complicated battlefield.

Sadowski acknowledges that putting too much trust in our war machines could be dangerous, but he thinks we’re not there yet.

“I would worry long-term about people thinking they can just rely on the robots, because that gets you into the problem of spoofing robots and whatever else,” he said. “It depends how you view the character of future warfare.”

Related Tags