Science

Future Soldiers Will Fight Alongside War Machines, but Present Soldiers Want Dogs

The Pentagon thinks human-machine teams are the future of ground war. But robots aren't loyal.

National Museum of American History/Flickr

Soldiers say there’s no bond stronger than the one connecting those who have faced combat together. This is core to the military’s “no one left behind” ethos, but also at odds with the Pentagon’s research and development spending priorities. If human-machine teams are the future of ground warfare — and the U.S. is betting they are — what do those bonds look like? The best model out there is the Marine teams working with dogs, but canines have a lot more in common with humans than robots so the question of what role affection will play in the future of war remains unresolved.

The problem right now is that no one will lay down their life for their Roomba. And though some people form sentimental attachments to their first car, for instance, that enthusiasm does not constitute a relationship. This is concerning if you’re a military thinker looking to send automatons into the field alongside soldiers. If soldiers can’t trust robots and robots can’t rely on soldiers to prioritize their safety, teams will fail to form.

Speaking at the Sea Air Space expo last week, Deputy Assistant Secretary of the Navy for Unmanned Systems Frank Kelley imagined a world in which Marines feels as protected and protective fighting alongside a robot as alongside a dog. “We need to be prepared with these relationships, with these warfighting relationships, relationships with Marines and sailors and machines alongside,” Kelley said.

A military patrol dog doing her thing.

Arctic Warrior/Flickr

One of the key developments in automated technology is the ability to learn and adapt. If a machine can respond to a command in real time, and anticipate what its human commander would want, there is real potential for trust. If a robot seems like it’s reacting to you in a specific, individualized way, the thinking goes: the human will see the robot as something closer to an individual.

There are two unmanned machines the Marines are currently testing, according to a Military.com report. One, called Spot, is a “quadruped that can take commands and execute them, but struggles to react to changing situations” developed by Boston Dynamics. As the name suggestions, the machine was created to take the place of military working dogs, and comes with a camera to scan a room for occupants. The other can transport gear but requires humans on both ends of its journey to input instructions.

The pairing of humans and robots won’t be limited to ground operations, either. The Air Force is pushing for pilots to have the ability to control drone swarms from their cockpit. “In the future, drones may be fully operated from the cockpit of advanced fighter jets such as the Joint Strike Fighter or F-22,” Air Force Chief Scientist Greg Zacharias reported earlier this week.

As with robo-dogs or anti-mine devices, scientists are pushing for increased autonomy in drone swarms. These machine wingmen could provide extra assistance in targeting decision, collect reconnaissance, or carry additional weapon payloads.

For now, pilots and sensor-operators on the ground control unmanned aerial vehicles, often in teams comprised of several people. As automated drone technology improves, the Air Force hopes that a pilot would be able to command an entire squadron of robotic wingmen, operating at various levels of autonomy. Already, the military has used automated helicopters in Afghanistan to transport cargo from one base to another. Compared with the difficulty of self-driving car, self-flying helicopters are fairly simple. The terrain is simpler, and the environment less crowded.

For now, part of the problem with human-robot bonding comes from a lack of familiarity. In cases where the operator is relatively unfamiliar with the capabilities and limitations of the machine, they are much less likely to trust it in a moment of extreme stress. But the opposite is also true. Within anti-explosive units, there is often high levels of trust and anthropomorphizing of the machines, almost to extent that the devices became pets. “Soldiers formed such a strong bond with their explosive-disposal robots that they insist getting the same robot back after it is repaired or become sad if their damaged robot cannot be repaired,” a report from the Army Research Laboratory found.

Military dogs aren't just tools. They are soldiers' friends.

DMA Hawaii/Flickr

In effect, the canine model is the future and the past of combat relationships. But there is a clear caveat: While many current soldiers had exposure to dogs growing up, few had prolonged or meaningful interactions with advanced robotics. In effect, soldiers can be trained to use unmanned systems, but they’re not culturally conditioned to do so. And that’s where next generation civilian technology makes a difference. A.I. systems and robots, from Alexa and Google Home to TK and TK, are increasingly a felt presence in American homes. For future fighters, interacting naturally with technology will be second nature and, more critically, a domestic given. Soldiers will increasingly come pre-conditioned to deal with robotic battlefield partners.

The increasingly competitive civilian marketplace will also change war machines for good. As companies attempt to differentiate their products by improving their interactivity, functionalities designed to increase consumer loyalty will — one could say “already have” — become increasingly important. Specific functionalities will make bots seem less like products of mass production and more like beings worthy of protection. Robot types will become roughly analogous to dog breeds, exhibiting different abilities and dispositions. Rather than being less relevant, the dog-soldier model will become increasingly desirable.

Still, problems arise when human operators put faith in the machines they use. That phenomenon is called automation bias and stems from the idea that computers are smarter than people, so it makes sense to defer to a machine’s recommendation. This is not a problem with dogs, which is part of what makes them so effective on the front lines. Dogs make recommendations and robots make demands. This represents perhaps the most serious UX problem on Earth; no one wants bad coding to result in human casualties.

For now, most soldiers would rather have a dog with them than a robot. But as anyone who gets anxious without their smartphone at their side knows, it doesn’t take long to become completely dependent on a mechanical device. Maybe one day that dependence will extend to our robotic best friends. Maybe one day that dependence will extend to conflict zones.