Science

The True Cost of Westworld's Robot Sex

"I’m not concerned about robots. I'm concerned about human behavior."

HBO/Westworld

Within the first ten minutes of HBO’s Westworld, we witness the prelude to rape. A tall man in black, played menacingly by Ed Harris, drags Evan Rachel Wood’s screaming Dolores towards an empty barn by the collar of her demure blue dress. She’s still screaming as he throws her onto a pile of hay.

“But it’s okay,” we assure ourselves. “She’s just a robot.”

Westworld’s human guests aren’t obligated to uphold any sense of morality. The upsell for the park involves reversion to primal urges. In Crichton’s pessimistic view, the breakdown of normal society leads immediately to sex and violence, enjoyed simultaneously, even synergistically. Hosts — that’s Westworld’s term for robots — are raped, graphically and without pause, because consent, as a concept, doesn’t apply. And perhaps that’s the case. But it doesn’t mean everyone involved in a robot rape comes away unscathed. Research suggests that the real victims of Westworld’s barbaric sexual culture would be human.

“I’m not concerned about robots. I’m concerned about human behavior,” Kate Darling, Ph.D., a research specialist and robot-human interaction expert at MIT Media Lab, told Inverse in a discussion about Westworld. To Darling, the no-holds-barred sexual free-for-all posited in the show is an interesting thought experiment, but it doesn’t explain what might happen after visitors left the park.

Two things could happen, she says: Either sex robots would continue to serve as a healthy outlet for our unhealthy urges or they would whet peoples’ appetites for unsavory sexual fare. Both options have their own troubling implications, but the latter is more immediately problematic.

Patrick Lin, Ph.D., the director of the Ethics and Emerging Sciences Group at California Polytechnic University, fears that sexbots might make us less human. “The fear is that people will become less in the habit of seeking consent and more into imposing their will and desires on others,” he told Inverse in an e-mail. A Westworld-like scenario presents a slippery slope similar to the one posed by or violent films or video games, perhaps even more troubling because sex robots are so “visceral” and “immersive” that they blur the line between virtual and physical reality.

“I don’t think these fears have played out, but robots and virtual reality are much more immersive and visceral than previous technologies, so we don’t really know,” he says.

Sex robots — at least the sophisticated, highly anthropomorphic kind — haven’t been around long enough for us to tell if they’ll affect our sexual interactions with each other. In her research, Darling is beginning to explore how human-on-robot violence might affect behavior outside the lab, and the preliminary results are less Hobbesian than we feared. In an experiment in 2014, Darling instructed participants to play with a set of Pleos — cute, wide-eyed robot dinosaurs the size of small cats — and subsequently tie them up, strike them, and kill them. Participants, especially strongly empathic ones, afterward self-described the experience as “disturbing,” says Darling, who took this as evidence that humans are somewhat confused by life-like robots. In follow-up experiments using human-like social bots called Hexbug Nanos, she observed similar, compassionately hesitant, reactions.

But in Westworld, there doesn’t seem to be any confusion. In one scene, a human couple beams at the death of a swarthily handsome robot they killed in a “shootout,” then poses with its corpse for a photo. Whether they’ll take those behaviors home with them remains to be seen, but Crichton’s assertion — that humans are inherently shitty — suggests they will.

It might seem pessimistic, but evidence seems to suggest that lack of consensual language between robots and humans will happen.

“Then the question is, if it does have an impact on people’s perceptions of consent and their behavior toward humans, what’s the best way to go about preventing that?” Darling asks. Should we, for example, program sex robots to say no? That depends on what we want from our machines: Sex robots are tools designed to deliver satisfaction. They could deliver an education as well, if that’s something we decide we need.

Some ethicists would argue that they should. “It is important for robots to say ‘no’ to us,” Lin says, noting that it’s the job of a “smart tool” to refuse our orders when it knows better. This already happens all the time: Autonomous cars turn right when left poses danger; nurse robots force critical medications on patients who refuse to take them. A sex robot that demands consent could be useful for stamping out our baser behaviors, in the same way meditation apps train us to stop drifting off, or exercise trackers buzz when we get too lazy.

But if we’re anything like the humans in Westworld, the last thing we’ll want from our sex robots is to be didactic. Not only would that defeat their purpose — aren’t they just Fleshlights and vibrators in elaborate casing? — but programming them with consent-seeking behavior would require a difficult admission on our part: That is, that we need to be stopped.

But it’s just not clear that sex robots are the way to teach those lessons. “It might be that we just simply need to have a more general conversation as a society about this issue, which is obviously a bigger deal than just the sex robots,” says Darling. Keith Abney, Ph.D., a lecturer at California Polytechnic’s department of philosophy, likens the situation to the roboticist Ron Arkin’s suggestion that we build child sex robots as “outlets” for pedophiles, so that no actual children are harmed. “I’m not sure that’s a good idea, but not because of concern for the robots themselves, but because of what it means for the character of the humans involved,” he told Inverse in an e-mail.

“This isn’t a science fiction question,” Darling says. “This is a question for now.”

Related Tags